Dr King spent his last precious hours advocating for the economic rights of African American sanitation workers in Memphis. In his broader vision, this was one arm of a struggle for justice for the poor and powerless that spanned divides of gender and race.
I recently met Ryan Harrison at the Conference on Fairness, Accountability, and
Transparency and he shared his amazing post on socially responsible or as he puts it solidarity based investing.
The post, at least for me, presents a new way of thinking about “return on investment”. In other words, the “return” is the uplift and empowerment of our communities in ways that seek to build equity for all instead of maximal profits for a few.
In our brief conversation, Ryan schooled me on bail bonds funds as one example. Since many people can’t afford the bond for minor traffic violations and misdemeanors, they end up having to do jail time, miss work, lose jobs, and thus end up in a downward poverty spiral. Since it’s not supposed to be a crime to be Black, Brown and Poor, non-profit funds such as the Bronx Freedom Fund were setup to provide a route of this particular trap. An investment in the bail bond fund is a direct investment in the economic viability of a given community — like the South Bronx.
As Ryan points out, the move away from the traditional 401k/IRA can be gradual — say 10% of your investment funds allocated to solidarity investments. It is the start of the journey that matters.
The options for where to put your solidarity dollars range from grant based investing (like bail bond funds or in local food cooperatives like the one in the featured image by Steven) to direct lending programs (like Canopy Coop in Boston) to more traditional equity investments like the Shared Capital Cooperative .
Gayatri Sethi (my life partner) is working on a education platform called Alt-College that’s based on this solidarity model.
Do you have any suggestions on efforts to invest in? Strategies that you have put into place for socially conscious investing? Please share!
The Atlanta WordCamp is an annual gathering for people that use and develop WordPress sites. Although it is put on by and for the Atlanta WordPress community, I met people from all over.
I gave a talk there Sunday (4/15/18) on the state of inclusion in distributed companies. Since WordPress is maintained by a distributed company (Automattic, by employer) and an open source community, the subject is of great relevance.
Let me know what you think. There are more unanswered (and unasked) questions than answers.
Mind filling out this survey if you work at a distributed company or work remotely?
The discussion was lively and thought provoking. A few takeaways:
- It’s important to be explicit about the excluded groups in your company. Only through getting the discussion going can progress be made.
- Many people are still concerned about revealing their race/ethnicity/physical ability (even on EEOC questions at end of hiring applications).
- How do we deal with the bias in reaching out to more diverse populations.
- How do excluded groups even know where to look for positions, when even job search has build in exclusivity.
- How can independent consultants and free-lancers be advocates in this space?
- Is the Internet really the equalizer we think it is?
- How do we start?
I was delighted by the inclusiveness evident in the conference organizers and attendees. One of the many beautiful things about Atlanta.
Please comment and add your questions!
We learned of Stephen Hawking’s passing today. I learned that one of the technologists behind the assistive technology that amplified the continuous flow of so many of his ground breaking insights is Lama Nachman.
Her story and the implications for better assistive technology is fascinating.
We are both mourning the passing of Stephen Hawking and celebrating Women’s History Month in the US (wait, so that mean’s the other 49% get the rest of the year?). It reminds me of the legions of Joan Feynmans (her brother got the spotlight), Vera Sóss (other Erdös-1’s seem to get the spotlight — wait can we get an Anna Erdös number? ), Katherine Johnsons (took a while to get that spotlight), Maryam Mirzakhanis that are working away, far from the spotlight, building and unfolding the universe.
In the US, the African American scholar (and February 1st Google doodle subject) Carter G Woodson began working in 1926 to establish “Negro History Week“, for in Woodson’s day the contributions of Black people were “overlooked, ignored, and even suppressed by the writers of history textbooks and the teachers who use them.” Woodson’s Negro History week evolved into today’s US Black History Month thanks to the efforts of student activists of the 1970s.
My partner, Dr Gayatri Sethi, reminds me that the aspiration of marginalized and minoritized peoples to be heard, to enter into equity in whatever place they call home is universal.
With that in mind, it is no surprise then that Black History month has been celebrated in the UK for the last 30 years in October. This October a group of mathematicians at University College London —Sean Jamshidi, Nikoleta Kalaydzhieva and Rafael Prieto Curiel — decided to make October Black Mathematicians month.
During the month they presented interviews with UK mathematicians starting with Dr Nazar Miheisi who does research in Analysis at King’s College. The Aperiodical blog also ran pieces highlighting Black mathematicians, among them Dr Caleb Ashley who gives this Numberphile segment on the fifth postulate.
Building an equitable mathematics community, or better yet an equitable world, should not be confined to a single month — it is an undertaking that will require continuous and deliberate effort. But it is encouraging and inspiring to see many hopeful signs on a global scale.
Do you know of similar efforts in other countries to encourage the participation of marginalized peoples in science and mathematics? If so, please leave a comment or drop an email!
A.I. and Big Data Could Power a New War on Poverty is the title of on op-ed in today’s New York Times by Elisabeth Mason. I fear that AI and Big Data is more likely to fuel a new War on the Poor unless a radical rethinking occurs. In fact this algorithmic War on the Poor seems to have been going on for quite some time and the Poor are not winning.
Mason posits that AI and Big Data provide three paths forward from the trap of inequality: 1. The ability to match people to available jobs; 2. the ability to deliver customized training that enables people to perform those jobs; and 3. the ability to algorithmically deliver social welfare programs in a more efficient manner.
The first objective seems within the realm of Indeed.com and LinkedIn’s recommendation algorithms and second — personalized training — has a long history in AI systems development. The problem is access: how do you get one of the “good middle-class jobs” in San Francisco when you live in Atlanta and attend a high school that lacks the coursework to prepare you for Stanford? How do you get access to an immersive 3D training environment when your family can’t afford to put down 100 a month for high speed internet and your school lacks the equipment also?
The third part of Mason’s strategy is the most problematic. We’ve seen AI (meaning machine learning and decision making algorithms) used to enforce biased sentencing practices; seen how skewed training data can lead to racial bias in facial recognition; and the use of data-driven methods in predatory lending has also been documented. These examples constitute the tip of a deep problem and still largely un-addressed problem in AI. In short, if the algorithms on which our hopes for transformation are pinned learn from data that reifies the structural racism at the root of social inequity, then we’re simply finding a more optimal route to oppression.
Before we hand over the lives and futures of the most vulnerable members of society to algorithms that we are still trying to fathom, we should strive first for accountability and transparency in algorithms. The efforts underway in New York City to insure algorithmic ethical accountability is one start.
But if machine learning and AI are the new tools of our age, we should empower all people to put the computational tools and conceptual frameworks of data science to work for them. Black Lives Matter activists took the social networking tools to organize protests and share video that has changed and empowered. What could a coming generation do with additional visualization and analytical tools?
It was the prospect of using AI to empower education that first attracted me to the field. I think that the emerging technology has some good to do. But the process must necessarily be participatory. When artists, educators, poets, activists, grocery store owners, gardeners — everyone — can be given access to the tools then I’ll bet on the human capacity to find new paths to expression and opportunity.
The Black in AI workshop at this year’s NIPS conference could be among the most important events in artificial intelligence this year.
Why? If you think the AI in your phone, car, or bathroom is free of racist or sexist biases, then respectfully give this op-ed by mathematician Cathy O’Neil a close read and begin educating yourself.
Further, the societal threat posed by systems capable of intentionally exploiting racial fears and prejudice should be evident by now.
Developing AI that is free of the gender, racial and other prejudices that continue to mar our society is an immense task and as many have pointed out, there is no one algorithm or tool that will get us there. One part of the solution is opening up the field to scientists, developers, and thinkers from all backgrounds so that norms of oppression and exclusion are questioned and ultimately ushered into the museum.
In that sense, the Black in AI workshop is an important contribution. The stated goal of the workshop is to provide a forum to nurture and develop researchers who are Black — thus promoting the inclusivity of a field that shamefully homogenous.
Does this mean that the inclusion of Black people in AI will spell the end of racist AI? Probably not, but clearly the near exclusion of Black folk has given biased systems a free pass. Perhaps a Black AI researcher might be more inclined to raise concerns about algorithmic bias in facial training sets , or even design commercial facial recognition systems with ethnic diversity baked in. But more importantly, a Black blogger, professor, lecturer, developer might foster important shifts in the way their readers, students, and customers view the world. That has to be a step forward for humanity.