Tech Industry Needs to Innovate Anti-Racist Principles

By Bion Johnson

In tech, we take the concept of specialization pretty far. Not only are roles for people highly specialized, but the focus of a given product’s development often is as well. There’s good reason for this.

When businesses attempt to make functioning, marketable products out of massively complex systems of logic, they need to winnow down the scope until they’re hopefully left with something that is both useful to users and technically feasible. In doing so, technologists effectively train themselves not to think about things outside of their purview.

The problem with this approach is that, all too often, we don’t notice that our creations have broader and unexpected results beyond what we intended. In the context of a society with clearly documented systems of oppression, this means we can build software that enables and or even amplifies those systems without it ever crossing our minds.

One doesn’t have to look hard to find examples of this. I doubt Google’s facial recognition team intended to make a system that classified the faces of black people as gorillas, or that their search team made it a goal for the top results for inquiries about Jewish people to be links to anti-Semitic hate groups. Airbnb probably did not consciously set out to give racial discrimination a foothold in travel accommodations.

Without an understanding of systemic oppression in our society, we as technologists are prone to being complicit in contributing to those systems simply by following our industry’s best practices and doing our jobs as best we can. However, based on the growing list of high-profile cases such as the above, technologists have no excuse for ignorance of this dynamic.

Furthermore, we are entering a period of technological integration into our society wherein decision-making algorithms are beginning to enshrine the aggregate biases of our institutions into processes in which individuals can no longer be held accountable.

There is a need in the tech industry to build an understanding of how to counteract these trends, and in Seattle we are fortunate to have some excellent resources readily available to do so. Organizations such as the People’s Institute Northwest and Moral Choice offer workshops and curriculum that not only help individuals gain an understanding of how systems of racial oppression function but also help institutions limit their contributions to these systems.

A key lesson from anti-racism training is how to think proactively about systemic oppression, as opposed to making bewildered stabs at shutting the problem down after it crops up.

Companies need to be proactive rather than reactive. Google fixing their search results and facial recognition algorithm for specific phrases and images comes after widespread and highly-publicized complaint. It is at best still a patchwork approach. There are still plenty of earnest questions one can ask Google about marginalized groups that will result in page after page of hate and misinformation.

It’s hard to believe that a company that has tasked itself with curing death and eradicating aging can’t figure out how to filter white-supremacist websites from search results to general inquiries about populations facing prejudice. A search engine that approached anti-racism proactively would build safeguards against clearly racist search results into its core feature-set.

In order to work proactively against oppression, tech companies must make themselves accountable to people who are most vulnerable to prejudice. It shouldn’t take articles in international news for well-funded tech companies to notice they’re increasing the vulnerability of marginalized people to prejudiced behavior.

Technologists trained in anti-racism will know that their businesses need to incorporate systems of accountability not to news organizations and tech columnists but to communities and people who are historically oppressed.

How can Google, which is being used by white-supremacists as a​ conduit to engender racial, prejudice, make itself accountable to people of color who are the targets of that prejudice? How can Airbnb make itself accountable to travelers who are denied lodging on account of race or disability? Answering those questions is complex and difficult, and will require exactly the kind of intelligence and willingness to innovate tech workers often pride themselves on.

However, that won’t be enough without employing the expertise of people who have both studied and lived experiences of oppression. A good first step is engaging with organizations like The People’s Institute Northwest and Moral Choice.

This is not to say that these organizations will have ready-made solutions. What they offer is a way forward in bringing anti-racist methodology into our institutions and practices. The ultimate responsibility is on us.

Featured image is a CC licensed photo attributed to Tech Jobs Tour

We'd Like to Hear Your Thoughts:

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s