Building an Ustopia: A Conversation on Tech & Equity with Dr. Ruha Benjamin
We are quickly approaching the point where the majority of the workforce is millennials and GenZ. In the United States, post millennials are the most ethnically and racially diverse generation ever. The conversations and requirements for creating empowered talent has changed over the years, and for the better. Employees are looking to work for a great company, have rewarding work, total rewards, impact in the community, protect the planet, be mission led, prioritize for sustainability and fairness, equity and inclusion. Mission, Trust, Sustainability, Equity and Impact are essential to unlocking the energy of a company.
At OpenText, we embrace these beliefs. We are setting the bar high, holding a mirror to ourselves, learning and advancing, and driving action to be more. We are committed to creating a more diverse industry and company. We will have challenging and even uncomfortable conversations, and take strong action to fight discrimination.
Recently, as part of our celebrations for Black History Month, Dr. Ruha Benjamin, Professor of African American Studies at Princeton University, spoke with OpenTexters about the complex dynamic between innovation and equity. She explored why it is so vital that we learn the history of technology alongside the history of race.
I am deeply grateful to Dr. Benjamin, who both challenged and inspired us to think about the world in new ways. Here are a few highlights from our conversation—which covered everything from Black Mirror and discriminatory design, to how we train future engineers and build better organizations to do good in the world.
I am very pleased to share our conversation.
Mark: You’re a professor at Princeton, with a deep love for education and learning. Can you tell us about the curriculum you put together at Princeton?
Dr. Benjamin: Absolutely. One of the classes I enjoy teaching is called “Black Mirror: Race, Technology and Justice.” It’s a riff off the TV show Black Mirror, which I think is a really important cultural awareness in terms of having us think about how technology reflects existing social processes. What’s interesting about the show, which we build on in the class, is that technology is not the problem. It is reflecting existing social problems—perhaps amplifying them, perhaps hiding them. So, in that way, we want to put technology in its place. Rather than blaming it for everything, we want to think about it reflecting us, so we need to look back at ourselves.
That class is one of the few probably that you could find anywhere in the country, in terms of trying to combine the humanities and social sciences in this way, with engineering and computer science.
Mark: You talked about technology and society, and the two stories—a dystopian view of the world and a utopian view of the world, whether we think technology is harmful or helpful. I’d like to hear more on your view of how we view that glass. Half empty, half full? Dystopian, utopian?
Dr. Benjamin: Part of it is to think about what other options there are. If it’s not dystopia, if it’s not utopia, it might be what the science fiction writer Ursula Le Guin calls ustopia. This goes back to the idea that whatever we’re dealing with is a reflection of us, of society, as it is. So that is one way to describe what that third frame is. We need to figure out who the “us” is that’s shaping the world that we have.
One of the things I want us to consider is that there’s a very small sliver of all of humanity who currently monopolizes the power and the resources to shape the world for everyone else. And so, if we want to begin to undo this polarity, we have to broaden whose voices are heard and whose lives are considered as we build both our physical and our digital structures. In that ustopia, we need to broaden the “us” in terms of who’s actually participating in the process.
Mark: You use the term discriminatory design, and have written that racism isn’t actually a form of ignorance, but rather a distorted form of knowledge.
Dr. Benjamin: Yes. Some years ago, there was a viral video that went around that showed a soap dispenser with two friends, Black and White, at a hotel restroom, and the soap wouldn’t come out for the friend with darker skin. This video went viral under the label “racist soap dispenser,” which is in a way funny because we know that the soap dispenser doesn’t have intentions to be discriminatory. Essentially, this infrared technology, because darker skin absorbs light, wasn’t bouncing off and causing the soap to come out. So it could be that everyone who it was tested on had lighter skin, and because of who was behind the scenes, the glitch or the problem never came to light. This is an analog, simple example of the way that it doesn’t require intention to be harmful or intention to discriminate.
But we can think about it in more complex hiring algorithms that many companies now use to streamline the process of recruitment, where in many cases the training data is based on people who already work in the company. So, if you love your employees, they’re doing a great job, that becomes training data to find more employees like that.
But if for the last 50 years or 10 years, you’ve hired mostly men or hired mostly people from North America or hired people only that spoke one language, then there are proxies for that that then get reproduced. You might think of this hiring algorithm as more neutral and unbiased than, say, a human resource officer, someone who’s doing the interviewing, whose own bias and subjectivity may come into the screening process. But if you’re using historical data—and all data is historical—and that history has had forms of explicit or implicit discrimination built into it, what you’re going to do is reproduce that under the guise of neutrality through this hiring algorithm.
So, intention is the wrong metric because this can happen without the intention to do harm. It can actually happen because you’re not asking the right questions or thinking about the legacy you’re building into these systems.
“Cosmetic” diversity is not enough.
Mark: A few years ago, there was a term being used, and COVID definitely accentuated this—the digital divide. In your view is the digital divide widening, is it holding constant, is it increasing? And what can we do?
Dr. Benjamin: I think the consequences of the digital divide are becoming more severe. That is because more and more of our lives—not just our recreational lives or our leisure time—relies on these tools. Every day basic needs are now mediated through the internet and through digital tools.
And certainly, as you mentioned, during COVID a lot of this came to light. In New York City, when we went remote, it was found that something like 200,000 school children were housing insecure or homeless, so they couldn’t do remote learning. So, what had already existed as a problem, the consequences of it became more severe, because now they relied on this to be able to get their basic education.
Not having access to these technologies is really important and something we need to address, but some of these same communities are also hyper-surveilled. Technologies are deployed against them without their knowledge. They’re exposed to technologies in a way that is disempowering. So, in any conversation about access, we need to remember both of those things. We need to understand when people need to be included, but also when people are exposed to technologies and surveillance that are harming them, that are extracting their data and weaponizing it against them.
Inclusion is not a straightforward good, because there are all kinds of predatory “inclusions.” Many of these same communities who don’t have access to the internet and broadband and other digital platforms are experiencing technology, but not of their own design and not of their own choice.
Mark: It’s worthy of awareness and education, this predatory inclusion of technology. It wasn’t something that I was thinking about, but will be reflecting deeply on.
25,000 OpenTexters. 10,000 programmers. What advice do you have for us—even if challenging, uncomfortable—to build a better company, one more aware, more inclusive, and to go do good in the world?
Dr. Benjamin: You know, when it comes to technology, if we’re talking about ethics or we’re talking about equity, I think we train our attention on the outcomes, how technology’s impacting X, Y and Z.
What I would love us to do is start much earlier and think about what are the inputs to the process, the starting values, the assumptions, the incentives, the forms of knowledge that we’re inputting. Reevaluate the things we take most for granted in the process, and scrutinize them and ask ourselves, how is this either contributing to the common good, to more equity and inclusion and justice in the world, or how is it undermining that?
And put each thing under the microscope to really evaluate the things we take most for granted, rather than approaching this work with big buzzwords, big initiatives, flashy campaigns that get us attention, but often don’t scrutinize and think about the nitty gritty, the small things that actually build up to larger harms.
A key element from my conversation with Dr. Benjamin was the importance of re-inventing “design” and ensuring the design has the right inputs and perspectives. When we improve the design, we improve the chance of positive change. I am a big believer in the idea that we need both long-range telescopes and high-power microscopes when it comes to designing and building software.
Thank you, Dr. Benjamin.
Keep visiting this space for more insights from “OpenTalk with Mark J. Barrenechea,” my conversations with some of the world’s greatest thinkers and leaders.