The Future Needs You Today: A Conversation on AI & Decolonization with Karen Palmer

AI is bringing us into a new epoch of human society—it is a force multiplier for human potential. OpenText is about Information Management + Data…

Mark J. Barrenechea profile picture

Mark J. Barrenechea

February 29, 20248 minutes read

AI is bringing us into a new epoch of human society—it is a force multiplier for human potential.

OpenText is about Information Management + Data + AI + Trust.

AI also reflects its creators. We are currently at a critical point with AI. This is our moment to build the future that we want to live in.

AI can carry implicit bias and perpetuate unequitable power structures. We are on a journey to hear from a wide variety of voices and learn new perspectives on how to build AI that is sustainable, ethical, and inclusive.

As part of our celebrations for Black History Month, I recently had an incredible conversation with Karen Palmer, Storyteller from the Future, Award-winning XR Artist, and TED Speaker who explores the implications of AI and technology on societal structures and inequality. Karen won the XR Experience Competition at South by Southwest (SXSW) 2023 with her most recent project, Consensus Gentium, designed to drive discussion about data privacy, unconscious biases, and the power of technology.

I am thankful to Karen for sharing her powerful and insightful ideas with OpenText, and exploring with us the idea of decolonizing AI. You can read highlights from our conversation below.

***

Mark: As humans, why do you think we need AI?

Karen: The most important aspect or characteristic of AI is efficiency and speed. So, not about accuracy off the bat. It’s going to make things more efficient for you and it’s going to make it quicker for you. That’s a service they’re providing for you.

It’s all being driven around commerce, capitalism, and then the other side of it is surveillance. Like when Charles was crowned king, that was when they kicked in the most complicated and far-reaching AI system, facial recognition system, that’s ever been in England.

You and I may think, “hey, do I need AI?” But we haven’t got a choice in what’s happening today, because it’s being suggested to us.

My view on smart cities is that I really call them “surveillance cities,” because everything is sold to us as “it’s going to be more efficient, speedy, and make our life more safe.” But what it does is that it brings in more measures of security.

So, for example, Robert Williams in Detroit was the first person arrested by the facial recognition system that got it wrong. That was a system called Project Greenlight, and it was presented to the city of Detroit and recommended because there was so much crime. That if they put this surveillance grid in there, it would be better for fighting crime, to keep them safer. And what happened is that it’s now surveilling people and arresting people of color and they can’t dismantle that system. It’s here now.

So we have to be very aware of what is being sold to us and how we would like to use it. And by “us,” I mean all people. It might impact people of color or black people or women or minorities first, but it’s going to impact all of us eventually.

Mark: Thank you for sharing that. We’re here to challenge ourselves today. You used this expression, “chains of colonial algorithms,” and you also used a term, “decolonizing AI.” I’d love to hear your voice on what does that mean to you and what should we take away from that?

Karen: I’ve been looking at bias in AI since 2016-2017. But the deeper I go into it, the more I feel that maybe that term is a little bit of an understatement. That we really need to look at decolonizing AI and dismantling the colonial biases which are deeply embedded within these artificial intelligence systems— which are built on historical injustices and dominance and prejudices—and really enable different types of code to be brought to the forefront, such as Indigenous AI.

Let’s create AI systems from an Indigenous perspective, from a different cultural lens, from an African lens, or Hawaiian lens, or a Māori lens. Not coming at it like, “okay, you’ve got to be diverse for the quota of diversity.” This will make systems better for everybody.

What about building solutions from us, the people? What would that look like? How would we actually go around decolonizing society? How would we go around decolonizing AI, and what would that look like? And that’s my work that I’m embarking on.

Mark: So, to bring in wider data sets that express a full picture of society, is that another way to say it?

Karen: Yes! Holistic. Total. Authentic. Representative. Something which is reflective and authentic of the world in which we live.

Mark: OpenAI is in the news almost every day, and they announced recently their video generator called Sora. Some of the early imagery is phenomenal. Google recently announced that it is pausing image generation. Inaccurate historical context was coming out of Google.

I’d welcome your thoughts.

Karen: Let me just backtrack a little, with the writers’ strike in America. That happened because Hollywood and the studios were exploiting people’s rights. Their data, their digital data, their digital identity.

Everybody is nervous about AI taking their jobs, wherever you are, whether you’re a driver, whether you’re an artist like myself. AI is reflective of society. So with the writers’ strike, the studios tend to be quite exploitative of talent. What they were trying to implement through the contracts was also exploitative. So it’s very important for our society to reflect the best part of ourselves, because the AI is going to do that.

Technology reinforces inequality. And when it does that, it’s not a glitch. It’s a signal that we need to redesign our systems to create a more equitable world. And so, as we’re moving forward in this ever-changing world to whatever role is being lost and whatever jobs are being discovered, it’s really important that it’s a world which is accessible for all of us.

And in terms of Gemini AI, that was the other extreme of bias in AI, where it was too diverse. There were Nazis, where they generated images that were Asian women or Black men. So it wasn’t historically accurate. So that’s why they paused it.

There’s got to be this middle ground—we’ve gone too far one way in terms of bias and too far one way regarding data sets in terms of diversity—to find something which is more representative.

And that again, is probably where the Indigenous AI and that decolonizing will create a bit more of an authentic representation.

Mark: Yeah, I don’t know how one really regulates it or oversees it. Other than the market going, “good tool / bad tool.” Where is that authentic voice to say, “this whole market’s moving in the right direction?”

Karen: That position of good or bad, that just comes down to perspective. That’s why we’re going to move into the age of perception and greater understanding. Because we’re in a time now of real division, and we’ve got to understand that what you may deem good, someone else might deem bad.

And that’s why, by democratizing more AI, more people can develop their own, more independent systems. So that people can have and code whatever they need to. They’re not dependent on a body doing it. Like, say, Joe Biden, two weeks ago. They’ve announced this organization now that’s going to regulate AI. But we don’t really know whose interest they’re actually going to represent, because there’s this history of governments and big business working together.

So that’s why what’s good for someone may not be good for you. It’s about us having a seat at the table of what’s happening.

Mark: Look 5-10 years out in AI. Love to hear your view of how the next few years play out in the world of AI.

Karen Discusses Her View of What’s Next from the Perspective of a Time Traveler from the Future

***

Karen invites us to envision a future where we have already created the world we would like to live in, using technology. What does it look like? Now, work backwards. What steps do we need to take today to get there?

I was inspired by her words: “The future is not something that happens to us. It’s something which we build together.”

I believe AI will be a force multiplier for human potential. To realize this, AI must be combined with our capacity for compassion, justice, and ethical behavior—our humanity, in a nutshell. AI will herald a new era of prosperity if—and only if—we prioritize the humanist impact of technology. Let’s apply AI for the betterment of our world and use it to help us solve our greatest, most pressing challenges. Let’s use it to become more human, not less.

And never forget: the future needs you today.

Thank you, Karen Palmer.

The comments of Karen Palmer are her own and do not necessarily represent the views of Open Text Corporation or its employees.

Share this post

Share this post to x. Share to linkedin. Mail to
Mark J. Barrenechea avatar image

Mark J. Barrenechea

Mark J. Barrenechea joined OpenText as President and Chief Executive Officer in January 2012, and also serves as a member of the Board of Directors. In January 2016, Mark took on the role of Chief Technology Officer and was appointed Vice Chair in September 2017.

See all posts

More from the author

OpenText Committed to Climate Innovation

OpenText Committed to Climate Innovation

As I return home from OpenText World Europe, I am feeling invigorated by the powerful conversations that occurred throughout the week. I am also feeling…

April 22, 2024 5 minutes read
OpenText World 2023—Welcome to the AI Revolution

OpenText World 2023—Welcome to the AI Revolution

Welcome to the AI Revolution. AI is not just a technology, it is a new ontology—for creativity, data, trust. No business or individual will be…

October 19, 2023 6 minutes read
OpenText World 2023—AI and Its Forces of Change

OpenText World 2023—AI and Its Forces of Change

AI will change everything. We are in the midst of a massive shift from the cloud digital era to the AI cognitive era. Automation, data…

September 26, 2023 4 minutes read

Stay in the loop!

Get our most popular content delivered monthly to your inbox.