Advancing Social Impact With AI

As we build the future of work — let’s not only make sure it is diverse and inclusive but take a look even a step further.

Laura Dumas Kozub
The Startup

--

We each have a role to play and each of us can make a difference. We need to change the course of history. We now have the technology and power to create change. We can and we must!

What can we do? We can — LISTEN, LEARN, ACT, & SUPPORT.

Two arrows pointing in opposite directions, with the word Future and a question mark below it.
Will our future repeat history or will we move forward to change?

If we look at advances in technology such as AI, it is not only about efficiency. It’s also about creating the change and social impact we long to see. Data is being used to create state-of-the-art algorithms. This technology is not only reshaping the way we work but designing what our future looks like.

I began to reflect on this recently after listening to a discussion about AI & Ethics through one of the initiatives I am part of together with Jamba Career for All and AIDA, where we host meetups called brAInstorm Talks and unite people interested in topics around AI and social good. Our two invited researchers (Sebastian Dennerlein and Christof Wolf-Brenner) were discussing ethical principles to consider with the use of algorithms and stressed the point, it’s not the algorithm, it’s the data.

They are right it’s not the algorithm, it’s the history in our data, it’s who we were and are, and it’s up to us to make the change. If we use our history’s data as-is, we will as social scientist Genevieve Bell (Intel vice president) says, reproduce and enshrine really longstanding inequities and bias.

When considering this, we have to remember the future is now and our current actions will determine our future framework and impact generations to come. As we design this new technology, we need to ask some important questions:

  1. Who’s sitting at the table? Which stakeholders are included in programming?
  2. What data is used, are biases examined, who’s cleaning the data? What voices are heard when examining these biases?

If we include all stakeholders, create diverse programming teams, carefully examine our data and ensure it is free of biases, we have the opportunity for more voices to be heard in New Work.

At Jamba-Career for All-Austria, our focus is on the future of work. We are often asked what made your organization decide to focus on the technology sector, specifically AI? The answer is quite simple and nothing to do with AI being a buzz topic. These are the jobs of the future, and we want to equip people with disabilities with the skills of the future. Capacity building of digital skills provides an equal opportunity to gain a competitive advantage in the labor market. Not only this, but it also provides this talent with the opportunity to use their own knowledge and skills to design solutions that break down barriers faced.

In addition to labor market equality and ethical reasoning, this makes business sense as well. According to the world health organization, 15% of the world’s population is currently living with some form of disability. At the same time according to a PWC report, AI will contribute $15.7 trillion to the global economy in 2030. It doesn’t make business sense to leave out 15% of the population in the field projected as one of the fastest-growing contributors to the global economy.

Business leaders around the world agree, it makes both ethical and business sense to include all of the population. The call to action seems clear with many already onboard. But it is not enough and we need to take a look even a step further. Sure we can clean our data and equip our systems with algorithms programmed with the right data, proven to make better bias-free decisions than humans. This has the opportunity to open doors that have otherwise been closed. However, it doesn’t stop with opening doors.

What does it look like for those entering, many of which, who were not present before? Are their voices heard? Is the potential instead of the barriers considered? Can we erase the fears and sensitivities to differences? We know creating diverse environments is just the first step, to succeed we need an inclusive environment. Even beyond this, we need every voice to have a sense of belonging. It’s not enough to just open the doors. We need more people, organizations, and investments examining this aspect of inclusion.

Much of the funding and investments are going to technology advancements and innovation is at an all-time high. If we really want to use technology to create a fair future, we need to start investing in those that are examining the impacts of these new advances from the very voices of those facing the impacts. As Genevieve Bell also says, the question I always want to ask as a social scientist is not can we do it technically but should we do it socially? If it is technically possible to open doors now, it makes social sense, with awareness comes acceptance, but let’s take it even further from integration to belonging.

How can we create equity beyond inclusion? How can we ensure new technology works for social good? That the use of AI does not recreate history but produces a new fair world. I believe the future belongs to all and I want to be part of a team that is ensuring it!

Let’s LISTEN to the voices that for far TOO LONG haven’t been heard, let’s LEARN what we can do, let’s ACT to create change, and let’s SUPPORT one another in doing so.

I invite you to become an #AgentOfChange! Where our network believes in and values innovation for social good and we are creating a future of work that belongs to all. Are you in? Join us.

--

--

Laura Dumas Kozub
The Startup

Inclusion advocate, passionate about empowering change and building a future that is equitable and fair. HR Development Background