Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

KAZU Greenroom: Alka Roy

Portrait Courtesy Of The Smart Transportation Innovation Project
/
Background: Charlie Nguyen / Flickr Creative Commons

Alka Roy is the founder of RI Labs and the Responsible Innovation Project. One of the most sought-after consultants in Silicon Valley, Roy is a featured guest at the 21st Annual Ethics and Responsible Business Forum.

KAZU is a proud media sponsor for the Forum.

This year's program is "Life and Work after ChatGPT: New Opportunities and Ethical Challenges from Generative AI."

We spoke with Roy ahead of the conference about technology, ethics, and children.

This interview has been edited for length and clarity.

On using the word “ethics” in Silicon Valley

I like ethics. Saying you don’t like ethics is like saying you just don’t like doing the right thing.

I just stay away from the term.

It's kind of… I think ethics ought to be there in every single decision made. There ought to be an ethical lens. I don't think it's only around ‘A.I.’ or only around ‘innovation.’

It really wraps around the entire way you run an organization, the way you treat people. We all know that it should exist. So I don't feel like I need to remind people of that.

On a generational shift towards anthropomorphized technology

I was at an elementary school talking to kids about robots. And they asked me, “why do we make robots that look like people?” They’re right.

From there, the question to those kids becomes “Can robots feel? Can robots care?” These are third, fourth graders. And they say, “no, robots can't think independently, think they don't care. Those are cameras. Those are sensors. They move with wheels. People control them, People design them.”

But they had to think about it.

I'm really curious – as a social experiment – to interview them again, you know, ten years, five years from now as they go through the process to see how their thinking changes.

I asked them how many of them play video games. Of course, the majority of them raised their hands. And parents should know how and who and why makes the games, and how it changes the brain.

Addiction doesn’t need to just be an exciting video game. It can be a robot that looks human, as long as that robot can activate the reward center of the brain. Once that happens…our brains play tricks.

Today we see that with children who, you know, want just five more minutes so they can level up and not lose this game. Tomorrow it might be children who want five more minutes with a robot so they don't lose this relationship.

On leaving today’s problems to be fixed by future generations

A few years ago, I heard from a labor union [boss]. He told me “we used to worry that robots will take our jobs, and now we worry that we'll be reporting to robots.”

The reason I share that is because, you know, over time, whether it's generational or over time, our understanding, we're kind of limited by the way our imagination is conditioned and how we think.

And this thing about kicking the problem to the next generation is like saying ‘Hey, you figure out how your community can rise up and fix the problems that other people have created before you were born is a really interesting approach.

It's really convenient to say someone else will fix it. And it's politically the right thing to say, "Our younger generations are so smart! I know what's going on and they'll do it!"

I do think there's a little bit of truth to it in the sense that our kids are being raised by innovation. They can think creatively, but are we allowing them to think creatively? Are we giving this space the kind of education, the kind of autonomy and agency?