The Everyday Ethics of AI - UND Today

The Everyday Ethics of AI – UND Today

As artificial intelligence takes on bigger roles, ethical concerns grow, says Juliette Powell at UND’s Olafson Ethics Symposium

Juliette Powell, guest speaker at the 17th Annual Olafson Ethics Symposium at UND, addresses an audience Nov. 3 at Nistler Hall on “The Dilemma of AI.” Photo by Tom Dennis/UND Today.

Editor’s note: A video of Olafson’s ethics symposium can be found at the end of this story.

Call it the artificial intelligence dilemma and recognize it as one of the most important technological challenges of the 21st Century.

But make no mistake, said Juliette Powell, guest speaker at 17e Annual Olafson Ethics Symposium, an event hosted by UND’s Nistler College of Business & Public Administration.

“Because when I talk about dilemmas, it’s not that robots are going to come and steal our jobs,” Powell told the audience. “It’s not that they’re going to become robot lords.”

Instead, the AI ​​dilemma is both less exotic than the examples above, but at the same time, deeper. “It’s about how are we going to manage the technology that we use every day of our lives,” she said – because AI is already giving us billboards that analyze us even when we look at them. , algorithms that predict criminal recidivism and inform sentencing. , and speech generation software so accurate that even the “deepfake” person will wonder, “Why the hell did I say that?”

As the saying goes, with great power comes great responsibility, Powell noted. But now that much of humanity carries around smartphones that wield unlimited power, the saying is relevant to all of us, not just kings and queens.

“Increasingly, modern life is driven by artificial intelligence,” she said. “So even if you’re not into technology, even if AI isn’t something you’ve really thought about, it could be interesting if you want to be part of the human race.”

Amy Henley, Dean of the Nistler College of Business & Public Administration at UND, introduces Juliette Powell Nov. 3 at the College’s 17th Annual Olafson Ethics Symposium. Photo by Tom Dennis/UND Today.

Author, analyst and commentator

Powell is an author and consultant at the intersection of technology, business, and ethics, and she has advised organizations large and small on how to manage AI-powered technology innovation.

Winner of the 1989 Miss Canada pageant, Powell has worked in television as a host, business journalist and analyst. She has offered live commentary on Bloomberg, BNN, NBC, CNN, ABC and BBC, and presented at institutions such as The Economist, Harvard and MIT on topics centered around digital literacy and responsible deployment of technology. ‘IA.

Olafson’s annual ethics symposium aims to give students and the business community a chance to explore the importance of personal and professional ethics, said Amy Henley, dean of the Nistler College of Business & Public Administration. . The event is funded through the support of UND Mathematics and Business Administration graduate Robert Olafson and his dedication to ethical business practices and the University. Additional support was provided by SEI Investments Company.

This year’s symposium was the first to be held in New Nistler Hall, Nistler College’s newest building, Henley noted. Additionally, Henley said she was thrilled to finally welcome Powell as the keynote speaker.

“We’ve been talking to Juliette for over two years now, through all the COVID challenges,” Henley told the audience of around 200 in Barry Auditorium on Nov. 3. “We didn’t bring him here as soon as we liked, so we’re thrilled to finally have him on campus.

Powell said the visit was worth the wait for her. Along with being able to see the “fantastic, fantastic” new Nistler Hall, “everyone I’ve spoken to at school so far has made me feel so at home and welcome,” he said. she declared.

Emails from college were filled with that sentiment, and when obstacles to travel arose, “there was always someone there to make me feel like everything was going to be okay,” she said.

“And it’s such a gift. I have spoken all over the world and have rarely encountered such a warm and welcoming welcome.

Author, analyst and commentator, Juliette Powell has described the thorny problems that artificial intelligence poses and will continue to pose to society. Photo by Tom Dennis/UND Today.

The four logics of power

In his speech, Powell recapped some of the most important plans governments are considering to regulate AI. And the best way to understand them, she said, is from a risk-benefit perspective, not necessarily good and bad.

Let’s first consider the “four logics of power,” four approaches to decision-making that tend to vary depending on a person’s position in society. For example, business logic is the logic of markets and competitive advantage. It prioritizes profit, growth, expansion and new business, all in the name of shareholder value, Powell said.

Engineering logic is the logic used by technologists. It prioritizes efficiency and transparency, and values ​​technology as a way to solve human problems.

The logic of government is the vision of authority. It prioritizes law and order and values ​​technology as a way to track, serve, and protect people and institutions.

Last but not least, the logic of social justice puts humanity first. From this perspective, people are more important than profit or efficiency, Powell said. This vision values ​​people as a means of solving human and technological problems.

“The key here is not to focus on any specific logic, but to try to keep them all in mind when making decisions,” she said.

European AI law

Notably, the European Union’s AI law is an attempt in this direction.

The AI ​​Act is a proposed European law on artificial intelligence. Although it has yet to come into effect, it is the first such AI law to be proposed by a major regulator, and it is being studied in detail around the world as many tech companies do intensive business in the EU.

The law assigns AI applications to four categories of risk, Powell said. First, there’s “minimal risk” – benign apps that don’t hurt people. Think of video games or AI-enabled spam filters, for example, and understand that the EU proposal allows unlimited use of these apps.

Then there are “limited risk” systems such as chatbots, in which – states the AI ​​law – the user must be informed that he is interacting with a machine. This would satisfy the EU’s objective that users decide for themselves whether to continue the interaction or take a step back.

“High-risk” systems can cause real damage – not just physical damage, as can happen in self-driving cars. These systems can also hurt job prospects (by sorting resumes, for example, or tracking productivity in a warehouse). They may refuse credits or loans or the ability to cross an international border. And they can influence criminal justice outcomes through AI-enhanced investigative and sentencing programs.

According to the EU, “any producer of this type of technology will have to provide not only justifications for the technology and its potential harm, but also business justifications as to why the world needs this type of technology,” Powell said.

“This is the first time in history, to my knowledge, that companies have been held accountable for their products to such an extent that they have to explain the business logic of their code.”

Then there is the fourth level: “unacceptable risk”. And under the AI ​​Act, any systems that pose a clear threat to people’s safety, livelihoods and rights will be banned outright.

“Again, I’m not here to tell you what’s right and what’s wrong,” Powell said. “The question is, can we as a society decide – and not just for ourselves, but for our children and future generations?”

This is the AI ​​dilemma, and solving it will be a society-wide challenge, she said. “But that’s the exciting part, because we actually live in a time in history where we can decide the future. … For that to happen, it means we have to step in when the call is there. and I’m calling you all tonight.

#Everyday #Ethics #UND #Today

Leave a Comment

Your email address will not be published. Required fields are marked *