The Vatican Releases Its Own AI Ethics Handbook

The Vatican Releases Its Own AI Ethics Handbook

The Vatican is getting in on the AI craze. The Holy See has released a handbook on the ethics of artificial intelligence as defined by the Pope.

The guidelines are the result of a partnership between Francis and Santa Clara University’s Markkula Centre for Applied Ethics. Together, they’ve formed a new organisation called the Institute for Technology, Ethics, and Culture (ITEC). The ITEC’s first project is a handbook titled Ethics in the Age of Disruptive Technologies: An Operational Roadmap, meant to guide the tech industry through the murky waters of ethics in AI, machine learning, encryption, tracking, and more.

His Holiness and his associates might not seem like an obvious choice to weigh in on artificial intelligence. But according to Father Brendan McGuire, pastor of St. Simon Parish in Los Altos and an advisor to ITEC, the initiative is the culmination of longstanding interests for the church. He argues the Vatican wields a unique ability to bring key players to the table.

“The Pope has always had a large view of the world and of humanity, and he believes that technology is a good thing. But as we develop it, it comes time to ask the deeper questions,” Father Brendan told Gizmodo in an interview. “Technology executives from all over Silicon Valley have been coming to me for years and saying, ‘You need to help us, there’s a lot of stuff on the horizon and we aren’t ready.’ The idea was to use the Vatican’s convening power to bring executives from the entire world together.”

Where many advocates, academics, and observers focus their efforts on appeals to regulators, the ITEC handbook takes a different approach. Rather than wait for governments to set rules for industry, the ITEC hopes to provide guidance for people within tech companies who are already wrestling with AI’s most difficult questions.

“There’s a consensus emerging around things like accountability and transparency, with principles that align from company to company,” said Ann Skeet, Senior Director of Leadership Ethics at the Markkula Centre, and one of the handbook’s authors. “That’s great, but there’s less consensus about what to actually do and how you actually apply those standards to the design and employment of technology.”

In general, the book argues for building values organised around a set of principles into technology and the companies that develop it from the start, rather than looking back to fix problems post facto. The manual spells out one anchor principle for companies: ensuring that “Our actions are for the Common Good of Humanity and the Environment.” That’s all well and good and, obviously, extremely vague. But the ITEC handbook is organised to break big ideas everyone can agree on into a cascading series of specific advice and actionable steps.

That big anchor principle is broken down into seven guidelines, such as “Respect for Human Dignity and Rights,” and “Promote Transparency and Explainability.” Those seven guidelines are then broken down into 46 specific actionable steps, complete with definitions, examples, and actionable steps.

For example, the principle “Respect for Human Dignity and Rights” includes a focus on “Privacy and confidentiality.” To put that idea in practice, the book calls for a commitment to “not collect more data than necessary,” and says “collected data should be stored in a manner that optimises the protection of privacy and confidentiality.” It spells out that companies should consider specific protections for medical and financial data, and focus on responsibilities to users, not just legal requirements.

“The goal is to actually empower the people inside the company as people are going about their everyday work, whether it’s writing a code or a technical manual, or thinking about issues around workplace culture,” Gregg Skeet said. “We’ve tried to write in the language of business and engineers so the reserves will actually get used and they’re similar to things and standards they’ve seen before.”

The Vatican isn’t the only organisation asking big questions about the future of AI and technology. Just months after OpenAI released ChatGPT on the world, the company’s CEO Sam Altman was already meeting with President Biden and testifying before Congress about how AI should be regulated.

But if you listen to the tech CEO’s working on the technology, they seem most focused on a distant, hypothetical future where robots bring about some version of the end of the world. Hundreds of tech executives recently signed a one-sentence statement about what we should do about AI: “Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.”

As real (or not) as these concerns may be, some critics argue the focus on the future is part of an industry-led effort to distract regulators from the problems we’re already facing thanks to AI technology that actually exists right now.

Fortunately for the tech business, the Vatican has a lot of experience answering questions about how we should consider the apocalypse. Father Brendan said that AI’s possible existential threats are serious, but the near-term AI issues deserve just as much attention. He did not, however, have insight on whether or not the Pope has used ChatGPT.

“Major guardrails are absolutely necessary, and countries and governments will implement them in time,” Father Brendan said. “But this book plays a significant role in fast-tracking the approach to design and consumer implementation. That’s where we’re trying to enable companies to meet the standards we need way ahead of time.”


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.