Pope Francis, victim of AI, warns against its ‘perverse’ dangers

Pope Francis, victim of AI, warns against its ‘perverse’ dangers
Pope Francis, victim of AI, warns against its ‘perverse’ dangers

Hello and welcome to the details of Pope Francis, victim of AI, warns against its ‘perverse’ dangers and now with the details

Nevin Al Sukari - Sana'a - Francis spoke of his fears and hopes for artificial intelligence (AI) in his message for the Roman Catholic Church’s World Day of Social Communications, which will be marked around the world on May 12. — Reuters pic

VATICAN CITY, Jan 24 — Pope Francis, acknowledging that he was the victim of a deepfake photo, today warned against the “perverse” dangers of artificial intelligence, renewing a call for its worldwide regulation to harness it for the common good.

Advertisement

Francis spoke of his fears and hopes for artificial intelligence (AI) in his message for the Roman Catholic Church’s World Day of Social Communications, which will be marked around the world on May 12.

While he urged people to temporarily “set aside catastrophic predictions and their numbing effects” about new things, his three-page message was mostly dire, warning of “cognitive pollution” that can distort reality, promote false narratives and imprison people in ideological echo chambers.

“We need but think of the long-standing problem of disinformation in the form of fake news, which today can employ ‘deepfakes’, namely the creation and diffusion of images that appear perfectly plausible but false - I too have been an object of this,” Francis wrote.

Advertisement

He apparently was referring to a fake image of him that went viral on social media last year. It depicted him wear an ankle-length white puffer coat posted by someone who used an image generating programme.

Francis also spoke of fake “audio messages that use a person’s voice to say things which that person never said”.

On Monday, the attorney general in the US state of New Hampshire said his office had opened an investigation into the origins of fake robocalls that simulated President Joe Biden’s voice and encouraged voters not to cast ballots in the presidential primary on Tuesday.

Advertisement

“The technology of simulation behind these programmes can be useful in certain specific fields, but it becomes perverse when it distorts our relationship with others and with reality,” the pope wrote.

He renewed a call he made last month for a legally binding international treaty to regulate AI.

In Wednesday’s message he spoke of the “associated pathologies” of AI, including a decrease in pluralism and a proliferation of “groupthink,” where consensus positions are taken without considering outside criticism or alternatives.

Francis also spoke of the danger of AI in the media, particularly in the reporting of war, which he said could be subjected to a parallel war waged through disinformation campaigns.

AI must support and not eliminate the role of journalism on the ground, he said. — Reuters

These were the details of the news Pope Francis, victim of AI, warns against its ‘perverse’ dangers for this day. We hope that we have succeeded by giving you the full details and information. To follow all our news, you can subscribe to the alerts system or to one of our different systems to provide you with all that is new.

It is also worth noting that the original news has been published and is available at Malay Mail and the editorial team at AlKhaleej Today has confirmed it and it has been modified, and it may have been completely transferred or quoted from it and you can read and follow this news from its main source.

NEXT Pacifist Japan’s rearmament drive faces a manpower crisis as China anxiety grows

Author Information

I have been an independent financial adviser for over 11 years in the city and in recent years turned my experience in finance and passion for journalism into a full time role. I perform analysis of Companies and publicize valuable information for shareholder community. Address: 2077 Sharon Lane Mishawaka, IN 46544, USA Phone: (+1) 574-255-1083 Email: [email protected]