Technology

Risks of Artificial Intelligence on the G7 Agenda

Risks of Artificial Intelligence on the G7 Agenda

Digital affairs ministers from the Group of Seven (G7) countries agreed today, Sunday, that the advanced nations should adopt regulatory frameworks "based on risks" for artificial intelligence, as Members of the European Parliament rush to pass AI legislation to impose rules on emerging tools such as chatbots (ChatGPT).

However, the ministers stated in a joint statement, released at the end of a two-day meeting in Japan, that such regulatory frameworks should "also maintain an open and encouraging climate" for the development of AI technologies and be based on democratic values. The ministers' statement added, "We intend to hold future G7 discussions regarding generative artificial intelligence, which may include topics such as governance, protection of intellectual property rights, including copyright, as well as enhancing transparency and combating misinformation, including foreign interference in information."

Acknowledging that "the policy tools aimed at achieving the shared vision and goal of making the field of artificial intelligence trustworthy may differ from one G7 country to another," the agreement marks a significant milestone in how major countries regulate the AI domain amidst concerns about privacy and security risks. Margrethe Vestager, the European Commission's Vice President, told Reuters ahead of the agreement, "The outcomes of this G7 meeting show that we are certainly not alone in this."

In addition to concerns regarding intellectual property, G7 countries recognized the security risks associated with artificial intelligence. Japanese Digital Affairs Minister Taro Kono stated at a press conference following the agreement, "Generative AI presents society with fake news and disruptive solutions if fed with false data."

Japan will host the G7 summit in Hiroshima in late May, where Prime Minister Fumio Kishida is scheduled to discuss AI regulatory frameworks with world leaders.

Our readers are reading too