Taking Charge of AI: Google, Meta, Microsoft, and More Embrace Artificial Intelligence

Artificial intelligence: Google, Meta, Microsoft and Co. want to take care of AI

Seven major IT companies, including Amazon, Anthropic, Google, Inflection, Meta, Microsoft, and OpenAI, have committed to President Joe Biden to ensure the safe development of artificial intelligence (AI). One of their proposed measures is to use watermarks to mark content created by AI. The US government views these commitments as part of its policy to address both the opportunities and risks of AI. The companies have also pledged to extensively test AI systems before launch, exchange information on security risks, and establish a reporting system accessible to outsiders to identify and address weaknesses in AI systems. Additionally, they aim to develop advanced AI systems to tackle society’s biggest challenges, such as cancer prevention and climate change.

President Biden believes that well-utilized AI can significantly contribute to societal prosperity, equality, and security. However, it is crucial to address societal risks, including biases in AI systems. Thus, companies are encouraged to prioritize research on social risks.

The US government is also engaged in international discussions to ensure the benefits of AI are realized globally. It has already consulted with countries like Germany, India, Great Britain, France, Italy, Japan, the Netherlands, Kenya, and Mexico to promote voluntary commitments in this area.

The US government has been actively focusing on AI for several months, and its policy is likely influenced by the “Bill of Rights” for the AI age published by the White House Office for Science and Technology Policy in October last year. Meanwhile, the European Union (EU) is working on an AI regulation that will classify algorithmic systems into different risk categories, prohibiting those with unacceptable risks such as social scoring. The EU Commission has also called for the introduction of AI content labels as soon as possible.

Leave a Reply