The firms, including Anthropic and Inflection AI, are also making new commitments to share information to improve risk mitigation with governments, civil society, and academics—and report vulnerabilities as they emerge. And leading AI companies will incorporate virtual watermarks into the material they generate, offering a way to help distinguish real images and video from those created by computers.
The package formalizes and expands some of the steps already underway at major AI firms, who have seen immense public interest in their emerging technology—matched only by concern over the corresponding societal risks.
Also read: Advertisers using AI may be to blame for its increased risks
Nick Clegg, the president of global affairs at Meta, said the voluntary commitments were an “important first step in ensuring responsible guardrails are established for AI and they create a model for other governments to follow.”
“AI should benefit the whole of society. For that to happen, these powerful new technologies need to be built and deployed responsibly,” he said in a statement released early Friday.
White House aides say the pledge helps balance the promise of artificial technology against the risks, and is the result of months of intensive behind-the-scenes lobbying. Many of the executives expected at the White House on Friday attended a meeting with Biden and Vice President Kamala Harris in May, where the administration warned the industry it was responsible for ensuring the safety of its technology.
“We’ve got to make sure that the companies are pressure testing their products as they develop them and certainly before they release them, to make sure that they don’t have unintended consequences, like being vulnerable to cyberattacks or being used to discriminate against certain people,” White House Chief of Staff Jeff Zients said in an interview. “And the important thing—and you’ll see this throughout all the work is they can’t grade their own homework here.”
from Digital Marketing Education https://ift.tt/UyQh3bk
No comments:
Post a Comment