Thursday, June 15, 2023

Robot rights: IPA survey shows our bizarre attitude to AI


How do we stop AI taking over the world? There are many doom scenarios floating around, and plenty of leaders promising to hold talks about curbing its powers.

The Institute of Practitioners in Advertising has come up with a sensible suggestion in the meantime: transparency. Brands could disclose their use of AI-generated content so that we can keep tabs on ChatGPT’s bid for world domination.

In a survey of 2000 adults, 74% said that brands should be transparent in their use of AI, and 75% want to be notified when they are not dealing with a real person.

A bizarre 24% of us are in favour of “robot rights” and are prepared to stand up for the belief that we should treat AI like we would our fellow humans. That doesn’t mean we want AI to act like us: 67% think AI shouldn’t pretend to be human or act as if it has a personality (although that’s sometimes a tall order when you are trying to communicate).

Surprisingly, despite frustration with ever-more ubiquitous bots, nearly half of us (48%) still bother to be polite when interacting with virtual assistants, down from 64% in 2018.

Josh Krichefski, IPA President and CEO, EMEA & UK, GroupM: “AI provides incredible opportunities for our business. As these findings demonstrate, however, the public are understandably cautious about its use – and increasingly so in some areas. It is therefore our responsibility to be transparent and accountable when using AI to ensure the trust of our customers.”

WPP’s recent deal with Nvidia and Publicis Groupe’s tie-up with Adobe are both about building platforms to supply content and images using generative AI.

With a few guidelines in place and some major reputations at stake, there should be a decent chance of policing transparency in these defined communications ecosystems. It will be more of a challenge in the rest of the world, where AI scams and deep fakes are already proliferating.



source



from Digital Marketing Education https://ift.tt/NX1aMLn

No comments:

Post a Comment