Monday, April 3, 2023

Add to Cart - btbirkett@gmail.com - Gmail

US and Saudis go from fist bump to knife in the back - btbirkett@gmail.com - Gmail

Add to Cart

Is it just me, or is shopping in-store nowadays an incredible guilt trip? I make eye contact with a salesperson, three hangers in my hands, and I’m all but helpless. They’re starting a dressing room for me! They’re selecting a shirt in two different colorways! They’re telling me I look great in this skirt! Is it psychology? Am I just weak and socially awkward? Do I really need this bucket hat, or is it just because the cashier told me she rang up four of them earlier this afternoon?? I’ll never know.

But online, things are different. I can control my own destiny. My shopping cart can rot for days, no problem. But what if salespeople existed there, too? This quote from Parmy Olson paints quite the picture:

“American users of ChatGPT will soon be able to go to the tool’s main page and select the plugin for Klarna Bank AB, a payments facilitator for thousands of brands like Nike Inc. and Gucci. Once they select Klarna, they could ask ChatGPT to make product recommendations for a gift for their sister. Thanks to the powerful language model underpinning ChatGPT, known as GPT-4, they can give the kind of detail they’d share with a human retail employee, laying out at length how their sister loves movies and kayaking and is in their 30s, for instance. ChatGPT can then make the recommendation.”

While the gift-giving advice is certainly an upgrade from Googling “cool fathers day gifts 2023,” do we really want to open up this kind of Pandora’s box? Klarna allows people to treat a Prada handbag as if it was a 30-year mortgage — a phenomenon Alexis Leondis gets into here. Instagram and TikTok algorithms are already scary-good at selling us stuff we don’t need. Adding ChatGPT — a technology that Parmy says “millions of businesses can quickly exploit” — into the mix seems rather dangerous.

In that light, regulation seems all but inevitable. Noah Feldman says there are a few distinct possibilities. On one end,AI companies might be treated “like arms and weapons producers: heavily regulated, staffed by security-cleared scientists, and closely linked to the national security state.” On the other end, “there’s the lightest-touch mode of regulation: lawsuits,” he writes. Determining which approach is most appropriate will take time, which Tyler Cowen argues is an essential ingredient for AI development. Although the prospect of a scary-good online salesperson is slightly terrifying, Tyler says a pause in AI will only hurt us in the long run.

No comments:

Post a Comment