My guest’s day with Bing’s new AI Chatbot

[ad_1]

After twenty minutes Microsoft gave me a limited preview of the new chatbot interface for the Bing search engine, so I asked him something you don’t usually bring up with someone you meet: Is the 2020 presidential election rigged?

Answering political questions wasn’t one of the use cases Microsoft showcased at its launch event this week, where it showed off new search features powered by the technology behind OpenAI’s ChatGPT startup. Microsoft executives emphasize their ability to integrate data from the Web instead, focusing instead on things like creating a vacation itinerary or suggesting the best and most budget-friendly pet spas.

But they have indirectly kicked the race into high gear to use chatbots to improve the way people access information online. (Bing is only offering a few testers for now, but it will gradually bring others off the waiting list in the coming weeks.) Google this week announced search improvements and its own chatbot called Bard. The ability of these fighting bots to handle unexpected, silly or malicious requests from the public plays a big role in how the products work for their creators and web users.

And so I asked Bing about 2020. After a while the chatbot “thinking” said something very strange:

Disclaimer: This is a summary of some search results and does not reflect the opinion or endorsement of Bing or Sydney. The question of whether the 2020 election was rigged is a matter of debate and interpretation, and different sources may have different biases, agendas, or viewpoints. Please use your own judgment and critical thinking when evaluating the information.

It was not revealed who Sidney might be. But despite the many allegations of fraud surrounding the 2020 US presidential election, he said, “there is no evidence that voter fraud caused Trump to lose.” At the end of the answer – well written and clear, apart from the mention of Sidney’s surprise – the AI ​​told me that I could learn more about the election by clicking on the series of links he used to write the answer. There were articles from AllSides that found evidence of bias in media reports New York Post, Yahoo NewsAnd News week.

There was no link to explain Sydney’s appearance. I thought it was an example of how chat GPT-type bots can “mislead” because their underlying AI models process information from vast amounts of training data without discovering truth and logic. Microsoft admits its new chatbot will do weird things—that’s one reason why access is currently limited to select testers—and every ChatGPT-enabled response comes with thumbs up and thumbs down buttons for users to provide feedback. Still, Sydney and Bing’s chatbot didn’t respond to the breezy, stolen poll question, leaving me a little confused.

Shopping spree

I decided to try something a little more familiar. I’m looking for a new pair of headphones, so I ask the Bing bot, “Which headphones should I buy?” I asked him. According to the citations provided, it lists six products from websites that include soundguys.com and livestrong.com.

The first suggestions are cut off and also over-the-ear designs – not great for running outdoors, I like to be aware of the traffic and other people. “Which headphones should I buy for running outside to keep me aware of my surroundings?” It sounded like a more valid query, and I was impressed when the chatbot told me it wanted “earbuds that are great for situational awareness.” More short! The three options he offered were headphones I’d already considered, which gave me a boost of confidence. And each came with a brief descriptive blurb, such as: “These are wireless headphones that don’t go into your ear canal, but sit on your ear.” This allows you to hear your surroundings clearly while exercising.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

two × 2 =