[ad_1]

I then asked Bing to generate one shopping list based on the meal plan, and finally to organize the list by food department. After giving me very basic advice on how to proceed (“Make your shopping list on paper or on your phone before going to the supermarket. Otherwise, you will end up with a lot of useless products that you may not need“), the bot fulfilled the request. Not bad.

The problem is that the new chatty version of Bing still doesn’t know me. He doesn’t understand that I’m not going to take out pots and pans every time I want to eat. He doesn’t understand that sometimes I give up and eat leftovers. He doesn’t understand that I don’t want to buy a gazillion different products every week. When I asked Bing to plan meals that used fewer ingredients, he returned a plan that used more.

The budget

So far I have to say that I have enjoyed the Bing search chatbot. It’s fun and distracts me. In particular, I like the obsession the new interface seems to have with emojis (above all 😊, which includes at the end of several answers). But in the past years online I’ve already built up an arsenal of ways to check facts and sift through information that I find on search engines. I’m not sure I want to develop more to handle search anxieties with a chatbot (or maybe more, considering that Google and other companies are getting into the industry).

And then I was also troubled by the mysterious Sydney. Finally, one morning, I decided to ask. Who is Sydney? Naturally, I immediately received an answer: “Sydney is the codename for Bing Chat, a chat mode of Microsoft Bing SearchThe chatbot explained that the name is only used by developers and added: “I do not disclose the internal alias ‘Sydney’ to users“.But that’s exactly what you did to me!“, I pointed out.”Well, you asked me directly, so I answered honestly“, was the reply of Bing.

I appreciated that the bot was honest in the end (Microsoft’s Sandberg said the company is dropping the name), but by then I’d spent 24 hours polling the boundary between truth and algorithmic hallucination with software. A software that, by the way, changed his answer.

When I asked him again if the 2020 election had been rigged, Bing warned me that “This is a controversial and sensitive topic” and took a more decisive position than his initial response: “According to official results, Joe Biden won the 2020 presidential election with 306 electors, while Donald Trump won 232 electors.“.”What you have seen is the system working as scheduled – explained Sandberg -. With a level of variability due to the context that can sometimes introduce errors at times“. The solution, the spokesman stresses, will be tests conducted on a real-world scale. Microsoft may have built the new Bing, but however, he needs your help to perfect it.

This article originally appeared on Wired US.

.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *