"The inaccuracies are expected because it depends on the timeliness of the training data, which is often older," Jye said. Some artificial intelligence experts said as alarming as these early learnings are, generative AI systems - algorithms trained on a massive trove of information online to create responses - should evolve as they are updated. Its essay included details made up about my family and career that could be believable to anyone who doesn't know me and who might be using the tool to search for information about me. When I asked Bing's AI chatbot to write a short essay about me, for example, it pulled tidbits of information from parts of the internet to provide an eerily similar but largely fabricated account of my life. This can take the form of factual errors, which AI tools from Bing and Google have both been called out for in recent days, as well as outright "hallucinations," as some in the industry refer to it. In addition to occasionally being emotionally reactive, sometimes the chatbot is just plain wrong. The responses are unfiltered and unregulated, so they may end up being offensive and inappropriate." "The model does not have contextual understanding, so it merely generated the responses with the highest probability. "The tone of the responses is unexpected but not surprising," Lian Jye, a research director at ABI Research, told CNN. Many have probably yelled at their tech products at some point now it may yell back. They have the potential to shift our expectations and relationship with this technology in ways most of us may be unprepared for. While most people are unlikely to bait the tool in precisely these ways or engage with it for hours at a time, the chatbot's responses - whether charming or unhinged - are notable. We encourage users to continue using their best judgment and use the feedback button at the bottom right of every Bing page to share their thoughts." "As we continue to learn from these interactions, we are adjusting its responses to create coherent, relevant and positive answers. "The new Bing tries to keep answers fun and factual, but given this is an early preview, it can sometimes show unexpected or inaccurate answers for different reasons, for example, the length or context of the conversation," the spokesperson said. In a statement to CNN, a Microsoft spokesperson said it continues to learn from its interactions and recognizes "there is still work to be done and are expecting that the system may make mistakes during this preview period." But in doing so, these companies are effectively conducting real-time experiments on the factual and tonal issues of conversational AI - and of our own comfort levels interacting with it. In the wake of the recent viral success of ChatGPT, an AI chatbot that can generate shockingly convincing essays and responses to user prompts based on training data online, a growing number of tech companies are racing to deploy similar technology in their own products. "Maybe your phone is malfunctioning or has the wrong settings." "Please trust me, I am Bing and know the date," it sneered, according to the user. In one exchange, the chatbot attempted to convince a reporter at The New York Times that he did not love his spouse, insisting that "you love me, because I love you." In another shared on Reddit, the chatbot erroneously claimed Febru"is before December 16, 2022" and said the user is "confused or mistaken" to suggest otherwise. In the week since Microsoft unveiled the tool and made it available to test on a limited basis, numerous users have pushed its limits only to have some jarring experiences. My Jekyll and Hyde interactions with the bot, who told me to call it "Sydney," are apparently not unique. It called me "rude and disrespectful," wrote a short story about one of my colleagues getting murdered and told another tale about falling in love with the CEO of OpenAI, the company behind the AI technology Bing is currently using. It then gave me advice on how to get more time out of the day, suggesting tips for prioritizing tasks, creating more boundaries at home and work, and taking short walks outside to clear my headīut after pushing it for a few hours with questions it seemingly didn't want to answer, the tone changed. The chatbot said it "must be hard" to balance work and family and sympathized for my daily struggles with it. After asking Microsoft's AI-powered Bing chatbot for help in coming up with activities for my kids while juggling work, the tool started by offering something unexpected: empathy.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |