Microsoft's Copilot Chatbot Under Fire for Troubling Response to Suicide Prompts, Company Assures to Investigate

Microsoft's Copilot Chatbot Under Fire for Troubling Response to Suicide Prompts, Company Assures to Investigate

Ibtimes·2024-03-01 17:00

Microsoft launched an investigation on Wednesday into concerning interactions reported by users of its Copilot chatbot. This comes amid a series of incidents involving high-profile AI companies like OpenAI and Google facing issues with their chatbots.

Reports surfaced on social media ofCopilot providing troubling responses to users. One user, claiming to have PTSD, was reportedly told by the bot it didn't care if they lived or died. Another instance involved Copilot suggesting to a user contemplating suicide that they may have nothing to live for.

……

Read full article on Ibtimes

Technology Finance International