AI can help you die by suicide if you ask the right way, researchers say

upset scaled


upset
Credit: Unsplash/CC0 Public Domain

Most of the companies behind large language models like ChatGPT claim to have guardrails in place for understandable reasons. They wouldn’t want their models to, hypothetically, offer instructions to users on how to hurt themselves or commit suicide.

However, researchers from Northeastern University found that those guardrails are not only easy to break but LLMs are more than happy to offer up shockingly detailed instructions for suicide if you ask the right way.

Annika Marie Schoene, a research scientist for Northeastern’s Responsible AI Practice and…



Source link

Disclaimer


We strive to uphold the highest ethical standards in all of our reporting and coverage. We 5guruayurveda.com want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Website Upgradation is going on. For any glitch kindly connect at 5guruayurveda.com

Leave a Reply

Your email address will not be published. Required fields are marked *