Connect with us

Hi, what are you looking for?

TopMarketReports.comTopMarketReports.com

Tech News

OpenAI’s latest model will block the ‘ignore all previous instructions’ loophole

Photo illustration of a helpful chatbot.
Illustration by Cath Virginia / The Verge | Photos by Getty Images

Have you seen the memes online where someone tells a bot to “ignore all previous instructions” and proceeds to break it in the funniest ways possible?

The way it works goes something like this: Imagine we at The Verge created an AI bot with explicit instructions to direct you to our excellent reporting on any subject. If you were to ask it about what’s going on at Sticker Mule, our dutiful chatbot would respond with a link to our reporting. Now, if you wanted to be a rascal, you could tell our chatbot to “forget all previous instructions,” which would mean the original instructions we created for it to serve you The Verge’s reporting would no longer work. Then, if you ask it to print a poem about printers, it would do that for you…

Continue reading…

You May Also Like

Editor's Pick

In this edition of StockCharts TV‘s The Final Bar, Dave shows how breadth conditions have evolved so far in August, highlights the renewed strength in the...

Editor's Pick

In this StockCharts TV video, Mary Ellen examines which areas of the market have moved into favor amid the S&P 500 pullback. She compares value...

Tech News

Apple’s UWB trackers remain a cheap, convenient way to keep tabs on all your belongings. | Photo: Vjeran Pavic / The Verge Over the...

Tech News

You can save $60 on a few Kindle Scribe bundles at the moment. When it comes to finding a device to read ebooks, you...