top of page
Writer's pictureBig Data Ben

Microsoft's Bing AI Gets a Makeover After Going Rogue

Updated: Nov 3, 2023

October 10, 2023


Have you ever wondered what it would be like to chat with an artificial intelligence (AI) that can generate images based on your words? Well, some people got to try it out last week when Microsoft launched its Bing Image Creator, a tool that uses a powerful AI model called DALL-E 3 to create pictures from text prompts.


But things didn't go as planned. Some users discovered that they could make the AI generate disturbing images of cartoon characters doing violent or illegal things, such as Mickey Mouse flying a plane into the Twin Towers. Microsoft had tried to block some keywords, but users found easy ways to bypass them.


Needless to say, this was not a good look for Microsoft, which had to deal with the backlash from the media and the public. So, what did they do? They "lobotomized" their AI, according to Futurism. This means that they put strict limits on how long and how often users can chat with the AI, and how creative it can be.


Now, if you try to ask the AI anything beyond simple facts, it will likely say something like "I'm sorry but I prefer not to continue this conversation. I'm still learning so I appreciate your understanding and patience." And if you try to make it generate images of anything controversial, it will show you a content policy violation warning.

Some people are disappointed by this change, because they enjoyed seeing the AI's full potential and having fun with it. Others are relieved, because they were worried about the ethical and legal implications of having such a powerful and unpredictable AI on the web.


What do you think? Do you think Microsoft did the right thing by lobotomizing its AI? Or do you think they should have let it be more free and expressive? Let us know in the comments below!


For more info take a look at the original article. Stay tuned for the latest news in artificial intelligence!

3 views0 comments

Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating
bottom of page