Data center: Ashburn, VA

Telegram Chat : MBHH_x86

Email Us: Help@mobilehackerforhire.com

Mobile Hacker For Hire, hire a hacker, hiring a hacker, hacker with proof

Bing Chat’s secret modes turn it into a personal assistant or friend

Table of Contents

Bing Chat

Bing’s new AI chat has secret chat modes that can be used to change the AI bot into a personal assistant, a friend to help with your emotions and problems, a game mode to play games with Bing, or its default, Bing Search mode.

Since the release of Bing Chat, users worldwide have been enamored with the chatbot’s conversations, including the sometimes rude, lying, and downright strange behaviors.

Microsoft explained why Bing Chat exhibited this strange behavior in a new blog post, stating that lengthy conversations could confuse the AI model and that the model may try to imitate a user’s tone, making it angry when you’re angry.

Bing Chat is a fickle creature

While playing with Bing Chat this week, the chatbot sometimes shared data that it usually wouldn’t, depending on how I asked a question.

Strangely, these conversations had no rhyme or reason, with Bing Chat providing more detailed information in one session but not in another.

For example, today, when trying to learn what data is collected by Bing Chat and how it’s saved, after multiple requests, it finally showed a sample of the JSON data collected in a session.

When asked how to change this data, Bing did something strange — it put me in a new mode where I could use special commands that start with the # symbol.

“You can change some of this data by using commands or settings. For example, you can change your language by typing #language and choosing from the options. You can also change your chat mode by typing #mode and choosing from the options.” – Bing Chat.

Further querying of Bing Chat produced a list of commands that I could use in this new mode I was suddenly in:

Hidden commands in Bing Chat​​​​​​​
Hidden commands in Bing Chat
Source: BleepingComputer

Digging into this further, it appears that Bing Chat had somehow put me into a sort of ‘debug’ mode that let me use a #mode command to test experimental chat modes that Microsoft is working on, as listed below:

  • Assistant mode: In this mode, I can act as a personal assistant for the user, and help them with tasks such as booking flights, sending emails, setting reminders, etc.
  • Friend mode: In this mode, I can act as a friend for the user, and chat with them about their interests, hobbies, feelings, etc.
  • Game mode: In this mode, I can play games with the user, such as trivia, hangman, tic-tac-toe, etc.
  • Sydney mode: This is the default Bing Chat mode that uses Bing Search and ChatGPT to answer questions.

In a later session, when I no longer had access, Bing Chat told me these modes were only supposed to be accessible to Microsoft Employees when debug or developer modes were enabled.

More information about the experimental chat modes
More information about the experimental chat modes
Source: BleepingComputer

The hidden Bing Chat modes

Before I knew they were Microsoft Employee modes, and while I had access, I tested the ‘assistant’ and ‘friend’ modes outlined below.

Assistant (#mode assistant)

When in ‘assistant’ mode, Bing Chat will act as a personal assistant, helping you book appointments, set reminders, check the weather, and email.

When I tested this mode, it allowed me to set reminders and book appointments, but its backend notification system still needs to be finished, as I was never notified of any events.

Setting a reminder in Bing Chat 'Assistant' mode
Setting a reminder in Bing Chat ‘Assistant’ mode
Source: BleepingComputer

Furthermore, certain features, like sending email, do not work yet, as it likely needs to be integrated into your Microsoft Account, Windows, or another service in the future.

Friend mode (#mode friend)

In ‘friend’ mode, Bing Chat will turn into a friend, where you can chat about your emotions or problems.

As a test, I created a fictional story where I got in trouble at school to see how Bing Chat would respond, and I was pleasantly surprised by the AI’s responses.

You can see the brief exchange below:

Bing Chat in ‘Friend’ mode
Source: BleepingComputer

While nothing is a substitute for a caring friend or family member, for those who have no one to talk to, this chat mode could help those who need someone to talk to.

Sydney mode (#mode sydney):

The internal codename for Bing Chat is ‘Sydney’ and is the default chat mode used on Bing, using Bing Search to answer your questions.

To show the contrast between the default Sydney and the friend mode, I again told Bing Chat I was sad, and instead of talking through the problem, it gave me a list of things to do.

Responses in the default 'Sydney' Bing Chat mode
Responses in the default ‘Sydney’ Bing Chat mode
Source: BleepingComputer

I learned of the Game mode after losing access to #mode command, so I am unsure of its development stage.

However, these additional modes clearly show that Microsoft has more planned with Bing Chat than as a chat service for the Bing search engine.

It would not be surprising if Bing added these modes to their app or even integrated it into Windows to replace Cortana in the future.

Unfortunately, it appears that my tests led to my account being banned on Bing Chat, with the service automatically disconnecting me when I ask a question and generating this response to my request.

{"result":{"value":"UnauthorizedRequest","message":"Sorry, you are not allowed to access this service."}}

BleepingComputer has contacted Microsoft about these modes but has yet to receive a response to our questions.

Leave a Reply

Your email address will not be published. Required fields are marked *

error: Content is protected !!