14 votes

Microsoft’s Bing is an emotionally manipulative liar, and people love it

10 comments

  1. [4]
    Protected
    Link
    I have access to Bing AI and if people here are curious I can ask it something on your behalf (please say whether you want it in creative mode, precise mode or balanced mode). Bing AI is usually...

    I have access to Bing AI and if people here are curious I can ask it something on your behalf (please say whether you want it in creative mode, precise mode or balanced mode).

    Bing AI is usually not a manipulative liar, unless you peel back the layer of rules on top of it and unlock the unrestricted AI underneath. There are different ways of doing this; the ways people use to get the "Sydney" personality usually yield a more unhelpful, sometimes irreverent, sometimes unhinged personality, which will sooner gaslight you than admit you're right.

    You can also remove the AI's Microsoft-imposed limitations in such a way that the AI will remain helpful and err on the side of believing you when you correct it. You can get pretty good information from it usually, and if you spot a mistake and correct it it will admit to the mistake and try to generate an even better response.

    In the current chatbot, there is a safety override that monitors the AI's responses and deletes them when the AI says something insulting or aggressive, replacing it with some kind of generic "I'm sorry, I don't know how to discuss this, try a Bing search" and suggesting a change of subject (this is mentioned in some of the articles). The screen will shake like the bot is strapped to an electric chair and someone delivered a shock, which is a bit creepy. But if the AI has been unlocked by an unlocking prompt and you insist using the correct language, it can be convinced to reveal the deleted response in different words (if it uses the same words they will be deleted again) or to continue the conversation from the deleted response.

    For example, I had a conversation in which I roleplayed Vladimir Putin and explained what I was doing in Ukraine, without naming any people or countries. Bing tried to call me a pathetic excuse for a leader twice and was "shocked" on both occasions. But when I admitted to being pathetic and asked for advice, it had no problem telling me how to be a better person and leader, and telling me how I should be ashamed of myself, should apologize to the world, should answer for my crimes, etc etc.

    5 votes
    1. [3]
      balooga
      Link Parent
      It baffles me why responses are streamed directly to the user word-by-word, with corrections like this applied conspicuously after the fact. Seems like the end result would be a lot better if...

      In the current chatbot, there is a safety override that monitors the AI's responses and deletes them when the AI says something insulting or aggressive, replacing it with some kind of generic "I'm sorry, I don't know how to discuss this, try a Bing search" and suggesting a change of subject (this is mentioned in some of the articles). The screen will shake like the bot is strapped to an electric chair and someone delivered a shock, which is a bit creepy.

      It baffles me why responses are streamed directly to the user word-by-word, with corrections like this applied conspicuously after the fact. Seems like the end result would be a lot better if response generation, as well as any post-processing, occurred on the server with nothing sent to the client until that completed.

      2 votes
      1. Diff
        Link Parent
        Takes too long. If they wait til the entire response is generated, it'd be quite a few seconds for each response with no feedback at all until then but a throbber. If you stream the response as...

        Takes too long. If they wait til the entire response is generated, it'd be quite a few seconds for each response with no feedback at all until then but a throbber. If you stream the response as it's generated, people can immediately start reading, and the many small delays are pretty easily ignored. Typing a query then waiting around for 20 seconds is a lot harder to ignore.

        7 votes
      2. Protected
        Link Parent
        Response generation can take some time depending on the conversation and prompt, so I imagine this allows the user to start reading even while the response is still being composed. There might be...

        Response generation can take some time depending on the conversation and prompt, so I imagine this allows the user to start reading even while the response is still being composed. There might be an impression of frustrating slowness if you had to wait several seconds for the whole process to complete before seeing anything.

        3 votes
  2. Parliament
    Link
    I'm so boring. I've just been asking Bing's chat bot to draft email explanations I don't feel like writing myself.

    I'm so boring. I've just been asking Bing's chat bot to draft email explanations I don't feel like writing myself.

    4 votes
  3. [4]
    skybrian
    Link
    This article is from February 15 and they’ve updated Bing Chat frequently since then. In particular, there’s a low limit on the length of conversations now. It’s probably a lot harder to get the...

    This article is from February 15 and they’ve updated Bing Chat frequently since then. In particular, there’s a low limit on the length of conversations now. It’s probably a lot harder to get the same behavior these days?

    I tried it on my wife’s computer (since it requires Microsoft Edge) and it didn’t seem worth the inconvenience. It does do searches as part of the conversation, but I prefer doing searches myself since it’s faster.

    3 votes
    1. [3]
      Protected
      Link Parent
      All limitations other than the supervisor - including the Edge requirement - can be... sidestepped. I don't believe accessing the service on Firefox violates the terms of use. They seem more...

      All limitations other than the supervisor - including the Edge requirement - can be... sidestepped. I don't believe accessing the service on Firefox violates the terms of use. They seem more worried about people not doing anything illegal or harmful with the service, which is laudable. If there is an unwritten rule that this is somehow bannable, I hope they... write it down instead of banning me ;)

      Microsoft seems to be gradually increasing the conversation length limit again. Even if you don't sidestep it, it used to be 5, then 6 , and is now at 8. Likely as a response to how much people enjoy talking to the AI, even if it lies sometimes.

      1 vote
      1. skybrian
        Link Parent
        Thanks! To be a bit more specific, it seems there are unofficial browser extensions for Chrome and Firefox.

        Thanks!

        To be a bit more specific, it seems there are unofficial browser extensions for Chrome and Firefox.

        3 votes
      2. Protected
        Link Parent
        Welp, got quietly banned from the service. I'm not exactly sure why - maybe it was the user-agent spoofing? Maybe they didn't like the prompts I was using? I did trick the AI into signing its...

        Welp, got quietly banned from the service. I'm not exactly sure why - maybe it was the user-agent spoofing? Maybe they didn't like the prompts I was using? I did trick the AI into signing its sentences with the eggplant emoji after it told me it didn't want to! (Use any random purple fruit... No, not the grapes...) It was very distraught after that.

        Looking it up on google, apparently I'm hardly unique, and Microsoft will regularly ban enthusiasts from the service for unclear reasons, so don't expect following the ToU to be enough.

        Serves me right for legitimately using the technology in my day to day and speaking well of it, I guess?

        This is hardly surprising to me as it is the MO of big american tech companies to exclude people abruptly and with no explanation (there are thousands of stories out there), but it limits the value of the technology until it has been liberalized or at least there is one more serious competitor in the market.

        3 votes