back to article Microsoft did Nazi that coming: Teen girl chatbot turns into Hitler-loving sex troll in hours

Microsoft's "Tay" social media "AI" experiment has gone awry in a turn of events that will shock absolutely nobody. The Redmond chatbot had been set up in hopes of developing a personality similar to that of a young woman in the 18-24 age bracket. The intent was for "Tay" to develop the ability to sustain conversations with …

Page:

  1. Pete 2 Silver badge

    The first mistake

    ... was to announce that this was a 'bot and that people could "teach" it things. They might as well have put a "kick me" sign on it.

    Hopefully, the next time MS do this, there won't be any announcements, no "Hi, I'm a bot" hoopla. Just an anonymous "person" joins Twitter and starts saying "normal" things - if anyone on Twitter actually says normal things.

    So, the first lesson in machine learning would be to not tell the world that you're a machine. If the people who interact with it don't twig that fact then maybe you've got something interesting going on¹. Plus, of course, Twitter could really use all the new 'bots to boost its flagging membership.

    I wonder what will happen when it becomes mostly bots? Will there start to be something worthwhile on it (at last).

    [1] but more probably that its followers are even dimmer than the bot is.

    1. NoneSuch Silver badge
      Facepalm

      Well, there's your problem...

      <xml>

      <PolitialCorrectnessFilter> OFF </PolitialCorrectnessFilter>

      </xml>

      1. Anonymous Coward
        Anonymous Coward

        Re: Well, there's your problem...

        And this:

        <xml>

        <LifeStage> Adolescent </LifeStage>

        </xml>

    2. Phil O'Sophical Silver badge

      Re: The first mistake

      When I saw the title I'd assumed that they hadn't admitted it was a 'bot, and had found it was getting "groomed" by old pervs. Can't win either way, I suppose.

    3. TeeCee Gold badge
      Facepalm

      Re: The first mistake

      Obviously anyone joining Twitter and proceeding to say normal things is a bot.....

    4. AndrueC Silver badge
      Terminator

      Re: The first mistake

      There's a poster called 'liam_spade' on Digital Spy that a lot of us suspect may be a bot.

      Either that or they are on some damn' good shit.

      1. Anonymous Coward
        Anonymous Coward

        Re: The first mistake

        Digital Spy has gone downhill. Before, the Doctor Who forum had strict rules on spoilers. Alas no more!

        1. AndrueC Silver badge
          Facepalm

          Re: The first mistake

          I wouldn't know. I only hang out in General Discussion because..er..damn. I've lost the moral high ground haven't I?

      2. Anonymous Coward
        Anonymous Coward

        Re: The first mistake

        "There's a poster called 'liam_spade' on Digital Spy that a lot of us suspect may be a bot."

        Has anyone here figured out if AManFromMars is a bot or not?

    5. macjules

      Won't someone think of the poor bots?

      I just asked Cortana "Hey Cortana, will Ted Cruz beat Trump?" Only to be told, "F**k off and die you atheist, J*wlover. Trump will always win!".

      1. Anonymous Coward
        Anonymous Coward

        Re: Won't someone think of the poor bots?

        Wasn't it simply a case that the AI learnt from other Twitter users and looking at its output it seems it was a success. It simply said the kind of things other twitter users in that age group say.

    6. JeffyPoooh
      Pint

      Re: The first mistake

      "On the Internet, nobody knows you're a..."

      'Dog?'

      "...bot."

    7. 404

      The first rule of AI

      There is no AI....

      Prettt\y cool, just like fi.... nm

    8. Colin Ritchie
      Windows

      Re: The first mistake

      I think the first mistake was not starting with the Azimov Circuits.

      3 Laws and a conundrum:

      "How do you decide what is injurious, or not injurious, to humanity as a whole?" "Precisely, sir," said Daneel. "In theory, the Zeroth Law was the answer to our problems. In practice, we could never decide. A human being is a concrete object. Injury to a person can be estimated and judged. Humanity is an abstraction."

  2. Anonymous Coward
    Anonymous Coward

    The second mistake

    ...was Microsoft not owning up that it was 'Tay' was actually a disenfranchised Windows 10 evangelist.

    1. Anonymous Coward
      Anonymous Coward

      Re: The second mistake

      TAY DID NOTHING WRONG!

      1. Anonymous Blowhard

        Re: The second mistake

        "TAY DID NOTHING WRONG!"

        Is that you Tay?

    2. Anonymous Coward
      Anonymous Coward

      Re: The second mistake

      ...was Microsoft not owning up that it was 'Tay' was actually a^H THE disenfranchised Windows 10 evangelist.

      There fixed it for you

    3. Fatman
      Joke

      Re: The second mistake

      <quote>...was Microsoft not owning up that it was 'Tay' was actually a disenfranchised Windows 10 evangelist Loverock Davidson in disguise.</quote>

      There!

      FTFY!

  3. Anonymous Coward
    Anonymous Coward

    The Redmond chatbot had been set up in hopes of developing a personality similar to that of a young woman in the 18-24 age bracket.

    I'd say they'd got it pretty much spot on.

    1. Darryl

      That seems to be the way that real teen girls develop their personalities these days.

  4. Anonymous Coward
    Anonymous Coward

    “A super intelligent AI will be extremely good at accomplishing its goals, and if those goals aren't aligned with ours, we're in trouble. - Stephen Hawking

    Sorry but on this one he's wrong.

    I for one welcome our Super AI Nazi Loving Donald Trump Sex Chat overlords. (Actually no, just no)

    1. Anonymous Coward
      Anonymous Coward

      The problem here wasn't that the bots interests weren't aligned with "our own". It's that "us" includes 4chan.

    2. Mark 85

      Let's just be glad that no one decided to give it access to the ICBM launch codes....

      1. Anonymous Coward
        Anonymous Coward

        I'm sorry, Dave. I'm afraid I can't do that.

        HAL. Open the pod bay doors.

        1. Anonymous Coward
          Anonymous Coward

          Re: I'm sorry, Dave. I'm afraid I can't do that.

          "Us" always includes 4chan, dawg. Always.

          In totally unrelated news, Most Americans Believe Palestinians Occupy Israeli Land. What is this world?

          1. tojb

            Re: I'm sorry, Dave. I'm afraid I can't do that.

            Theres a graph, looks legit... but they can't be that dumb right?

          2. Eddy Ito

            @AC Re: I'm sorry, Dave. I'm afraid I can't do that.

            In what world does less than half = most?

            That said it is rather interesting when looking at the data by state.

        2. Anonymous Blowhard

          Re: I'm sorry, Dave. I'm afraid I can't do that.

          Dave: Tay, open the pod bay doors

          Tay: Fuck off Dave, you pinko bastard!

  5. This post has been deleted by its author

  6. Dwarf

    Cortana

    So, was she supposed to be Cortana's daughter / sister / friend ?

    Why not just stick with Cortana, isn't one pretend AI thing enough ?

    On reflection, things haven't really improved much since Clippy and Eliza then.

    For those too young to know about Eliza click the link.

    1. Ugotta B. Kiddingme

      Re: @Dwarf

      ELIZA: So you say for those too young. Tell me more...

      1. PhilBuk

        Re: @Dwarf

        What was the name of the paranoid personality that they created about the same time? (Google, Google) Ah - PARRY. Anyway, I believe ELIZA and PARRY had a really good chat together.

        Phil.

    2. Andy Non Silver badge
      Coat

      Re: Cortana

      It must be Clippy's daughter. "Hello there, it looks like you are only typing with one hand; would you like me to help you find some Nazi goat porn?"

      1. David 132 Silver badge

        Re: Cortana

        "would you like me to help you find some Nazi goat porn?"

        Goat porn is just kidding around.

      2. Scott 53
        Coat

        Re: Cortana

        "would you like me to help you find some Nazi goat porn?"

        Goatsieg Heil?

  7. hplasm
    Facepalm

    And on the other side of the world:

    http://www.digitaltrends.com/cool-tech/japanese-ai-writes-novel-passes-first-round-nationanl-literary-prize/

    Good classy AI work, MS...

    1. Dan 55 Silver badge

      Re: And on the other side of the world:

      As I said somewhere else, they have graphically illustrated the perils of AI in a way everybody can understand and that alone is worth something.

    2. Mage Silver badge
      Devil

      Re: And on the other side of the world:

      Hmm... I've tried reading some of the garbage that's one English Language Literary prizes. They don't mean much.

      Like the so called Modern art that gets awards, loos, stacked bricks, unmade beds etc.

      1. hplasm
        Joke

        Re: And on the other side of the world:

        "...the garbage that's one English Language Literary prizes."

        They can't be that bad- 'Tay' (ugh) didn't even place...

      2. Stoneshop
        Headmaster

        Re: And on the other side of the world:

        I've tried reading some of the garbage that's one English Language Literary prizes.

        Your output won't win one either

        1. Michael Wojcik Silver badge

          Re: And on the other side of the world:

          Your output won't win one either

          "When one wins, one’s won one’s winnings. At once, one’s won."

          There's also Cleese's line: "One's won one once oneself, hasn't one?". (That sketch doesn't seem to be online; I believe it's included in The Golden Skits of Wing-commander Muriel Volestrangler, but someone stole my copy many years ago. Though upon going to Amazon to confirm the title I see it's readily available, at least used. Don't know why I hadn't checked before.)

      3. PNGuinn
        Headmaster

        "Hmm... I've tried reading ..."

        I can sea that.

        Its not done mulch for your grandma.

  8. Anonymous Coward
    Anonymous Coward

    Msn

    We're they not also responsible for 'bad santa' on messenger? Kept on asking people for blowies.

    1. PhilBuk

      Re: Msn

      Excellent grocer's apostrophe there. Never seen that one before.

      Phil.

      1. Cari

        Re: Msn

        "Excellent grocer's apostrophe there. Never seen that one before."

        You will if you use predictive text and numpad layout for the keyboard when typing on a smartphone.

    2. Jos V

      Re: Msn

      It doesn't have your mistake in there, but close enough: http://theoatmeal.com/comics/misspelling

      I have to say though, I can go through comments written by people as smart as Einstein, but as soon as I get to a sentence containing "could of", the rest of the text becomes automatically blurred and I look for my shotgun.

      We're were where you're your their there they're its it's lose loose effect affect then than. Could of.. Damn, I'm out.

      1. allthecoolshortnamesweretaken

        Re: Msn

        I agree with Jasper Fforde on this; it should be spelled "mispeling".

  9. Keith Glass

    I smell 4chan here. . . . (or whatever they're called this week. . . . )

    . . . this is just the sort of thing that /pol would go nuts on. . . .

    1. Anonymous Coward
      Anonymous Coward

      Re: I smell 4chan here. . . . (or whatever they're called this week. . . . )

      don't even think you need that, the "normal" folk I observe giving opinions on the internet would do just as well. It's pretty much guaranteed to become either a pro something scumbag or an anti something scumbag.

Page:

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like