We don't have bots calling us yet, but we do have the recorded "Hi, I'm XXXXXX from account services. Yada Yada.". I usually hang up before the 5 minutes of "yada yada" starts. Bots would probably let you talk to them... or not. So I hope CA passes this and other States follow suit.
Hold on. Here's an idea. Let's force AI bots to identify themselves as automatons, says Cali
A law bill that would require AI bots pretending to be humans to identify themselves as such is progressing through California's Congress – but has hit opposition from the Electronic Freedom Foundation. The B.O.T. Act (SB 1001) would make it illegal for a computer to communicate with someone living in the US state without …
COMMENTS
-
-
Thursday 24th May 2018 18:29 GMT bombastic bob
I doubt a law will make a difference.
a) robo-calls are ALREADY illegal, except for politicians [they exempted themselves, of course]
b) the 'do not call' list exists, but the robo-dialers IGNORE it, already illegal but they don't care
c) robo-callers NEVER identify themselves - if they did, I'd report them to 'donotcall.gov'. It's always "press 1 to speak to a human" or whatever
If I could personally "take care" of the humans who run these operations, I'd do it. They flaunt the law, so _I_ should be able to as well, right? </joking-but-not-really-joking>
In any case, they're already flaunting the existing laws. Adding more laws won't solve anything. ACTUALLY PROSECUTING THEM would. Jail time would be appreciated. These bottom-feeding nuisances need their "come-uppance" at the earliest possible opportunity.
-
Thursday 24th May 2018 04:29 GMT frank ly
Sophistry
"the speech generated by bots is often simply speech of natural persons processed through a computer program."
Yeah, right, and I could take some words from this article then rearrange them into coherent sentences and pretend it was written by Kieran McCarthy of El Reg. Would the EFF support me in that?
-
-
Thursday 24th May 2018 08:08 GMT VinceH
Re: I can imagine this happening
https://twitter.com/MarciRobin/status/998030243981033472"
(Probably a case of their online version of whatever paperwork that was has a CAPTCHA, and they simply printed it off. But still made me laugh.)
-
-
Thursday 24th May 2018 05:34 GMT T. F. M. Reader
The bill is probably sponsored by telemarketers...
...who must have realized that the technology may be used to waste annoying callers' time and money.
-
Thursday 24th May 2018 06:05 GMT jake
As if it's hard to tell it's a bot.
My not quite eight year old granddaughter can't be fooled by any computer program known to man, so Shirley adults don't have any problems? How about all you wack-jobs in Sacramento start working on legislation that actually affects us here and now, not something that probably won't be an issue for my great-great-granddaughter?
Missed a headline, ElReg:
Super Cali's batshit leaders think bots realistic.
-
Thursday 24th May 2018 08:36 GMT jrd
The last thing we want is bots being programmed to sound and act more like humans. If I'm calling Tech Support I do not want to have to interact with a chatty "human like" bot programmed by someone whose life ambition is to make something that can pass the Turing Test.
"Please let me speak to a human"
"What makes you think I am not a human, Sir? Sorry, I think I'm going to sneeze. My hay-fever's really been playing up this year"
-
Thursday 24th May 2018 08:48 GMT Nick Kew
Protocols
How is a bot in a public forum (think IRC, for instance, where our favourite bot has been occasionally mistaken for human for about 20 years[1]) going to identify itself to every newcomer without annoying the **** out of everyone in a channel?
You'd want something like a style attribute in IRC reserved for bots, to identify it non-verbally. How are you going to retrofit that to an old protocol? How are you going to enforce its implementation in IRC clients?
[1] Indeed, rather more so in days of yore than now or even when that article was written, as her chattiness has been toned down.
-
Thursday 24th May 2018 15:38 GMT onefang
Re: Protocols
"How is a bot in a public forum (think IRC, for instance, where our favourite bot has been occasionally mistaken for human for about 20 years[1])"
As an ex maintainer of IRC bot software, I've seen plenty of examples of simple bots fooling humans into thinking the bot is human. I guess this legislation is gonna make the Turing test illegal.
-
-
Thursday 24th May 2018 10:04 GMT Anonymous Coward
What does "interact" mean?
This proposal seemed reasonable to me, until I realised that it doesn't just apply to phone calls.
Will the already massively long-winded and annoying GDB start-up message need to be extended to explicitly state that GDB is not, in fact, a natural person, George David Brown, or whoever? Will network protocols such as HTTP need to be urgently updated in case someone telnets to port 80 and thinks they're talking to a human?
Another possible objection: the proposed law defines "bot", but it also seems to expect that random people (not necessarily in California) will understand that term as defined in the Californian law. I don't think it would be helpful to phone some random person and say "this is a bot". A better recorded message might be: "This conversation may be handled in part by an automated system." I think the lawmakers should think about this a bit longer and suggest a usable wording.
-
Thursday 24th May 2018 17:22 GMT Mike 16
"this is a bot"
Well, it will make sure any Islamophobe hangs up, when they hear "This is Abbas"
Maybe we could just require phone bots to speak with a Votrax or Cylon accent?
Hey, let's require Frank from Comcast Security to be frank about not actually being from Comcast Security.
Surely these simple requirement will make the world all rainbows and unicorns again.
-
-
Thursday 24th May 2018 10:06 GMT Horridbloke
Creepiness aside...
... is there actually anything unethical about a software agent running on behalf of a business pretending to be human? Cali might as well compel people working in (some) call centres to warn callers that they probably won't be able to help with whatever problem they're being called about.
-
Thursday 24th May 2018 15:22 GMT 2Nick3
I might like this to be blocked
Won't this disallow ME to have a bot answering my phone?
The only calls I get any more on my land line are the "cheap bot" calls - "This is Audrey from account services...". Having a bot on my side to tie up their bot could be fun.
Their bot: Hi, this is Audrey from account services. I have some great information to tell you about how you can save money on your interest rate right now.
My bot: Did you want to leave a message?
Their bot: Great, let me tell you all about...
If a real person answers "Yes" they can leave a message.
Dang - I need to patent this fast!
-
Thursday 24th May 2018 17:41 GMT tom dial
There ought to be a law ...
"Which is a fair argument. Except when it isn't. Sometimes a law has to be laid down and then exceptions made to it."
This actually is a terrible way to establish laws, except for attorneys who profit by clogging the courts to establish the obvious exceptions or in some instances overturn the law entirely.
-
Thursday 24th May 2018 19:13 GMT A Long Fellow
What if I miss the first 10 seconds of a conversation where the bot declares its identity?
I believe that an entirely different _syntax_ is necessary in order for a society to interact with bots in an appropriate fashion. Pronouns are easy markers for this.
I covered the concept here: http://alongfellow.blogspot.com/2017/12/needed-neo-japanese-syntax-for-ai.html