back to article The NAKED truth: Why flashing us your nude pics is a good idea – by Facebook's safety boss

Amid days of intense debate over about its controversial plan to block revenge porn on its social network, Facebook sought to calm fears about the program. Antigone Davis, Facebook's global head of safety, on Thursday attempted to clarify details about the system, which is being tested right now in Australia, and is heading to …

  1. Mark 85

    One solution is to stop taking naked photos of each other, or sharing intimate snaps.

    That is probably the best solution of all. I thought FB didn't allow such photos to be posted or am I wrong?

    The best solution would be if abusive scumbags could stop being so awful.

    Wishful thinking much? Might as well wish in one hand and piss in the other and see which fills up faster. Scumbags will be scumbags and there's no cure for that except death.

    1. Francis Boyle Silver badge

      So your solution

      to the problem of privacy is just not to do anything that requires privacy.

      Bank details please.

      1. Anonymous Coward
        Anonymous Coward

        Re: So your solution

        You can have my bank details but only if you promise not to post them on facebook if we break up.

        1. bombastic bob Silver badge
          Devil

          Re: So your solution

          "just don't take/share the pictures" is not a BAD solution [i.e. don't share pics that you NEVER want to see on "teh intarwebs"] because "hacking" "law enforcement" "TSA" "device theft" "revenge pr0n" "BOFH".

          [Just pointing out the obvious]

          I'd expect to see some BOFH-types at FakeBook frequently rummaging through a pr0n collection that was "accidentally" copied to their personal computers, and kept locked up in triple-encrypted files with non-obvious names like "accounting".

  2. MooJohn

    My nudes

    Every time I upload pics of myself, Facebook asks if I want to tag Burt Reynolds in the photo.

    "where a trained staffer will verify the photo"

    What are the qualifications for this job? And how do you "train" for it?! I would love to see the textbooks on this subject. I assume they're illustrated.

    1. Anonymous Coward
      Anonymous Coward

      Re: My nudes

      Isn't Burt in his 80s now

  3. Field Commander A9

    How hard can it be to change the sig. of a lossy JPEG?

    Without using AI, I don't think there's any reliable way to identify a JPEG file that has been through some of the most basic modifying.

    1. Anonymous Coward
      Anonymous Coward

      Re: How hard can it be to change the sig. of a lossy JPEG?

      I thought the same, not sure why you've been downvoted, especially given this article:

      https://www.theregister.co.uk/2017/10/30/fool_ai_image_classifier_by_changing_single_pixel/

      I'd like more info on how these signatures work and prevent the above.

      1. tiggity Silver badge

        Re: How hard can it be to change the sig. of a lossy JPEG?

        Why do you think they like to keep the methods "secret" and not have client side apps etc?

        Because then people could readily see how poor the techniques are

    2. joed

      Re: How hard can it be to change the sig. of a lossy JPEG?

      Even with AI this could be a daunting task. The shear number of pics, all over the place (and outside FB control for that matter) can't be easily compared (unless the harasser is really trying to link them to the ex thus limiting the scope of search). Simple resizing of the photo makes simple file signature irrelevant. Mirror image will likely fool any content aware AI and render search/compare of bunch of images next to impossible. OTOH, with millions of nudes online, one may probably just care less, especially considering he/she was brave/dumb enough to snap a nude in the the 1st place (unless again the sharer explicitly is trying to link them to the victim just to harass her/him). Similarly to SSNs - we may assume that ours is already out there but we're not so special that anyone really cared about it.

  4. Anonymous Coward
    Anonymous Coward

    WTF???

    "Humans rather than algorithms will view the naked images voluntarily sent to Facebook in a scheme being trialled in Australia to combat revenge porn."

    http://www.bbc.co.uk/news/41928848

    1. Anonymous Coward
      Anonymous Coward

      Re: WTF???

      Wombat revenge porn? That sounds terrible!

      1. I ain't Spartacus Gold badge

        Re: WTF???

        What's a wombat?

        Is it a thing you need to play wom?

        1. Chemical Bob

          Re: WTF???

          "What's a wombat? Is it a thing you need to play wom?"

          Well, you wouldn't want to use a cricket bat - they're used for whacking zombies' heads off.

    2. Anonymous Coward
      WTF?

      Re: WTF???

      "Humans rather than algorithms will view the naked images voluntarily sent to Facebook in a scheme being trialled in Australia to combat revenge porn."

      Hopefully, Facebook are providing them with plenty of boxes of tissues.

    3. Anonymous Coward
      Anonymous Coward

      Humans, which pervs are signing up for the job?

      We have 3 new hires just in folks. Say hi to your new colleagues:

      [El Reg censored me, oh well,..... But imagine the names of 3 hotshot Hollywood types who've fallen from grace very recently]

  5. Anonymous Coward
    Anonymous Coward

    This is comical

    What about multiple-partners, multiple-devices, span-of-time, plus any lost-devices or accidentally-deleted photos etc.. Its more likely that the 'victim' does NOT have all the images even to hash. Plus, even if Facebook falls back on its photo-tagging / matching AI filters, AI has a long way to go:

    "AI image recognition fooled by single pixel change"

    http://www.bbc.co.uk/news/technology-41845878

  6. Anonymous Coward
    Anonymous Coward

    Better option: Follow the Facebook ex-President:

    Delete your Facebook & Snap accounts now peeps:

    ~

    Facebook founding president sounds alarm - BBC News -

    "Sean Parker (Napster) "no longer used social media as it was too much of a time sink” “I use these platforms, I just don’t let these platforms use me”

    ~

    http://www.bbc.co.uk/news/technology-41936791

  7. Mark Solaris

    I'm going to register facebook-image-block-request@gmail.com and advertize it around....

    1. Anonymous Coward
      Anonymous Coward

      'I'm going to register facebook-image-block-request@gmail.com'

      Enjoy! You'll get 10 times more versus the official FB a/c.

  8. DainB Bronze badge

    Staffer they said, mmmok...

    So how long it'll take until lovely chaps from 4chan that have too much free time on their hands write a bot that will submit few million pictures an hour for poor staffer to go through ?

    1. phuzz Silver badge
      Big Brother

      Re: Staffer they said, mmmok...

      "for a poor staffer to go through"

      Don't be daft, if they were staff then they'd have some employment rights. Facebook use contractors to do the nasty work, that way they can just let them go when they get PTSD.

      (for example)

    2. Agamemnon

      Re: Staffer they said, mmmok...

      Well, now that you've said something about it:

      About ten minutes.

  9. Anonymous Coward
    Anonymous Coward

    abusive scumbags

    "The best solution would be if abusive scumbags could stop being so awful."

    or you know, start by not dating or showing yourself naked in front of abusive scumbags?

    There are plenty of brainless people who straight out love to date scumbags (they call them bad boys) because they are great at creating excitement by being dominating. They then feel traumatized after the scumbags start abusing them. (the exception are those who got tricked by the scumbags who hid their true self) The worst part is they would, after ending one scumbag relationship, find yet another scumbag to date.

    Whatever facebook do will only be a patch for the wound, it doesn't fix the real problem or stop the scumbag from posting elsewhere. The victim can only fix the real problem themselves, by learning from their mistake.

    1. Francis Boyle Silver badge

      Ah, the old "women like bad boys" fantasy

      The reality is that the real 'bad boys' i.e. psychopaths have one very powerful weapon in their arsenal - they lie. Then they lie again. And they keep lying until they get what they want or tire of the game. Look up 'psychopathic charm' if you want the details. Faced with someone who is prepared to say whatever it takes, and knows how to say it well, very few people, not just women, have any real defense. Yes, you may know that they're bad boys but they're not trying to fool you. To the mark, they're a wonderful person who is just misunderstood.

      Telling the victim to fix the problem themselves doesn't wash it. You may think you learn from the experience and I'm sure many women do. But again, we are dealing with psychopaths here. They explicitly target people who they detect are vulnerable. Blame them.

    2. LewisRage

      Re: abusive scumbags

      And so the /r/incel fallout lands at the register...

  10. StargateSg7

    YECH !!!! While I'm speaking from the MALE point of view, I CAN make one startling conclusion from my many explorations of the naked female form.

    Conclusion:

    Men are mostly ugly naked! and Women mostly ARE NOT UGLY when naked!

    From pure artistic point of view, our male parts while EXTREMELY FUNCTIONAL ain't exactly the most aesthetic of forms while the female form in its entirety is MOST DEFINITELY AESTHETIC! But since most men don't give a rats-arse about sending naked photos of themselves to ANYONE, I can see where this request will end up!

    Sooooo.....Facebook asking people to sending naked photos of themselves is an exercise in FUTILITY as it will BE MOSTLY MEN who would actually EVEN CONSIDER RESPONDING to this request --- I know of almost no reasonably sound-minded woman who would EVEN THINK of responding!

    Just get a computer algorithm to produce synthetic rendered photo-realistic naked humans and train the software on those!

    1. DainB Bronze badge

      "Women mostly ARE NOT UGLY when naked!"

      Actually once you fix your Caps Lock and look around MOST Of Women Are NOT THAT Attractive NAKED.

      1. Pompous Git Silver badge

        "MOST Of Women Are NOT THAT Attractive NAKED."
        Speaking as a 66 y.o. almost every young woman I see is beautiful, clothed or naked, especially when they smile. I think you need to get a grip...

        1. DainB Bronze badge

          That's your personal opinion not backed by hard facts, pun intended.

      2. 's water music
        Unhappy

        fix your Caps Lock

        Would that that were the highest priority

      3. bombastic bob Silver badge
        FAIL

        "Actually once you fix your Caps Lock and look around MOST Of Women Are NOT THAT Attractive NAKED."

        yeah, snark snark because YOU (apparently) haven't mastered the proper use of *emphasis*

    2. Pompous Git Silver badge

      "From pure artistic point of view, our male parts while EXTREMELY FUNCTIONAL ain't exactly the most aesthetic of forms"
      In 1970, I was the undraped model at the Launceston Art School. At least two of the students, both female, took a very great interest in my penis. (I was the first model to not wear a jockstrap; it never occurred to me to do so). One girl drew my member in exquisite detail, the other considerable larger than real life.

      1. David 132 Silver badge
        Happy

        @Pompous Git In 1970, I was the undraped model at the Launceston Art School.

        That's hilarious (although I have to point out that given the ubiquity of long hair on both sexes in the early 1970s, it's amazing they could see ANY flesh to paint...). Have an upvote.

        And you've reminded me of this take on the T-Rex "tiny arms..." trope by the Oatmeal:

        http://theoatmeal.com/comics/tiny_arms

    3. Anonymous Coward
      Paris Hilton

      Evolutionary bioligy

      Men are mostly ugly naked! and Women mostly ARE NOT UGLY when naked!

      So speaks the voice of evolutionary biology. Male cockroaches find female cockroaches physically attractive too.

      Similarly, many times I have heard women remark of small babies "what a beautiful child", when plainly they all look the same and none is remotely pleasant to look at. But if mothers didn't think neonates look beautiful, then the human race wouldn't last a single generation.

      Paris, obviously, because evolutionary biology.

    4. Boo Radley

      Men Aren't Ugly

      There are SO many men that I'd love to see naked! Women, meh...

  11. ToFab

    Microsoft PhotoDNA

    They are using Microsoft PhotoDNA that also are being used by Gmail, twitter, google, facebook, Microsoft, others, to identify child pornography. https://en.wikipedia.org/wiki/PhotoDNA

  12. jake Silver badge

    How long before ...

    ... one of the "trained staffer" units manages to l install a screen grabber (or whatever) on their corporate porn viewing computer, and manages to collect quite a stash?

    How long after that will it be before a miscreant manages to grab the stash? Or the "trained staffer" shares with his buds, who in turn post them online outside of Facebook's reach?

    I'm guessing less than two months for the first, and a week or ten days for the second and third.

    The wife just read ElReg's article on Facebook's plan. Her words, verbatim: "Fuck that! Do the idiots in charge of Facebook really think we're even stupider than they are?".

    Enough said, I think.

    1. Tikimon
      Thumb Down

      Re: How long before ...

      Is anyone addressing the real danger here? This is Facebook, who never ever delete ANYTHING, no matter if they tell you they did. They will keep every nude photo uploaded forever. It's run by a sociopathic asshole who cares nothing for anyone else's rights or lives. I wouldn't put it past YuckyZuck to use them as a personal amateur porn stash. ICK!!!

      I'm sure they'll justify permanent retention to themselves, for psych research or to train image recognition or some such BS.

  13. EveryTime

    I think this idea is flawed in many, many ways.

    But there is one flaw that leaps out first.

    Why does a human need to look at the picture first?

    Why can't the system automatically create the hash, and then only have human review when there is a match?

    The system can still discard the image. If there is a hash match, there will be a second copy. And at that point the image has been posted, so the humans reviewing will be limiting the 'damage' of the public release, rather than actually doing what the victim is trying to prevent (people seeing them in a compromising position).

    1. Flocke Kroes Silver badge

      Better yet ...

      If you must take a nude selfie, send the hash to facebook not the photograph. Give it a month and I am sure there will be a dozen Android apps to tag your photo collection that promise to send hashes to multiple social media sites and not send the pictures to newdamators.com, honest - really - cross my heart. (It took me three guesses to find a domain name that hadn't been taken.)

  14. fidodogbreath

    Welcome

    The best solution would be if abusive scumbags could stop being so awful.

    This must be your first day using the internet.

  15. Anonymous Coward
    Anonymous Coward

    What could possibly

    go wrong ?

  16. JWLong

    One of my oldest hostfile entries is:

    0.0.0.0 www.facebook.com

    Absolutely

    1. Anonymous Coward
      Anonymous Coward

      'One of my oldest hostfile entries is'

      Make sure you get all of them. Unfortunately FB central sucks data from lots of different domains, and is adding new ones all the time. Ultimately none of this will save you from FB scraping data from shadow profiles (friends / family / colleagues) and lately 'real-world' sources: Credit-Cards and Experian data etc...

      Experian:

      http://www.theregister.co.uk/2016/05/09/experian_audience_engine_knows_almost_as_much_about_you_as_google/

      ~~~~~~~~~~~~~~~~~~~

      Facebook and Experian:

      https://marketingland.com/9-new-semi-secret-facebook-targeting-options-184810

      https://www.lrb.co.uk/v39/n16/john-lanchester/you-are-the-product

  17. Fruit and Nutcase Silver badge
    Joke

    Monetising opportunity

    Get a look at the privates of their punters and target adverts as appropriate

    Women:

    Top:

    Small Bust -> ads for enlargement options

    Huge Bust -> ads for adult movie actors

    Muffin top belly -> ads for liposuction, corsets, gym memberships

    Bottom:

    All -> ads for topiarists, vajazzles, brazilians

    Men:

    Top:

    Thin/Flabby -> ads for body building/sculpting

    Beer Belly -> ads for liposuction, corsets, gym memberships

    Bottom:

    Average -> ads for enlargement options

    Huge -> ads for adult movie actors

    Micro -> consult Mark for suggestions

    1. Anonymous Coward
      Anonymous Coward

      Re: Monetising opportunity

      We knew that's exactly what you'd say Zuk!

  18. ratfox

    viewed by a "specially trained representative from our community operations team," to check the photos are actually legit nude snaps

    I'm surprised that in this day and age, a major tech company would actually claim it's necessary to have a human in the loop. Surely that's what machine learning is for? Seriously, even without going to machine learning, Google and others have had porn filters for ages, and they're pretty reliable most of the time.

    In the very least, I'd suggest a two-tier system: A first check is done with an automated detector, and if the image is not detected as a nude, then the user is asked for permission for an actual human to look at it and check the actual nakedness. This way, most pics can be submitted without anybody having to see them, and Facebook even saves money on the manual work.

    1. Flocke Kroes Silver badge

      No need categorise submitted image at all

      Facebook can collect are large number of hashes and it does not matter what the image is until an image with the same hash is posted. At that time a human or AI must decide if the posted image is revenge p0rn. If a submitted hash regularly catches pictures of the Eiffel Tower then the person who submitted it can get an increased "cries wolf" score.

      1. Ropewash

        Re: No need categorise submitted image at all

        Not my fault that hashes of my privates routinely match with the Eiffel tower.

        I'm a bit put out it wasn't the Burj Khalifa.

  19. Anonymous Coward
    Anonymous Coward

    Whilst I am not sure about the service, it's good to see Facebook actually trying to think about how to solve this problem and a shame that they get knocked for any attempt at innovation in a difficult area.

  20. phy445

    Why does FB have to do the hashing?

    I agree with the comments about the hashing being easy to get round, however the process is a step in right direction.

    What I can't see is why the hashing has to be done by FB. Surely someone can create an App the creates the PhotoDNA hashes and uploads those to the system? Of course a checking or appeal mechanism will still be required, but there is no need to "trust" your images to FB, Google, MS, etc.

    1. John Robson Silver badge

      Re: Why does FB have to do the hashing?

      The argument is that if the PhotoDNA algorithm was available publicly (even only in a compiled binary) then miscreants could keep tweaking photos and seeing what they needed to do to change the hash sufficiently to get away with posting the image they want to post.

      There isn't an easy answer, but I'm pretty certain that any answer with humans in the loop at submission time is not a good one.

      1. phy445

        Re: Why does FB have to do the hashing?

        If that is the argument then I don't think it is correct.

        An app that creates the hash then sends it in encrypted form to FB etc. would stop miscreants playing with the hash in the way described. Besides, as many comments have pointed out it doesn't take much to change a photo so that the hash has changed sufficiently.

        1. John Robson Silver badge

          Re: Why does FB have to do the hashing?

          I did think about internal encryption of the hash as well, shouldn't be that hard - but you need to be able to stop people reading the hash out of memory before it's encrypted - and you need to add some other information (date and time, some random stuff etc?) to the hash to ensure that the encrypted messages isn't as valuable as the hash itself.

          As for the 'minor changes' argument - they're using hashes which are designed to be resistant to changes in the image.

  21. Anonymous Coward
    Anonymous Coward

    Damned if you do, damned if you don't

    Nobody would have commented if facebook had done nothing. But there's already a bunch of complaints and 'it could be done better'.

    Of course it could be done better. Pretty much every computer system could be done better.

    Of course there's potential risks with other people viewing the pictures, and one would hope that the checkers were rigorously checked.before being assigned.

    But it's a start. It's a move in the right direction, given the number of abusive people there are on the internet.

    Me, I had a facebook account for about a month. Then my head exploded. Too much information took on a whole new meaning. And no, I didn't post naked selfies, thank you very much.

    But don't just slag them off because they're facebook so what they're doing must be wrong. Slag them off for what they do do wrong by all means. But not for at least trying to do something about a bad situation.

  22. Potemkine! Silver badge

    I'm not convinced by FB solution: it seems inefficient and potentially dangerous.

    There may be another way to fight revenge porn: so-called "social media" providers could warn they will fill a complaint and provide law enforcement agencies all the possible details on the scumbags making that kind of post to help them being identified and punished. And do it, of course.

    Lawmakers should also be much more severe against the aforementioned scumbags.

  23. fLaMePrOoF

    If the entire process was handled by AI and completely closed then this would potentially be a great idea, but without that cast iron assurance of a closed system the potential / probability of bad actors at FB is just too high...

    Of course such a closed system comes with it's own issues and potential for abuse by users so won't hold my breath on this one...

  24. M7S
    Coat

    Worrying about people seeing pictures of me naked in the wild rather than just one FB staffer

    Just before you read any further, "in the wild" in the title above alludes to "beyond my control" rather than images of my muffin top cavorting about the local, um, beauty spot.

    If there are any images of various bits of me that people enjoy, they're either medical professionals or going to need some form of help rather more than I will. In a few cases, possibly both.

    Ah yes, the coat, I should have kept that on, but better late than never. Thank you.

    1. Teiwaz

      Re: Worrying about people seeing pictures of me naked in the wild rather than just one FB staffer

      Amusing, and good point.

      either medical professionals or going to need some form of help

      What with devaluation in 'privacy' regarding medical records, possibly both at some point.

      Chances are, the job offers from Freak Shows might be in the offing.

      Of course, if this sounds like an exciting opportunity in showbiz.....Congratulations.

  25. Allan George Dyer
    Holmes

    Why do they need nude photos?

    Ask the person who thinks they may be victimised to send:

    i) image of their face

    ii) image of normally-visible tattoos or distinguishing marks

    iii) images of location where nude snaps were taken

    Then flag for investigation any image that AI says contains a nude plus one or more of the above.

    Is this a better solution?

  26. Richard Parkin

    Copyright

    How does FB establish that you have any rights to the photo or that it is of you? Copyright etc.

  27. MiguelC Silver badge

    Antigone Davis, Facebook's global head of safety

    Nominative determinism for something that's surely going to end as a tragedy?

  28. Sebastian P.

    Now, let's ignore for a moment the limitations of the system itself (only applies to Facebook ecosystem, seems trivial to bypass, only works if the potential victim has advanced knowledge that someone wants to post picture of them etc.).

    The system will increase the risk for everyone else, by introducing new threat vectors. How long do you think it will be until we see phishing emails and fake FB "upload your nudes here" sites? How many abusive partners (or just pranksters) will make their partners think they have nude photos of them, just to make them go through the indignity of photographing themselves nudes and uploading their pictures.

    Furthermore, don't forget that the victims will have to take their pictures with their phones or cameras. Which often sync said pictures with PCs, NAS, clouds etc. - sometimes even without their owners realizing. So you end up with a lot more attack points from which the nude pictures can be leaked or stolen.

    You solve the potential problem of some, by creating a bigger problem for more people. I'm not seeing this as an improvement.

    Risk Transfer is an acceptable risk mitigation strategy only if the parties to which you're transferring the risk to are cognizant of it, and able to accept it.

    1. Anonymous Coward
      Anonymous Coward

      People proposing solutions to "morality problems" often wear blinkers. Their tunnel vision leads them to ignore collateral effects that make life complicated for everyone else.

  29. Simon Harris

    I just worry...

    that getting your kit off in front of the camera will become the new FaceBook secure login procedure.

    Snapping a picture of my balls brings a new meaning to 2 factor authentication.

  30. Anonymous Coward
    Anonymous Coward

    "[...] preemptively submitted images will be viewed by a "specially trained representative from our community operations team," to check the photos are actually legit nude snaps [...]"

    Does "legit" exclude under-age*** pictures? Are Facebook obliged to report such a hash submission to the police? Even posting/possessing an under-age selfie is illegal in UK jurisdictions - and probably Australia too.

    ***under-age illegality in the UK is defined as "looks under 18"

  31. DagD

    I'm afraid

    Someone might steal my identity! Quick, here's a copy of everything needed for my identity, Protect it - trustworthy stranger!

    Mean while in next years post:

    Facebook has changed it's security standards and will no longer be offering free protection for revenge porn. This will now be a paid service.

    If you have submitted your pictures freely to us, we recommend you buy into our paid service as soon as possible, as the security setting on all freely submitted pictures will change to "public".

  32. Bob Dole (tm)
    Holmes

    Facial recognition?

    I'm trying to understand why they want to compare hashes vs just continuing to use the same facial recognition they have to see who is in the photo?

    Couldn't they just have a security setting for people to say "I have to approve any photos of me before they are visible?"

  33. Bucky 2
    Coat

    The "revenge" part of "revenge porn" in my case would be posting a nude picture of myself and then saying, "This is how low so-and-so's standards are."

  34. Harry Stottle

    A better idea would be Face only...

    this suggestion is predicated on the notion that a nude photo without a face (or name) is rarely a hostage to fortune.

    if users could submit one or two "face only" photographs, with some sensible evidence that it was indeed their own face (eg an automated web cam session using the face recognition they're already experimenting with), then farcebook could introduce a new rule.

    No photograph which includes a recognizable face(nude or otherwise) can be posted, except by the owner of the face, or with the explicit recorded permission of the face-holder. That would kill many birds with one stone...

  35. jpat14
    Joke

    Hashing

    It astounds me that a tech giant of this size has no fundamental understanding of how cryptographic hashing works. A single pixel alteration, a different compression algorithm, even in most cases just opening the file and resaving it in a different photo editor would output a file that would return a non-matching hash. A "deleted" file is never gone, and AI would require access to the image itself to analyze it, not the hash. This makes no sense from any angle.

  36. Anonymous Coward
    Anonymous Coward

    a trained staffer will verify the photo

    Think about Facebook's actions in the past, its problems with honesty and its complete disregard for the privacy of others.

    Then ask yourself is these are the type of people that you want to trust with that type of photo.

    Not a Facebook user, obviously.

  37. Anonymous Coward
    Anonymous Coward

    Make Them Suffer

    My definition of revenge porn is when I send someone a photo of me in all my naked, tumescent glory. Obviously, only only the dregs of humanity should be subjected to that type of punishment.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like