back to article The NAKED truth: Why flashing us your nude pics is a good idea – by Facebook's safety boss

Amid days of intense debate over about its controversial plan to block revenge porn on its social network, Facebook sought to calm fears about the program. Antigone Davis, Facebook's global head of safety, on Thursday attempted to clarify details about the system, which is being tested right now in Australia, and is heading to …

Page:

  1. Mark 85

    One solution is to stop taking naked photos of each other, or sharing intimate snaps.

    That is probably the best solution of all. I thought FB didn't allow such photos to be posted or am I wrong?

    The best solution would be if abusive scumbags could stop being so awful.

    Wishful thinking much? Might as well wish in one hand and piss in the other and see which fills up faster. Scumbags will be scumbags and there's no cure for that except death.

    1. Francis Boyle Silver badge

      So your solution

      to the problem of privacy is just not to do anything that requires privacy.

      Bank details please.

      1. Anonymous Coward
        Anonymous Coward

        Re: So your solution

        You can have my bank details but only if you promise not to post them on facebook if we break up.

        1. bombastic bob Silver badge
          Devil

          Re: So your solution

          "just don't take/share the pictures" is not a BAD solution [i.e. don't share pics that you NEVER want to see on "teh intarwebs"] because "hacking" "law enforcement" "TSA" "device theft" "revenge pr0n" "BOFH".

          [Just pointing out the obvious]

          I'd expect to see some BOFH-types at FakeBook frequently rummaging through a pr0n collection that was "accidentally" copied to their personal computers, and kept locked up in triple-encrypted files with non-obvious names like "accounting".

  2. MooJohn

    My nudes

    Every time I upload pics of myself, Facebook asks if I want to tag Burt Reynolds in the photo.

    "where a trained staffer will verify the photo"

    What are the qualifications for this job? And how do you "train" for it?! I would love to see the textbooks on this subject. I assume they're illustrated.

    1. Anonymous Coward
      Anonymous Coward

      Re: My nudes

      Isn't Burt in his 80s now

  3. Field Commander A9

    How hard can it be to change the sig. of a lossy JPEG?

    Without using AI, I don't think there's any reliable way to identify a JPEG file that has been through some of the most basic modifying.

    1. Anonymous Coward
      Anonymous Coward

      Re: How hard can it be to change the sig. of a lossy JPEG?

      I thought the same, not sure why you've been downvoted, especially given this article:

      https://www.theregister.co.uk/2017/10/30/fool_ai_image_classifier_by_changing_single_pixel/

      I'd like more info on how these signatures work and prevent the above.

      1. tiggity Silver badge

        Re: How hard can it be to change the sig. of a lossy JPEG?

        Why do you think they like to keep the methods "secret" and not have client side apps etc?

        Because then people could readily see how poor the techniques are

    2. joed

      Re: How hard can it be to change the sig. of a lossy JPEG?

      Even with AI this could be a daunting task. The shear number of pics, all over the place (and outside FB control for that matter) can't be easily compared (unless the harasser is really trying to link them to the ex thus limiting the scope of search). Simple resizing of the photo makes simple file signature irrelevant. Mirror image will likely fool any content aware AI and render search/compare of bunch of images next to impossible. OTOH, with millions of nudes online, one may probably just care less, especially considering he/she was brave/dumb enough to snap a nude in the the 1st place (unless again the sharer explicitly is trying to link them to the victim just to harass her/him). Similarly to SSNs - we may assume that ours is already out there but we're not so special that anyone really cared about it.

  4. Anonymous Coward
    Anonymous Coward

    WTF???

    "Humans rather than algorithms will view the naked images voluntarily sent to Facebook in a scheme being trialled in Australia to combat revenge porn."

    http://www.bbc.co.uk/news/41928848

    1. Anonymous Coward
      Anonymous Coward

      Re: WTF???

      Wombat revenge porn? That sounds terrible!

      1. I ain't Spartacus Gold badge

        Re: WTF???

        What's a wombat?

        Is it a thing you need to play wom?

        1. Chemical Bob

          Re: WTF???

          "What's a wombat? Is it a thing you need to play wom?"

          Well, you wouldn't want to use a cricket bat - they're used for whacking zombies' heads off.

    2. Anonymous Coward
      WTF?

      Re: WTF???

      "Humans rather than algorithms will view the naked images voluntarily sent to Facebook in a scheme being trialled in Australia to combat revenge porn."

      Hopefully, Facebook are providing them with plenty of boxes of tissues.

    3. Anonymous Coward
      Anonymous Coward

      Humans, which pervs are signing up for the job?

      We have 3 new hires just in folks. Say hi to your new colleagues:

      [El Reg censored me, oh well,..... But imagine the names of 3 hotshot Hollywood types who've fallen from grace very recently]

  5. Anonymous Coward
    Anonymous Coward

    This is comical

    What about multiple-partners, multiple-devices, span-of-time, plus any lost-devices or accidentally-deleted photos etc.. Its more likely that the 'victim' does NOT have all the images even to hash. Plus, even if Facebook falls back on its photo-tagging / matching AI filters, AI has a long way to go:

    "AI image recognition fooled by single pixel change"

    http://www.bbc.co.uk/news/technology-41845878

  6. Anonymous Coward
    Anonymous Coward

    Better option: Follow the Facebook ex-President:

    Delete your Facebook & Snap accounts now peeps:

    ~

    Facebook founding president sounds alarm - BBC News -

    "Sean Parker (Napster) "no longer used social media as it was too much of a time sink” “I use these platforms, I just don’t let these platforms use me”

    ~

    http://www.bbc.co.uk/news/technology-41936791

  7. Mark Solaris

    I'm going to register facebook-image-block-request@gmail.com and advertize it around....

    1. Anonymous Coward
      Anonymous Coward

      'I'm going to register facebook-image-block-request@gmail.com'

      Enjoy! You'll get 10 times more versus the official FB a/c.

  8. DainB Bronze badge

    Staffer they said, mmmok...

    So how long it'll take until lovely chaps from 4chan that have too much free time on their hands write a bot that will submit few million pictures an hour for poor staffer to go through ?

    1. phuzz Silver badge
      Big Brother

      Re: Staffer they said, mmmok...

      "for a poor staffer to go through"

      Don't be daft, if they were staff then they'd have some employment rights. Facebook use contractors to do the nasty work, that way they can just let them go when they get PTSD.

      (for example)

    2. Agamemnon

      Re: Staffer they said, mmmok...

      Well, now that you've said something about it:

      About ten minutes.

  9. Anonymous Coward
    Anonymous Coward

    abusive scumbags

    "The best solution would be if abusive scumbags could stop being so awful."

    or you know, start by not dating or showing yourself naked in front of abusive scumbags?

    There are plenty of brainless people who straight out love to date scumbags (they call them bad boys) because they are great at creating excitement by being dominating. They then feel traumatized after the scumbags start abusing them. (the exception are those who got tricked by the scumbags who hid their true self) The worst part is they would, after ending one scumbag relationship, find yet another scumbag to date.

    Whatever facebook do will only be a patch for the wound, it doesn't fix the real problem or stop the scumbag from posting elsewhere. The victim can only fix the real problem themselves, by learning from their mistake.

    1. Francis Boyle Silver badge

      Ah, the old "women like bad boys" fantasy

      The reality is that the real 'bad boys' i.e. psychopaths have one very powerful weapon in their arsenal - they lie. Then they lie again. And they keep lying until they get what they want or tire of the game. Look up 'psychopathic charm' if you want the details. Faced with someone who is prepared to say whatever it takes, and knows how to say it well, very few people, not just women, have any real defense. Yes, you may know that they're bad boys but they're not trying to fool you. To the mark, they're a wonderful person who is just misunderstood.

      Telling the victim to fix the problem themselves doesn't wash it. You may think you learn from the experience and I'm sure many women do. But again, we are dealing with psychopaths here. They explicitly target people who they detect are vulnerable. Blame them.

    2. LewisRage

      Re: abusive scumbags

      And so the /r/incel fallout lands at the register...

  10. StargateSg7

    YECH !!!! While I'm speaking from the MALE point of view, I CAN make one startling conclusion from my many explorations of the naked female form.

    Conclusion:

    Men are mostly ugly naked! and Women mostly ARE NOT UGLY when naked!

    From pure artistic point of view, our male parts while EXTREMELY FUNCTIONAL ain't exactly the most aesthetic of forms while the female form in its entirety is MOST DEFINITELY AESTHETIC! But since most men don't give a rats-arse about sending naked photos of themselves to ANYONE, I can see where this request will end up!

    Sooooo.....Facebook asking people to sending naked photos of themselves is an exercise in FUTILITY as it will BE MOSTLY MEN who would actually EVEN CONSIDER RESPONDING to this request --- I know of almost no reasonably sound-minded woman who would EVEN THINK of responding!

    Just get a computer algorithm to produce synthetic rendered photo-realistic naked humans and train the software on those!

    1. DainB Bronze badge

      "Women mostly ARE NOT UGLY when naked!"

      Actually once you fix your Caps Lock and look around MOST Of Women Are NOT THAT Attractive NAKED.

      1. Pompous Git Silver badge

        "MOST Of Women Are NOT THAT Attractive NAKED."
        Speaking as a 66 y.o. almost every young woman I see is beautiful, clothed or naked, especially when they smile. I think you need to get a grip...

        1. DainB Bronze badge

          That's your personal opinion not backed by hard facts, pun intended.

      2. 's water music
        Unhappy

        fix your Caps Lock

        Would that that were the highest priority

      3. bombastic bob Silver badge
        FAIL

        "Actually once you fix your Caps Lock and look around MOST Of Women Are NOT THAT Attractive NAKED."

        yeah, snark snark because YOU (apparently) haven't mastered the proper use of *emphasis*

    2. Pompous Git Silver badge

      "From pure artistic point of view, our male parts while EXTREMELY FUNCTIONAL ain't exactly the most aesthetic of forms"
      In 1970, I was the undraped model at the Launceston Art School. At least two of the students, both female, took a very great interest in my penis. (I was the first model to not wear a jockstrap; it never occurred to me to do so). One girl drew my member in exquisite detail, the other considerable larger than real life.

      1. David 132 Silver badge
        Happy

        @Pompous Git In 1970, I was the undraped model at the Launceston Art School.

        That's hilarious (although I have to point out that given the ubiquity of long hair on both sexes in the early 1970s, it's amazing they could see ANY flesh to paint...). Have an upvote.

        And you've reminded me of this take on the T-Rex "tiny arms..." trope by the Oatmeal:

        http://theoatmeal.com/comics/tiny_arms

    3. Anonymous Coward
      Paris Hilton

      Evolutionary bioligy

      Men are mostly ugly naked! and Women mostly ARE NOT UGLY when naked!

      So speaks the voice of evolutionary biology. Male cockroaches find female cockroaches physically attractive too.

      Similarly, many times I have heard women remark of small babies "what a beautiful child", when plainly they all look the same and none is remotely pleasant to look at. But if mothers didn't think neonates look beautiful, then the human race wouldn't last a single generation.

      Paris, obviously, because evolutionary biology.

    4. Boo Radley

      Men Aren't Ugly

      There are SO many men that I'd love to see naked! Women, meh...

  11. ToFab

    Microsoft PhotoDNA

    They are using Microsoft PhotoDNA that also are being used by Gmail, twitter, google, facebook, Microsoft, others, to identify child pornography. https://en.wikipedia.org/wiki/PhotoDNA

  12. jake Silver badge

    How long before ...

    ... one of the "trained staffer" units manages to l install a screen grabber (or whatever) on their corporate porn viewing computer, and manages to collect quite a stash?

    How long after that will it be before a miscreant manages to grab the stash? Or the "trained staffer" shares with his buds, who in turn post them online outside of Facebook's reach?

    I'm guessing less than two months for the first, and a week or ten days for the second and third.

    The wife just read ElReg's article on Facebook's plan. Her words, verbatim: "Fuck that! Do the idiots in charge of Facebook really think we're even stupider than they are?".

    Enough said, I think.

    1. Tikimon
      Thumb Down

      Re: How long before ...

      Is anyone addressing the real danger here? This is Facebook, who never ever delete ANYTHING, no matter if they tell you they did. They will keep every nude photo uploaded forever. It's run by a sociopathic asshole who cares nothing for anyone else's rights or lives. I wouldn't put it past YuckyZuck to use them as a personal amateur porn stash. ICK!!!

      I'm sure they'll justify permanent retention to themselves, for psych research or to train image recognition or some such BS.

  13. EveryTime

    I think this idea is flawed in many, many ways.

    But there is one flaw that leaps out first.

    Why does a human need to look at the picture first?

    Why can't the system automatically create the hash, and then only have human review when there is a match?

    The system can still discard the image. If there is a hash match, there will be a second copy. And at that point the image has been posted, so the humans reviewing will be limiting the 'damage' of the public release, rather than actually doing what the victim is trying to prevent (people seeing them in a compromising position).

    1. Flocke Kroes Silver badge

      Better yet ...

      If you must take a nude selfie, send the hash to facebook not the photograph. Give it a month and I am sure there will be a dozen Android apps to tag your photo collection that promise to send hashes to multiple social media sites and not send the pictures to newdamators.com, honest - really - cross my heart. (It took me three guesses to find a domain name that hadn't been taken.)

  14. fidodogbreath

    Welcome

    The best solution would be if abusive scumbags could stop being so awful.

    This must be your first day using the internet.

  15. Anonymous Coward
    Anonymous Coward

    What could possibly

    go wrong ?

  16. JWLong

    One of my oldest hostfile entries is:

    0.0.0.0 www.facebook.com

    Absolutely

    1. Anonymous Coward
      Anonymous Coward

      'One of my oldest hostfile entries is'

      Make sure you get all of them. Unfortunately FB central sucks data from lots of different domains, and is adding new ones all the time. Ultimately none of this will save you from FB scraping data from shadow profiles (friends / family / colleagues) and lately 'real-world' sources: Credit-Cards and Experian data etc...

      Experian:

      http://www.theregister.co.uk/2016/05/09/experian_audience_engine_knows_almost_as_much_about_you_as_google/

      ~~~~~~~~~~~~~~~~~~~

      Facebook and Experian:

      https://marketingland.com/9-new-semi-secret-facebook-targeting-options-184810

      https://www.lrb.co.uk/v39/n16/john-lanchester/you-are-the-product

  17. Fruit and Nutcase Silver badge
    Joke

    Monetising opportunity

    Get a look at the privates of their punters and target adverts as appropriate

    Women:

    Top:

    Small Bust -> ads for enlargement options

    Huge Bust -> ads for adult movie actors

    Muffin top belly -> ads for liposuction, corsets, gym memberships

    Bottom:

    All -> ads for topiarists, vajazzles, brazilians

    Men:

    Top:

    Thin/Flabby -> ads for body building/sculpting

    Beer Belly -> ads for liposuction, corsets, gym memberships

    Bottom:

    Average -> ads for enlargement options

    Huge -> ads for adult movie actors

    Micro -> consult Mark for suggestions

    1. Anonymous Coward
      Anonymous Coward

      Re: Monetising opportunity

      We knew that's exactly what you'd say Zuk!

  18. ratfox

    viewed by a "specially trained representative from our community operations team," to check the photos are actually legit nude snaps

    I'm surprised that in this day and age, a major tech company would actually claim it's necessary to have a human in the loop. Surely that's what machine learning is for? Seriously, even without going to machine learning, Google and others have had porn filters for ages, and they're pretty reliable most of the time.

    In the very least, I'd suggest a two-tier system: A first check is done with an automated detector, and if the image is not detected as a nude, then the user is asked for permission for an actual human to look at it and check the actual nakedness. This way, most pics can be submitted without anybody having to see them, and Facebook even saves money on the manual work.

    1. Flocke Kroes Silver badge

      No need categorise submitted image at all

      Facebook can collect are large number of hashes and it does not matter what the image is until an image with the same hash is posted. At that time a human or AI must decide if the posted image is revenge p0rn. If a submitted hash regularly catches pictures of the Eiffel Tower then the person who submitted it can get an increased "cries wolf" score.

      1. Ropewash

        Re: No need categorise submitted image at all

        Not my fault that hashes of my privates routinely match with the Eiffel tower.

        I'm a bit put out it wasn't the Burj Khalifa.

Page:

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like