yeah we're just going to select what posts you see mmkay
its for your own good mmmkay.
we're just going to delete the ones we don't think you should be seeing mmmkay.
dont worry about it theres no need for you to be aware of this mmmmmkay
Facebook has hit back at its critics after the social network instructed researchers to meddle with its users' "news feeds" in order to manipulate their emotions. The free-content ad network sparked anger when it emerged its data scientist Adam Kramer gave a green light to researcher to filter out positive and negative posts …
Facebook does this all the freakin' time. They make no attempt to hide it.
The average Facebook user has 338 "friends". If the average user makes 1 post per day, and you see (say) 15 posts when you log in, how do you think it selects those 15 out of the 338 available?
I don't know. My bet is, it's an algorithm that's constantly being tweaked, that takes into account the age of each update, number of likes, and how much you tend to interact with that particular friend on FB, among probably many other factors (such as how much they're paying FB to promote their posts).
And for a trial period, they introduced another factor into this algorithm - the "mood" of each post - and fed a differently-weighted version to each of two different subsets of users.
Seriously, I don't get how this is such a big deal.
NSA trawling everybody's data but not actually doing anything with most of it, that's a problem. Facebook actually submitting millions of users to a psychological test without consent and you don't get it?
I'll be honest, I've personally been "meh" about it, but only for myself and only because outside of the occasional visit home when my mother wants me to send her some stuff for Farmville, I haven't touched my accounts there in about two years, maybe three. But I DO get that if I were a regular FB user, particularly one who was attempting to use it to stay in touch with Friends and Family, I would be pissed (US, not UK) at this sort of revelation.
I don't use Facebook so I don't know much about it. I would guess that it is in business to make money, which is ok, and to look after its own interests which may or may not be ok. The traditional approach to this situation is free competition and the handicapping of monopolies.
As a disinterested onlooker may I suggest nuking from LEO and seeing if a replacement occurs and if it is any different; purely as a sociological experiment, and in the interests of pure science, of course.
From Kramer's Facebook post:
"[...]
The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook. We didn't clearly state our motivations in the paper.
[...]"
In other words, Facebook are prepared to distort reality to preserve their own business.
Adverts are generally obviously trying to sell something.
Most people are clearly ok with seeing adverts a lot of the time, or even deliberately seeking out an advert-laden medium as they watch commercial TV.
However, this is manipulating the users by artificially changing the content, hiding posts which they probably will have wanted to see.
It's like broadcasting two versions of Corrie - one where everything went wrong for the characters and one where everything went right, and seeing if it made the viewers happy or sad without their knowledge
- Except that a week of a soap opera without disaster for someone would be suspicious in itself, which isn't true of Facebook.
If you aren't going to show all news items (which is a challenge in itself), you have to have a method / algorithm to decide what items you are going to show. It seems reasonable to test the effect of potential biases. Although Kramer doesn't have an academic affiliation, the other authors do, which usually obliges researchers to go through ethics committees for such experiments. There's no evidence in the (pretty lightweight) paper to say they do, but that's usually down to the journal's policies - still, it'd be nice to know
"...conducted for a single week in 2012"
read: fuck off, 2 years in the web 2.0 is like a millennium in the old, analogue. Spent and we're good now!
"none of the data used was associated with a specific person’s Facebook account"
read: we say we anonymize data, so fuck off
"We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible."
This one is straight from the Book of "fuck-off quotes".
read: we do research to spam our sheep into oblivion
"We carefully consider what research we do..."
read: "we know how to shear the sheep, so fuck off"
"There is no unnecessary collection of people’s data"
read: prove us wrong and have a good day
Oh dear, no sheep?! But it's such a fitting analogy! You give them "free" grass and you shear them and give them more free stuff and shear them more. You shoo them, or give them a kick, and all they do is - bleat. Same with milking cows.
No sheep... well, hamsters maybe? I guess you feed them (...) bit by bit and they just stuff it in their pouches, til they can hardly move along. Pretty comical, eh?
"It's hackneyed to the point of uselessness, comes off as smug and superior, and turns away the very people you might hope to sway."
AKA "You catch more flies with honey than vinegar."
And it's amazing how many people these days forget this simple lesson. For example, I myself would probably be more supportive of the ideals of the various "equality" movements, were their proponents less disposed to snottily dismiss anyone who opposes them as an unenlightened overprivileged bigot, instead of logically explaining where they think the opponent is wrong. I wonder how much opposition to their causes results from this vinegar/honey effect, as opposed to genuine political disagreement with their ideas?
I'd have thought that it was a give that data was anonymised.
My problem with it is that they were deliberately trying to manipulate peoples' moods.
There are probably a fair few people that rely on seeing positive posts from others (family etc) to get through their day/week/month/year. This is bloody disgraceful, FB.
"This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account."
We're not telling anyone who we messed with. Including them. So that's ok.
"We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. "
This is an important step in determining how to make our cattle happy and most likely to click the adverts provider by our users. The cattle will eat whatever they're fed.
"We carefully consider what research we do and have a strong internal review process."
We consider carefully if the research is in our interests and if there's money to be made. No, we're not sharing how we decided. We're very comfortable about this, so what's the problem?
There is no unnecessary collection of people’s data in connection with these research initiatives "
We decide what's necessary.
"and all data is stored securely."
It's our data, isn't it? So you can bet it's secure. The sooner you get this into your thick heads the quicker this discussion will be over.
"Facebook knows you don't want to know **EVERYTHING** your friends do, so using complex algorithms, we carefully select the most miserable posts that we know you won't want to read. Going to work in the morning? Check your Facebook news feed before you leave and feel like shit! Thanks to Facebook, you can be made to feel like a warmed up turd all week"
After dicking with the UI and the news feed filters and people going nuts because of it, this is the one that shows that all Facebook users are Zuck's bitches and nobody among the baby-sharing, bacon-snapping morons will care;
Manipulate my mind, but don't mess with the UI!
Hadn't really thought about it before, but yeah, that was around the time I stopped using FB. That and it was synergistically linked to Zynga's free game server performance going to shit. And I mostly hung out for the Zynga games in the first place. I had even just reached the point where I would have been willing to toss them a Hamilton a month for the service.
This post has been deleted by its author
Farcebook.
You can not believe ANYTHING you see on this site (farcebook) because we manipulate it to get the most emotional response from you. Since MOST of you are "sheeple" ( Meaning: for the obviously ignorant among you, you are frikking sheep people who act like herd animals, enjoy getting screwed in the posterior and are too stupid to make up your own mind) you deserve this crap, along with people who complain about being labeled "sheeple".
...A peddler of creepy advertising, like a tobacco executive who peddles cigarettes to children in developing countries.... I'd wake up and ask myself: What am I doing with my life...???
Every single day I cloak Facebook in PR spin about 'being social', but secretly I know that Facebook is highly 'addictive' nicotine, an advertising delivery device, that tricks people into clicking on ads....
I would drop my head in shame and ask: Why am I not trying to change the world: get us to Mars for example...? Then I'd probably go and eat a bullet...