Quote:
For one week in early 2012, Facebook (FB, Tech30) changed the content mix in the News Feeds of almost 690,000 users. Some people were shown a higher number of positive posts, while others were shown more negative posts.
"What?" You might be asking. Why did they do that?
Quote:
The results of the experiment, conducted by researchers from Cornell, the University of California, San Francisco and Facebook, were published this month in the prestigious academic journal Proceedings of the National Academy of Science. It's not that prestigious, don't let them fool you.
The study found that users that were shown more negative content were slightly more likely to produce negative posts. Users in the positive group responded with more upbeat posts.
The study found that users that were shown more negative content were slightly more likely to produce negative posts. Users in the positive group responded with more upbeat posts.
Quote:
"I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused," Kramer wrote. "In hindsight, the research benefits of the paper may not have justified all of this anxiety."
And how did the editors feel about publishing this?
Quote:
"I was concerned," she told The Atlantic, "until I queried the authors and they said their local institutional review board had approved it -- and apparently on the grounds that Facebook apparently manipulates people's News Feeds all the time... I understand why people have concerns. I think their beef is with Facebook, really, not the research."
So it's okay to manipulate your mood for this experiment, because Facebook does it all the time anyway.
Comforting.
What say you?
Facebook is evil and must be exterminated.: | 7 (33.3%) | |
Facebook is a good company getting thrown under the wagon by alarmists.: | 1 (4.8%) | |
Meh.: | 9 (42.9%) | |
Face-what? : | 4 (19.0%) | |
Total: | 21 |