Facebook Emotion Study

Facebook Is Unethical

I tried not to be dramatic about it, but I sort of left Facebook. Problem is you can never really leave Facebook. At least not when you manage over 10 accounts for clients and you have a blog and you go to log-in on random sites and Facebook has its tentacles into everything on the Internet. So I decided to minimize my personal dealings on Facebook and just concentrate on my professional work there. I realize this probably only hurts me and does nothing to Facebook but at least I won’t be so angry every day. Gives me the opportunity to still check in on friends and message them and see events, while ignoring many of the things that drive me absolutely bonkers.

Why am I so upset? You probably have heard about this, but a study was published this month from a data scientist at Facebook where, in a nutshell, he manipulated the streams of almost 700,000 Facebook users to view either happy or negative updates and see if the emotions in these streams could affect the emotions of users. Yup, they did! And, in of itself, not too surprising and also from a research perspective, actually fairly interesting.

My problem with this is that Facebook felt that their loosely worded terms of service would cover informed consent for human research subjects (even though their IRB did think it was “creepy.“). I used to co-own a private research company that builds websites which test behavioral communication strategies and we had an outside, independent IRB that governed everything we did. Strictly. When I read about the Facebook study, my emotions were certainly manipulated in that I felt like I was having a heart attack at the same time my head was exploding with the disbelief. When you work with human subjects, they MUST be informed they are involved in a research study and have the opportunity to opt out. Every IRB training references the Tuskegee airman case, which changed the way we do research in America.

Did you feel yucky when you heard about this? You should. It’s absolutely wrong.

Since then, the main author has pulled a mea culpa (on Facebook of course) that reads like a 13-year-old boy who got caught throwing snowballs into traffic. For me, personally, this is not even close to being good enough when we’re talking about a gigantic company who is doing internal research with hardly any oversight, manipulating users without their knowledge, practicing un-ethical business and research standards, and then somehow gets published for it.

I have long had problems with Facebook because when they make changes to their privacy settings they always set them to the least private. Ethically, when you build websites, you make everything most private and allow the user to choose if they want to open themselves up. Tricking people is not how to build trust and respect with your users – and Facebook seems to forget that if they don’t have users, they don’t have data.

Well, at least they now have less data from me.

This article has 23 comments

  1. Ben D

    Such assholes.

  2. zipper

    so glad I only read the internet.

  3. john

    It’s not just Facebook – it’s all of them.

  4. Matthew V

    And I have a strong feeling this is merely the tip of the iceberg, so to speak. You know they probably do all kinds of things like this, but the knowledge of them stays internal. I mean, for goodness sake, they even save everything you’ve ever started to type as a status update then deleted without posting. If we know they do that, imagine what we don’t know they do.

    • Aimee

      Yeah people have been citing back other things that they have been doing previous to this. I am glad that they are getting investigated.

  5. Lindsey

    I can always tell when a friend has been on Facebook too much, they are always in need of someone telling them that not everyone is sick or dying, not all children are starving, they aren’t less of a person for missing birthday on facebook. Its ridiculous, personally I have stuck with Instagram as my platform, I can get just as “real” there too! Love the new site and new name A.

  6. Jessica

    While I think it’s creepy that they use our information without permission for things like this, I also don’t consider Facebook to be a private forum at all. My profile is private, but the website is specifically designed for sharing information, and share it does. I accidentally announced a pregnancy by asking a question in a group, only to have friends tell me that activity popped up on their side bar.

    I think it’s safe to say these days that if you’re writing about it on the internet, it better be something you don’t mind everyone seeing, sharing, or using without permission.

    • Aimee

      Jessica, therein lies the rub. If we except that then we have given up all control. There are some new documentaries coming out about how we blindly except terms of service without reading them and my guess they’ll probably be a shift in the way companies are forced to give us terms of service instead of these big long pieces of legalese that no one reads her understands. And Facebook in particular has been a hot mess regarding privacy just because they pretend that their private when they’re not. They made a huge mistake by considering themselves a private social network but they need public data to make the money and work. I don’t actually have a problem with Facebook or Twitter or anyone studying the data that their social network creates because that is how they make money and are able to provide a service for free. But what people need to understand what Facebook did here is first of all added the word research to the terms of service after the study was started and then performed a study where they change the stream and positive and negative ways to see how emotions would change. They went in and did that. They did not study people lose friends you use dogs have died to see how sad they got. They went in and showed sad news to people and saw how sad they got from it. Without informing people that they were participating in a research study or giving them the option to opt out. This is the huge problem and that’s why scientific bodies around the world are starting to call for an investigation into it because it is completely wrong and crosses so many lines I can’t even handle it.

  7. Amy Evans

    Thank you so much for writing about this. I don’t think enough people understand the issue. Sure, study data – but don’t mess with us Facebook!

  8. Suzanne

    I get the impression that the main researcher has no formal education in psychometry. If he did, he would have been exposed to the Tuskegee Airman case and the Berkeley Prison experiment and KNOWN that full disclosure is paramount to ethical research.

    • Aimee Giese (Greeblehaus)

      Suzanne – it is so crazy. I am a graphic designer/web developer with a BA in graphic design. Everyone in my company that came in touch with data or human subjects, no matter their level of education, was made to do IRB training. I am simply flabbergasted.

  9. Lucrecer

    They are bastards from the depths of hell. I’m at the point I don’t even comment as much because the stream is so full of crap. Bastards.

  10. Sarah Patterson

    Facebook are jerks. I am so glad I have such a small group of friends there (including you!)

  11. Aero Bina

    There maybe more than what we think off.

  12. Krista

    I left too.
    I have an account with a made up name so no one can find me. I manage my pages through Hootsuite so that I never have to login to FB at all. No friends, no drama, no newsfeed. I didn’t leave because of FB, the company, really. I left because I was tired of getting sucked into peoples every day drama. I also realized that I spent to much time scrolling and not enough time actually being productive.

  13. Google

    It’s amazing designed for me to have a web page, which is useful in support of my know-how.

    thanks admin

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Send this to a friend