I tried not to be dramatic about it, but I sort of left Facebook. Problem is you can never really leave Facebook. At least not when you manage over 10 accounts for clients and you have a blog and you go to log-in on random sites and Facebook has its tentacles into everything on the Internet. So I decided to minimize my personal dealings on Facebook and just concentrate on my professional work there. I realize this probably only hurts me and does nothing to Facebook but at least I won’t be so angry every day. Gives me the opportunity to still check in on friends and message them and see events, while ignoring many of the things that drive me absolutely bonkers.
Why am I so upset? You probably have heard about this, but a study was published this month from a data scientist at Facebook where, in a nutshell, he manipulated the streams of almost 700,000 Facebook users to view either happy or negative updates and see if the emotions in these streams could affect the emotions of users. Yup, they did! And, in of itself, not too surprising and also from a research perspective, actually fairly interesting.
My problem with this is that Facebook felt that their loosely worded terms of service would cover informed consent for human research subjects (even though their IRB did think it was “creepy.“). I used to co-own a private research company that builds websites which test behavioral communication strategies and we had an outside, independent IRB that governed everything we did. Strictly. When I read about the Facebook study, my emotions were certainly manipulated in that I felt like I was having a heart attack at the same time my head was exploding with the disbelief. When you work with human subjects, they MUST be informed they are involved in a research study and have the opportunity to opt out. Every IRB training references the Tuskegee airman case, which changed the way we do research in America.
Did you feel yucky when you heard about this? You should. It’s absolutely wrong.
Since then, the main author has pulled a mea culpa (on Facebook of course) that reads like a 13-year-old boy who got caught throwing snowballs into traffic. For me, personally, this is not even close to being good enough when we’re talking about a gigantic company who is doing internal research with hardly any oversight, manipulating users without their knowledge, practicing un-ethical business and research standards, and then somehow gets published for it.
I have long had problems with Facebook because when they make changes to their privacy settings they always set them to the least private. Ethically, when you build websites, you make everything most private and allow the user to choose if they want to open themselves up. Tricking people is not how to build trust and respect with your users – and Facebook seems to forget that if they don’t have users, they don’t have data.
Well, at least they now have less data from me.