Your browser is out of date.

You are currently using Internet Explorer 7/8/9, which is not supported by our site. For the best experience, please use one of the latest browsers.

866.430.2595
Request a Consultation
Scroll to Explore
banner

"Facebook: Millennial Frankenstein's Monster" - A Data Privacy Editorial by Jenna Waters

Get Started

In both theory and practice, Security and Privacy intersect for the perfect symbiosis. It is widely accepted that when you focus on one, you typically achieve the other, as well. Understanding their respective applications and definitions can be key to knowing how to protect yourself in the modern, connected world. Security applies to measures one takes to ensure that sensitive data is protected against unauthorized access or use, while Privacy is often defined by what is or is not appropriate use of that data. As fast friends, Security and Privacy go hand in hand and typically have a congenial, mutually beneficial relationship– except, that is, when personal data becomes profitable– as in the case of Facebook. It’s no new concept that the monetization of personal data can sometimes blur ethical lines around privacy practices. Watching the Facebook drama play out on a global stage, however, even loyal users have to ask themselves, How far is too far? When will users finally draw the line?  

When questioned, most of our parents and peers like to justify their presence on Facebook as their outlet for “seeing pictures of people’s kids” or “keeping up with old high school friends”, but is it worth the cost? Is the ability to avoid having to actually pick up the phone and call someone so valuable that we are willing to give away our personal privacy? For many of us in the Security and Privacy industries, Facebook is Frankenstein’s monster, originally birthed from eager millennial innovation to connect friends and enable them to share their lives. As in the case of Shelley’s Frankenstein, however, good intention was ultimately corrupted. Few (aside from our grandparents, who don’t notice anyway) would disagree that the Facebook experience has been ruined by greed, cyberstalking, and Farmville.

In its defense, Facebook has taken steps to implement controls around data security. Currently, both the application and website are relatively secure to use. Getting back to the intersection of Security and Privacy, though, security isn’t the only thing. If you follow global media headlines, you are already well aware that most people believe that this platform doesn’t exactly stand as the bastion of privacy. In the past year, prominent media voices, the U.S. Congress, EU Parliament, and users worldwide have scrutinized Facebook's questionable relationship with data privacy and consent. So, are we really surprised that Facebook has managed to dig and stumble into another of its own controversial pits recently? 

The company's business model relies heavily on the auctioning off of private user data to the highest bidding advertiser, and it seems that user health data is the latest ethically questionable tool in Facebook’s money-making toolbox. Last month, the Wall Street Journal reported that 11 out of 70 tested iOS applications integrate with Facebook Analytics and brazenly share private health data with Facebook servers - even if the user does not have a Facebook account.Why does Facebook want, collect, and store non-users’ health data? Through these third-party applications, Facebook receives very sensitive and federally protected health information without users’ explicit consent. As discovered by the Wall Street Journal (WSJ), the data was shared with Facebook because the applications used Facebook’s software development kit (SDK). By definition, SDK provides developers with guides, tools, libraries, code samples, and processes that enable developers to create applications– and for– a specific platform. Facebook’s SDK provides developers with an analytics service which allows developers to track user trends and customize application features. The Wall Street Journal asserts that developers who send sensitive information to Facebook use "custom app events" to send HIPAA protected user health data, such as daily heart rates, ovulation times, and pregnancy timelines. Let’s let that sink in: ovulation times.

The social network behemoth can now match unique IDs within custom app events to pair users and their data with existing Facebook profiles. Facebook will then use this data pairing to improve ad targeting tools and provide the “service” of giving users more customized advertisements. The data will also be given back third-party application developers and others tapped into Facebook’s data market to help them target their own advertising campaigns to existing and new users while they’re browsing on Facebook. It is also troublesome to note that all 11 of these applications forwarded private data without notifying users via a privacy policy or terms and conditions of service– two methods of discloser with which users are quite familiar. Facebook states in their own terms of service for the developer kit that they can use any data shared to “improve other experiences on Facebook, including News Feed and Search content ranking capabilities”. It is possible that these 11 applications, and potentially even Facebook, are at risk of violating HIPAA statutes which require a patient’s consent before sharing health data with a third party. If we hold doctors accountable for a breach of patient data, will we also hold accountable social media magnates? Does having a place to post pictures of grandkids justify what may be gross violations of privacy law in the U.S.? 

For its part, Facebook has officially stated that it is not collecting sensitive health and financial information to use more broadly and that they "require app developers to be clear with their users about information they are sharing", according to a Facebook spokesperson to the WSJ. Yet, this is similar verbiage to what was laughably claimed before the EU Parliament last year, and there is still no way to verify that Facebook isn't still tracking your every move. So how can you protect yourself, your employees, and your business in an era of questionable privacy practices? While I can definitively say that trusting Facebook is not the answer, there are ways to mitigate and limit data tracking, though eliminating it all together may require becoming a luddite hermit (not an altogether bad option).

 

  • Check, re-check, and stay abreast about relevant and/or new Facebook privacy settings.
  • Research an application prior to downloading it to a personal or work- issued devices.
  • Look for and inspect a Terms of Service agreement, or Privacy Policy. It is a red flag for an application if neither exist.
  • As an organization, train employees on the importance security and how to incorporate security principals in their private lives. (See our last blog post by Corey Bolger.)
  •  Be wary of when and how you share private health data and be aware of what you consent to be shared.
  • As an organization, do not use Facebook's Software Development Kit or Pixel web tracker for custom applications that may interact with private health or finance data.

 

Read more about how to protect yourself here.