Reflections on the FTC Social Networking Privacy Roundtable

Yesterday, I had the honor of being part of the Privacy Roundtable put on by the Federal Trade Commission, held at the University of California at Berkeley.  The other 6 panelists were Chris Conley from the ACLU, Tim Sparapani of Facebook, Nicole Wong of Google, Erika Rottenberg of LinkedIn, Lillie Conley of the Electronic Privacy Information Center, and Ian Costello of Living Social. I want to thank Peder Magee and Michelle Rosenthal of the FTC for inviting me to be part of this esteemed audience.

Highlights and personal observations:

  • To what degree is privacy an issue and what role should regulation have?  Google, LinkedIn, and Facebook echoed a similar response that users are, for the most part, quite savvy.  “Users trust us” and because of that fact, they feel safe to share their lives with their friends and community. “And we trust them”, by allowing communication to flow relatively freely and to not have a heavy hand in regulation.  Technology changes so fast that it’s difficult to write rules that would inadvertently cause friction, since “data is the lifeblood of social networks”.
  • Portability is key: Nicole mentioned that Google engineers have a penchant for starting their own movements– and one of them is the “Data Liberation Front”, where they want to allow users to be able to migrate or remove their data freely if they so chose.  Of the couple dozen services that Google offers, the majority of them do allow you to leave freely and easily.  For example, you can export all your Gmail and relationships and move to a competing service.  Users knowing they can leave freely at any time, as opposed to having to act on it, has been important to the transparency and growth of their services. Erika of LinkedIn and Tim of Facebook noted that there are many social networks– the bar for entry is low and users can pick up their data and go.
  • “We absolutely compete on privacy: There are dozens of social networks and search engines out there and the leaders are those that maintain user trust. Google has cross-functional privacy teams that meet regularly to review issues and ways to give more control to users. At the same time, implementing fine-grained privacy controls is difficult– to allow users to place their friends into groups and even place sharing rules at the individual message level is an engineering challenge.
  • No one size fits all: It’s nearly impossible to set smart defaults on what data is shared with whom. Chris Conley of the ACLU noted that a gay student was outed when friends in his hometown discovered that he was a fan of a GLBT page on Facebook.  As an aside, the relationships between people are so powerful that we can predict who you are just by knowing what your friends like and do– we don’t need to know much about you.
  • A free model creates incentives for advertising abuse: Third-party developers often have to choose between playing by the rules or maximizing their earnings. There is certainly a trade-off.  This trade-off exists for the social networks themselves, too, as they do not charge a monthly membership fee (LinkedIn being an exception).  Yet, market forces encourage playing by the rules in the long run, since short-term abuse by running deceptive ads will hamper long-term growth, as this Dennis Yu guest post details.  The crowdsourced feedback model, where users can report bad behavior, helps create a self-policing system.

I’m excited to see how regulation adapts to the growth of publicly shared data across an increasing number of devices.  Certainly, privacy is not “dead”, nor can it be unilaterally and suddenly stripped away from hundreds of millions of users. The problem with regulation is that it applies not just to the bad actors hiding in the shadows, but also to the good guys.  And abuse may not necessarily be of malicious intent. Should there be sufficient examples of unintentional sharing– perhaps a senior citizen’s health records are accidentally revealed because a mobile app accesses that information on this person’s phone or a woman’s child is publicly revealed to be autistic– and we’ll potentially have laws like HIPAA come down. The lack of harmonization across countries on privacy laws makes this even more complex.

I concluded in my final remarks that oftentimes regulation as a solution is a blunt tool– like trying to fix a broken watch with a sledgehammer.  Education is the right answer, as that will help minors and adults alike in being able to protect themselves.

Leave a Comment

Scroll to Top