COMMENTARY Getting caught when you least wanted to -- whatever it is you're doing -- is never pleasant. And that is exactly what is happening in the smartphone industry.
Google (GOOG) is bypassing privacy settings on Apple's (AAPL) Safari browser, the Wall Street Journal reported on Friday. If that wasn't enough, the Federal Communications Commission says that third-party smartphone application developers must provide better parental disclosures about the data they collect on children.
It's no surprise that wireless vendors of all stripes are doing things that trip privacy alarms. What makes the problem particularly thorny is that they can skirt these barriers in ways that don't arouse obvious consumer privacy concerns.
Take Google. According to the WSJ, the company wanted to support its so-called +1 website feature that would allow people to show approval or interest, much the way that Facebook has integrated "like" buttons onto the websites of partners:
But Google faced a problem: Safari blocks most tracking by default. So Google couldn't use the most common technique -- installation of a small file known as a "cookie" -- to check if Safari users were logged in to Google.
To get around Safari's default blocking, Google exploited a loophole in the browser's privacy settings. While Safari does block most tracking, it makes an exception for websites with which a person interacts in some way--for instance, by filling out a form. So Google added coding to some of its ads that made Safari think that a person was submitting an invisible form to Google. Safari would then let Google install a cookie on the phone or computer.
What Google may have failed to consider is how this would appear to consumers, just as was the case with for keywords or .
Given that ad networks have engaged similar practices, it's tough to extend the same theory to all. Some networks likely knew they were collecting personal data, which has has great value to them.
However, many unintentional privacy breaches could have been conceived by businesses as something far different. For example, do numerous app developers really try to pry into people's privacy? What could most of them do with the data, even if they snagged it? Few would have the infrastructure and wherewithal to make profitable use of the data.
A mistake, or something unsavory?
More likely is that they thought they were doing something else, much like Web developer Anant Garg, who conceived of the same tactic that Google and the ad networks used. He wanted a "consistent experience" for users of a chat system.
That would probably extend to app data-collection of children's info. Apps might be collecting contact data for social-network integration and usage data to better understand what people want from the app. Many small companies have a poor grasp of regulatory compliance, even with something like the Children's Online Privacy Protection Rule, or COPPA. A mobile app is an online service requiring the same parental disclosures and permissions as a major website.
The problem is that what starts as one thing can turn into another, like the science fiction story in which the government keeps control over time-machine technology that lets someone view the past. Only controlled projects looking at periods hundreds of years in the past get access.
A researcher makes plans for a time machine available to anyone to free the technology. Only after it is too late and the information is disseminated does an official find out and inform the person that the government suppressed time viewing because it would become the biggest privacy invasion ever known. After all, when does the past begin? A moment ago. Now everyone's secrets will be laid open.
In other words, sometimes the greatest danger is not from deliberate nefariousness, but from the law of unintended consequences.
Image courtesy of Flickr user Anonymous 9000