In his Time magazine article from two weeks ago, Apple CEO Tim Cook declared that people deserve privacy online. “Consumers shouldn’t have to tolerate another year of companies irresponsibly amassing huge user profiles,” he wrote. Back in October 2018, in a keynote speech at the 40th International Conference of Data Protection and Privacy Commissioners in Brussels, he warned, “Our own information, from the everyday to the deeply personal, is being weaponized against us with military efficiency.”
He was merely putting voice to the growing concern over corporate misuse and even abuse of personal information. While Facebook, followed perhaps by Google, bears the media brunt of privacy publicity, almost all large companies either misuse personal information or do not adequately protect personal information.
The right to privacy from government spying is alluded to in the Fourth Amendment to the U.S. Constitution. Judge Louis Brandeis called it “the right to be left alone.” If people have to be given privacy from government snooping, they should surely expect the same protection from commercial snooping.
The European Constitution is even more explicit, where Article 8 of the EU Charter of Fundamental Rights says, “Everyone has the right to the protection of personal data concerning him or her.” On these bases, it is fair to say that companies have a moral if not constitutional obligation to protect users’ data privacy.
But companies have an additional reason to protect privacy — the effect of loss of brand image, cost of mitigation and recovery, and the potential for increasingly severe compliance fines on profits, provides a compelling economic argument to protect personal privacy.
It is against this background that the Internet Society published on Monday (International Privacy Day) its Privacy Code of Conduct (PDF) — nine steps that all companies should take to ensure data privacy. The first principle combines the notions of this dual moral and economic need: Become Data Stewards. “Act as custodians of users’ personal data — protect the data, not just out of business necessity [legal and economic], but on behalf of the people who have trusted you with it [moral].”
The remaining eight steps comprise:
Be accountable. This effectively means ‘be transparent’. Conform to independent privacy audits; and if anything goes wrong, be open about it.
Don’t hide behind ‘user consent’. A user might consent to the collection of certain personal data; but that does not give a company carte blanche on how that data is used.
Provide user-friendly privacy information. Companies should do this as a matter of course — but it should be noted that failure to do so is not without legal ramifications. On 21 January, the French data protection regulator (CNIL) fined Google €50 million because, in part, “the information provided by GOOGLE is not easily accessible for users,” and where it is accessible, “is not always clear nor comprehensive”.
Give people control over their privacy. This combines some of the other principles: allow users to see how their data is used, and give them control over that usage.
Respect context. Again, this is flavored by other principles; privacy controls should be easy-to-use, and privacy should be the default, not an option.
Protect “anonymized” data as if it were personal data. Just because personal data has been anonymized, that does not mean that companies can be cavalier over its use. De-anonymization is relatively easy, especially when the anonymized data is amalgamated with other clear data. Individuals can still be recognized.
Encourage researchers to highlight privacy flaws. The days of companies trying to protect their reputation by threatening legal action against researchers should be long gone. We’re now in the era of bug bounties; and this is a good thing. ‘Paying’ researchers to find flaws makes economic sense — and is generally more effective and efficient than using in-house staff. Companies now should “provide an open, transparent process for responsible disclosure.”
The final code brings us full circle to the combination of moral and legal requirements for data privacy: ‘Set privacy standards above and beyond what the law requires’. It is companies, says the Internet Society, that “should set the next generation of privacy standards.”
The nine steps in the code of practice will not make a company compliant with data protection regulations — but if they are incorporated into a company’s business DNA, the processes, procedures and controls necessary to ensure they operate effectively will put any company in a strong compliance position.
The Internet Society has more than 95,000 individual members. It is the organizational home of the Internet Engineering Task Force (IETF) and the Public Interest Registry (PIR), which manages the .ORG, .NGO and .ONG domain names. Vint Cerf and Bob Kahn, who are considered the “Fathers of the Internet,” are founding members of the Internet Society.
Related: Industry Reactions to Data Privacy Day
Related: Privacy Fears Raised Over Facebook Messaging Apps Integration
Related: Flood of Complaints to EU Countries Since Data Law Adopted
Related: Data is Currency. Treat it That Way to Strengthen Privacy