A new ranking of major technology companies has given them low marks for protecting the privacy and freedom of expression of their users. #mediadev talks to the project's director, Rebecca MacKinnon, about the results.
The Corporate Accountability Index 2015 ranked 16 international technology and telecommunications companies on their commitment to human rights including privacy and freedom of expression.
To compile the digital rights ranking, researchers combed through user agreements, privacy policies, terms of service and corporate reports of companies such as Facebook, Vodaphone and Bharti Airtel.
Google ranked the highest, followed by Yahoo, while the Asian telecommunications company Axiata and the Emirates-based Etisalat ranked lowest.
Rebecca MacKinnon is the director of Ranking Digital Rights, the non-profit research initiative behind the study. MacKinnon has long been active in the fields of freedom of expression and privacy, and is a founding member of the Global Voices Online citizen media network. She is also the author of Consent of the Networked: The Worldwide Struggle for Internet Freedom, which came out in 2012.
#mediadev: What was the area where companies in your digital rights ranking stumbled?
Rebecca MacKinnon: One of the big problems is that companies are not clear with users about what information they collect and what information they share, who they share it with and under what circumstances, and how long the information is kept. There is not enough transparency about this. To be informed users of technology, if someone collects a dossier of information on us, we need to know what they know. Companies need to allow us to know what they collect and make it possible for us to get a copy of this information.
And they don't do this?
Some do in a limited form but many companies don't at all. In some countries, the law requires it but regardless of whether the law requires it or not, it should be a standard practice that companies let us know what they collect.
Is there another area where many companies failed to make the grade in your index?
Another area is transparency reports. Some companies are starting to issue data about how many government requests they receive to remove content or hand over user information. But many companies have no transparency around this. There are also companies which receive requests from private groups, from anti-hate speech or child protection groups, and there is much less transparency about these. And there is no transparency at all about how much content companies are taking down or what type of content they are removing.
We are increasingly seeing governments putting pressure on companies to stop terror or violence against women or stop different kinds of bad behavior. We are not saying companies should allow free-for-all bad behavior. We recognize there are reasons why companies have rules and enforce rules and there are reasons why companies collaborate with law enforcement. But the process needs to be transparent and accountable.
Fundamentally, you want to have accountable governance. Companies are creating a layer of private governance on top of government governance. If you don't have mechanisms for accountability and if these companies don't have a clear commitment to our rights, including freedom of expression and privacy, and aren't held accountable when it comes to respecting those rights, it is going to be very difficult to preserve the kind of society that we want to have.
When I talk to people about Internet privacy, they often say that they have nothing to hide. What would you say to these people?
A lot of people using Facebook and other social media don't want their boss to know, let's say, their sexual orientation or their political preferences – maybe they are discussing politics online and they don't want to be discriminated against in the workplace because their boss belongs to a different party.
What about the importance of Internet privacy in developing countries?
People are getting arrested every day all over the world because of things they write on Facebook. Or if they are not getting arrested, their phones are bugged or they get a warning from their boss who got a warning from the police, so it is a real concern. Global Voices Advocacy, for instance, gets reports every week about people being arrested for things they posted on Facebook. You can get tortured to give your Facebook and Gmail passwords to the police. So the extent to which these companies have privacy policies that help people protect their information really makes a difference.
There is a real battle right now around freedom of expression and privacy online. Freedom House puts out county reports on Internet freedom and for the past five years they have found the level is going down. In part, this is because governments are putting growing pressure on companies to engage in censorship and surveillance for them. So we are trying to find out the extent to which companies are contributing to the problem or at least get companies to be transparent so users can make up their own minds.
How could you convince companies to change their practices?
One of the inspirations for our project is the Who Has Your Back ranking by the Electronic Frontier Foundation (EFF) which looks at tech companies handling of US government requests for data. EFF has already seen that some companies have changed their policies in order to get higher scores in the ranking. We are hoping some of the companies in our index will make an effort to be more transparent and be clearer about their policies, particularly if NGOs use the rankings as a tool for advocacy. Companies are less likely to change if we just put the data out there and nobody says anything. But if people ask questions such as, “Why did you do so badly about sharing data?” or “Why can't you do better than that?” then hopefully they might change.
What would be your main recommendation to companies?
First of all, companies need to do human rights impact assessments. They need to assess how their business impacts on someone's freedom of expression and privacy and they need to have a process for monitoring this as well as a process for accountability within the company.
Then they need to be clear to their users about what they collect and what happens to user information. They also need to be transparent about their policies for handling third-party requests from governments and other private entities to restrict or remove content, or to hand over user information. This doesn't mean they should publish the names of users whose information they share. But they can publish the number of requests and their processes for handing the requests. This will go a long way in helping people understand what is going on.
Some people would say that this would undermine these companies’ business models since they live off this data.
Even if companies don't change their business models, they could still improve their scores tremendously. What is not acceptable is lying to your users or not telling them what is going on. There are plenty of users who will still use these services with the same business model. If services are more open about what is being collected and how it is being used, people will have more trust in them.
Right now in Europe, for example, there is a backlash against Google partly because people don't feel they know enough about what is happening. If Google would be clearer with users about how it shares data with third parties and how it collects it from other third parties – indicators in our index that Google didn't do very well on – this would help build trust. It would also contribute to a more informed debate about business models because there are different models among these companies. For example, Apple has made the decision that is not going to collect and share user data in the same way and they have gotten a lot of publicity for that. Microsoft in some of its advertising has been implying that they don't do certain things that Google does. So I think that companies can use this as a competitive advantage.
At the 2015 Internet Governance Forum in Brazil, you took part in a workshop organized by DW Akademie and iRights Lab with African Internet experts. Did you learn anything from them that could inform your work?
Hearing from the group helped me understand how people want to use the data in the digital rights rankings. For instance, someone suggested that NGOs should engage with regulators about how regulations affect companies’ practices both in positive and negative ways. Several were interested in doing national level indices, taking a deeper look at a number of companies in the same market.
Looking five years in the future, are you optimistic?
If you look at the overall trends right now around human rights, the Internet and freedom of expression and privacy, it is really tough. Regulations and corporate approaches are going in the wrong direction. Ultimately, if we are to make progress, all of us need to think about what piece of this problem we can help with. My response is to contribute the Ranking Digital Rights tool that could help push things in the right direction. If we want to have a global Internet that is compatible with human rights, we need to have everybody doing their bit.