"There is a serious debate to have about these algorithms"

Article  by  François QUINTON  •  Published 13.12.2016  •  Updated 12.12.2016
Sherif Elsayed Ali
Privacy, algorithms, human rights… Interview with Sherif Elsayed-Ali (Amnesty International).

Sherif Elsayed-Ali is Head of Technology and Human Rights at Amnesty International. We met him during the Web Summit 2016.
 
 
In 1999, Sun Microsystem CEO said we had “zero privacy anyway. Get over it”. He was followed by people such as Mark Zuckerberg, Facebook CEO, who said “Public is the new social norm”. What do you think of the hypothesis of the end of privacy with new technologies? 

Sherif Elsayed-Ali: I think privacy means different things for different people, and it changes over time. It changes with changes in people’s lives; it changes with the technology that they use. What we think of privacy now is very different from what we thought privacy was before the internet. But it doesn’t mean people want everything to be public, don’t want to have private conversation.
People might be sharing a lot on Facebook, if you share something with 300 or 500 friends it’s not fully public, it’s semipublic, it’s not really private; but that doesn’t mean you’ll put everything there.
 
Even if you look at the younger generation, lots of people say young people don’t care about it. All the polls on this issue show this is not true: they just have a different view on privacy. They know exactly what they are sharing, with who they are sharing in and on which platforms. Many people are not using Facebook as much because their parents are on Facebook and they want to keep things to their friends’ circles.
 
 Saying that privacy is dead is wrong and also quite dangerous  I think saying that privacy is dead is wrong and also quite dangerous in the sense that if you start building your product in new technology in the assumption that people don’t care, I think your underestimate what the people expect. The difficulty is there: a lot of the time, there is very little information about how privacy is protected, how data is being shared, how data is being used by companies, and that is very disempowering to people. If people have clear information about how data is being used, who has access to it, they can make informed decision about what services they are using or not using. That’s something really lacking.
 
 
Are people aware enough of their rights, regarding private data?

Sherif Elsayed-Ali: No I think it’s very obscure and I think there is a very serious failure on the part of most companies to make people aware and empower people to make a choice about what they want to share and who to share to. When any one of us who use the internet every day, either for email, Facebook and so on, there is a lot of data that get recorded, tracked. And your activity on the internet is tracked by so many websites. It’s all very unclear. People don’t know because there is nothing that gives them the information in an understandable way.

 Many people say that privacy on the internet is a lost cause  Many people say that it’s a lost cause, there is no privacy on the internet because we don’t know what’s happening, we don’t know who is doing what. But that has an impact: in many cases people stop using certain things, they doing certain things. We have to think about devices. For instance Amazon Echo, or the new Google speakers. Are people actually adopting them at the same rythm that they would adopt them if they knew, trusted information was private. Once you have a device in your room which is always listening, that is a very serious issue. Many people may not be jumping to buy these new things as they don’t know what they do, because they are worried about the privacy implications.
 
And I think that lack of transparency from the companies and the fact that companies are not taking this seriously can itself slow the adoption of technology.
 
 
How do you consider social networks: a help to empower people, to promote freedom of expression or a trap, considering advertising or political targeting? Facebook, for instance, can classify its US users into 98 political categories…

Sherif Elsayed-Ali: It’s important to say that, overall, the internet and social media, has empowered people with access to information, with access to communications, with being able to speak out; many people who couldn’t reach out to the public are able to do so now. So I think it has been really empowering and democratizing and we should not minimize it.
 
But because there is a lot of positive doesn’t mean there is no problem. With the issue of privacy, I think the biggest issue is that people can’t make an informed choice. Now we have this cookie notification that you get, which as well intentioned as it was, is probably the most disempowering thing possible: it doesn’t do anything, there is nothing you can do, it’s completely useless. It’s not like you can say “no I don’t want to be followed” because every website does that.
 
But at the same time you get issues about, particularly with Facebook, the social bubble - i.e. the algorithms make it so that you only see the things that match your opinion. I think there is a serious debate to have about these algorithms, about how they work, what you see or not, but there is also a personal responsibility.

 
We are the bubble as well in our everyday life…

Sherif Elsayed-Ali: Yes, exactly. There is nothing that stops people in most countries – there are of course some exceptions in which people are restricted in what they can access – to read newspapers that have an opposite view and just see the arguments; you can follow different twitter accounts. There are lots of things you can do if you want to put the time into it.
 
I don’t know necessarily that it’s an issue that is limited to social media, because on TV, on the radio and magazines, you can do the same thing. You can also just read things you agree with. But I think because for so many people something like Facebook is such a big window on information, there is a big responsibility.
 
 Some companies have so much control over what everyone sees  With Google News, and it’s maybe a good difference, you select the topic you want to follow but within that, there might be more of a diverse opinion. But I think it’s essential to have a serious discussion about algorithms, because some companies have so much control over what everyone sees.
 

How human rights or privacy issues differ between democratic countries and authoritarian regimes? Are questions different or is it a question of intensity?

Sherif Elsayed-Ali: I think it’s a little of both. We see in some of the most established democracies that if you have freedom of expression generally, we don’t see that much censorship, that much kind of cracking down on opposition and so on, but we see a lot of surveillance – a lot of mass surveillance and targeted surveillance. A few days ago, there was the revelation that the Montreal police was spying on 6 journalists in Canada. Last year, Amnesty international found out that GCHQ was spying on Amnesty’s communications. So these things still happen, even in those places.
 
On the other hand, we see other places, like Bahrain, where if you are an opposition activist, you have a very serious risk of being targeted with surveillance that then feeds into patterns of violation of your rights and freedom. In many places, if you are an activist you are at risk of detention, at risk of torture, in some cases even being killed. The use of surveillance to target the communications of activists and opposition, to target them, to spy on them can lead to very serious consequences for people. It’s also interesting to see that a lot of these technologies come from Western companies.

 
Do you think these companies don’t take enough into account the context of these countries?

Sherif Elsayed-Ali: Absolutely. It’s very clear, if you look at International Human Rights standards,  that if you are a company providing surveillance technology, you shouldn’t be selling your technology to a country like Ethiopia that has a track record of cracking down on dissent and imprisoning activists.
 
 
Do you think governments are sometimes slow to understand what is at stake with new technologies regarding political issues such as human rights? And if yes, how does Amnesty International deal with it?

Sherif Elsayed-Ali: If you look around the world, even on issues that are years-old like privacy and surveillance, violence against women online, there is very few countries dealing with it, very few laws are up to date, very few countries are trying to update these laws. It’s just starting now, a little bit, and it’s already quite late. But there are new issues like things to do with regulation of employment in driving services like uber and others, things to do with using predictive policing, which is unregulated although the technology is already being used. You have the use of more and more artificial intelligence coming up, and also automated systems of decision making that affect people’s access to finance, to loans and benefits and all that.
 
One of the most serious issues over the next 10 years is the potential of huge mass automation. If you are a government, if you know that the World Bank says that in OECD countries, just under 60 % of jobs, on average, are at risk of automation, if you encounter this very authoritative report from the World Bank, and you know that the technology trend is going to that direction, you know that big companies are looking at automation in factories, in call centers, in service industries,… how come you not taking this issue seriously? How come, as a government, you don’t already have a commission to make a national plan of actions to adapt to these changes? How are you not looking at your education system, to change the education system? If you talk to people working in the technology sector, they will tell you the same thing:  the skills that one learns at school or at university would probably become obsolete by the time they graduate. What are you doing about this? What does that mean for your workers? I think governments are in the most part - there a probably few exceptions - really failing to address these challenges. They are not easy challenges, but if it’s not one of the key challenges you are trying to address, then you are failing your country.

--
Photo credit : Ina

Would you like to add or correct something? Contact the editorial staff