Last week was an intense week of lecturing, which means that I have fallen behind with other work – including writing up lecture notes. One of the lectures was Dangerous bits of information and was presented at the NOKIOS conference in Trondheim Norway. Unfortunately I did not have much time in the city of Trondheim but what I saw was wonderful sunny city with plenty of places to sit and relax by the river that flows through the center. But there was not much sitting outside on this trip.
The lecture was part of the session “Ny teknologi i offentlig forvaltning – sikkerhet og personvern” (New Technology and Public Administration – security and data security). In the same session was Bjørn Erik Thon, Head of the Norwegian and Storm Jarl Landaasen, Chief Security Officer Market Divisions, Telenor Norge.
My lecture began with an introduction to the way in which many organizations fail to think about the implications of cloud technology. As an illustration I told of the process that surrounded my universities adoption of a student email system. When the university came to the realization that they were not really excellent at maintaining a student email system they decided to resolve this.
The resolution was not a decision of letting individuals chose their system. But the technical group (it was after all seen as a tech problem) was convened and decided in an either – or situation. The decision placed before the group was whether we go with Google or with Microsoft. The group chose Google out of a preference for the interface.
When I wrote a critique of this decision I was told that the decision was formally correct since all the right people (i.e. representatives) where present at the meeting. My criticism was, however, not based on the formality of the process but rather about the way in which the decision was framed and the lack of information given to the students who would be affected by the system.
My critique is based on four dangers of cloud computing (especially by public bodies) and the lack of discussion. The first issue is one of surveillance. Swedish FRA legislation, which allows the state to monitor all communication, was passed with the explicit (though rather useless) understanding that only cross border communication will be monitored. The exception is rather useless as most Internet communication crosses borders even if both sender and receiver is within the same small state. But this cross-border communication becomes even more certain when the email servers are based abroad – as those of gmail are.
The second problem is that some of the communication between student and lecturer is sensitive data. Also the lecturer in Sweden is a government official. This is a fact most of us often forget but should not. Now we have sensitive data being transferred to a third party. This is legal since the users (i.e. the students) have all clicked that they agree the licensing agreements that gmail sets. The problem is that the students have no choice (or very little & uninformed – see below) but to sign away their rights.
The third problem is that nothing is really deleted. This is because – as the important quote states – “If you are not paying for it you are not the customer but the product being sold” – the business model is to collect, analyze and market the data generated by the users.
But for me the most annoying of the problems is the lack of interest public authorities has in protecting citizens from eventual integrity abuses arising from the cloud. My university, a public authority, happily delivered 40000 new customers (and an untold future number due to technology lock-in) to Google and, adding insult to injury, thanking Google for the privilege.
Public authorities should be more concerned about their actions in the cloud. People who chose to give away their data need information about what they are doing. Maybe they even need to be limited. But when public bodies force users to give away data to third parties – then something is wrong. Or as I pointed out – smart people do dumb things.
The lecture continued by pointing out that European Privacy Law has a mental age of pre-1995 (the year of the Data Protection Directive). But do you remember the advice we gave and took about integrity and the Internet in the early days? They contained things like:
- Never reveal your identity
- Never reveal your address
- Never reveal your age
- Never reveal your gender
Post-Facebook points such as these become almost silly. Our technology has developed rapidly but our society and law is still based on the older analogue norms – the focus in law remains on protecting people from an outer gaze looking in. This becomes less important when the spreading of information is from us individuals and our friends.
The problem in this latter situation is that it extremely difficult to create laws to protect against the salami-method (i.e. where personal data is given away slice by slice instead of all at once).
At this stage I presented research carried out by Jan Nolin and myself on social media policies in local municipalities. We studied 26 policies ranging between < 1 page to 20 pages long. The policies made some interesting points but their strong analogue bias was clear throughout and there were serious omissions. They lacked clear definitions of social media, they confused social media carried out during work or free time. More importantly the policies did not address issues with cloud or topics such as copyright. (Our work is published in To Inform or to Interact, that is the question: The role of Freedom of Information & Disciplining social media: An analysis of social media policies in 26 Swedish municipalities)
Social media poses an interesting problem for regulators in that it is not a neutral infrastructure and it does not fall under the control of the state. The lecture closed with a discussion on the dangers of social media – in particular the increase in personalization, which leads to the Pariser Filter Bubble. In this scenario we see that the organizations are tailoring information to suit our needs or rather our wants. We are increasingly getting what we want rather than what we need. If we take a food analogy we want food with high fat and high sugar content – but this is not what our bodies need. The same applies to information. I may want entertainment but I probably need less of it than I want. Overdosing in fatty information will probably harm me and make me less of a balanced social animal.
Is there an answer? Probably not. The only way to control this issue is to limit individual’s autonomy. In much the same way as we have been forced to wear seat belts for our own security we may need to do the same with information. But this would probably be a political disaster for any politician attempting to suggest it.