Manuela Lenzen: How did you as a physicist end up at the Department of Humanities, Social and Political Sciences?
Dirk Helbing: I wanted to understand the universe on both a large and small scale. That’s why I wanted to study elementary particle physics and astrophysics in Göttingen. I also became interested in chaos theory and self-organization. But then something happened that had a strong impact on me. In Göttingen at the time there was both an extreme-leftist and an extreme-rightist political scene and they were always wrangling with each other. One time the police intervened and ended up mainly chasing the leftists. They used tear gas. A woman fled and, with the tear gas in her eyes, she ran into a car and was killed. That put the entire city in an uproar. There was a giant demonstration and public discussions about what should be done in such turbulent times. I had the feeling that everyone was speaking past each other. I thought that you should take a scholarly approach and first define the relevant terms. Then one could develop models and draw conclusions from them.
At the time I was just about to end my physics studies and take up psychology and sociology, but then I decided to combine physics and the social sciences. That’s how I found myself contributing to the birth of a new research field, called socio-physics. At first I worked on the dynamics of pedestrians and had the idea to analyze the flow of pedestrians like gases or fluids. The social-forces model which emerged from that is today applied worldwide. Afterward I got involved with the modeling of opinion formation, migration and behavioral changes. So I came to the social sciences, or more precisely put: computational social science.
ML: In your research you model and simulate processes to this day. How has the discipline developed in the age of data?
DH: Modeling was always a bit of an art. You have to be creative and find an appropriate mathematical framework in order to describe processes well. For this we frequently employed multi-agent models, in which individuals or cars were replaced through computer “agents” with certain features. Then one could use the computer to simulate what happens when you have an interaction between pedestrians or cars or people with particular opinions. At the time it was still expensive and time-consuming to collect data and test and develop models. But then came big data with Google, Facebook and Amazon which record the behavior of hundreds of millions of people – and artificial intelligence which detects patterns in the data.
ML: Does that make the research easier?
DH: Partly. But the problem is that science has no access to much of this data. Our task as scholars is to help improve the world and support innovations that enable economic prosperity and social progress; and moreover to judge what the potentials and risks of such technology are and if necessary to inform and also warn the public. Yet because companies and intelligence agencies are almost the only entities which have access to these amounts of data, it is difficult to judge what is being done with it, what dangers lurk, and what opportunities remain unutilized.
ML: What do you mean by unutilized opportunities?
DH: We are experiencing a sustainability crisis, which means that we are overusing our resources. This is an existential problem, a matter of life and death. One might expect that we are now doing everything to solve this problem, in particular we are giving our cleverest scholars access to the data so they can see what can be done to save the world. But that’s not what’s happening. This shows that it’s not humanity but individual interests which are center stage. There is something quite wrong here.
ML: What happens to the data?
DH: Many people are unaware of this, but basically for each of us a black box was created that is fed with mass surveillance data. This black box learns to behave in a way similar to us. These are effectively digital doubles. You can then do computer experiments with them. You can test what kind of information you should present to them so they will change their opinion in a certain way, vote for a certain party, or buy a certain product. And then you can apply this information to manipulate the person in question.
The collection of data also creates entirely new cyber threats. We have the problem that Trojan horses and computer viruses constantly try to infiltrate our computers and other systems. That’s why cyber-security centers were created, namely to assess sources of danger. The danger can emanate from states, from companies, institutions, or individuals. Everyone could have a hidden agenda – and consequently, everyone is under suspicion. In order to evaluate the potential hazards, an incredible amount of data is being assembled on each and every one of us.
ML: But if someone has nothing to hide . . .
DH: That’s extremely naïve. That’s not what collecting data primarily is about. The more is known about us, the easier it is to manipulate us, the opinions that we hold, the prices that we pay. You need endless time to check every little bit of information as to whether you’re being deceived or not. The greater the information influx, the more we are forced to do what is suggested to us. To an ever increasing extent we are being externally controlled.
ML: By corporations or government agencies?
DH: Ever since Edward Snowden it is known that the two go hand in hand. The information at the time was that over a million people had the same security clearance as Snowden. So there are a lot of people in intelligence agencies and firms that use our personal data. The state says it needs access to the data for security reasons. But evidently one hand washes another. This is highly problematic because here the companies and state institutions mutually empower one another and elevate themselves above the citizenry. Cyber-security centers are modern Stasi headquarters. There’s a zillion times more data being collected on us than from East Germans under the Stasi. I can’t imagine that this is being sufficiently controlled in a democratic way – or can be controlled. There’s gigantic potential for abuse here. We saw how immediately after the coup in Turkey thousands of people were imprisoned – among them hundreds of scholars who certainly weren’t terrorists.
ML: And what about the cyber-criminals?
DH: Today’s big data approach is not reliable enough to identify criminals or terrorists. These days everyone who travels by plane is undergoing a big data based background check. Yet, 97 percent of alarms triggered are false alarms. This shows, despite the massive amount of data, how problematic such big data applications are.
ML: Why are these procedures no better?
DH: The idea that results would continue to improve if one only had enough data is simply false. The more data there is the more patterns will happen to appear in that data. Many of them are spurious correlations. The danger is that, by using AI-applications for social control, we turn them into causalities. As a consequence, certain demographic groups will be discriminated against. For example, if certain parts of the city are more frequently monitored, more offenses will be detected and everyone who lives in such a “hotspot of crime” will be considered suspicious. People are then more intensely surveilled. They will lose their trust in the police as source of justice and security. Their belief in the rule of law may be ruined forever.
China shows how far that can go with its scoring system, which is about evaluating every individual’s entire life. The videos you watch, the news you read, whether you care for your parents, criticize the government or cross a red light – for everything you are assigned plus or minus points. Your final score determines whether or not you may obtain a certain job, whether you are allowed to use an express train, and how fast your Internet will be. That’s how citizens are digitally transformed into willing subjects.
ML: It sounds sinister. Do you also see promising opportunities in the world of digitalization and big data?
DH: Of course! It’s like with any kind of innovation. There are good and bad applications. We could boost technical and cultural progress. We could reinvent the way society is organized, the judicial system, money and financial markets. But the challenge is how to come to grips with it. There is presently a great deal which is going in the wrong direction – for example when algorithms instead of parliamentarians decide what is and isn’t possible, or when towns are transformed into “smart cities” without civic participation. Democracy and human rights are at stake.
ML: They say smart cities conserve energy and are sustainable.
DH: Exactly! Yet up until now this digitalization has not contributed to greater sustainability. On the contrary, energy consumption is still growing. In the year 2030 it is expected that more than 20 percent of worldwide energy consumption will be due to digital technology.
And the idea of turning entire cities into business models of individual firms is a very delicate one. Everything is supposed to be better and go faster, yet that is wrong. A company aims to maximize a certain target function. But in a polity you always must take a great many interests into account and balance them out. This leads to cities being less efficient, from whatever perspective you look at them, but it also means that they survive longer than companies.
We should furthermore bear in mind that much of what is important to people – love, friendship, human dignity – cannot be adequately quantified, and can thus easily get lost in a data-driven, highly optimized world. That’s how you get a perfect society for robots, but one that is inhuman – a paradise, but freezing cold, so to speak. Earlier we had private spheres and spaces where the priority was not to maximize profit. They have now disappeared and we are increasingly feeling that human dignity is at stake.
ML: So what can be done?
DH: We have to leave these automatization ideas behind and instead engage in empowerment, participation and coordination. And we have to infuse our values into the way cities are operated. This is called design for values. Moreover, we need multidimensional, local, real-time feedback just as in nature. We can learn from its extremely efficient material flow cycles, based on self-organization, multidimensionality, and participation. That’s the appropriate way of managing complex systems. Unfortunately, it has to be said that either we learn this now or many people will die. Everyone against everyone else is no longer a feasible approach, even if that is the basis of present-day capitalism and individualism.
Happily, human beings are not as egotistical as one might think. We have a social brain and together can accomplish things which individuals are incapable of. This has made us successful, but it is something that has not been sufficiently considered by economic theory. The danger is that we will become victims of simplifications and may even start believing that humans should resemble intelligent machines (or even fuse with them, as transhumanists suggest). In my opinion we need a new social contract.
ML: What exactly should this contract stipulate?
DH: The question is whether we want to live in a data-driven society where algorithms decide what we may or may not do and where just a few people hold the reins of power. Or whether there will be a new enlightenment that makes sure we will all profit from the digital opportunities in a fair way. We are in the midst of a power struggle, in a global information war for the future, which involves not just states but companies as well. They stand ready to control the societies of the future. But it is still unclear who will come out on top. It could be the Chinese system or a Google-cracy. Or something else, like a digital democracy.
ML: And how are the efforts to regulate the algorithms going?
DH: All of that is coming along at a very slow pace. The question is whether the constitutional state can mobilize the courage to prevail over particular economic and power interests, and whether it will finally regulate this heroized “destructive creativity” which doesn’t stop at democracy and human rights. The motto was: Move fast, break things. And now we already have the multibillion-dollar businesses which, it seems, politics no longer dares to touch.
ML: Here at the Wissenschaftskolleg you are working on a book – what is it about?
DH: Naturally it’s about the digital transformation and how we can use it to shape a positive digital future. My research group in Zurich has come up with a couple of ideas. For example, we have suggested a platform for informational self-determination and a new participatory format that we call a “City Olympics” or “City Cup.” Cities would engage in friendly competitions to come up with and apply innovations that promote sustainability, resilience, or energy efficiency. Research institutions, media, politics and the citizenry would also be involved. In the end, everyone could pick and choose the solution that fits best, develop it further, and combine the best all solutions, which would be made open source. That’s how collective intelligence can thrive on a global level. Progressing together – this is the goal, through networked participation from the bottom-up.
Apart from that, one could combine the Internet of Things with feedback mechanisms. The external effects of our actions would be measured locally: noise, CO2, and reusable resources, but also positive things such as health, education etc. Then, with appropriate incentives, positive effects could be amplified and negative ones reduced. Thus, in a participative way, one could transform supply chains into a circular economy. This would also include a genuine sharing economy, such that the available resources would be sufficient for all of us, and a high quality of life could be reached with less consumption of resources.
From psychology we know what makes people happy: humans have a desire for autonomy, social embeddedness and creativity. This should guide the use of digital technologies. A better future would be absolutely feasible. But politics would have to keep the interests of the powerful in check. Democratic politics is primarily here to protect human dignity. That’s the foundation of everything.
ML: Where will we be in ten years?
DH: In the digital world, ten years is a long time. Things that we regard as science fiction today could become real. We have to be much more courageous in thinking about our future. If we can succeed in averting technological totalitarianism then, I believe, the world in ten years will be much better than today.