Ranga Yogeshwar: "The data you deliver will end up strangling you in the system" | Culture

Ranga Yogeshwar: "The data you deliver will end up strangling you in the system" | Culture



Ranga Yogeshwar (Luxembourg, 1959) is a species of canary in the mine, capable of analyzing and alerting of the challenges that innovations bring us. He thinks that in the field of artificial intelligence, we innovate without being able to understand the systems we create and the consequences they can have.

Yogeshwar receives El Pais in his house in Lanzenbach, a village near Colonia, in the countryside. The vital habitat of this well-known popularizing physicist in Germany, however, is the world. He travels it giving lectures and knowing first-hand the innovations that will govern the future and change us as a society. Politicians and businessmen from Germany and around the world call him to tell them where we are going. He explains it in his book Next station: future (Harp), which alerts the misuse of data and we can end up being slaves to the digitalization of our behavior. But Yogeshwar does not belong to the club of the doomsayers, nor does he propose any return. What he believes is that politics can and should mold an innovation that should not be left to engineers and investors alone. "In the end it will be good, because we will ask ourselves the fundamental questions and we will find better answers than we have found so far."

Question. How is innovation changing us?

Answer. There are essential things that will remain. We do not want to live in a state of permanent fear, or die of hunger, but we must see how communication has changed us, for example, what happens now in families. When I was young and I was going on a trip it was a time of separation from my parents, of growing up, but now when the kids leave, they are connected all the time with their family. How will this affect your personality? Waves ecochambers, if you surround yourself all the time with people who think like you, you lock yourself in a bubble and that also shapes the personality. I wonder if we will end up in a very different society, which does not resemble the idea of ​​cohesive society that we had until now, but one in which they are islands with some elementary rules to relate to us.

P. Philosophy has spent a lot of time analyzing the difference between humans and animals. Of intelligent machines, how will we differentiate ourselves?

Our relationship with progress for me is like children at Christmas

R. The development of artificial intelligence forces us to rethink what we are and what makes us human. We can forgive, we love exceptions. That is, we aspire to treat everyone equally, but if he is your friend, you do something special for him. Sometimes you are irrational. You smoke although it does not make sense, the machines do not do that. Maybe thanks to innovation we end up rethinking essential things, such as being focused on productivity and economic performance; There may be transformations that we are not even aware of now. You can, for example, go to a post-textual era in which we tell and listen to stories instead of reading and writing, which is a very complex process. Now you do not read the instructions of a device, you see a tutorial. Maybe, our intelligence can be diminished because that of the machines grows.

P. It does not look like a very promising future.

R. I am optimistic. We see, for example, that we started wanting to spend time disconnected, what the British call digital detox [desintoxicación digital]. In the end, we will learn how to use innovation well. Our relationship with progress, for me, is like the children at Christmas: they open the packages and try to make the toy work. When they do not get it, that's when they look at the instructions. We try to go very fast, but now little by little we are beginning to study the instructions of the digital revolution. The conquistadors came to America, they took over the place and killed the Indians and that is the same mentality cowboy that we see in the digital age. Google and Facebook are the conquerors that take over the contents, but we are entering a phase in which we begin to civilize these contents and I firmly believe that we will be able to do so. We will establish rules, we will understand trends and, perhaps, in the coming years we will realize that we have to start googles and the amazons because they are too big. We will establish rules by which we will not use technologies that we do not fully understand, because there will be negative reactions. History shows us that there is always a process of civilization. In the end it will be good, because we will ask ourselves the fundamental questions and find better answers than we have found so far.

In the end we will learn to use innovation well

P. You explain that our future is not linear, that there is no continuity from the past, that there are disruptions. What are the consequences of this kind of progress?

R. Speed ​​is a new quality. Electricity, for example, was introduced little by little or the telephone, which took 75 years to reach 100 million users. Before there was time to adapt; It was an organic process. Sometimes it took even a generation or two, which meant it was not an intrusion into your personal life, because it was the next generation that had to manage it. Now, if we look at Facebook or other social networks, or smartphones, in just 11 years, everyone has one. That is, things happen now so quickly that there is no time to adapt in a civilized way. The innovations are implanted and suddenly we realize that we need laws. We started to regulate privacy, but it is a very slow process compared to the speed of innovation. In Artificial Intelligence, a multitude of ethical dilemmas arise.

P. How can algorithms and ethics coexist?

R. It is a subject that will be very present, but it can not be treated in a traditional way because ethics is changeable. For 50 years, for example, it was accepted that children should be beaten, but we changed. We need to work with an ethic that takes into account how technology will change us. For example, our concept of privacy will not be the same in a few years. It is a very dynamic situation and since we can not anticipate the changes, maybe we have to design an ethical process that continually questions and adapts things.

P. Is it possible to prevent morals and the law from always being behind?

R. We need a culture in which progress is the result of a process of reflection of society and not the exclusive result of engineering and investors. Do we want to have machines that make crucial decisions? And if we have them, how transparent should the decisions be? Me I refer for example to the bias of the data that you provide to the systems, because the problem is that nobody really understands how they work. We do not know what happens in the deeper layers of the systems and therefore we can not know if the right decisions are made. For example, Google started to tag photos, but suddenly, black people were labeled as gorillas, because the data they had entered into the system was biased, because they probably they had not supplied enough images of black people. ANDIn some applications, nothing happens because there are mistakes, but in those that are intimately linked with our democracy, it is not an option. You can not stop someone without telling them exactly why, I do not want them to stop me because an algorithm orders it. In the end, it is reduced to the question of whether we opt for correlation or for causality, which is the basis of our democracy.

P. What do you mean?

R. We establish and market systems even though we do not understand them. We do not know if they are stable or skewed. Everything is going very fast. I was in a conference with heads of human resources and they explained to me that they used artificial intelligence for job interviews. They have a system that analyzes what words applicants use to define a profile. If he uses certain words, they conclude that he is an optimistic and proactive type or the other way around. In the United States, for example, The Compas system predicts if a detainee will commit a new crime through algorithms. What happened was that they found an important bias towards the black detainees, because they had gotten the biased data. I will not tire of saying it: data is a problem.

Robots out of the closet

Yogeshwar stops talking and shows an invention on the computer of his studio that gives an idea of ​​how advanced voice processing is. A machine asks for an appointment to go to the hairdresser posing as a person and is able to hold a conversation, answering the questions of the receptionist. "It's immoral. You can not have machines that pretend to be humans. There is a lack of ethical reflection. The robots of the future have to come out of the closet and say, "Hi, I'm an artificial intelligence assistant."

P. You talk about speed and complexity. Many people feel alienated and reject a progress they do not understand. There is a reactionary movement.

R. We have to ask ourselves what our objective is. Have a digital society that makes our lives easier? or be happier? Because if we want that, there may not be much relationship, because little by little the human being enters the category of machines. We can end up subjected to the dictatorship of behavior. That is, your mobile for example controls for example the exercise you do. There are already insurers that ask for your data to calculate your policy. In five years we'll see a guy running through a park and we'll ask him if he likes to go running and he'll answer no, but he has to do it to be reflected in his data. The freedom that was promised may end in just the opposite, in a dictatorship of your behavior. The freedom to behave as you please, to drink a glass of wine at night, to smoke a cigarette will disappear because all the data you give will end up strangling you in a system. It starts with medical insurance and quickly spreads to many other areas. And that generates a certain feeling of distrust towards the future, it is the second phase of the digital age.

P. You are also concerned about the evolution of the media.

R. The pillars of democracy are the media, the place where it is implemented and the media have become populist because they are governed by clicks, by the audience measured with algorithms that amplify results. Now the fight is for attention not for ideas. In a very short period of time we have seen a change in the press that baffles many people. We see many changes in a very short time and it makes us think about where we are going, who is controlling the change or if someone is controlling it. For many people, the answer is that the change manages itself and they feel that they have nothing to contribute. The elites are not able to communicate the essence of the changes and people feel that they live in a democracy in which they can not participate. Therefore, although people do well, they feel a certain uneasiness and fear as a result of the transformation, which also affects their businesses. Because they have worked a lot of years to reach a position and suddenly that is not worth anything. They distrust innovation and give themselves over to populism. And they blame the foreigners, they think that the solution is to lock themselves up as if a hurricane was coming, which in reality is innovation. It is pure fear.

P. Can politics do something?

R. Politics can do a lot but unfortunately European and German have done very little. Most politicians do not understand what is happening, they are literally ignorant. I explain for example that the physiognomy of cities is going to change because of Amazon's monopoly and that Bookstores are going to disappear and it's something they have not even thought about. Hand a huge ignorance about the most crucial change that is happening.

P. And the industry?

R. They live in love with their past and do not think about the future. In the United States, 70% of companies have a head of digital transformation, which studies studying how the company will change with digitalization or if it will continue to exist. In Germany, it does not reach 20%. Here come foreign companies and take the most talented young people. If we look at the United States and China, we realize that Europe is going to die in the process of artificial intelligence innovation. Why nothing is born in Europe with few exceptions? We give our data to the USA and from Snowden we know that they treat us differently. IInternet, the Magna card of the digital era was created to be used by everyone and now all we see is business.

P. What geopolitical changes will innovation bring?

R. If we analyze where scientists are coming from in 2030, we see that 37% will come from China and 1.4% from Germany. Let's stop believing that the Chinese are stupid. They are very innovative and have ambition.

.



Source link