The Big Data collected about us turns out to be quite double-edged. In the hands of a doctor, it can help prolong our life, and in the hands of a swindler or manipulator, it can make it hell, writes Andrey Bystritskiy, Chairman of the Board of the Foundation for Development and Support of the Valdai Discussion Club. The article is published as part of the Valdai Club’s Think Tank project, continuing the collaboration between Valdai and the Argentine Council of International Relations (CARI).
Generally speaking, the St. Petersburg Forum is in essence economic, but the economy means people, and people live in a communications environment. In turn, communications are undergoing radical changes. Everything that people get with the help of the economy is inevitably reflected in communications. Therefore, we need to rethink the position of the main subject of the economy — a person in a new information and communication environment, in the world of Big Data.
In general, the discussion develops around human rights in the aforementioned “new world”, primarily around the right to privacy, the right to independent action, sovereignty in action and judgment.
The heart of the problem is clear enough: Big Data and communications, in which that data is immersed, creates fantastic new opportunities. It offers both speed and ease in the circulation of money and documents. There are also new opportunities for a business that is gaining hitherto unthinkable knowledge about its real and potential customers. Here are, at least, elements of artificial intelligence that allow you to create completely new and effective market strategies, and not only market behaviour. And much, much more. So it turns out as some kind of splendid new world, powerful and bright.
The main thing here is how to balance the obvious benefits of the Big Data use with the natural human desire to preserve private space, to act independently.
We can, of course, say that the right to privacy is a relatively recent invention, that in the conditions, for example, of a 19th-century village, this human right was extremely limited by living conditions. You can also add that the desire to protect privacy is a consequence of the sinful nature of a man. What should a perfectly honest person hope to hide? Their income is legal, and their behaviour is moral. What should they hide?
Alas, the trouble is that people are what they are. That is, they are characterised by aggression, envy, intolerance and eccentricity; others, sometimes, have dubious features. History shows that certain features of behaviour, origin and religion often lead to persecution, destroying a huge number of lives in the literal and figurative sense. If there is any vector in the humanitarian development of the world, it is the vector of gradual emancipation, isolation of individuals, who are able to acquire their own protected space.
Today, technological advances, designed to make our life easier, more comfortable and more interesting, challenge our right to be ourselves without dangerous consequences. This, if you like, is a question of human freedom, which, as you know, can be limited only by the equal freedom of others.
Unfortunately, there seem to be no simple answers to these questions. Only because there are no general rules of regulation in the new information and communication environment. The turnover of Big Data is international in nature; there are no borders here. But there are no global rules either: the laws are predominantly national in nature. The technologies used in the new environment are incompletely manageable.
The same Covid-19 pandemic showed that general safety concerns necessitate the declassification of some of the medical data, for example, information about vaccinations. But the pandemic has shown something else — that it is extremely difficult to reach an agreement on how to deal with this data, how to create and control these databases.
The problem is further aggravated by the extreme heterogeneity of the subjects of data collection and storage: there are private companies, there are states, there are public organisations of all kinds, there are particular individuals. And I’m not talking about outright crime figures. It is not by chance that there is a rather serious struggle for where the data of people should be stored physically. Trials on this topic are held, one might say, in all parts of the world.
In 1976, the movie “Non-Transferable Key” was released in the USSR. Basically, it’s about data, and even Big Data. However, it’s more about trust. More about the right to privacy. In the film, one young and progressive teacher (now she would be approaching seventy) actively and very frankly communicates with students, engages them in discussing all problems of the world. The students record these conversations using a tape recorder. The recording falls into the hands of one of the parents. She passes these recordings to the school director. A terrible situation arises: statements made in private conversations become public. This creates serious problems for everyone. And the key that can neither be betrayed nor transferred is trust. A kind of blockchain, if you will.
Therefore, no matter what technical or legal means of data protection we create, the problem of our privacy protection cannot be solved without trust.
In general, the issue of “privacy in Big Data” is complex. A necessary element of its solution is the discussion of this complex issue. It, in fact, will be discussed at the session of the Valdai Club at the SPIEF-21.