Even if we wanted we could not stop giving data and clues about who we are, do or want. Every step we take, every connection leaves a trace in the cloud. And that data goes, in most cases, to the private companies that manage the applications we use. We know it, and we consent because we get some advantage. That is the hook. As our daily life becomes increasingly dependent of digital applications that supposedly facilitate our lives, the spider web that surrounds us becomes increasingly dense and it is no longer crazy to think that its sticky threads could one day be the bars of our jail.
Very sensitive data such as our diseases or our genetic predispositions can end up thickening that spider web without knowing it and, of course, without having consented. We just learned that Google has reached an agreement with the second largest healthcare provider in the United States, the Ascension group, by which you can access the complete medical history of 50 million citizens. It has already obtained 10 million patient files and the serious thing is that it has received them with names and surnames, within the framework of a project that is developed with maximum secrecy. Medicine is among the priorities of Google Cloud to develop new business sources, that includes operations such as the recent purchase of Fitbit for almost 1.9 billion euros, allowing you to access the huge amount of data on physical and medical parameters of the Users of this application.
Google Cloud ensures that the data collected will serve to develop artificial intelligence programs aimed at improving the diagnosis and treatment of patients. But what guarantees do we have that it is really so? Medical data can be a sensitive material highly coveted by other agents and for other purposes, for example, insurance companies to avoid certain patient profiles.
Where are they going to stop and what are the recordings made automatically by voice assistants from Google or Amazon? The question has gained relevance for a case of double murder in the US state of New Hampshire. Two women died stabbed and there are two suspects. In the house where the crime occurred was Amazon's intelligent assistant, Alexia, who can become the key witness of the case. It is assumed that for Alexia to activate when she hears a certain phrase, she must be recording. And in fact this type of assistants do it even without connection to the network. A Dutch publication recently revealed conversations recorded by Alexia in which it was not difficult to identify several of the people involved. Google acknowledged that it transcribed the conversations to improve performance, but only 0.2% of what it records.
Geolocation or voice recognition mechanisms can monitor our activity and follow our lives. Until now we thought it was done through algorithms designed to offer us personalized advertising. Companies ensure that they guarantee the confidentiality of the data they collect. But what guarantees do we have that they will do so if that data becomes a virtual gold mine?