If we can’t believe what we see, then what shall we believe?

Article by Nerea Balenciaga, Diana Feo, Alejandro Mosteiro, Mario Reyero, Iziar Sánchez and Ángel Valverde

The so-called technique of the deepfakes consists in the recreation of video images of anonymous or known people, authorities, or politicians from multimedia content, but modifying their audio, physical traits, and the movement of the mouth.                                

The origin of the word deepfake comes from the association of two terms: deep learning and fake. Deep learning is one of the main branches of artificial intelligence, and fake just means false. The name is, without a doubt, perfect, because it refers to a technology that is based on the falsification of images, and not always in a positive sense.

Economic savings, a huge save of time and many personal benefits.

As in every new technological advance, when speaking about  the customization of videos that have been known as deep fakes, we have to analyze  the pros and cons that it has because these have gained importance and the risks they have are more visible now than before. 

But taking into account the pros of deep fakes, it is a great invention and it will for sure benefit a lot of industries around the world. Imagine that in movies, you do not have to film every scene many times if the actor does not say the script properly. You just change the voice and the mouth in post-production and this way you would save a lot of time and moneyAnother example in this field could be the film “Rogue One: A Star Wars Story» in which, thanks to this technique, they could bring Peter Cushing back to life. But what happened if they could “bring back to life” other figures like painters? Imagine Van Gogh or Dalí taking you on a tour of their own exhibitions and explaining to you their paintings. We could also give another example related to the film industry such as when it was used in the Fast and Furious movies after Paul Walker died in 2013. But there are more uses outside the film industry. There are other ways in which Deepfakes can be useful thanks to the development of personal avatars. Through this feature, we could benefit ourselves by, for example, trying new clothes without leaving our home or even a haircut. It would be very useful for us, but it would also be very helpful for research and studies. By creating fake images of brains, lungs or any other organ based on real patients, they will be able to study different diseases.

Robbing people’s identity and playing with their opinions; cheap fakes and political weapons.

Using someone else’s identity (the face in this case) without their consent can be a serious concern. That is what happened with several actresses which faces were used in pornography without their approval. This has only happened among famous and relevant people especially because many hours of videos are required to create this content.

An example of this is when it was used through Twitter during the US presidential elections of 2020 with the purpose of playing with the people’s opinion in order to change their vote.

But are deep fakes the biggest threat nowadays? We seem to focus too much on this technology and therefore we forget about other ways of deception that pose a greater menace, which is less complicated but just as successful. This is the case of the so-called cheap fakes, and their use as a political weapon. Making a deepfake needs time and expertise in video editing while making a cheap fake is much easier. Cheap fakes are videos that are modified to spread hoaxes, edited, or decontextualized. And they work. Some of the most famous cheap fakes are related to politicians that seem to be speaking under the effect of drugs or alcohol, but in fact, they have just been slowed in post-production.          

Spanish legislation and final assessments.

Furthermore, using someone else’s identity (the face in this case) without their consent can be a serious concern. And what does the Spanish legislation say about it? «In Spain crimes are never classified by the technology used but based on the intention with which they are carried out and the legal right against which they attack.» Borja Adsuara, professor and expert lawyer in Digital Law, affirms. In this respect, my belief is that if a material of this nature disseminated in Spain is obviously misleading or consists of a warning of its fallacy, it is not meant to confuse anybody, and we will face a simple case of freedom of speech. On the opposite, if a person’s portrait is used credibly for political coercion, for instance, we could be facing electoral fraud or even crimes of libel and slander.

While doing this research, we also found out some data regarding the Spanish legislation that consistently challenges deepfakes and that opens another parallel debate. In this case, the rights which are violated are the right to honor and self-image, which are defined as fundamental rights in Spain as laid down in Article 18 of the Spanish Constitution. 

Organic Law 1/1982 expressly states in its article 7 that illegitimate interference will be considered “the capture, reproduction or publication by photography, film or any other procedure, of the image of a person in places or moments of his private life or outside of them” (Art. 7.5) and “the use of a person’s name, voice or image for advertising, commercial or similar purposes” (art. 7.6). Then, using the image of Lola Flores for a beer commercial violates her fundamental rights? According to the actual Spanish legislation, the answer will be yes, however, as it is not fully regulated, a family approval was enough for this to be carried out. Taking this argument as a basis, I can also agree with the idea of these videos violating the image and honor rights of the people who appear in them.

          

So, the real question is: if we cannot even believe what we see, then what shall we believe?

If you want to try it for yourself, we recommend you to try the app Deep Nostalgia to revive old photographs, or Reface to put your face on the body of a lot of celebrities.

Post a Comment

#FOLLOW US ON INSTAGRAM