Information Systems Integration – Messina

The Growing Issue of Deep Fake Videos

Deep Fake videos are videos that appear to be real and untouched by editors, but are in fact fake and often use politicians or celebrities as subjects. Using deep learning technologies provided by increased computer abilities has made these altered videos and audio possible. These deep fake videos and audio are much more convincing than traditional photoshop methods of editing.

Politicians fear that this form of media will play a major role in foreign meddling of politics. This has become an increasingly important topic since the 2016 election, and lawmakers are attempting to implement laws around such videos. They fear people will be able to make videos of “state leaders seeming to make inflammatory comments they never actually made” to undermine the country. They are currently working on countermeasures to help identify such videos and audio using the same technology.

5 Responses to The Growing Issue of Deep Fake Videos

  • Wow, this is scary. I think it will make it even more difficult to get people to care about their own political footprint because they have no idea what to believe. I think it is important to have freedom of speech but it is also important to hold people accountable to this sort of defamation and keep the integrity of news.

  • Franics,
    This is probably the creepiest thing I have ever read on the community site. We always hear about “fake news” in our society today, especially with all of the “news sources” on Facebook that are constantly throwing their version of information in everyone’s face; but this truly takes the cake. Who knows how long this has even been going on? How many videos have we seen that are “deep fake” videos? This is an unsettling feeling that should not be taking place in the political spectrum.
    In my opinion, technology is always great until it crosses a line and in this case, I feel like the line has been crossed into a dangerous territory.
    Kind of also going off of what Maria said above, about the freedom of speech, there needs to be accountability on what we all say and do, especially in the wake of this “deep fake” videos. If we want to have the right to freedom of speech, we need to be able to account that what is being said is accurate and true.

  • This is creepy… How do we know the videos we are looking at are true? When it came to fake news, many of it was written so there was always a level of skepticism on the source. However, editing the video footage of the person is scary. It makes me worried that common people will not be able to tell the difference. This also brings the question of propaganda to the table. Yes, this could be used for political videos, but this could also be used in the media world. The government can adjust what celebrities and media stars are saying to fit what the government wants the public to think. This is scary not just on a political level, but on a worldwide scale.

  • Hello Francis,
    I have not heard of this before so I did some research and looking at it from the other side, it can be used for positive things. For example, the article I was reading talked about how dancers request these videos to help their dancing. Another article I was reading talked about how you can detect these fake videos by looking at how the person in the video is blinking. Humans usually tend to blink a certain amount of times in a minute, but during these “fake” videos it tends to not happen. But obviously, the people making these videos will figure out ways to correct this and make it even more realistic. This is also just helping us to think more critically and remember to not believe everything we see and hear.

  • This is very interesting and weird how people create these things with the technology we have. I think yes this can be very dangerous especially with the ease of access to social media now a day people are so easily convinced what they see online. I just can’t seem to think of a plausible situation where this kind of technology can benefit anyone. Maybe except fixing presentation video that people might want to cover up minor mistakes for.

Leave a Reply

Your email address will not be published. Required fields are marked *