跳转到主要内容
insight

Stories

Share

Deepfakes and National Security: How KBR is Calming the Worry of Deception

Publish date
图像
deepfakes_story1

How do you know if something is really… real?

Each time we sit down to watch a non-animated video, we expect to see a recording that is grounded in reality.

Sure, the lighting could be tweaked and a greenscreen could alter the background. Even the voices could be changed. But what the person says and looks like – including their expressions – those can’t be digitally modified, right?

Thanks to the capabilities of modern science, false video and audio content is more prevalent than ever. As a key provider of technical solutions to the U.S. government and its allies, KBR values its ability to decipher fraudulent content, which is pertinent to intelligence and national security.

This form of digital media, which goes by the term “deepfakes,” refers to computer imagery created with sophisticated technologies like artificial intelligence (AI) and machine learning (ML) to replace a person’s likeness with another in a recorded video.

“Technology is advancing at a tremendous pace and it is harder to identify an original video,” said Derrick Nixon, Defense and Intel Vice President for KBR’s Government Solutions. “That means there is endless potential to influence a large audience. The biggest goal of deepfakes is to create mistrust.”

Deepfakes initially began innocently as a way to swap faces onto someone else, such as on social media or in film. Individuals may recognize a few popular examples from the big screen – for instance, Tom Hanks’ character in Forest Gump when he is receiving the medal of honor alongside President Lyndon B. Johnson, or the digital Carrie Fisher created after her death in 2016’s Rogue One: A Star Wars Story.

More recently, deepfake technology has been used to create fake news and make it look like high level officials are saying things they actually never said.

“For a defense contractor, this is a huge concern. It sets people up to be exploited or blackmailed,” Nixon continued. “If you were kidnapped, they could create a fake video of you for ‘proof’ to get ransom. It can also be used in political campaigns. It’s essentially a powerful tool can cause a lot of harm.”

Some falsified videos are created by AI using Deep Neural Networks (DNNs). These are a set of algorithms that take in pixel colors, background, and movements, to create the variables needed for fabricated images and sounds.

This idea is similar to how Photoshop works – you can take a dropper and pick up the color of something on a photo and place it somewhere else it did not originate. Deepfakes, however, are a much more advanced version of this concept, and their use continues to grow each day.

“Deepfakes match world events,” Nixon said. “If there was a major event or a protest, all of a sudden there are leaked videos that show gunfire on citizens to deter protests. It is the power of persuasion. Whether it’s true or not, the perception is out there.”

This form of deception affects current operations related to IT, the military, and domestic safety on various levels. It also has the potential to destroy reputations and disturb economics and education.

“A deepfake could ruin a business with a false video review or by showcasing fake problems,” Nixon said. “We as humans are visual learners in psychological war. A video often becomes reality – it’s different from dropping propaganda leaflets in the past. Deepfakes broadcasted across major media news outlets could change the minds of millions of civilians without firing one shot.”

But as manipulated videos get more sophisticated, so are the techniques to identify them.

KBR works on deepfake detection technology to protect against artificial material at each level of the business, Nixon said. “We are always committed to ensuring our supply chains, data, and information is trustworthy.”

KBR has expertise in identification of counterfeit parts, video, and intelligence collection efforts.

“We have professionals who specialize in imagery and understand to the umpteenth degree what every pixel contains and if there are things that don’t ‘smell right,’ they can quickly identify those as manipulated,” Nixon said.

“As time goes on, KBR will continue to advance its algorithms to scan data faster and determine manipulations easier,” continues Nixon. “Bring in more data and you can crunch it in a quicker fashion.”

How this technology will leave its legacy in the long run is uncertain, but KBR remains committed to countering any threats deepfakes may present.

“Deceiving the enemy to gain a tactical advantage has been around forever and deepfakes are just a newer tool in the toolkit,” Nixon said. “Fortunately, KBR stays on the verge of new innovations and solutions for its customers.”

Since the average user does not have the backing of an international tech company like KBR, Nixon stressed the importance of protecting your own personal information on a daily basis.

“So many people download the latest and greatest cool app – some of that information may be going outside of the United States,” he said. “It could be used for devious activities. Protect your identity, which includes your face and voice.”

Cookie 政策