Deep Thoughts on Deep Fakes
I recently attended a conference for agency owners with a great session on ethical considerations for today’s communicators. This session made me think about “deep fakes” in a different light than I had before, and I’ve become concerned for risks this poses to corporations, leaders, brands and individuals.
A deep fake refers to manipulated media content — video or audio — that creates realistic and deceptive content. Deep fakes make it look like a person said or did something they never actually did. Think: “bad lip syncing” but executed with such perfection that the casual observer truly believes the president said he‘s going to run away with Britney Spears tonight at midnight. While deep fakes can be used for harmless purposes like entertainment or humor, they can also be misused to spread misinformation, engage in fraud and defamation, disrupt the economy and manipulate public opinion. Yikes!
Companies and brands should be concerned about deep fakes and consider them in crisis communication planning for a few reasons:
- Reputational damage: Deep fakes have the potential to cause significant harm to corporations and their brands. Bad actors could create deep fakes of corporate executives or spokespersons to spread false information or damaging statements that can impact the company’s brand, trustworthiness and stakeholder relationships.
- Misinformation and trust erosion: Deep fakes can be used to disseminate misinformation or fake news, leading to confusion among the public, customers, voters or investors. When the authenticity of audio or video content is compromised, it becomes challenging for people to discern what is real and what is manipulated. This trust erosion can adversely affect credibility.
- Financial fraud and scams: Deep fakes can be used to perpetrate financial fraud or scams targeting corporations. For example, fraudsters can create convincing deep fake videos or audio recordings impersonating high-level executives to deceive employees, customers or partners into performing fraudulent transactions or sharing sensitive information. I have experienced this with phishing emails being sent to employees from my commission email requesting them to take action.
- Legal and regulatory implications: The rise of deep fakes raises legal and regulatory concerns, which could lead to lawsuits, damage claims or regulatory investigations. Leadership, legal and communications should have these conversations now. Don’t wait until you’re in the heat of the crisis to discuss how this would be handled!
- Customer and investor confidence: If shareholders perceive a company is unable to protect its brand, executives or sensitive information from manipulation, they may hesitate to engage or invest, or pull their investment in a hurry causing stocks to plummet. This could result in serious financial repercussions.
In the event of a deep fake targeting your organization or key figures, effective crisis communication and response become crucial to mitigate the potential damage. Your organization must be prepared to address and counteract the effects of deep fakes promptly and effectively to safeguard your brand reputation.
Here are a few ways we recommend beginning to address these threats:
- Update your crisis communication plan. Identify new vulnerabilities posed by today’s digital threats. (Think about cybersecurity issues, too.)
- Discuss these newer vulnerabilities with decision makers, first responders and spokespersons during “normal times” so you can respond appropriately when stress is heightened.
- Train your team on your updated crisis communication plan so that everyone is on the same page.
- Involve leaders, department heads, operations and communicators in these conversations now so you can minimize surprises later.
If you need help guiding this conversation, updating your plan or coaching your leaders on how to respond, I’m happy to help. Even if you’re equipped to handle it in-house, I hope this article gave you the push to start prioritizing and facilitating these crucial conversations!