In addition to covert cyberattacks, Russian-backed hacker groups Fancy Bear and Cozy Bear engage in disinformation campaigns otherwise known as Fake News, according to international cybersecurity authority, Jeremy Samide, CEO of Stealthcare, the company that changed cybersecurity from defense to offense with its Zero Day Live threat assessment platform.
According to Samide, “Not only do these Russian state-supported hacker groups engage in traditional hacking at the highest levels, their disinformation campaigns span numerous news and social media websites. They will often use older domains with legacy staying power to give them reputable ratings needed to permeate the Internet with disinformation, and automated bots for distribution.”
Emmy Award-winning journalist Cameron Hickey, whose work has appeared on NOVA, and PBS, has been researching fake news at Harvard University under a grant from the Knight Foundation. Hickey began tracking sources and found “almost immediately that the scale and diversity of misinformation was much larger than we had expected.”
At a recent Boston Global forum symposium on cybersecurity Hickey drew the distinction between misinformation, which is plainly false or misleading, and disinformation, which is false with a sinister purpose.
To combat fake news, Stealthcare collects massive amounts of structured and unstructured data from very dark corners of the internet where a lot of disinformation is born and then manifests as different versions through dark and deep websites before it surfaces to access news sites, social media and propaganda websites. Samide says, “It’s like a really bad, amplified game of ‘telephone’ where the stories gain momentum, hype and are manipulated as they surface on the web.”
Samide notes, “Zero Day Live identifies and tracks state-supported websites through correlation analytics and machine learning to identify disinformation outlets such as fake popup media and provides clients with timely, contextual and actionable intelligence to defend against these campaigns.
Hickey warns people who are surfing the web or on Facebook to be wary of such propaganda tricks as false connections, imposter content, manipulated content and even satire, which can be taken at face value. “Other signals that content may be junk include Clickbait headlines, fake or missing bylines, low-quality ads, an inflammatory tone, and hate speech.”
Says Samide, “With the exponential proliferation of fake news and fake media outlets, Facebook and Google have an obligation to be more aggressive on this issue and not use fake news outlets or erroneous information as a way to profit from clicks and advertising.”
Like most individuals, Samide, finds the process of removing disparaging remarks and abusive, derogatory content from these sites extremely difficult. In many cases, Google and Facebook pass the buck and deflect the majority of requests to remove this content by citing the First Amendment. These organizations have been negligent in protecting consumer privacy and have an obligation to combat the disinformation epidemic.”
Meanwhile, emerging technologies in machine learning are helping to identify patterns, correlations and other language nuances to pinpoint specific illicit actors. “Stealthcare’s Zero Day Live, for example, uses its proprietary machine learning capability and analytics along with natural language processing (NLP) to observe trends and identify fake news perpetrators. This technology looks beyond the horizon to identify these threats before they are born and used as propaganda warfare.” Samide concludes.