Cynthia Cheng Correia was the presenter of this session. She is an instructor in the SLA Competitive & Decision Intelligence Certificates Program as well as the president, Council of Competitive Intelligence Fellows.
There was a special focus on disinformation.
The session started out with a poll about whether we have encountered disinformation in our research at work and a second question about whether we were concerned about disinformation in our research. I am not sure why we wouldn’t be concerned?
Misinformation can have unintentional and harmful effects. Disinformation is made up of fabrications and distortion. It can be deliberate. It can also be intended to have harmful effects. There is often commercial intent and it is growing. Manipulated media is another category pulled out from disinformation. It is often deliberate. It can also be intended to have harmful effects. There is often commercial intent and it is growing.
Manipulated media and disinformation have an intersection which represents new approaches and grown threats. New players are employing strategic, systematic, intelligent tech-enabled targeting and manipulation. These new players are individuals, fringe and interest groups, some businesses as well as state sponsored actors. There is rapid dissemination and propagation. It is possible that dissemination is rapid before anyone even notices something is happening.
Boosting the signals is very important to these players so that mis- and disinformation gets disseminated. A lot of fake accounts are created to amplify the mis- and disinformation.
Content Farms are the first category of players that disseminate this information. Soul Publishing is a Russian company that operates out of Cypress. They have a lot of 5 minute craft video, but when you subscribe, you get recommendations for other Soul Publishing videos, which may be more impactful in terms of mis- and disinformation. Disinformation websites generate $235 million of ad revenue annually.
Pseudo News Sites are the second type of site. They create specific sites for various groups and hire freelance journalist so their mis- and disinformation is harder to detect.
Manipulated media has generated successful audio attacks, such as having CFOs transfer money to illegitimate attacks. AI is also working out how to boost the quality of deep fakes. It’s increasingly difficult to detect as the technology gets more sophisticated.
Fake social media profiles are very common and everyday people have probably been targeted. Photos are stock, or co-opted or GAN-generated. There are also patterns and irregularities such as writing style, content, naming conventions.
There is an escalation in these tactics. Servers are moving to the US to try to circumvent those types of identifications. They are snaring legitimate authors to bring readership to these sites. The sites also target users for dissemination.
This is not just a US phenomenon, especially with elections, but is happening in Europe and Asia as well.
Compromised Accounts as well as accounts that are taken over. Established domain names are taken over if they expire, for example.
They are moving from a tactical approach to a strategic approach including understanding cognitive and behavioral analysis. Then these approaches are being exploited.
Gray zone or proxy conflict: commercial interests are easier targets. One of the reasons is economic espionage. For example, there is disinformation about the safety of cars, which can affect automobile makers.
There are 3 types of impact: macro, operating, and micro environments. The micro environment targets reputation and brand, financial assets, knowledge & intellectual assets as well as valuation within the industry. In the operating environment competition, innovation and industry structure can also be affected. The macro environment is affected by social and market stability, national and regional competitiveness as well as national regional security.
The impacts for research and intelligence professionals and their departments is to:
- Macro Environment
- issue analysis
- STEEP/PESTEL analsys
- Scenario analysis/Planning
- War Games
- Operating Environment
- Porter’s 5 Forces Analysis
- Customer/Market Analysis
- Supply Chain Analysis
- Benchmarking
- Micro Environment
- Competitor Analysis
- Management Analysis
- Technical Analysis
The key to all of the above is situational awareness.
Effects
Distrust is generated. There is diminished analytical quality and capabilities as well as crime and exploitation. These can lead to decision & planning errors because of distrust of information and analysis. Following that researchers and analysts can be distrusted as well. Distrust in institutions such as the media, government and social institutions will be generated. Reputations will be diminished/tarnished (this makes me think of when Tylenol was poisoned and how the company’s reputation was affected). Inefficiencies such as ‘noise’, overload and obfuscations will take more time and effort as well as resources and skills to combat.
What to Do
We need to understand the sources of mis- and disinformation. They come from state-sponsored organizations as well as corporations, HIVE groups, general mischief makers and the public (uninformed?).
- We need to systematically educate, but also create awareness. With the availability of search engines and the Internet, everyone is gathering their own information
- Direct combat – call out different content and information as disinformation. It is difficult because there is a lot. You can use software to address the issue at the point of research.
- Tracking bad sites and alerting people
- Pattern analysis
- Mapping and noting ‘hot spots’
- Point of access tools
- Cycle interruptions
- Profiling
- Relying on expert human intelligence – talk to people who are in the know on specific areas.
What you choose has to be based on the type of sources, content and subjects.
Level 1: Awareness and Training
- Types of disinformation actors & proagators
- understand the key players and sponsors
- understand the intentions and drivers
- understand the strategies
- understand the tactics
- understand the channels
- Information/media players, channels, cycles
- detect relationship, influences, anomalies
- understand potential vulnerabilities throughout cycles and pipelines
- track information origins and flow
- detect signs of ‘trickery’
Level 2: Sleuthing & Technology
- keep on top of developments (check out Graphika. Snopes can be used for stories shared on Facebook as well, but may not be suitable for everything)
- Compare content
- reverse text & image searches
- know platforms: Gab & Parler
- Check profile “likes” – agents want to create networks. The more a piece of information is liked, the more it becomes ‘true’
- Hijacked or repurposed websites
- Analysis tools & Services
- visual/audio analysis
- text content analysis
- metadata analysis
- mapping
- Experts
Level 3: Disinformation Protection Plan
- Assess vulnerabilities
- are there activities, functions or individuals who are especially vulnerable?
- are there processes and workflows which are vulnerable
- what about active/passive targeting?
- Can you find signals and indicators?
- Outline
- Map
- Prioritize
- it isn’t really possible to do everything.
Growing Activities >> Growing Research
Fortunately, academia is setting up centers and providing researchers to start to understand this growing threat. The Global Disinformation Index is one such tool that is being created. The Global Disinformation Index provides advertisers, ad tech companies and platforms with trusted, non-partisan and independent ratings to assess a site’s disinformation risks.
Commercial entities and government agencies are getting into the game as well.
Media organizations, such as the Poynter Institute are helping out with journalism and media literacy. The Poynter Institute for Media Studies is a non-profit journalism school and research organization located in St. Petersburg, Florida. The school is the owner of the Tampa Bay Times newspaper and the International Fact-Checking. Thinktanks like the Rand Corporation which has a variety of products and tools to combat disinformation including (Countering Truth Decay). The Rand Corp’s mission is To help improve policy and decisionmaking through research and analysis. Their core values are Quality and objectivity. R&D institutions such as MITRE and First Draft are also in the picture.
Maintain vigilance and work on this as a group within SLA.
One resource shared was COMPROP (Oxford)
I felt fairly hopeless after listening to this session. I don’t know how I can combat mis- and disinformation in a small information center. Cynthia was a good presenter and, clearly, there was a lot more to share and know. I would love to bring some of these tools and strategies to my clients, but KM is hard enough without adding this layer of tools to the mix.