Delta Stream Team Summaries
|
NAK Associates - William & Mary |
|
Within the NATO Communications and Information Agency, a new task force will be created called the Disinformation Task Force. This group will develop a simple user-friendly game that teaches social media users how to identify misinformation. This game will be incorporated into social media registration processes and made a requirement for existing users. The game will be periodically updated with the addition of special event prizes during peak times of disinformation spread. As users grow more accustomed to identifying disinformation, the developers will create more difficult stages of the game, and players will have an opportunity to be on a network wide leaderboard indicating their prowess in identifying disinformation. This leaderboard may be useful to NATO, as it will provide a list of informed citizens. This approach appeases private companies by giving them a way to visibly show their concern while avoiding any changes in the algorithms that made these sites popular. |
|
Keele Squirrels - Keele University |
|
A comprehensive media programme, raising awareness and information to combat disinformation distrust and disconnect by encouraging legitimate participation. Publishing legitimate information and collaborating with social media industry leaders, domestic governments and local leaders to raise awareness and consciousness of disinformation and cybersecurity, safeguarding truth, democracy and freedom. Through the publication and spread of legitimate information to vulnerable communities across NATO and beyond this programme fosters safe social spaces for political discourse free from cyber threats and disinformation. Through advertisement, local ambassadors and media presence NATOADAC strives against anti-democratic forces across the cyberspace. |
|
Filler Text - University of St Andrews |
|
|
|
Canadian Computer Scientists - University of Calgary |
|
Our solution would be to create a NATO Media Analysis Centre to scrape publicly available social media posts for storage and analysis. This centre will implement an analysis program that will analyze for influence, engagement, and shares to identify the most influential posts and users. Humans will then sort through these posts and identify misinformation and disinformation. This will be fed back into the system to track these users and trigger an alert when their posts reach a sufficient level of shares/influence, and NATO will craft messages to counter the disinformation. NATO must then make this analysis publicly available and work closely with local media sources in member countries to share the story of the disinformation and the corrected information. The media relations office of NATO must be more than just an organization that responds to the media and proactively engage them, specifically on disinformation. |
|
Hamilton College Continentals - Hamilton College |
|
A lack of digital literacy makes people particularly vulnerable to disinformation. Fact-checking, content removal and damage control cannot address the full amount of disinformation in the information ecosystem. A two-pronged, grassroots-driven approach consisting of disinformation training and a user-generated ad repository can mitigate the problems posed by disinformation at a low cost. Disinformation trainings will increase user’s knowledge and awareness of disinformation so they are better able to identify it, and less likely to fall victim to it. Once they have completed training, users can submit screenshots and other relevant information of advertisements they see on social media. This creates a repository of advertisements, bursting the filter bubble and providing NATO and social media companies with situational awareness as they get a sense of the size, scale, and scope of disinformation. User participation is incentivized through a token-based reward system. |
|
Baylor Cybears - Baylor University |
|
Private social media companies such as Facebook and Twitter have a political disinformation problem and are looking for a solution yet may be reluctant to allow governments to manage their platforms. NATO can act as a third-party authority with the human resources necessary to combat social media disinformation. NATO could independently “fact check” posts and accounts suspected of spreading disinformation. This would allow information to be verified by an unbiased third party that users can trust, reducing biased fact checking. NATO can then label posts as “Suspected Disinformation”, “Verified by NATO”, or “Unchecked by NATO.” This process of labeling disinformation yet not censoring is important for transparency. Furthermore, NATO could contact individuals who engage with a certain threshold of labeled “disinformation” with information regarding the background of certain political misinformation. Finally, NATO could introduce anti-disinformation bots that can flood channels of disinformation with information regarding the background of political disinformation. |
|
LPR - George Mason University |
|
Team LPR has determined a versatile solution using Artificial Intelligence algorithms and block chaining to enhance a safe cyberspace for everyone. We will implement our machine learning fact and opinion-based algorithm, computer vision using Optical Character Recognition, as well as blockchain for re-verification of the data. Keeping an agile model, scalability is implemented in order to keep it within a twelve month frame. Working with NATO ACT, we would work with the government sector directly and pitch these ideas to the companies. Our outputs and benefits are the concept of the cost of inaction. We are raising awareness, re-establishing trust in social media platforms, and assisting people in reflecting on their opinions, and this will improve people’s mental health, which is advantageous for society and the government. |
|
St Andrews Office of External Affairs - University of St Andrews |
|
The action plan aims to enhance the situational awareness and readiness of NATOs efforts to combat disinformation and election interference. Although cyberspace was recognized as one of the key domains of NATO’s structure, it is currently not treated as equally important to the other three: land, maritime and air. At the moment, it is the responsibility of the targeted state themselves to combat any cyber threat, rather than it being considered a collective security issue, despite being classified as such. The NATO Industry Cyber Partnership is an already existing framework that could be used to incorporate key companies such as META through incentivising their cooperation. This could be used to also extend the functions of the Cyber Rapid Reaction teams to act on a regular basis and not per request. Thus, using ethnic awareness disinformation in the run ups to key events could be combated through targeted information campaigns. |