
2020 has been a year marked by worldwide crises and changes, including a global pandemic, efforts to increase social justice, nation-shaping elections, massive wildfires and other environmental events, to name just a few. These crises and changes—in particular, their social dimensions—have been directly impacted and shaped by disinformation, influence campaigns, and other efforts to undermine people’s understanding and autonomy.
We will hold a special virtual conference on Social-cybersecurity in Times of Crisis and Change this November 2020. This conference aims to advance the science of social-cybersecurity through research and applications that address two questions. How does the unprecedented scale of false information and its spread impact human activity in times of crisis and social change? And, what are the mechanisms that enable or contain the spread of false information during these times?
The institute and conference charge no registration fee but registration is required.
Conference Highlights:
- The virtual institute will include: invited panels, virtual posters, regular talks and tutorials.
- "Selling Lies" with director Leslie Iwerks
First we will screen the film together and then open the discussion with a Q&A session with director Leslie Iwerks moderated by IDeaS Co-Director David Danks.
- Special Double Panel on Disinformation, Social Cybersecurity: The Path Ahead
Moderated by IDeaS Co-Director Kathleen M. Carley, panelists will discuss current forms of disinformation in our public discourse. Panelist include David A. Broniatowski, Conrad Tucker, David Mussington and Filippo Menczer. Social cybersecurity is an emerging scientific discipline critical for the future. Panelists will discuss the area and illustrate the types of tools and technologies that exist and that are needed in this emergent area. Examples are drawn from diverse events including COVID-19 and the US elections.
- The challenge of deepfakes and manipulated content - Panel and Q&A
This panel moderated by IDeaS Co-Director David Danks will discuss the challenges of manipulated content.
Emerging science in social cybersecurity is demonstrated in these talks by the CMU IDeaS Knight Fellows.
View Videos
here. Guest password to view, ideas2020
Institute & Conference Agenda
**tentative, all times US Eastern
last update Tuesday November 17th 2:05pm
Live and Taped Tutorials are offered
This is a free event
Taped Tutorials will be available beginning at 9 am on November 18th through November 21st.
Following is the schedule for live tutorials. Additional information on the tutorials are below.
10:20am |
Critical thinking and resilience - Dr. David Danks |
1:00pm |
Social Influence Campaigns - Dr. Kathleen M. Carley |
3:00pm |
Deepfakes and synthetic video - Dr. David Danks |
4:30pm |
Introduction to Network Analytics for Social Media - Jeff Reminga |
- Social Influence Campaigns (live on November 18th)
This tutorial will discuss how social influence campaigns are conducted in social media. The nature of social media and disinformation are reviewed. Information maneuvers used to conduct influence campaigns are presented. Technology pipelines for tracking and assessing social influence campaigns are presented and key tools demoed. Examples are drawn from the EuroMaidan Revolution, Covid-19, various elections and natural disasters. Key topics covered are social cybersecurity, information warfare, social influence, metrics for assessing influence, and misleading indicators.
- Facebook and Reddit Data in ORA (pre-recorded)
This tutorial will discuss how to visulaize and upload Reddit Data in ORA for analysis. Examples will be shown using Reddit data from current events.
- Networked Time Series Analysis and Clustering (pre-recorded)
In network science, a trail is defined as a time-series of categorical states, and can be used to model many interesting phenomena. Trail clustering, then, is the problem of discovering similar trails in a trail dataset. For example, a trail dataset containing trails of url shares by users on social media may be clustered to find similar users. First, we will cover the necessary background in time series and network analysis. Then, we will cover trail analysis basics and trails comparison methods. From there, we will expand on different challenges faced in trail analysis, including heterogeneous state-types, higher-order states, and highly dynamic data. Finally, we will consider the scalability of the approaches discussed. A case study of social media analysis will be used to demonstrate the discussed trail clustering methods in practice.
- Introduction to ORA (pre-recorded and live session on November 18th)
A lecture and hands-on workshop in which attendees learn about network science and the ORA toolkit. Using ORA the attendees will learn how to import, export, visualize, and assess data. Attention will be focused on processing Twitter, Blogs and YouTube data. Network analytics for content. Topic group detection. Participants will be presented with a thorough demonstration of software features used to create a sample network and analyze it. Sample data sets will be available.
- Critical Thinking and Resilience (live on November 18th)
As information — true, false, and misleading — spreads at ever-faster rates, it is increasingly important for individuals and communities to be able to think and reason critically about the world around them. This session will examine the cognitive bases of critical thinking and informational resilience, as well as ways to improve both in real-world settings.
- Deepfakes and synthetic video (live on November 18th)
Technological systems can now create highly realistic synthetic images and videos — pictures and movies that are indistinguishable from reality to the human eye. This session will examine current capabilities for both generation and detection of synthetic video, as well as potential responses to misleading or malicious synthetic content.
8:45am - 10:45 am: Polarization and Hate Speech
Social Media, News, Polarization, and Disinformation In Times Of Crisis: A Case Study On Turkey |
Baris Kirdemir, Nitin Agarwal |
War on “Fact Check” -- the Path to Magic 270 |
Rodrigue Rizk, Dominick Rizk, Vijay Srinivas Tida, and Sonya Hsu |
Visualizing Vitriol: Hate Speech and Image Sharing in the 2020 Singaporean Elections? |
Joshua Uyheng, Lynnette Hui Xian Ng, and Kathleen M. Carley |
From Xenophobia to Political Confrontation: Shifting Online Discussions of Racism During the COVID-19 Pandemic |
Joshua Uyheng, Daniele Bellutta, and Kathleen M. Carley |
Utilizing Topic Modeling and Social Network Analysis to Identify and Regulate Toxic COVID-19 Behaviors on YouTube |
Adewale Obadimu, Tuja Khaund, MaryEtta Morris, Esther Mead, and Nitin Agarwal |
10:45am - 11:00am: Break
11:00am - 12:10pm: Emerging Technologies and Techniques 1
12:10pm - 12:40pm: Lunch Break
12:40pm - 1:50pm: Emerging Technologies and Techniques 2
1:50pm - 2:00pm: Break
2:00pm - 3:00pm: "Selling Lies" Panel
Moderated by IDeaS Co-Director David Danks, we will screen the 30 min feature "Selling Lies" and have a Q&A with the director Leslie Iwerks.
3:00pm - 3:15pm: Break
3:15pm - 5:00pm: Special Double Panel on Disinformation - Part 1
- David A. Broniatowski (George Washington University)
Can Communicating the Gist Combat Systematic Manipulation of the Online Vaccine Discourse?
In this talk, I will provide an overview of the different ways in which malicious actors manipulate the vaccine debate online in order to advance a range of hidden agendas. Specific attention will be given to the role of domestic organizations framing vaccine refusal as a civil right, and foreign organizations that have promoted vaccine and COVID-19 disinformation for geopolitical ends. These manipulations occur in an environment of deep scientific uncertainty that is fueled by the ongoing COVID-19 pandemic. Thus, we conclude with evidence testing “Fuzzy-Trace Theory” — a leading psychological theory of decision under deep uncertainty that provides guidance of public health communicators who wish to combat these information operations and promote healthy behaviors.
- Conrad Tucker (Carnegie Mellon University)
From Generative Neural Networks to Social Media Networks: Ascertaining the Veracity of Data in the Information Age
Ascertaining the veracity of data in the information age is a challenge both for humans (e.g., communicating within social media networks) and machines (e.g., training data for artificial neural networks). A lack of data veracity has the potential to “fool” both machines, as well as humans into achieving different outcomes/output. From a machine learning perspective, “fooling” a machine has had a positive impact in the development of algorithms such as generative adversarial networks (GANs), and has resulted in the ability of machines to generate hyper-realistic data such as images, 3D geometries, and text. However, adverse effects can be observed in large-scale social media networks, where the veracity of data cannot be quickly ascertained. Misinformation that is spread via social media networks can result in echo-chambers, lone communities that facilitate selective content diffusion as a result of user polarization. Ironically, this misinformation can now be reliably generated using machine learning algorithms such as GANs. Our research focuses on developing methods to both generate high quality data, and safeguard against data exploitation. Several application domains are explored including product design and development, healthcare physiology state estimation, and STEM education.
Examining the mechanics of the online QAnon movement and its evolution
MITRE mapped the QAnon online and social media space to detect related conspiracy theories and the changing tradecraft. Omelas tracked QAnon propagation on social networks after so-called bans on Twitter, YouTube, and Facebook and found the social networks playing key roles in dissemination of Q content.
5:00pm - 5:15pm: Break
5:15pm - 6:25pm: ReOpen America
8:45am - 9:55am: COVID Conspiracies and Disinformation 1
9:55am - 10:05am: Break
10:05am - 11:35am: Interventions
11:35am - 12:15pm: Lunch Break
12:15pm - 2:00pm: Deepfakes Panel “The challenge of deepfakes and manipulated content”
- Sam Gregory (WITNESS)
- Claire Leibowicz (Partnership on AI)
- Steven Tiell (Accenture Labs)
2:00pm - 2:15pm: Break
2:15pm - 4:00pm: Special Double Panel on Disinformation - Part 2
- David A. Mussington (University of Maryland)
A Framework for Evaluating Disinformation Harms from Social Media
Policy for managing harms from dis- and mis-information on social media must focus on the potential for deleterious effects, rather than on the platforms themselves. This sort of "platform agnostic" approach means that the policies that permit "current" platform operations need to be the focus of possible remedial measures, not the business operations of the social media providers. Operating as they do with the advantages of private business but with broad regulatory exemptions justified by business formation, rather than social value rationales, a new regime for interacting with proto-monopolists occupying the space between "bare communications connectivity" and digital services providers is necessary. This presentation offers just such a framework, and lists some potential metrics requirements that flow from it.
- Filippo Menczer (Indiana University)
4 Reasons Why Social Media Make Us Vulnerable to Manipulation
As social media become major channels for the diffusion of
news and information, it becomes critical to understand how the
complex interplay between cognitive, social, and algorithmic biases triggered by our reliance on online social networks makes us vulnerable to manipulation and disinformation. This talk overviews ongoing network analytics, modeling, and machine learning efforts to study the viral spread of misinformation and to develop tools for countering the online manipulation of opinions.
- Kathleen M. Carley (IDeaS Center, Carnegie Mellon University)
Social cybersecurity is an emerging scientific area focused on the science to characterize, understand, and forecast cyber-mediated changes in human behavior, social, cultural and political outcomes, and to build the cyber-infrastructure needed for society to persist in its essential character in a cyber-mediated information environment under changing conditions, actual or imminent social cyber-threats. An example is the technology and theory needed to assess, predict and mitigate social influence manipulation and the spread of disinformation by inauthentic actors such as bots, cyborgs, and trolls. Social cybersecurity is a computational social science in which the socio-political context is taken in to account, advanced smart technologies operate along side humans, and operational utility of theory and methods is prized. Given the massive and ongoing changes in human communication and the changing affordances of available technologies, this is area where it is critical to think beyond the boundaries of disciplines and to move to transdisciplinary theories and empirical research.
4:00pm - 4:10pm Break
4:10pm - 5:15pm COVID Conspiracies and Disinformation 2
5:15pm - 5:25pm Break
5:25pm - 6:35pm: Protests