Critical Analysis of News Propagation Online: TwitterTrails Classroom Activity

We developed an educational activity for our Introduction to Media Arts and Sciences course “Computing for the Socio-Techno Web”, for students to think critically about the truthfulness of news propagated in social media. This activity utilizes TwitterTrails, a visual tool to analyze Twitter claims, events, and memes. This tool provides views such as a propagation graph of a story’s bursting activity, and co-ReTweeted network. Using a response and reflection form, students are guided through these different facets of a story.

We hope that other educators will further improve and use this activity with their own students.

Learning Goals

We defined the following learning goals for our in-class activity. In particular, we expect that following participation in the activity, students will be able to:

  1.  Understand the concepts of rumor spreading, including the extent and mechanisms in which stories propagate over social media.
  2.  Read and interpret visualizations that:
    Describe propagation of information over time
  3.  Conduct an evidence-based inquiry into the reliability of online information, employing a set of questions to examine who is spreading a story, and when and how was the story propagated
  4.  Identify indicators and characteristics that impact reliability including polarization, echo chamber, timing, and Twitters bots.

Activity materials

  • We presented the students with an 11-minute short demo on how to use TwitterTrails and we recorded the presentation in a video posted on YouTube.
  • Groups of 3-4 students selected randomly two of 12 stories and completed the Activity form.
  • Each student at the end completed a Reflection form on their experience investigating the stories.

Feel free to contact us, we would love to hear from you if you want to use our activity materials!

 

Evidence of pizzagate conspiracy theory on TwitterTrails

Many things have been written about the infamous #pizzagate conspiracy theory (“scandal” for those who believed in it) and, we are sure, much more will be written in the future. The fact that the outrageous and sick imagination of a few online trolls was able to persuade thousands of people that it was real, and motivate one of them to walk into the Comet Pizza with a loaded gun will be a matter of study of many Psychologists, Sociologists, and Political Scientists.  Given that in a few days a workshop will be held in Montreal on Digital Misinformation, we thought that this would be a good time to share some notes of a TwitterTrails story we did.

On December 2, 2016, we did a TwitterTrails investigation collecting Twitter data that contained the hashtag #pizzagate and we present here a few interesting observations about how it spread, who used it first and what were the shape of the community that engaged in the spreading until that time. Below are some of our findings. You can always explore the TwitterTrails story on your own.

Who used the hashtag #pizzagate first?

According to our data, the first mention of #pizzagate was at 8:34 AM GMT on Nov. 6, 2016, two days before the US elections. While the vast majority of people in the US were sleeping, the tweet was sent by a troll that has promoted tens of thousands of provocative lies to its 2 thousand followers. Most of the followers are certainly bots designed to infiltrate online groups willing to believe them — in this case Trump supporters.

If you want to know more about how trolls and spammers are successful in promoting lies, take a look at The Real “Fake News” post by Prof. Eni Mustafaraj.

 

Who made #pizzagate widely known?

The propagation graph below shows who were the main propagators of a rumor when its activity showed its first “burst”.

Clicking on the (partially covered) purple data point in the upper right corner we find that, surprisingly, the first tweet that had over 3000 retweets belongs to a pro-Erdogan Turkish journalist! According to The Daily Dot columnist Efe Sozeri, at that time, Turkey was outraged by a child abuse scandal and from controversial pending legislation on child marriage and governmental sources were trying to show that their scandal was minor compared to the US scandal.

But who informed the Turkish journalist about pizzagate? The propagation graph has some evidence that he was informed by a barrage of tweets that occur a few hours before his posting. The colorful column of data points just before his tweets are by a troll that sending dozens of tweets in Turkish. Here are a few of them as recorded in TwitterTrails:

 

A deafening echo chamber

The twitter exchange related to the pizzagate co-retweeted graph shows a dense echo chamber that is just verifying to its participants the validity of the conspiracy and allows no doubt to emerge:

This is the densest echo chamber we have observed on TwitterTrails. Among the 22,000 accounts posting about pizzagate, 4528 of them have risen to prominence being retweeted by at least two other accounts over a million times! Looking at the word cloud that characterizes the cyan group of 4474 participants, we see that the most common words in their profile are #maga, trump, truth [sic], love, god conservative.

How different is this graph from other graphs of political discourse? For comparison, we show what a typical co-retweeted network looks like when discussing political issues. Below is the graph related to the 2016 vice-presidential debate:

In this graph, you can see the two communities, their polarization, and the partial overlap as people read both sides but prefer one of them.

What else can we find?

These are just some of the insights that TwitterTrails can offer to a journalist or anyone who might want to study the propagation of a rumor. If you want to study it further, use TwitterTrails story of the hashtag #pizzagate and send us a comment!

 

 

The Real “Fake News”

The following is a blog post that Eni Mustafaraj has recently published in The Spoke. We reproduce it here with permission.

Fake news has always been with us, starting with The Great Moon Hoax in 1835. What is different now is the existence of a mass medium, the Web, that allows anyone to financially benefit from it.

Etymologists typically track the change of a word’s meaning over decades, sometimes even over centuries. Currently, however, they find themselves observing a new president and his administration redefine words and phrases on a daily basis. Case in point: “fake news.” One would have to look hard to find an American who hasn’t heard this phrase in recent months. The president loves to apply it as a label to news organizations that he doesn’t agree with.

But right before its most recent incarnation, the phrase “fake news” had a different meaning. It referred to factually incorrect stories appearing on websites with names such as DenverGuardian.com or TrumpVision365.com that mushroomed in the weeks leading up to the 2016 U.S. Presidential Election. One such story—”FBI agent suspected in Hillary email leaks found dead in apparent murder-suicide”—was shared more than a half million times on Facebook, despite being entirely false. The website that published it, DenverGuardian.com, was operated by a man named Jestin Coler, who, when tracked down by persistent NPR reporters after the election, admitted to being a liberal who “enjoyed making a mess of the people that share the content”. He didn’t have any regrets.

Why did fake news flourish before the election? There are too many hypotheses to settle on a single explanation. Economists would explain it in terms of supply and demand. Initially, there were only a few such websites, but their creators noticed that sharing fake news stories on Facebook generated considerable pageviews (the number of visits on the page) for them. Their obvious conclusion: there was a demand for sensational political news from a sizeable portion of the web-browsing public. Because pageviews can be monetized by running Google ads alongside the fake stories, the response was swift: an industry of fake news websites grew quickly to supply fake content and feed the public’s demand. The creators of this content were scattered all over the world. As BuzzFeed reported, a cluster of more than 100 fake news websites was run by individuals in the remote town of Ceres, in the Former Yugoslav Republic of Macedonia.

How did the people in Macedonia manage to spread their fake stories on Facebook and earn thousands of dollars in the process? In addition to creating a cluster of fake news websites, they also created fake Facebook accounts that looked like real people and then had these accounts subscribe to real Facebook groups, such as “Hispanics for Trump” or “San Diego Berniecrats”, where conversations about the election were taking place. Every time the fake news websites published a new story, the fictitious accounts would share them in the Facebook groups they had joined. The real people in the groups would then start spreading the fake news article among their Facebook followers, successfully completing the misinformation cycle. These misinformation-spreading techniques were already known to researchers, but not to the public at large. My colleague Takis Metaxas and I discovered and documented one such techniqueused on Twitter all the way back in the 2010 Massachusetts Senate election between Martha Coakley and Scott Brown.

There is an important takeaway here for all of us: fake news doesn’t become dangerous because it’s created or because it is published; it becomes dangerous when members of the public decide that the news is worth spreading. The most ingenious part of spreading fake news is the step of “infiltrating” groups of people who are most susceptible to the story and will fall for it.  As explained inthis news article, the Macedonians tried different political Facebook groups, before finally settling on pro-Trump supporters.

Once “fake news” entered Facebook’s ecosystem, it was easy for people who agreed with the story and were compelled by the clickbait nature of the headlines to spread it organically. Often these stories made it to the Facebook’s Trending News list. The top 20 fake news stories about the election received approximately 8.7 million views on Facebook, 1.4 million more views than the top 20 real news stories from 19 of the major news websites (CNN, New York Times, etc.), as an analysis by BuzzFeed News demonstrated. Facebook initially resisted the accusation that its platform had enabled fake news to flourish. However, after weeks of intense pressure from media and its user base, it introduced a series of changes to its interface to mitigate the impact of fake news. These include involving third-party fact-checkers to assign a “Disputed” label to posts with untrue claims, suppressing posts with such a label (making them less visible and less spreadable) and allowing users to flag stories as fake news.

It’s too early to assess the effect these changes will have on the sharing behavior of Facebook users. In the meantime, the fake news industry is targeting a new audience: the liberal voters. In March, the fake quote “It’s better for our budget if a cancer patient dies more quickly,” attributed to Tom Price, the Secretary of Health and Human Services, appeared on a website titled US Political News, operated by an individual in Kosovo. The story was shared over 80,000 times on Facebook.

Fake news has always been with us, starting with The Great Moon Hoax in 1835. What is different now is the existence of a mass medium, the Web, that allows anyone to monetize content through advertising. Since the cost of producing fake news is negligible, and the monetary rewards substantial, fake news is likely to persist. The journey that fake news takes only begins with its publication. We, the reading public who share these stories, triggered by headlines engineered to make us feel outraged or elated, are the ones who take the news on its journey. Let us all learn to resist such sharing impulses.

False and True rumors look very differently on TwitterTrails

News of Turkish airplane shooting down a Russian one over the Turkish-Syrian border has dominated the news and the social media lately. We investigated the rumor within hours after it appeared (24 Nov. 2015) and you can see the ressults of the analysis here: http://twittertrails.wellesley.edu/~trails/stories/investigate.php?id=462776628

This was not the first time a rumor of this kind emerged. About a month and a half ago (10 Oct. 2015) an identical rumor had emerged. We had investigated that rumor too and you can see the results of our anaysis here: http://twittertrails.wellesley.edu/~trails/stories/investigate.php?id=134661966

Russian jet downing rumors

As you can see, based on the crowd’s reaction to the rumors, TwitterTrails was able to determine that the October rumor was false while the November one was true. The false rumor did not spread much and had a lot of skeptical tweets questioning its validity. On the other hand, the true rumor spread much higher and in terms of skepticism was undisputed.

Our understanding of the way the “wisdom of the crowd” works is that, when unbiased, emotionally cool observers see a rumor that seems suspicious, they usually react in one of two ways: They either do not retweet it, reducing its spread, or they may respond questioning the validity of the rumor, resulting in higher skepticism.

Continue reading False and True rumors look very differently on TwitterTrails

Twitter’s increasing polarization about the refugee crisis via #RefugeesWelcome

Since the Paris attacks in November, 2015, social media has become increasingly polarized and emotional when it comes to discussing the refugee crises.  On the 18th, the hashtag #RefugeesWelcome was trending on Twitter in the US.  The co-retweeted network visualizes just how polarized and international the debate is.

network

top-description-words-all
The co-retweeted network of the #RefugeesWelcome data, collected on 11/18. The largest groups are noted on the graph. Users are highly polarized, as well as grouping by language and location. The word clouds show the most common user-profile words for each group, with colors matching the graph (English stop words are filtered out).

On the left are three pro-refugee groups, differentiated by language and location.  The purple group is largely German; the green are Scottish, many identifying with the Scottish National Party; and the blue are American liberals.  On the right are four groups spreading anti-refugee messages.  In yellow, another group of Germans; the red, orange and pink groups are all English speaking, mostly American, with similar messages and identifying themselves with terms like #tcot and “christian.” 

The co-retweeted network graph is interactive on TwitterTrails, and includes a widget below the graph where you can view aggregated statistics on each group, including user languages, most used words in descriptions and most used hashtags.

You can explore this network on TwitterTrails:
http://twittertrails.wellesley.edu/~trails/stories/investigate.php?id=984767780

Comparison with past usage

#RefugeesWelcome was also trending on September 3rd, in reaction to the images of the body of a drowned Syrian toddler, Alyan Kurdi, washed up on a Turkish beach.  At this time, the network was not polarized, and although there are different groups shown they are mainly spreading the pro-refugee message.  

Continue reading Twitter’s increasing polarization about the refugee crisis via #RefugeesWelcome

Strong emotions on Twitter let the false claim that 10,000 refugees arrived in New Orleans spread unchecked

Various rumors have been spreading on social media–even Twitter, which usually moderates false claims–following the Paris attacks on November 13th, 2015.  We recently reported on Trump’s misattributed tweet earlier, and now we highlight another claim that is spreading in this emotionally charged environment: that 10,000 Syrian refugees (that’s apparently them in the image below, notably all young men) have recently arrived in New Orleans.

A photo which has been posted with the claim show the supposed refugees who arrived in New Orleans
A photo which has been posted with the claim show the supposed refugees who arrived in New Orleans

10,000 Syrian refugees have not arrived in New Orleans.  The image, according to snopes, was taken in Hungary in September 2015.  This claim, fueled by the emotional climate on social media due to the Paris attacks, a combination of anger, fear and increasing xenophobia, vastly inflated the reality of only two Syrian refugee families arriving in New Orleans. (Yet, the arrival has caused backlash from Louisiana republicans, including presidential candidate Bobby Jindal, adding to the momentum of false claims).  

Like the Trump tweet claim, this story has moderate spread for a false claim (as observed by TwitterTrails), and also very low skepticism.  Very little fact checking has affected the emotional spread of this claim.

spread

If you are familiar with the Twittertrails.com system you can explore the story on TwitterTrails, or keep reading to see how strong emotions manifest in the spreading of rumors on Twitter.
http://twittertrails.wellesley.edu/~trails/stories/investigate.php?id=647751846

Continue reading Strong emotions on Twitter let the false claim that 10,000 refugees arrived in New Orleans spread unchecked

Did Donald Trump write an insensitive tweet about the Nov 13th attacks in Paris?

tweetThe tweet on the right caught my attention this morning.  It contains a screenshot of Gérard Araud, French ambassador to the US replying to a tweet written by Donald Trump.  Trump calls the “tragedy in Paris” “interesting” given that France has some of the strictest gun control laws in the world and Araud calls Trump “repugnant” and lacking “human decency”.

My immediate reaction was skepticism: even Trump, now running for president, wouldn’t write such an insensitive tweet following November 13th’s tragic attacks.  So, after getting over the shock value of what I read, I went into fact checking mode.  It didn’t take very much research to poke holes in this claim.  The screenshot posted in the tweet shows the tweet by Trump was sent on January 7th; obviously not after the attacks.  I also checked Donald Trump’s twitter account, and quickly observed that the tweet was not in his timeline, and he had in fact tweeted prayers for Paris.  

However, the tweet by Trump is real and a quick google search found it (at the time of writing this, it has not been deleted):
https://twitter.com/realdonaldtrump/status/552955167533174785
trump-tweet

Rather than being about the attacks on November 13th, the tweet was written in response to the Charlie Hebdo attacks, which happened January 7th, 2015.  (The tweet by Araud seems to be legitimate and has been mentioned by various sources, but has been deleted)

So how did so many people miss this fact?  

Continue reading Did Donald Trump write an insensitive tweet about the Nov 13th attacks in Paris?

Global warming denialists make lousy spammers

nasa-tweetTwitterTrails is a system focused on studying how rumors and memes spread online, but it can also track efforts by spammers to promote spam messages on Twitter. In one such case, which we observed recently, global warming denialists online misrepresented an article posted by NASA to claim that NASA’s article effectively denies the existence of global warming. TwitterTrails discovered their effort and identified 115 of these denialists. Here’s what happened:

On October 30th, 2015, NASA posted an article about a study showing that the amount of ice in the Antarctic ice sheet has increased overall.  They tweeted about the article, summarizing it with: “Antarctica is overall accumulating ice, but parts have increased ice loss in last decades.”

Several news websites posted articles about the study which were widely shared on Twitter expressing overall skepticism about both the findings and their implications for global warming:

However, the link shared by far the most on Twitter a website called “The Real Strategy” which takes a very different spin on the NASA study:

NASA Debunked Global Warming Hoax in Study

You will not hear this on your mainstream “News” channels… A new study by NASA has proven that the global warming hoax is a myth, once and for all.

A study released by NASA on October 30 says that the amount of ice in the poles has increased steadily over the last several decades. What happened to Al Gore’s claim that the poles would be melted completely by this year? Obama was just filmed claiming that “Global Warming” is proven… How stupid are their audiences really? Has everyone been programmed to just accept what these liars tell them as gospel? This study proves that “Global Warming” is so bogus that the world is actually COOLING!

The story continues on to quote the NASA article at length to support its claim that climate change is a “hoax”. This story received widespread air time on Twitter; nearly 4,000 of the 8,700 tweets we collected in reference to the NASA article included a link to “The Real Strategy,” compared to just a few hundred links to other relevant articles (including NASA’s original).

So, does Twitter audience believe that NASA’s own study disproves global warming?  Not quite.   

Continue reading Global warming denialists make lousy spammers

#Pinkout vs. #Blackout

On September 29th, 2015, Planned Parenthood organized a #PinkOut hashtag, to give Twitter users a forum in which to show support for the group.  However, data collected with the #PinkOut hashtag shows one of the most polarized co-retweeted networks we have collected on TwitterTrails, far more polarized than even US electoral debates!

The co-retweeted network of #PinkOut investigated on Sept. 29, 2015 at 10AM ET.
The co-retweeted network of #PinkOut investigated on Sept. 29, 2015 at 10AM ET.

Attempting to undermine #PinkOut, some Twitter users opposed to Planned Parenthood suggested their own hashtag: #BlackOut. On the 28th, @PolitiBunny tweeted: “Stand w/the babies on Sept 29, make your avi black & white, #BLACKout ur banner photo to protest @PPFA’s #PinkOut. We need you! #Life #ccot”

Thus, a Twitter “war” of attention was waged; or rather, another “battle” in a long standing social media feud.   This blog post explores the polarization in the network, and the two opposing hashtags.

Continue reading #Pinkout vs. #Blackout