In the run-up to the 2020 presidential election, some of the most popular social media platforms have introduced new measures to curb misinformation, increase transparency and try to bolster the integrity of the democratic process this time around.
These updates — which some experts say still may not go far enough — come after an unprecedented election saga in 2016, when Russian operatives exploited social media to try to influence the American electorate in favor of Donald Trump ahead of the presidential vote, according to multiple investigations.
Intelligence officials warned in 2018 that Russia was at it again, along with other state actors. Now in this election cycle, cybersecurity experts have also raised alarm over the increasing threat of domestic actors sowing misinformation online.
The task of policing content while avoiding the appearance of bias has been a tripwire for many of these social media giants, who have faced attacks from both sides of the political aisle for decisions to remove certain content, including allegations of censorship.
ABC News has compiled this explainer to provide readers with a guide to comparing and contrasting policy measures from some of the most-used social media platforms in the U.S. including Facebook (and Facebook-owned Instagram), Twitter, Reddit, TikTok and YouTube.
Facebook
Facebook, the most-used social media platform in the U.S., took the most heat for the 2016 controversy.
In the years since 2016, Facebook’s core efforts to maintain election integrity have fallen into three major categories: Taking down inauthentic accounts and networks, tightening policies on content moderation, and unveiling an ad database with the goal of increased transparency.
Facebook also launched an Elections Operations Center in 2018, a team that it says will monitor potential democratic process abuses on the network in real-time. The company said that so far it has removed more than 120,000 pieces of content from Facebook and Instagram in the U.S. for violating voter-interference policies it has set, and displayed warnings on more than 150 pieces of content. Moreover, the company said it removed 30 networks engaged in coordinated inauthentic behavior targeting the U.S.
In August of 2020, the company unveiled a campaign to encourage people to vote and pledged to remove any content that encourages people not to vote, such as posts which state that voting requires a passport or driving license.
In the weeks ahead of the 2020 vote, the company also announced a series of last-minute changes, including banning all new political advertisements a week before the election, removing new posts with militarized language, such as “army” or “battle,” that aims to suppress voters and temporarily pausing all political ads on the site for an undisclosed period of time after the polls close on Nov. 3.
Facebook also said it will label content that seeks to delegitimize the outcome of the election, and label content from candidates or campaigns that try to declare victory before results are in — instead directing users to official results from Reuters and the National Election Pool.
Moreover, Facebook said it would start labeling some content that it doesn’t remove because it is deemed newsworthy, such as speeches from politicians.
“We’ll allow people to share this content to condemn it, just like we do with other problematic content, because this is an important part of how we discuss what’s acceptable in our society — but we’ll add a prompt to tell people that the content they’re sharing may violate our policies,” Zuckerberg said in a Facebook post at the time.
In addition, Facebook said it would remove all accounts representing the group QAnon, a baseless conspiracy theory which purports, without evidence, that Donald Trump is working in secret against a global Satanic pedophile ring. The unfounded theory was invented online shortly after the 2016 election and has made its way into the political discourse.
While these are major changes at the company compared to 2016 when many, including CEO Mark Zuckerbreg, say it was caught flat-footed, some advocates have criticized what that say is how narrow their actions surrounding political ads are.
“The policies Facebook has taken are extremely reactive,” Ben Decker, the founder of Memetica, a digital investigations consultancy firm, told ABC News. “I don’t think the measures they have taken to curb political ads are going to be particularly effective, because they have these exact stipulations.”
Dipayan Ghosh, the co-director of the Harvard Kennedy School’s digital platforms and democracy project, told ABC News that the ban on new political advertisements one week ahead of the election, “is a ban on new submittals, not on political advertising entirely” and questioned the impact of the ban when record numbers of people are voting early.
Banning political ads after polls close is also “not necessarily going to have a result on the on the election itself,” Ghosh added.
Facebook said this move was aimed to “to reduce opportunities for confusion or abuse.”
“I think many advocates would have liked to see is Facebook to extend a full ban on political advertising for a lengthy period ahead of the election, say, a month or even longer than that,” Ghosh said. “What many of us wanted to see from Facebook is a full ban, a commitment to put the democratic process over revenues.”
Facebook made nearly $70 billion in advertising revenue alone last year, according to financial disclosures.
Ghosh also expressed concerns over the way misinformation spreads on private Facebook groups, which in many cases remain largely unregulated unless they contain active calls for violence — and even then Facebook has been accused of reacting too late.
“I myself have joined groups which have amassed a big following on different sorts of issues, mostly sports related, which then all of a sudden change one day in theme from something about the New York Giants, to ‘Justice for Justice Kavanaugh,'” Ghosh said. “And you can clearly see that what’s happening here is that organizers are trying to get people into these groups, and then all of a sudden, turn a switch to try to influence the members of that group toward these kinds of political themes.”
Also concerning to many, new research from the German Marshall Digital Fund think tank published earlier this week found that more people now are engaging with outlets on Facebook that repeatedly publish verifiably false content than in the lead-up to the 2016 election.
Twitter
Twitter banned all political ads worldwide in October 2019, a move that put it in stark contrast to Facebook, which at the time had recently ruled out banning political ads. Jack Dorsey, Twitter’s CEO, tweeted “while internet advertising is incredibly powerful and very effective for commercial advertisers, that power brings significant risks to politics.”
Ghosh noted that Twitter originally did not make “a lot of money off of political advertising, which likely made it an easier decision for Jack Dorsey than it would be for Mark Zuckerberg.” Twitter reported making nearly $3 billion in ad revenue in Fiscal Year 2019, according to financial disclosures.
Political ads on Twitter did not exist on the same scale as they do on Facebook, but the company has also taken a number of additional measures in recent months to show they are taking action ahead of the 2020 election. Most recently it launched what it calls a “2020 U.S. election hub” which will include a curated list of news articles, as well as live streams of debates.
As part of a suite of measures to combat misinformation, Twitter also introduced a new labelling system in May 2020 which allowed the platform to flag tweets with what it determined to be misleading content.
In the last few months, the social media platform found itself embroiled in controversy after it labeled a number of Donald Trump’s tweets, including those containing claims about mail-in-voting, as potentially misleading, It has also put labels on Trump’s tweets for violating its policies for abusive behavior as well as those regarding manipulated media. In these cases, the tweets are hidden from view but users can easily click in to see the content. Trump has accused Twitter of trying to silence conservative voices.
Critics have questioned the efficacy of the labels Twitter (and Facebook) use in actually stopping misinformation or false claims from spreading or being amplified on the platforms.
Decker noted that more research needs to be done here, but said “it’s unclear often how many of those who read the disinformation are actually reading the fact check, or the intervention response.”
Ghosh said he thinks that “these kinds of labels have a very limited, marginal impact on influencing the opinion of the people who consume that content.”
“I can’t say that these labels really resolve the core issue, which is that you’ve got a person in certain cases with a massive following, who is pushing misinformation intentionally and pushing disinformation, and trying to do so for his own political gain,” Ghosh said. “Having this sort of label does not really change the mind of anyone who’s consuming it.”
A study published in March by researchers at the Massachusetts Institute of Technology suggested that selective labeling of false news can actually have detrimental effect, dubbed the “implied-truth effect,” where unmarked and unchecked, yet still demonstrably false, content appears more legitimate.
The strongest weapon Twitter has to prevent the spread of political misinformation is the removal of tweets and the restriction of accounts, but the platform utilizes these sparingly, likely to avoid being accused of censorship. The most high profile example of this was when it restricted Donald Trump Jr.’s account in late July after he shared a video featuring doctors making false claims about the coronavirus, including that masks are unnecessary. Trump Jr.’s account was suspended for 12 hours, meaning he was unable to tweet, and it removed the video from public view.
Last week, Twitter also unveiled a slew of new updates aimed specifically at curbing the spread of misinformation on the platform ahead of the election, including efforts to stop tweets with misleading information from going viral and a policy that will not allow any person, including candidates for office, to claim an election win before it was authoritatively called.
Significantly, users will not be able to retweet or reply to tweets “with a misleading information label from U.S. political figures (including candidates and campaign accounts), U.S.-based accounts with more than 100,000 followers, or that obtain significant engagement.” Users will, however, be able to quote-tweet the messages, although they will have to click through a warning in order to see these labeled tweets in the first place.
When users attempt to retweet, they will be prompted to Quote Tweet (add their own commentary) instead.
“Though this adds some extra friction for those who simply want to Retweet, we hope it will encourage everyone to not only consider why they are amplifying a Tweet, but also increase the likelihood that people add their own thoughts, reactions and perspectives to the conversation,” the company said in a blogpost.
YouTube
The video-sharing giant announced earlier this year some updates to how it was preparing for the election, saying it would remove election-related content that violated its Community Guidelines.
“These policies prohibit hate speech, harassment, and deceptive practices, including content that aims to mislead people about voting or videos that are technically manipulated or doctored in a way that misleads users (beyond clips taken out of context) and may pose a serious risk of egregious harm,” the company said.
The company also said it would remove content that contains hacked information, stating, “For example, videos that contain hacked information about a political candidate shared with the intent to interfere in an election.”
Similar to other platforms, YouTube also pledged to remove content encouraging users to interfere with the democratic process, citing an example as content “telling viewers to create long voting lines with the purpose of making it harder for others to vote.”
Some have expressed concern that YouTube (similar to Reddit) has not yet published a clear policy on how it will handle candidates claiming victory before the election is officially called.
Decker called YouTube’s policies “extremely reactive” overall.
“Oftentimes, they will apply key word filters to prevent content from being found in search, YouTube’s biggest claim is they incorporate Wikipedia pages into knowledge panels, so if it’s a video about COVID-19, regardless of where it’s from, there would also be this knowledge panel or factcheck above the description that points you toward accurate sources,” Decker said. The Wikipedia articles, while volunteer-edited, provide at least some context to content that would otherwise not have any.
“While the problem on YouTube is still bad, it’s now much less worse,” Decker said.
He also noted that their three strikes policy has been effective in booting a number of content creators off the platform, but an unintended consequence is that this has “led to the rise of fringe platforms.”
“It’s tricky because in one sense it does clean up the stream in the short term, so on the one hand it creates healthier conversations, but it moves them to another area of the internet, which is even more unregulated but there are even less dissenting views, so it’s a space where people can be radicalized,” he said.
Notably, YouTube announced in a company blogpost earlier this week it was taking new steps to curb hate by “removing more conspiracy content used to justify real-world violence.”
Specifically, YouTube cited QAnon as an example of entity “that targets an individual or group with conspiracy theories that have been used to justify real-world violence.”
“As always, context matters, so news coverage on these issues or content discussing them without targeting individuals or protected groups may stay up,” the blogpost added. “We will begin enforcing this updated policy today, and will ramp up in the weeks to come.”
Reddit
In April 2020, Reddit announced that it was launching a subreddit dedicated to political transparency, which would list all political ad campaigns running on Reddit dating back to January of 2019. The company said this subreddit would give information on the individual advertiser, their targeting, impressions, and spend on a per-campaign basis. As an additional transparency measure, Reddit said it would require political advertisers to leave comments “on” for the first 24 hours of a campaign to enable them to “engage directly with users in the comments.”
While the political transparency subreddit contains significant details about political ads, it has a limited reach, with around 3,000 members since it was launched five months ago. It is also worth noting that Reddit doesn’t allow political ads in other countries.
In June 2020, Reddit also announced that it was banning a number of subreddits which it said violated company policies on hate speech. Included in these was r/TheDonald, a pro-Trump subreddit which was popular in the run-up to the 2016 election but which had been largely inactive for months despite its nearly 800,000 members. Members of this subreddit had already migrated to another platform the year before, in response to stricter content rules and increased moderation. The banning of this subreddit and others was indicative of the problems facing social media platforms, where measures to combat hate speech or misinformation do not keep pace with the dissemination of such material.
Reddit, however, noticeably has no stated policy on candidates claiming victory in the election before it is authoritatively called.
TikTok
While the Chinese-owned video sharing app avoided the level of misinformation scrutiny leveled at platforms like Facebook and Twitter, it has taken a number of actions in recent months to show that it is taking a stand before the election.
Along with banning political ads, in August TikTok also announced a suite of news measures to combat misinformation ahead of the 2020 presidential election. Crucially, it banned manipulated media which it said “misleads users by distorting the truth of events in a way that could cause harm.” This included deepfakes, synthetic media produced by artificial intelligence which has the appearance of being real.
Despite these measures, political content on TikTok, like all social media platforms, is extremely popular. Videos containing the hashtag #Trump2020 have been viewed 10.3 billion times by September 2020, according to data on the app. A report from the Wall Street Journal late last year claimed that the Trump campaign had reached out to TikTok accounts with large supportive followings, including some with the Trump 2020 flag in their videos.
Sutherland House Experts is Empowering Quiet Experts through Compelling Nonfiction in a Changing Ideas Landscape
TORONTO, ON — Almost one year after its launch, Sutherland House Experts is reshaping the publishing industry with its innovative co-publishing model for “quiet experts.” This approach, where expert authors share both costs and profits with the publisher, is bridging the gap between expertise and public discourse. Helping to drive this transformation is Neil Seeman, a renowned author, educator, and entrepreneur.
“The book publishing world is evolving rapidly,” publisher Neil Seeman explains. “There’s a growing hunger for expert voices in public dialogue, but traditional channels often fall short. Sutherland House Experts provides a platform for ‘quiet experts’ to share their knowledge with the broader book-reading audience.”
The company’s roster boasts respected thought leaders whose books are already gaining major traction:
• V. Kumar Murty, a world-renowned mathematician, and past Fields Institute director, just published “The Science of Human Possibilities” under the new press. The book has been declared a 2024 “must-read” by The Next Big Ideas Club and is receiving widespread media attention across North America.
• Eldon Sprickerhoff, co-founder of cybersecurity firm eSentire, is seeing strong pre-orders for his upcoming book, “Committed: Startup Survival Tips and Uncommon Sense for First-Time Tech Founders.”
• Dr. Tony Sanfilippo, a respected cardiologist and professor of medicine at Queen’s University, is generating significant media interest with his forthcoming book, “The Doctors We Need: Imagining a New Path for Physician Recruitment, Training, and Support.”
Seeman, whose recent and acclaimed book, “Accelerated Minds,” explores the entrepreneurial mindset, brings a unique perspective to publishing. His experience as a Senior Fellow at the University of Toronto’s Institute of Health Policy, Management and Evaluation, and academic affiliations with The Fields Institute and Massey College, give him deep insight into the challenges faced by people he calls “quiet experts.”
“Our goal is to empower quiet, expert authors to become entrepreneurs of actionable ideas the world needs to hear,” Seeman states. “We are blending scholarly insight with market savvy to create accessible, impactful narratives for a global readership. Quiet experts are people with decades of experience in one or more fields who seek to translate their insights into compelling non-fiction for the world,” says Seeman.
This fall, Seeman is taking his insights to the classroom. He will teach the new course, “The Writer as Entrepreneur,” at the University of Toronto, offering aspiring authors practical tools to navigate the evolving book publishing landscape. To enroll in this new weekly night course starting Tuesday, October 1st, visit: https://learn.utoronto.ca/programs-courses/courses/4121-writer-entrepreneur
“The entrepreneurial ideas industry is changing rapidly,” Seeman notes. “Authors need new skills to thrive in this dynamic environment. My course and our publishing model provide those tools.”
About Neil Seeman: Neil Seeman is co-founder and publisher of Sutherland House Experts, an author, educator, entrepreneur, and mental health advocate. He holds appointments at the University of Toronto, The Fields Institute, and Massey College. His work spans entrepreneurship, public health, and innovative publishing models.
Hallmark launching a streaming service with two new original series, and Bill Skarsgård out for revenge in “Boy Kills World” are some of the new television, films, music and games headed to a device near you.
Also among the streaming offerings worth your time as selected by The Associated Press’ entertainment journalists: Alex Garland’s “Civil War” starring Kirsten Dunst, Natasha Rothwell’s heartfelt comedy for Hulu called “How to Die Alone” and Sylvester Stallone’s second season of “Tulsa King” debuts.
NEW MOVIES TO STREAM SEPT. 9-15
— Alex Garland’s “Civil War” is finally making its debut on MAX on Friday. The film stars Kirsten Dunst as a veteran photojournalist covering a violent war that’s divided America; She reluctantly allows an aspiring photographer, played by Cailee Spaeny, to tag along as she, an editor (Stephen McKinley Henderson) and a reporter (Wagner Moura) make the dangerous journey to Washington, D.C., to interview the president (Nick Offerman), a blustery, rising despot who has given himself a third term, taken to attacking his citizens and shut himself off from the press. In my review, I called it a bellowing and haunting experience; Smart and thought-provoking with great performances. It’s well worth a watch.
— Joey King stars in Netflix’s adaptation of Scott Westerfeld’s “Uglies,” about a future society in which everyone is required to have beautifying cosmetic surgery at age 16. Streaming on Friday, McG directed the film, in which King’s character inadvertently finds herself in the midst of an uprising against the status quo. “Outer Banks” star Chase Stokes plays King’s best friend.
— Bill Skarsgård is out for revenge against the woman (Famke Janssen) who killed his family in “Boy Kills World,” coming to Hulu on Friday. Moritz Mohr directed the ultra-violent film, of which Variety critic Owen Gleiberman wrote: “It’s a depraved vision, yet I got caught up in its kick-ass revenge-horror pizzazz, its disreputable commitment to what it was doing.”
— The year was 2006. Snow Patrol, the Northern Irish-Scottish alternative rock band, released an album, “Eyes Open,” producing the biggest hit of their career: “Chasing Cars.” A lot has happened in the time since — three, soon to be four quality full-length albums, to be exact. On Friday, the band will release “The Forest Is the Path,” their first new album in seven years. Anthemic pop-rock is the name of the game across songs of love and loss, like “All,”“The Beginning” and “This Is the Sound Of Your Voice.”
— For fans of raucous guitar music, Jordan Peele’s 2022 sci-fi thriller, “NOPE,” provided a surprising, if tiny, thrill. One of the leads, Emerald “Em” Haywood portrayed by Keke Palmer, rocks a Jesus Lizard shirt. (Also featured through the film: Rage Against the Machine, Wipers, Mr Bungle, Butthole Surfers and Earth band shirts.) The Austin noise rock band are a less than obvious pick, having been signed to the legendary Touch and Go Records and having stopped releasing new albums in 1998. That changes on Friday the 13th, when “Rack” arrives. And for those curious: The Jesus Lizard’s intensity never went away.
— Hallmark launched a streaming service called Hallmark+ on Tuesday with two new original series, the scripted drama “The Chicken Sisters” and unscripted series “Celebrations with Lacey Chabert.” If you’re a Hallmark holiday movies fan, you know Chabert. She’s starred in more than 30 of their films and many are holiday themed. Off camera, Chabert has a passion for throwing parties and entertaining. In “Celebrations,” deserving people are surprised with a bash in their honor — planned with Chabert’s help. “The Chicken Sisters” stars Schuyler Fisk, Wendie Malick and Lea Thompson in a show about employees at rival chicken restaurants in a small town. The eight-episode series is based on a novel of the same name.
— Natasha Rothwell of “Insecure” and “The White Lotus” fame created and stars in a new heartfelt comedy for Hulu called “How to Die Alone.” She plays Mel, a broke, go-along-to-get-along, single, airport employee who, after a near-death experience, makes the conscious decision to take risks and pursue her dreams. Rothwell has been working on the series for the past eight years and described it to The AP as “the most vulnerable piece of art I’ve ever put into the world.” Like Mel, Rothwell had to learn to bet on herself to make the show she wanted to make. “In the Venn diagram of me and Mel, there’s significant overlap,” said Rothwell. It premieres Friday on Hulu.
— Shailene Woodley, DeWanda Wise and Betty Gilpin star in a new drama for Starz called “Three Women,” about entrepreneur Sloane, homemaker Lina and student Maggie who are each stepping into their power and making life-changing decisions. They’re interviewed by a writer named Gia (Woodley.) The series is based on a 2019 best-selling book of the same name by Lisa Taddeo. “Three Women” premieres Friday on Starz.
— Sylvester Stallone’s second season of “Tulsa King” debuts Sunday on Paramount+. Stallone plays Dwight Manfredi, a mafia boss who was recently released from prison after serving 25 years. He’s sent to Tulsa to set up a new crime syndicate. The series is created by Taylor Sheridan of “Yellowstone” fame.
— One thing about the title of Focus Entertainment’s Warhammer 40,000: Space Marine 2 — you know exactly what you’re in for. You are Demetrian Titus, a genetically enhanced brute sent into battle against the Tyranids, an insectoid species with an insatiable craving for human flesh. You have a rocket-powered suit of armor and an arsenal of ridiculous weapons like the “Chainsword,” the “Thunderhammer” and the “Melta Rifle,” so what could go wrong? Besides the squishy single-player mode, there are cooperative missions and six-vs.-six free-for-alls. You can suit up now on PlayStation 5, Xbox X/S or PC.
— Likewise, Wild Bastards isn’t exactly the kind of title that’s going to attract fans of, say, Animal Crossing. It’s another sci-fi shooter, but the protagonists are a gang of 13 varmints — aliens and androids included — who are on the run from the law. Each outlaw has a distinctive set of weapons and special powers: Sarge, for example, is a robot with horse genes, while Billy the Squid is … well, you get the idea. Australian studio Blue Manchu developed the 2019 cult hit Void Bastards, and this Wild-West-in-space spinoff has the same snarky humor and vibrant, neon-drenched cartoon look. Saddle up on PlayStation 5, Xbox X/S, Nintendo Switch or PC.
Former President Donald Trump is on the brink of a significant financial decision that could have far-reaching implications for both his personal wealth and the future of his fledgling social media company, Trump Media & Technology Group (TMTG). As the lockup period on his shares in TMTG, which owns Truth Social, nears its end, Trump could soon be free to sell his substantial stake in the company. However, the potential payday, which makes up a large portion of his net worth, comes with considerable risks for Trump and his supporters.
Trump’s stake in TMTG comprises nearly 59% of the company, amounting to 114,750,000 shares. As of now, this holding is valued at approximately $2.6 billion. These shares are currently under a lockup agreement, a common feature of initial public offerings (IPOs), designed to prevent company insiders from immediately selling their shares and potentially destabilizing the stock. The lockup, which began after TMTG’s merger with a special purpose acquisition company (SPAC), is set to expire on September 25, though it could end earlier if certain conditions are met.
Should Trump decide to sell his shares after the lockup expires, the market could respond in unpredictable ways. The sale of a substantial number of shares by a major stakeholder like Trump could flood the market, potentially driving down the stock price. Daniel Bradley, a finance professor at the University of South Florida, suggests that the market might react negatively to such a large sale, particularly if there aren’t enough buyers to absorb the supply. This could lead to a sharp decline in the stock’s value, impacting both Trump’s personal wealth and the company’s market standing.
Moreover, Trump’s involvement in Truth Social has been a key driver of investor interest. The platform, marketed as a free speech alternative to mainstream social media, has attracted a loyal user base largely due to Trump’s presence. If Trump were to sell his stake, it might signal a lack of confidence in the company, potentially shaking investor confidence and further depressing the stock price.
Trump’s decision is also influenced by his ongoing legal battles, which have already cost him over $100 million in legal fees. Selling his shares could provide a significant financial boost, helping him cover these mounting expenses. However, this move could also have political ramifications, especially as he continues his bid for the Republican nomination in the 2024 presidential race.
Trump Media’s success is closely tied to Trump’s political fortunes. The company’s stock has shown volatility in response to developments in the presidential race, with Trump’s chances of winning having a direct impact on the stock’s value. If Trump sells his stake, it could be interpreted as a lack of confidence in his own political future, potentially undermining both his campaign and the company’s prospects.
Truth Social, the flagship product of TMTG, has faced challenges in generating traffic and advertising revenue, especially compared to established social media giants like X (formerly Twitter) and Facebook. Despite this, the company’s valuation has remained high, fueled by investor speculation on Trump’s political future. If Trump remains in the race and manages to secure the presidency, the value of his shares could increase. Conversely, any missteps on the campaign trail could have the opposite effect, further destabilizing the stock.
As the lockup period comes to an end, Trump faces a critical decision that could shape the future of both his personal finances and Truth Social. Whether he chooses to hold onto his shares or cash out, the outcome will likely have significant consequences for the company, its investors, and Trump’s political aspirations.