adplus-dvertising
Connect with us

Media

What Social Media are doing to counter misinformation in US election

Published

 on

Intelligence officials warned in 2018 that Russia was at it again, along with other state actors. Now in this election cycle, cybersecurity experts have also raised alarm over the increasing threat of domestic actors sowing misinformation online.

The task of policing content while avoiding the appearance of bias has been a tripwire for many of these social media giants, who have faced attacks from both sides of the political aisle for decisions to remove certain content, including allegations of censorship.

ABC News has compiled this explainer to provide readers with a guide to comparing and contrasting policy measures from some of the most-used social media platforms in the U.S. including Facebook (and Facebook-owned Instagram), Twitter, Reddit, TikTok and YouTube.

Facebook

Facebook, the most-used social media platform in the U.S., took the most heat for the 2016 controversy.

In the years since 2016, Facebook’s core efforts to maintain election integrity have fallen into three major categories: Taking down inauthentic accounts and networks, tightening policies on content moderation, and unveiling an ad database with the goal of increased transparency.

Facebook also launched an Elections Operations Center in 2018, a team that it says will monitor potential democratic process abuses on the network in real-time. The company said that so far it has removed more than 120,000 pieces of content from Facebook and Instagram in the U.S. for violating voter-interference policies it has set, and displayed warnings on more than 150 pieces of content. Moreover, the company said it removed 30 networks engaged in coordinated inauthentic behavior targeting the U.S.

In August of 2020, the company unveiled a campaign to encourage people to vote and pledged to remove any content that encourages people not to vote, such as posts which state that voting requires a passport or driving license.

In the weeks ahead of the 2020 vote, the company also announced a series of last-minute changes, including banning all new political advertisements a week before the election, removing new posts with militarized language, such as “army” or “battle,” that aims to suppress voters and temporarily pausing all political ads on the site for an undisclosed period of time after the polls close on Nov. 3.

Facebook also said it will label content that seeks to delegitimize the outcome of the election, and label content from candidates or campaigns that try to declare victory before results are in — instead directing users to official results from Reuters and the National Election Pool.

Moreover, Facebook said it would start labeling some content that it doesn’t remove because it is deemed newsworthy, such as speeches from politicians.

“We’ll allow people to share this content to condemn it, just like we do with other problematic content, because this is an important part of how we discuss what’s acceptable in our society — but we’ll add a prompt to tell people that the content they’re sharing may violate our policies,” Zuckerberg said in a Facebook post at the time.

In addition, Facebook said it would remove all accounts representing the group QAnon, a baseless conspiracy theory which purports, without evidence, that Donald Trump is working in secret against a global Satanic pedophile ring. The unfounded theory was invented online shortly after the 2016 election and has made its way into the political discourse.

While these are major changes at the company compared to 2016 when many, including CEO Mark Zuckerbreg, say it was caught flat-footed, some advocates have criticized what that say is how narrow their actions surrounding political ads are.

“The policies Facebook has taken are extremely reactive,” Ben Decker, the founder of Memetica, a digital investigations consultancy firm, told ABC News. “I don’t think the measures they have taken to curb political ads are going to be particularly effective, because they have these exact stipulations.”

Dipayan Ghosh, the co-director of the Harvard Kennedy School’s digital platforms and democracy project, told ABC News that the ban on new political advertisements one week ahead of the election, “is a ban on new submittals, not on political advertising entirely” and questioned the impact of the ban when record numbers of people are voting early.

Banning political ads after polls close is also “not necessarily going to have a result on the on the election itself,” Ghosh added.

Facebook said this move was aimed to “to reduce opportunities for confusion or abuse.”

“I think many advocates would have liked to see is Facebook to extend a full ban on political advertising for a lengthy period ahead of the election, say, a month or even longer than that,” Ghosh said. “What many of us wanted to see from Facebook is a full ban, a commitment to put the democratic process over revenues.”

Facebook made nearly $70 billion in advertising revenue alone last year, according to financial disclosures.

Ghosh also expressed concerns over the way misinformation spreads on private Facebook groups, which in many cases remain largely unregulated unless they contain active calls for violence — and even then Facebook has been accused of reacting too late.

“I myself have joined groups which have amassed a big following on different sorts of issues, mostly sports related, which then all of a sudden change one day in theme from something about the New York Giants, to ‘Justice for Justice Kavanaugh,'” Ghosh said. “And you can clearly see that what’s happening here is that organizers are trying to get people into these groups, and then all of a sudden, turn a switch to try to influence the members of that group toward these kinds of political themes.”

Also concerning to many, new research from the German Marshall Digital Fund think tank published earlier this week found that more people now are engaging with outlets on Facebook that repeatedly publish verifiably false content than in the lead-up to the 2016 election.

Twitter

Twitter banned all political ads worldwide in October 2019, a move that put it in stark contrast to Facebook, which at the time had recently ruled out banning political ads. Jack Dorsey, Twitter’s CEO, tweeted “while internet advertising is incredibly powerful and very effective for commercial advertisers, that power brings significant risks to politics.”

Ghosh noted that Twitter originally did not make “a lot of money off of political advertising, which likely made it an easier decision for Jack Dorsey than it would be for Mark Zuckerberg.” Twitter reported making nearly $3 billion in ad revenue in Fiscal Year 2019, according to financial disclosures.

Political ads on Twitter did not exist on the same scale as they do on Facebook, but the company has also taken a number of additional measures in recent months to show they are taking action ahead of the 2020 election. Most recently it launched what it calls a “2020 U.S. election hub” which will include a curated list of news articles, as well as live streams of debates.

As part of a suite of measures to combat misinformation, Twitter also introduced a new labelling system in May 2020 which allowed the platform to flag tweets with what it determined to be misleading content.

In the last few months, the social media platform found itself embroiled in controversy after it labeled a number of Donald Trump’s tweets, including those containing claims about mail-in-voting, as potentially misleading, It has also put labels on Trump’s tweets for violating its policies for abusive behavior as well as those regarding manipulated media. In these cases, the tweets are hidden from view but users can easily click in to see the content. Trump has accused Twitter of trying to silence conservative voices.

Critics have questioned the efficacy of the labels Twitter (and Facebook) use in actually stopping misinformation or false claims from spreading or being amplified on the platforms.

Decker noted that more research needs to be done here, but said “it’s unclear often how many of those who read the disinformation are actually reading the fact check, or the intervention response.”

Ghosh said he thinks that “these kinds of labels have a very limited, marginal impact on influencing the opinion of the people who consume that content.”

“I can’t say that these labels really resolve the core issue, which is that you’ve got a person in certain cases with a massive following, who is pushing misinformation intentionally and pushing disinformation, and trying to do so for his own political gain,” Ghosh said. “Having this sort of label does not really change the mind of anyone who’s consuming it.”

A study published in March by researchers at the Massachusetts Institute of Technology suggested that selective labeling of false news can actually have detrimental effect, dubbed the “implied-truth effect,” where unmarked and unchecked, yet still demonstrably false, content appears more legitimate.

The strongest weapon Twitter has to prevent the spread of political misinformation is the removal of tweets and the restriction of accounts, but the platform utilizes these sparingly, likely to avoid being accused of censorship. The most high profile example of this was when it restricted Donald Trump Jr.’s account in late July after he shared a video featuring doctors making false claims about the coronavirus, including that masks are unnecessary. Trump Jr.’s account was suspended for 12 hours, meaning he was unable to tweet, and it removed the video from public view.

Last week, Twitter also unveiled a slew of new updates aimed specifically at curbing the spread of misinformation on the platform ahead of the election, including efforts to stop tweets with misleading information from going viral and a policy that will not allow any person, including candidates for office, to claim an election win before it was authoritatively called.

Significantly, users will not be able to retweet or reply to tweets “with a misleading information label from U.S. political figures (including candidates and campaign accounts), U.S.-based accounts with more than 100,000 followers, or that obtain significant engagement.” Users will, however, be able to quote-tweet the messages, although they will have to click through a warning in order to see these labeled tweets in the first place.

When users attempt to retweet, they will be prompted to Quote Tweet (add their own commentary) instead.

“Though this adds some extra friction for those who simply want to Retweet, we hope it will encourage everyone to not only consider why they are amplifying a Tweet, but also increase the likelihood that people add their own thoughts, reactions and perspectives to the conversation,” the company said in a blogpost.

YouTube

The video-sharing giant announced earlier this year some updates to how it was preparing for the election, saying it would remove election-related content that violated its Community Guidelines.

“These policies prohibit hate speech, harassment, and deceptive practices, including content that aims to mislead people about voting or videos that are technically manipulated or doctored in a way that misleads users (beyond clips taken out of context) and may pose a serious risk of egregious harm,” the company said.

The company also said it would remove content that contains hacked information, stating, “For example, videos that contain hacked information about a political candidate shared with the intent to interfere in an election.”

Similar to other platforms, YouTube also pledged to remove content encouraging users to interfere with the democratic process, citing an example as content “telling viewers to create long voting lines with the purpose of making it harder for others to vote.”

Some have expressed concern that YouTube (similar to Reddit) has not yet published a clear policy on how it will handle candidates claiming victory before the election is officially called.

Decker called YouTube’s policies “extremely reactive” overall.

“Oftentimes, they will apply key word filters to prevent content from being found in search, YouTube’s biggest claim is they incorporate Wikipedia pages into knowledge panels, so if it’s a video about COVID-19, regardless of where it’s from, there would also be this knowledge panel or factcheck above the description that points you toward accurate sources,” Decker said. The Wikipedia articles, while volunteer-edited, provide at least some context to content that would otherwise not have any.

“While the problem on YouTube is still bad, it’s now much less worse,” Decker said.

He also noted that their three strikes policy has been effective in booting a number of content creators off the platform, but an unintended consequence is that this has “led to the rise of fringe platforms.”

“It’s tricky because in one sense it does clean up the stream in the short term, so on the one hand it creates healthier conversations, but it moves them to another area of the internet, which is even more unregulated but there are even less dissenting views, so it’s a space where people can be radicalized,” he said.

Notably, YouTube announced in a company blogpost earlier this week it was taking new steps to curb hate by “removing more conspiracy content used to justify real-world violence.”

Specifically, YouTube cited QAnon as an example of entity “that targets an individual or group with conspiracy theories that have been used to justify real-world violence.”

“As always, context matters, so news coverage on these issues or content discussing them without targeting individuals or protected groups may stay up,” the blogpost added. “We will begin enforcing this updated policy today, and will ramp up in the weeks to come.”

Reddit

In April 2020, Reddit announced that it was launching a subreddit dedicated to political transparency, which would list all political ad campaigns running on Reddit dating back to January of 2019. The company said this subreddit would give information on the individual advertiser, their targeting, impressions, and spend on a per-campaign basis. As an additional transparency measure, Reddit said it would require political advertisers to leave comments “on” for the first 24 hours of a campaign to enable them to “engage directly with users in the comments.”

While the political transparency subreddit contains significant details about political ads, it has a limited reach, with around 3,000 members since it was launched five months ago. It is also worth noting that Reddit doesn’t allow political ads in other countries.

In June 2020, Reddit also announced that it was banning a number of subreddits which it said violated company policies on hate speech. Included in these was r/TheDonald, a pro-Trump subreddit which was popular in the run-up to the 2016 election but which had been largely inactive for months despite its nearly 800,000 members. Members of this subreddit had already migrated to another platform the year before, in response to stricter content rules and increased moderation. The banning of this subreddit and others was indicative of the problems facing social media platforms, where measures to combat hate speech or misinformation do not keep pace with the dissemination of such material.

Reddit, however, noticeably has no stated policy on candidates claiming victory in the election before it is authoritatively called.

TikTok

While the Chinese-owned video sharing app avoided the level of misinformation scrutiny leveled at platforms like Facebook and Twitter, it has taken a number of actions in recent months to show that it is taking a stand before the election.

Along with banning political ads, in August TikTok also announced a suite of news measures to combat misinformation ahead of the 2020 presidential election. Crucially, it banned manipulated media which it said “misleads users by distorting the truth of events in a way that could cause harm.” This included deepfakes, synthetic media produced by artificial intelligence which has the appearance of being real.

Despite these measures, political content on TikTok, like all social media platforms, is extremely popular. Videos containing the hashtag #Trump2020 have been viewed 10.3 billion times by September 2020, according to data on the app. A report from the Wall Street Journal late last year claimed that the Trump campaign had reached out to TikTok accounts with large supportive followings, including some with the Trump 2020 flag in their videos.

Source:- ABC News

Source link

Media

Trump could cash out his DJT stock within weeks. Here’s what happens if he sells

Published

 on

Former President Donald Trump is on the brink of a significant financial decision that could have far-reaching implications for both his personal wealth and the future of his fledgling social media company, Trump Media & Technology Group (TMTG). As the lockup period on his shares in TMTG, which owns Truth Social, nears its end, Trump could soon be free to sell his substantial stake in the company. However, the potential payday, which makes up a large portion of his net worth, comes with considerable risks for Trump and his supporters.

Trump’s stake in TMTG comprises nearly 59% of the company, amounting to 114,750,000 shares. As of now, this holding is valued at approximately $2.6 billion. These shares are currently under a lockup agreement, a common feature of initial public offerings (IPOs), designed to prevent company insiders from immediately selling their shares and potentially destabilizing the stock. The lockup, which began after TMTG’s merger with a special purpose acquisition company (SPAC), is set to expire on September 25, though it could end earlier if certain conditions are met.

Should Trump decide to sell his shares after the lockup expires, the market could respond in unpredictable ways. The sale of a substantial number of shares by a major stakeholder like Trump could flood the market, potentially driving down the stock price. Daniel Bradley, a finance professor at the University of South Florida, suggests that the market might react negatively to such a large sale, particularly if there aren’t enough buyers to absorb the supply. This could lead to a sharp decline in the stock’s value, impacting both Trump’s personal wealth and the company’s market standing.

Moreover, Trump’s involvement in Truth Social has been a key driver of investor interest. The platform, marketed as a free speech alternative to mainstream social media, has attracted a loyal user base largely due to Trump’s presence. If Trump were to sell his stake, it might signal a lack of confidence in the company, potentially shaking investor confidence and further depressing the stock price.

Trump’s decision is also influenced by his ongoing legal battles, which have already cost him over $100 million in legal fees. Selling his shares could provide a significant financial boost, helping him cover these mounting expenses. However, this move could also have political ramifications, especially as he continues his bid for the Republican nomination in the 2024 presidential race.

Trump Media’s success is closely tied to Trump’s political fortunes. The company’s stock has shown volatility in response to developments in the presidential race, with Trump’s chances of winning having a direct impact on the stock’s value. If Trump sells his stake, it could be interpreted as a lack of confidence in his own political future, potentially undermining both his campaign and the company’s prospects.

Truth Social, the flagship product of TMTG, has faced challenges in generating traffic and advertising revenue, especially compared to established social media giants like X (formerly Twitter) and Facebook. Despite this, the company’s valuation has remained high, fueled by investor speculation on Trump’s political future. If Trump remains in the race and manages to secure the presidency, the value of his shares could increase. Conversely, any missteps on the campaign trail could have the opposite effect, further destabilizing the stock.

As the lockup period comes to an end, Trump faces a critical decision that could shape the future of both his personal finances and Truth Social. Whether he chooses to hold onto his shares or cash out, the outcome will likely have significant consequences for the company, its investors, and Trump’s political aspirations.

728x90x4

Source link

Continue Reading

Media

Arizona man accused of social media threats to Trump is arrested

Published

 on

Cochise County, AZ — Law enforcement officials in Arizona have apprehended Ronald Lee Syvrud, a 66-year-old resident of Cochise County, after a manhunt was launched following alleged death threats he made against former President Donald Trump. The threats reportedly surfaced in social media posts over the past two weeks, as Trump visited the US-Mexico border in Cochise County on Thursday.

Syvrud, who hails from Benson, Arizona, located about 50 miles southeast of Tucson, was captured by the Cochise County Sheriff’s Office on Thursday afternoon. The Sheriff’s Office confirmed his arrest, stating, “This subject has been taken into custody without incident.”

In addition to the alleged threats against Trump, Syvrud is wanted for multiple offences, including failure to register as a sex offender. He also faces several warrants in both Wisconsin and Arizona, including charges for driving under the influence and a felony hit-and-run.

The timing of the arrest coincided with Trump’s visit to Cochise County, where he toured the US-Mexico border. During his visit, Trump addressed the ongoing border issues and criticized his political rival, Democratic presidential nominee Kamala Harris, for what he described as lax immigration policies. When asked by reporters about the ongoing manhunt for Syvrud, Trump responded, “No, I have not heard that, but I am not that surprised and the reason is because I want to do things that are very bad for the bad guys.”

This incident marks the latest in a series of threats against political figures during the current election cycle. Just earlier this month, a 66-year-old Virginia man was arrested on suspicion of making death threats against Vice President Kamala Harris and other public officials.

Continue Reading

Media

Trump Media & Technology Group Faces Declining Stock Amid Financial Struggles and Increased Competition

Published

 on

Tech News in Canada

Trump Media & Technology Group’s stock has taken a significant hit, dropping more than 11% this week following a disappointing earnings report and the return of former U.S. President Donald Trump to the rival social media platform X, formerly known as Twitter. This decline is part of a broader downward trend for the parent company of Truth Social, with the stock plummeting nearly 43% since mid-July. Despite the sharp decline, some investors remain unfazed, expressing continued optimism for the company’s financial future or standing by their investment as a show of political support for Trump.

One such investor, Todd Schlanger, an interior designer from West Palm Beach, explained his commitment to the stock, stating, “I’m a Republican, so I supported him. When I found out about the stock, I got involved because I support the company and believe in free speech.” Schlanger, who owns around 1,000 shares, is a regular user of Truth Social and is excited about the company’s future, particularly its plans to expand its streaming services. He believes Truth Social has the potential to be as strong as Facebook or X, despite the stock’s recent struggles.

However, Truth Social’s stock performance is deeply tied to Trump’s political influence and the company’s ability to generate sustainable revenue, which has proven challenging. An earnings report released last Friday showed the company lost over $16 million in the three-month period ending in June. Revenue dropped by 30%, down to approximately $836,000 compared to $1.2 million during the same period last year.

In response to the earnings report, Truth Social CEO Devin Nunes emphasized the company’s strong cash position, highlighting $344 million in cash reserves and no debt. He also reiterated the company’s commitment to free speech, stating, “From the beginning, it was our intention to make Truth Social an impenetrable beachhead of free speech, and by taking extraordinary steps to minimize our reliance on Big Tech, that is exactly what we are doing.”

Despite these assurances, investors reacted negatively to the quarterly report, leading to a steep drop in stock price. The situation was further complicated by Trump’s return to X, where he posted for the first time in a year. Trump’s exclusivity agreement with Trump Media & Technology Group mandates that he posts personal content first on Truth Social. However, he is allowed to make politically related posts on other social media platforms, which he did earlier this week, potentially drawing users away from Truth Social.

For investors like Teri Lynn Roberson, who purchased shares near the company’s peak after it went public in March, the decline in stock value has been disheartening. However, Roberson remains unbothered by the poor performance, saying her investment was more about supporting Trump than making money. “I’m way at a loss, but I am OK with that. I am just watching it for fun,” Roberson said, adding that she sees Trump’s return to X as a positive move that could expand his reach beyond Truth Social’s “echo chamber.”

The stock’s performance holds significant financial implications for Trump himself, as he owns a 65% stake in Trump Media & Technology Group. According to Fortune, this stake represents a substantial portion of his net worth, which could be vulnerable if the company continues to struggle financially.

Analysts have described Truth Social as a “meme stock,” similar to companies like GameStop and AMC that saw their stock prices driven by ideological investments rather than business fundamentals. Tyler Richey, an analyst at Sevens Report Research, noted that the stock has ebbed and flowed based on sentiment toward Trump. He pointed out that the recent decline coincided with the rise of U.S. Vice President Kamala Harris as the Democratic presidential nominee, which may have dampened perceptions of Trump’s 2024 election prospects.

Jay Ritter, a finance professor at the University of Florida, offered a grim long-term outlook for Truth Social, suggesting that the stock would likely remain volatile, but with an overall downward trend. “What’s lacking for the true believer in the company story is, ‘OK, where is the business strategy that will be generating revenue?'” Ritter said, highlighting the company’s struggle to produce a sustainable business model.

Still, for some investors, like Michael Rogers, a masonry company owner in North Carolina, their support for Trump Media & Technology Group is unwavering. Rogers, who owns over 10,000 shares, said he invested in the company both as a show of support for Trump and because of his belief in the company’s financial future. Despite concerns about the company’s revenue challenges, Rogers expressed confidence in the business, stating, “I’m in it for the long haul.”

Not all investors are as confident. Mitchell Standley, who made a significant return on his investment earlier this year by capitalizing on the hype surrounding Trump Media’s planned merger with Digital World Acquisition Corporation, has since moved on. “It was basically just a pump and dump,” Standley told ABC News. “I knew that once they merged, all of his supporters were going to dump a bunch of money into it and buy it up.” Now, Standley is staying away from the company, citing the lack of business fundamentals as the reason for his exit.

Truth Social’s future remains uncertain as it continues to struggle with financial losses and faces stiff competition from established social media platforms. While its user base and investor sentiment are bolstered by Trump’s political following, the company’s long-term viability will depend on its ability to create a sustainable revenue stream and maintain relevance in a crowded digital landscape.

As the company seeks to stabilize, the question remains whether its appeal to Trump’s supporters can translate into financial success or whether it will remain a volatile stock driven more by ideology than business fundamentals.

Continue Reading

Trending