adplus-dvertising
Connect with us

Media

It’s Time to Give Up on Ending Social Media’s Misinformation Problem

Published

 on

If you don’t trust social media, you should know you’re not alone. Most people surveyed around the world feel the same—in fact, they’ve been saying so for a decade. There is clearly a problem with misinformation and hazardous speech on platforms such as Facebook and X. And before the end of its term this year, the Supreme Court may redefine how that problem is treated.

Over the past few weeks, the Court has heard arguments in three cases that deal with controlling political speech and misinformation online. In the first two, heard last month, lawmakers in Texas and Florida claim that platforms such as Facebook are selectively removing political content that its moderators deem harmful or otherwise against their terms of service; tech companies have argued that they have the right to curate what their users see. Meanwhile, some policy makers believe that content moderation hasn’t gone far enough, and that misinformation still flows too easily through social networks; whether (and how) government officials can directly communicate with tech platforms about removing such content is at issue in the third case, which was put before the Court this week.

We’re Harvard economists who study social media and platform design. (One of us, Scott Duke Kominers, is also a research partner at the crypto arm of a16z, a venture-capital firm with investments in social platforms, and an adviser to Quora.) Our research offers a perhaps counterintuitive solution to disagreements about moderation: Platforms should give up on trying to prevent the spread of information that is simply false, and focus instead on preventing the spread of information that can be used to cause harm. These are related issues, but they’re not the same.

As the presidential election approaches, tech platforms are gearing up for a deluge of misinformation. Civil-society organizations say that platforms need a better plan to combat election misinformation, which some academics expect to reach new heights this year. Platforms say they have plans for keeping sites secure, yet despite the resources devoted to content moderation, fact-checking, and the like, it’s hard to escape the feeling that the tech titans are losing the fight.

Here is the issue: Platforms have the power to block, flag, or mute content that they judge to be false. But blocking or flagging something as false doesn’t necessarily stop users from believing it. Indeed, because many of the most pernicious lies are believed by those inclined to distrust the “establishment,” blocking or flagging false claims can even make things worse.

On December 19, 2020, then-President Donald Trump posted a now-infamous message about election fraud, telling readers to “be there,” in Washington, D.C., on January 6. If you visit that post on Facebook today, you’ll see a sober annotation from the platform itself that “the US has laws, procedures, and established institutions to ensure the integrity of our elections.” That disclaimer is sourced from the Bipartisan Policy Center. But does anyone seriously believe that the people storming the Capitol on January 6, and the many others who cheered them on, would be convinced that Joe Biden won just because the Bipartisan Policy Center told Facebook that everything was okay?

Our research shows that this problem is intrinsic: Unless a platform’s users trust the platform’s motivations and its process, any action by the platform can look like evidence of something it is not. To reach this conclusion, we built a mathematical model. In the model, one user (a “sender”) tries to make a claim to another user (a “receiver”). The claim might be true or false, harmful or not. Between the two users is a platform—or maybe an algorithm acting on its behalf—that can block the sender’s content if it wants to.

We wanted to find out when blocking content can improve outcomes, without a risk of making them worse. Our model, like all models, is an abstraction—and thus imperfectly captures the complexity of actual interactions. But because we wanted to consider all possible policies, not just those that have been tried in practice, our question couldn’t be answered by data alone. So we instead approached it using mathematical logic, treating the model as a kind of wind tunnel to test the effectiveness of different policies.

Our analysis shows that if users trust the platform to both know what’s right and do what’s right (and the platform truly does know what’s true and what isn’t), then the platform can successfully eliminate misinformation. The logic is simple: If users believe the platform is benevolent and all-knowing, then if something is blocked or flagged, it must be false, and if it is let through, it must be true.

You can see the problem, though: Many users don’t trust Big Tech platforms, as those previously mentioned surveys demonstrate. When users don’t trust a platform, even well-meaning attempts to make things better can make things worse. And when the platforms seem to be taking sides, that can add fuel to the very fire they are trying to put out.

Does this mean that content moderation is always counterproductive? Far from it. Our analysis also shows that moderation can be very effective when it blocks information that can be used to do something harmful.

Going back to Trump’s December 2020 post about election fraud, imagine that, instead of alerting users to the sober conclusions of the Bipartisan Policy Center, the platform had simply made it much harder for Trump to communicate the date (January 6) and place (Washington, D.C.) for supporters to gather. Blocking that information wouldn’t have prevented users from believing that the election was stolen—to the contrary, it might have fed claims that tech-sector elites were trying to influence the outcome. Nevertheless, making it harder to coordinate where and when to go might have helped slow the momentum of the eventual insurrection, thus limiting the post’s real-world harms.

Unlike removing misinformation per se, removing information that enables harm can work even if users don’t trust the platform’s motives at all. When it is the information itself that enables the harm, blocking that information blocks the harm as well. A similar logic extends to other kinds of harmful content, such as doxxing and hate speech. There, the content itself—not the beliefs it encourages—is the root of the harm, and platforms do indeed successfully moderate these types of content.

Do we want tech companies to decide what is and is not harmful? Maybe not; the challenges and downsides are clear. But platforms already routinely make judgments about harm—is a post calling for a gathering at a particular place and time that includes the word violent an incitement to violence, or an announcement of an outdoor concert? Clearly the latter if you’re planning to see the Violent Femmes. Often context and language make these judgments apparent enough that an algorithm can determine them. When that doesn’t happen, platforms can rely on internal experts or even independent bodies, such as Meta’s Oversight Board, which handles tricky cases related to the company’s content policies.

And if platforms accept our reasoning, they can divert resources from the misguided task of deciding what is true toward the still hard, but more pragmatic, task of determining what enables harm. Even though misinformation is a huge problem, it’s not one that platforms can solve. Platforms can help keep us safer by focusing on what content moderation can do, and giving up on what it can’t.

Scott Duke Kominers is the Sarofim-Rock Professor of Business Administration at Harvard Business School, a faculty affiliate in the Harvard Department of Economics, and a research partner at a16z crypto.
Jesse Shapiro is the George Gund Professor of Economics and Business Administration at Harvard and a research associate at the National Bureau of Economic Research.

Adblock test (Why?)

728x90x4

Source link

Continue Reading

Media

What to stream this weekend: ‘Civil War,’ Snow Patrol, ‘How to Die Alone,’ ‘Tulsa King’ and ‘Uglies’

Published

 on

 

Hallmark launching a streaming service with two new original series, and Bill Skarsgård out for revenge in “Boy Kills World” are some of the new television, films, music and games headed to a device near you.

Also among the streaming offerings worth your time as selected by The Associated Press’ entertainment journalists: Alex Garland’s “Civil War” starring Kirsten Dunst, Natasha Rothwell’s heartfelt comedy for Hulu called “How to Die Alone” and Sylvester Stallone’s second season of “Tulsa King” debuts.

NEW MOVIES TO STREAM SEPT. 9-15

Alex Garland’s “Civil War” is finally making its debut on MAX on Friday. The film stars Kirsten Dunst as a veteran photojournalist covering a violent war that’s divided America; She reluctantly allows an aspiring photographer, played by Cailee Spaeny, to tag along as she, an editor (Stephen McKinley Henderson) and a reporter (Wagner Moura) make the dangerous journey to Washington, D.C., to interview the president (Nick Offerman), a blustery, rising despot who has given himself a third term, taken to attacking his citizens and shut himself off from the press. In my review, I called it a bellowing and haunting experience; Smart and thought-provoking with great performances. It’s well worth a watch.

— Joey King stars in Netflix’s adaptation of Scott Westerfeld’s “Uglies,” about a future society in which everyone is required to have beautifying cosmetic surgery at age 16. Streaming on Friday, McG directed the film, in which King’s character inadvertently finds herself in the midst of an uprising against the status quo. “Outer Banks” star Chase Stokes plays King’s best friend.

— Bill Skarsgård is out for revenge against the woman (Famke Janssen) who killed his family in “Boy Kills World,” coming to Hulu on Friday. Moritz Mohr directed the ultra-violent film, of which Variety critic Owen Gleiberman wrote: “It’s a depraved vision, yet I got caught up in its kick-ass revenge-horror pizzazz, its disreputable commitment to what it was doing.”

AP Film Writer Lindsey Bahr

NEW MUSIC TO STREAM SEPT. 9-15

— The year was 2006. Snow Patrol, the Northern Irish-Scottish alternative rock band, released an album, “Eyes Open,” producing the biggest hit of their career: “Chasing Cars.” A lot has happened in the time since — three, soon to be four quality full-length albums, to be exact. On Friday, the band will release “The Forest Is the Path,” their first new album in seven years. Anthemic pop-rock is the name of the game across songs of love and loss, like “All,”“The Beginning” and “This Is the Sound Of Your Voice.”

— For fans of raucous guitar music, Jordan Peele’s 2022 sci-fi thriller, “NOPE,” provided a surprising, if tiny, thrill. One of the leads, Emerald “Em” Haywood portrayed by Keke Palmer, rocks a Jesus Lizard shirt. (Also featured through the film: Rage Against the Machine, Wipers, Mr Bungle, Butthole Surfers and Earth band shirts.) The Austin noise rock band are a less than obvious pick, having been signed to the legendary Touch and Go Records and having stopped releasing new albums in 1998. That changes on Friday the 13th, when “Rack” arrives. And for those curious: The Jesus Lizard’s intensity never went away.

AP Music Writer Maria Sherman

NEW SHOWS TO STREAM SEPT. 9-15

— Hallmark launched a streaming service called Hallmark+ on Tuesday with two new original series, the scripted drama “The Chicken Sisters” and unscripted series “Celebrations with Lacey Chabert.” If you’re a Hallmark holiday movies fan, you know Chabert. She’s starred in more than 30 of their films and many are holiday themed. Off camera, Chabert has a passion for throwing parties and entertaining. In “Celebrations,” deserving people are surprised with a bash in their honor — planned with Chabert’s help. “The Chicken Sisters” stars Schuyler Fisk, Wendie Malick and Lea Thompson in a show about employees at rival chicken restaurants in a small town. The eight-episode series is based on a novel of the same name.

Natasha Rothwell of “Insecure” and “The White Lotus” fame created and stars in a new heartfelt comedy for Hulu called “How to Die Alone.” She plays Mel, a broke, go-along-to-get-along, single, airport employee who, after a near-death experience, makes the conscious decision to take risks and pursue her dreams. Rothwell has been working on the series for the past eight years and described it to The AP as “the most vulnerable piece of art I’ve ever put into the world.” Like Mel, Rothwell had to learn to bet on herself to make the show she wanted to make. “In the Venn diagram of me and Mel, there’s significant overlap,” said Rothwell. It premieres Friday on Hulu.

— Shailene Woodley, DeWanda Wise and Betty Gilpin star in a new drama for Starz called “Three Women,” about entrepreneur Sloane, homemaker Lina and student Maggie who are each stepping into their power and making life-changing decisions. They’re interviewed by a writer named Gia (Woodley.) The series is based on a 2019 best-selling book of the same name by Lisa Taddeo. “Three Women” premieres Friday on Starz.

— Sylvester Stallone’s second season of “Tulsa King” debuts Sunday on Paramount+. Stallone plays Dwight Manfredi, a mafia boss who was recently released from prison after serving 25 years. He’s sent to Tulsa to set up a new crime syndicate. The series is created by Taylor Sheridan of “Yellowstone” fame.

Alicia Rancilio

NEW VIDEO GAMES TO PLAY

— One thing about the title of Focus Entertainment’s Warhammer 40,000: Space Marine 2 — you know exactly what you’re in for. You are Demetrian Titus, a genetically enhanced brute sent into battle against the Tyranids, an insectoid species with an insatiable craving for human flesh. You have a rocket-powered suit of armor and an arsenal of ridiculous weapons like the “Chainsword,” the “Thunderhammer” and the “Melta Rifle,” so what could go wrong? Besides the squishy single-player mode, there are cooperative missions and six-vs.-six free-for-alls. You can suit up now on PlayStation 5, Xbox X/S or PC.

— Likewise, Wild Bastards isn’t exactly the kind of title that’s going to attract fans of, say, Animal Crossing. It’s another sci-fi shooter, but the protagonists are a gang of 13 varmints — aliens and androids included — who are on the run from the law. Each outlaw has a distinctive set of weapons and special powers: Sarge, for example, is a robot with horse genes, while Billy the Squid is … well, you get the idea. Australian studio Blue Manchu developed the 2019 cult hit Void Bastards, and this Wild-West-in-space spinoff has the same snarky humor and vibrant, neon-drenched cartoon look. Saddle up on PlayStation 5, Xbox X/S, Nintendo Switch or PC.

Lou Kesten

Source link

Continue Reading

Media

Trump could cash out his DJT stock within weeks. Here’s what happens if he sells

Published

 on

Former President Donald Trump is on the brink of a significant financial decision that could have far-reaching implications for both his personal wealth and the future of his fledgling social media company, Trump Media & Technology Group (TMTG). As the lockup period on his shares in TMTG, which owns Truth Social, nears its end, Trump could soon be free to sell his substantial stake in the company. However, the potential payday, which makes up a large portion of his net worth, comes with considerable risks for Trump and his supporters.

Trump’s stake in TMTG comprises nearly 59% of the company, amounting to 114,750,000 shares. As of now, this holding is valued at approximately $2.6 billion. These shares are currently under a lockup agreement, a common feature of initial public offerings (IPOs), designed to prevent company insiders from immediately selling their shares and potentially destabilizing the stock. The lockup, which began after TMTG’s merger with a special purpose acquisition company (SPAC), is set to expire on September 25, though it could end earlier if certain conditions are met.

Should Trump decide to sell his shares after the lockup expires, the market could respond in unpredictable ways. The sale of a substantial number of shares by a major stakeholder like Trump could flood the market, potentially driving down the stock price. Daniel Bradley, a finance professor at the University of South Florida, suggests that the market might react negatively to such a large sale, particularly if there aren’t enough buyers to absorb the supply. This could lead to a sharp decline in the stock’s value, impacting both Trump’s personal wealth and the company’s market standing.

Moreover, Trump’s involvement in Truth Social has been a key driver of investor interest. The platform, marketed as a free speech alternative to mainstream social media, has attracted a loyal user base largely due to Trump’s presence. If Trump were to sell his stake, it might signal a lack of confidence in the company, potentially shaking investor confidence and further depressing the stock price.

Trump’s decision is also influenced by his ongoing legal battles, which have already cost him over $100 million in legal fees. Selling his shares could provide a significant financial boost, helping him cover these mounting expenses. However, this move could also have political ramifications, especially as he continues his bid for the Republican nomination in the 2024 presidential race.

Trump Media’s success is closely tied to Trump’s political fortunes. The company’s stock has shown volatility in response to developments in the presidential race, with Trump’s chances of winning having a direct impact on the stock’s value. If Trump sells his stake, it could be interpreted as a lack of confidence in his own political future, potentially undermining both his campaign and the company’s prospects.

Truth Social, the flagship product of TMTG, has faced challenges in generating traffic and advertising revenue, especially compared to established social media giants like X (formerly Twitter) and Facebook. Despite this, the company’s valuation has remained high, fueled by investor speculation on Trump’s political future. If Trump remains in the race and manages to secure the presidency, the value of his shares could increase. Conversely, any missteps on the campaign trail could have the opposite effect, further destabilizing the stock.

As the lockup period comes to an end, Trump faces a critical decision that could shape the future of both his personal finances and Truth Social. Whether he chooses to hold onto his shares or cash out, the outcome will likely have significant consequences for the company, its investors, and Trump’s political aspirations.

728x90x4

Source link

Continue Reading

Media

Arizona man accused of social media threats to Trump is arrested

Published

 on

Cochise County, AZ — Law enforcement officials in Arizona have apprehended Ronald Lee Syvrud, a 66-year-old resident of Cochise County, after a manhunt was launched following alleged death threats he made against former President Donald Trump. The threats reportedly surfaced in social media posts over the past two weeks, as Trump visited the US-Mexico border in Cochise County on Thursday.

Syvrud, who hails from Benson, Arizona, located about 50 miles southeast of Tucson, was captured by the Cochise County Sheriff’s Office on Thursday afternoon. The Sheriff’s Office confirmed his arrest, stating, “This subject has been taken into custody without incident.”

In addition to the alleged threats against Trump, Syvrud is wanted for multiple offences, including failure to register as a sex offender. He also faces several warrants in both Wisconsin and Arizona, including charges for driving under the influence and a felony hit-and-run.

The timing of the arrest coincided with Trump’s visit to Cochise County, where he toured the US-Mexico border. During his visit, Trump addressed the ongoing border issues and criticized his political rival, Democratic presidential nominee Kamala Harris, for what he described as lax immigration policies. When asked by reporters about the ongoing manhunt for Syvrud, Trump responded, “No, I have not heard that, but I am not that surprised and the reason is because I want to do things that are very bad for the bad guys.”

This incident marks the latest in a series of threats against political figures during the current election cycle. Just earlier this month, a 66-year-old Virginia man was arrested on suspicion of making death threats against Vice President Kamala Harris and other public officials.

Continue Reading

Trending