adplus-dvertising
Connect with us

Media

Social Media Restrictions Cannot Keep Up with Hidden Codes and Symbols – Scientific American

Published

 on


Social Media Restrictions Cannot Keep Up with Hidden Codes and Symbols
Governments and social media platforms may try to restrict speech, but people still find crafty ways to communicate their ideas. Credit: Getty Images
Advertisement

<div class="article-block article-text" data-behavior="newsletter_promo dfp_article_rendering " data-dfp-adword="Advertisement" data-newsletterpromo_article-text="

Read Our Latest Issue

” data-newsletterpromo_article-image=”https://static.scientificamerican.com/sciam/cache/file/8F2B3C56-1AAD-4F03-91FC59744348A13E_source.jpg” data-newsletterpromo_article-button-text=”Read Now” data-newsletterpromo_article-button-link=”https://www.scientificamerican.com/magazine/sa/2020/11-01/?utm_source=site&utm_medium=display&utm_campaign=sa-november-alert&utm_content=link&utm_term=SASI-20201101_CVP_article” name=”articleBody” itemprop=”articleBody”>

300x250x1

On the same day that President Donald Trump announced his COVID-19 diagnosis, Twitter reminded users of its policy that “tweets that wish or hope for death, serious bodily harm or fatal disease against *anyone* are not allowed and will need to be removed.” The social media platform soon filled with posts accusing it of hypocrisy: threats targeting women and people of color have accumulated for years without removal, users said. But even as Twitter attempted to enforce its rules more stringently, thinly veiled posts slipped through the cracks.

By referencing schadenfreude, karma or the old adage “you reap what you sow” in reference to Trump’s illness, many Twitter users avoided explicitly violating the site’s abuse policy while leaving no doubt as to their intended sentiment. Scholars who study social media discourse say this incident reflects a broader phenomenon: Whenever online authorities (whether social media platforms or governments) attempt to restrict speech on the Internet, people will find creative ways to subvert the rules. These strategies can be used to spread abuse—or to preserve freedom of expression.

China’s government, for instance, prohibits social media posts referencing June 4, the anniversary of the 1989 Tiananmen Square crackdown that killed an unknown number of Chinese pro-democracy protesters and their sympathizers. In the early days of Chinese microblogging platform Sina Weibo, users sidestepped the ban by referring to “May 35” instead. “But eventually [the censors] caught up with that, and they banned that, too,” says Susan Herring, a linguist at Indiana University Bloomington who studies technology-mediated communication. “It’s a race to try to fool the censors…. The people are constantly coming up with creative new symbols.”

A woman wears a helmet adorned with an image of Pepe the frog that she hand-painted and a Trump/Pence sticker at a rally organized by the right-wing group Patriot Prayer in Vancouver, Washington, U.S. September 10, 2017. Credit: Sylvia Buchholz, Elijah Nouvelage Alamy

The use of symbolism to avoid censorship is as old as language itself. In the antebellum U.S., Harriet Tubman communicated with fugitives escaping slavery by singing songs with hidden meanings that their pursuers would not understand. When homosexuality was illegal in the 20th-century U.K., members of the gay subculture used a secret slang called Polari. Herring says the cycle of code making and code breaking “is a major driver of language change.” In French, for instance, slang words known as verlan are created by transposing the syllables of an existing word. Because many verlan terms originated as a secret code to discuss illicit behavior, the process was often repeated when a new form became too recognizable. In this way, femme (the standard word for “woman”) gave rise to meuf, which in turn became feumeu. Analogously, online coded symbols evolve as their previously secret meanings become well-known.

On the Internet, such symbols can manifest as words or visual motifs, including emoji, memes or other images. The specific form of a symbol ultimately matters less than the idea it represents, says Ryan Milner, who studies Internet culture at the College of Charleston. But he notes that visual symbols have proved particularly effective at evading censorship, thanks to their inherent ambiguity. And their power to spread rapidly online derives from their ability to establish an in-group and out-group. “As they get more esoteric, as they get more inside jokey, then there’s more and more of a signal that ‘if you get this, if you’re part of the joke—then you are one of us,’” Milner says, “and ‘if you don’t get it…, then you’re not one of us.”

Pepe the Frog exemplifies the capacity of memes to both foment hate and defy oppression. The meme began life as an innocent cartoon amphibian, but as it grew in popularity, users of the online message boards 4chan, 8chan and Reddit began making anti-Semitic and racist versions of it. By 2016 Pepe the Frog had gained such widespread use among white supremacists in the U.S. that the Anti-Defamation League declared it a hate symbol. In 2019, however, pro-democracy protestors in Hong Kong—mostly unaware of Pepe’s associations with white supremacy—adopted the image to symbolize their movement. In both cases the meme’s coded meaning allowed it to spread uncensored, eventually garnering widespread recognition that elevated the public profile of the groups using it.

For many visual symbols, “there is this consistent push-pull” between recognizability and secrecy, Milner says. For groups aiming to persuade, “publicity is going to be good,” he adds. But for those seeking a clandestine “knowing wink,” too much attention can undercut a symbol’s effectiveness.

A protestor in Hong Kong waves a Pepe the frog flag. Credit: Miguel Candela Getty Images

The latter fate befell the “grass mud horse,” a fictional alpacalike creature whose name in Mandarin is a pun for an obscene and insulting phrase that sounds similar if the spoken tones are minimally altered. The grass mud horse originated in 2009 as an icon of resistance to China’s Internet censorship. At first people using it had plausible deniability—they could claim they were merely sharing a funny picture of an animal, says Sulafa Zidani, a global media studies scholar at the University of Southern California. But as the meme’s popularity exploded, its underlying meaning became common knowledge, thus diluting its power. In 2014 digital media scholar An Xiao Mina noted that the grass mud horse had been included among Sina Weibo’s official emoji—effectively transforming it from a symbol of subversion to a general-purpose obscenity.

In the U.S., 2020 has seen many “ideological contestations” over the meanings of coded symbols and phrases, Milner says. The extremist antigovernment “boogaloo” movement hijacked Hawaiian shirts. Followers of the QAnon conspiracy theory co-opted the phrase “save the children” from a legitimate antitrafficking campaign. The ambiguity of these symbols “absolutely” helps such groups evade censorship on social media, says Howard Graves, a senior research analyst at the Southern Poverty Law Center. Graves emphasizes that “context is crucial” when interpreting a symbol. For instance, in June fans of South Korean pop music flooded the hashtag #WhiteLivesMatter with memes and videos of K-pop stars to drown out racist posts using it. In October they repeated the tactic with the #ArmyForTrump.

Although coded language and symbols are nothing new, social media has transformed their impact on culture. “The ability to make a coordinated attempt … to claim a symbol as your own is something that you can do in a wide-scale way because of social media tools,” Milner says.

And whether or not those symbols incite violence or protest injustice, Zidani says, “we as individuals have a lot of power to amplify certain messages—and not amplify other messages.”


Rights & Permissions

Let’s block ads! (Why?)

728x90x4

Source link

Continue Reading

Media

B.C. online harms bill on hold after deal with social media firms

Published

 on

The British Columbia government is putting its proposed online harms legislation on hold after reaching an agreement with some of the largest social media platforms to increase safety online.

Premier David Eby says in a joint statement with representatives of the firms Meta, TikTok, X and Snapchat that they will form an online safety action table, where they’ll discuss “tangible steps” toward protecting people from online harms.

Eby added the proposed legislation remains, and the province will reactivate it into law if necessary.

“The agreement that we’ve struck with these companies is that we’re going to move quickly and effectively, and that we need meaningful results before the end of the term of this government, so that if it’s necessary for us to bring the bill back then we will,” Eby said Tuesday.

300x250x1

The province says the social media companies have agreed to work collaboratively with the province on preventing harm, while Meta will also commit to working with B.C.’s emergency management officials to help amplify official information during natural disasters and other events.

The announcement to put the Bill 12, also known as the Public Health Accountability and Cost Recovery Act, on hold is a sharp turn for the government, after Eby announced in March that social media companies were among the “wrongdoers” that would pay for health-related costs linked to their platforms.

At the time, Eby compared social media harms to those caused by tobacco and opioids, saying the legislation was similar to previous laws that allowed the province to sue companies selling those products.

A white man and woman weep at a podium, while a white man behind them holds a picture of a young boy.
Premier David Eby is pictured with Ryan Cleland and Nicola Smith, parents of Carson Cleland, during a news conference announcing Bill 12. (Ben Nelms/CBC)

Eby said one of the key drivers for legislation targeting online harm was the death of Carson Cleland, the 12-year-old Prince George, B.C., boy who died by suicide last October after falling victim to online sextortion.

“In the real world we would never allow a company to set up a space for kids where grown adults could be invited in to contact them, encourage them to share photographs and then threaten to distribute those photographs to their family and friends,” Eby said when announcing the legislation.

The premier said previously that companies would be shut down and their owners would face jail terms if their products were connected to harms to young people.

In announcing the pause, the province says that bringing social media companies to the table for discussion achieves the same purpose of protecting youth from online harm.

“Our commitment to every parent is that we will do everything we can to keep their families safe online and in our communities,” said Eby.

Ryan Cleland, Carson’s father, said in a statement on Tuesday that he “has faith” in Eby and the decision to suspend the legislation.

“I don’t think he is looking at it from a political standpoint as much as he is looking at it as a dad,” he said of Eby. “I think getting the social media giants together to come up with a solution is a step in the right direction.”

Business groups were opposed

On Monday, the opposition B.C. United called for a pause to Bill 12, citing potential “serious legal and economic consequences for local businesses.”

Opposition Leader Kevin Falcon said in a statement that his party pushed Eby’s government to change course, noting the legislation’s vague language on who the province can sue “would have had severe unintended consequences” for local businesses and the economy.

“The government’s latest retreat is not only a win for the business community but for every British Columbian who values fairness and clarity in the law,” Falcon said.

A white man wearing a blue tie speaks in a legislature building.
B.C. United Leader Kevin Falcon says that Bill 12 could have had unintended consequences. (Chad Hipolito/The Canadian Press)

The Greater Vancouver Board of Trade said they are pleased to see the legislation put on hold, given the “potential ramifications” of the proposal’s “expansive interpretation.”

“We hope that the government chooses not to pursue Bill 12 in the future,” said board president and CEO Bridgitte Anderson in a statement. “Instead, we would welcome the opportunity to work with the government to develop measures that are well-targeted and effective, ensuring they protect British Columbians without causing unintended consequences.”

Adblock test (Why?)

728x90x4

Source link

Continue Reading

Media

Trump poised to clinch US$1.3-billion social media company stock award

Published

 on

Donald Trump is set to secure on Tuesday a stock bonus worth US$1.3-billion from the company that operates his social media app Truth Social (DJT-Q), equivalent to about half the majority stake he already owns in it, thanks to the wild rally in its shares.

The award will take the former U.S. president’s overall stake in the company, Trump Media & Technology Group (TMTG), to US$4.1-billion.

While Mr. Trump has agreed not to sell any of his TMTG shares before September, the windfall represents a significant boost to his wealth, which Forbes pegs at US$4.7-billion.

Unlike much of his real estate empire, shares are easy to divest in the stock market and could come in handy as Mr. Trump’s legal fees and fines pile up, including a US$454.2-million judgment in his New York civil fraud case he is appealing.

300x250x1

The bonus also reflects the exuberant trading in TMTG’s shares, which have been on a roller coaster ride since the company listed on Nasdaq last month through a merger with a special purpose acquisition company (SPAC) and was snapped up by Trump supporters and speculators.

Mr. Trump will be entitled to the stock bonus under the terms of the SPAC deal once TMTG’s shares stay above US$17.50 for 20 trading days after the company’s March 26 listing. They ended trading on Monday at US$35.50, and they would have to lose more than half their value on Tuesday for Mr. Trump to miss out.

TMTG’s current valuation of approximately US$5-billion is equivalent to about 1,220 times the loss-making company’s revenue in 2023 of US$4.1-million.

No other U.S. company of similar market capitalization has such a high valuation multiple, LSEG data shows. This is despite TMTG warning investors in regulatory filings that its operational losses raise “substantial doubt” about its ability to remain in business.

A TMTG spokesperson declined to comment on the stock award to Mr. Trump. “With more than $200 million in the bank and zero debt, Trump Media is fulfilling all its obligations related to the merger and rapidly moving forward with its business plan,” the spokesperson said.

While Mr. Trump’s windfall is rich for a small, loss-making company like TMTG, the earnout structure that allows it is common. According to a report from law firm Freshfields Bruckhaus Deringer, stock earnouts for management were seen in more than half the SPAC mergers completed in 2022.

However, few executives clinch these earnout bonuses because many SPAC deals end up performing poorly in the stock market, said Freshfields securities lawyer Michael Levitt. TMTG’s case is rare because its shares are trading decoupled from its business prospects.

“Many earnouts in SPACs are never satisfied because many SPAC prices fall significantly after the merger is completed,” Mr. Levitt said.

To be sure, TMTG made it easier for Mr. Trump to meet the earnout threshold. When TMTG agreed to merge with the SPAC in October, 2021, the deal envisioned that TMTG shares had to trade above US$30 for Mr. Trump to get the full earnout bonus. The two sides amended the deal in August, 2023 to lower that threshold to US$17.50, regulatory filings show.

Had that not happened, Mr. Trump would not have yet earned the full bonus because TMTG’s shares traded below US$30 last week. The terms of the deal, however, give Mr. Trump three years from the listing to win the full earnout, so he could have still earned it if the shares traded above the threshold for 20 days in any 30-day period during this time.

Adblock test (Why?)

728x90x4

Source link

Continue Reading

Media

B.C. puts online harms bill on hold after agreement with social media companies

Published

 on

The B.C. government is putting its proposed online harms legislation on hold after reaching an agreement with some of the largest social media platforms to make people safer online.

Premier David Eby says in a joint statement with representatives of the firms Meta, TikTok, X and Snap that they will form an online safety action table, where they’ll discuss “tangible steps” towards protecting people from online harms.

Eby says the social media companies have “agreed to work collaboratively” with the province on preventing harm, while Meta will also commit to working with B.C’s emergency management officials to help amplify official information during natural disasters and other events.

300x250x1

“We have had assurance from Facebook on a couple of things. First, that they will work with us to deliver emergency information to British Columbia in this wildfire season that (people) can rely on, they can find easily, and that will link into official government channels to distribute information quickly and effectively,” Eby said at a Tuesday press conference.

“This is a major step and I’m very appreciative that we are in this place now.”


Click to play video: 'B.C. takes steps to protect people from online harms'
3:56
B.C. takes steps to protect people from online harms

 


The announcement to put the bill on hold is a sharp turn for the government, after Eby announced in March that social media companies were among the “wrongdoers” that would pay for health-related costs linked to their platforms.


The email you need for the day’s
top news stories from Canada and around the world.

At the time, Eby compared social media harms to those caused by tobacco and opioids, saying the legislation was similar to previous laws that allowed the province to sue companies selling those products.


Click to play video: 'Carol Todd on taking action against online harms'
5:46
Carol Todd on taking action against online harms

 


Last August, Eby criticized Meta over its continued blackout of Canadian news outlets as wildfires forced thousands from their homes.  Eby said it was “unacceptable” for the tech giant to cut off access to news on its platforms at a time when people needed timely, potentially life-saving information.

“I think it’s fair to say that I was very skeptical, following the initial contact (with Meta),” Eby said Tuesday.

Eby said one of the key drivers for legislation targetting online harm was the death of Carson Cleland, the 12-year-old Prince George, B.C., boy who died by suicide last October after falling victim to online sextortion.

The premier says in announcing the pause that bringing social media companies to the table for discussion achieves the same purpose of protecting youth from online harm.

“Our commitment to every parent is that we will do everything we can to keep their families safe online and in our communities,” the premier said in his statement.

 

Adblock test (Why?)

728x90x4

Source link

Continue Reading

Trending