Connect with us

Media

Chinese social media censors hashtags after coronavirus whistleblower doctor dies – Global News

Published

 on


The death of one of China‘s first coronavirus whistleblowers has prompted grief, fear and anger from many people in the country.

The virus, which was first detected in the city of Wuhan, has claimed the lives of hundreds of people and spread to more than 20 countries around the world.

People in China are angry with the government’s handling of the outbreak. Such individuals took to social media with the hashtag #IWantFreedomOfSpeech, which has since been censored by Weibo, the Chinese equivalent of Twitter, CNBC reports.

According to Reuters, the hashtag revealed many search hits on Thursday night, but garnered none by Friday once it was censored.


READ MORE:
Man with ‘I have the coronavirus’ sign sprays aerosol on Walmart food

The late Li Wenliang first alerted a group of doctors on social media after seeing seven cases of the virus in December. He and seven other doctors, the publication says, were taken into custody by Wuhan police in January.

Story continues below advertisement

They were accused of spreading “illegal and false information.”

Wenliang later contracted the virus himself and died at 2:58 a.m. at the Wuhan Central Hospital, the hospital announced on Weibo.

He was 34 years old.

It seems as though China’s surveillance culture has ramped up since the outbreak of the virus, which has killed around 600 and infected more than 30,000 people, according to CBS News.






2:20
Plane carrying Canadians evacuating China’s coronavirus outbreak lands at CFB Trenton


Plane carrying Canadians evacuating China’s coronavirus outbreak lands at CFB Trenton

A man from Hangzhou returned home after a business trip and was immediately contacted by the police, who had been tracking his whereabouts via his licence plate and noticed he visited Wenzhou, which has had a spate of virus cases.

When he left his home after being told to stay indoors for two weeks, the police and his boss contacted him when facial recognition technology spotted him by Hangzhou’s West Lake.

“I was a bit shocked by the ability and efficiency of the mass surveillance network. They can basically trace our movements with the AI technology and big data at any time and any place,” the man, who asked not to be identified, told Reuters.

The virus outbreak has prompted the Chinese government to put to use their sophisticated system of electronic surveillance.

Story continues below advertisement


READ MORE:
Japan finds 41 new cases of coronavirus on cruise ship as China death toll rises to 636

Mobile apps can tell users if they’ve been on a flight or train with a known coronavirus carrier. Maps even show locations of buildings where infected people live, Reuters reports.

For now, it seems as though most are accepting of the intrusion if it helps combat the international health emergency.

Some are concerned about the government’s involvement in covering up full news of virus outbreaks after they were accused of doing so during the Severe Acute Respiratory Syndrome (SARS) outbreak in 2003.

— With files from Reuters

meaghan.wray@globalnews.ca

© 2020 Global News, a division of Corus Entertainment Inc.

Let’s block ads! (Why?)



Source link

Media

Alex Jones Begs The Question, What's More Expensive For Media: Lies, Or The Truth? – Forbes

Published

 on


We are having a Goldilocks moment in American media. We simply can’t decide how much truth we really want. Too many lies can lead to expensive lawsuits, as Alex Jones proved this week with a $49M verdict for spreading misinformation about Sandy Hook. Too much truth, on the other hand, can lead to lawsuits by special interests afraid to have infamous stories made famous, as hundreds of writers have learned (including this author).

Is a healthy midpoint actually possible, or would the news simply lose all meaning? Or can we as consumers bend the news back towards the truth? Jones’ story should be a clarion call for both the right and the left to demand more truth from their storytellers. The future of democracy depends on it.

Too Many Lies

Last week, Alex Jones was caught lying about the Sandy Hook massacre . On December 14th, 2012, 26 people were killed by a mass shooter, including 20 children between 6 and 7-years-old. Before families could even take a moment to grieve their profound loss, Jones had already gone on air to deny the mass shooting, saying “why does government stage these things, to get our guns.” and referring to grieving parents as “crisis actors.”

It’s hard to know if his motivation was earning money, but certainly, that was an outcome. InfoWars was already a relatively successful media business, with 4 million unique views a month in 2010, and in 2013 estimated revenues of $10M a year. By 2018 he had 10 million unique views a month, more than mainstream publications like Newsweek and the Economist. During the trial, it was estimated that Jones’ businesses were collectively worth somewhere between $135M and $270M.

His bread and butter are what is sometimes politely referred to as “conspiracy theories,” a term that implies that stories like PizzaGate could actually be true. But a theory can ultimately be scientifically tested, and those peddled by Alex Jones have come up false. He is often known for spreading “misinformation,” a sanitized way to say “lying.” His site is aptly named InfoWars—it is a provider of information in deep war with the truth.

Given the strength of defamation law in this country protecting people from damaging speech, the parents were ultimately awarded $49M by a Texas court (which they may not fully receive given state limits in Texas). However, with more rulings likely to come from states like Connecticut without such a cap this upcoming year, it is likely that figure will rise substantially and send a very strong message to those who aim to manipulate people’s understanding of the truth for political gain.

Many have suggested that his motivation in the case of Sandy Hook was not in fact money, but a desire to push back against gun control efforts. It’s reasonable that in the wake of mass shootings, communities start to think long and hard about greater gun control, and that those who believe that more guns are good for America (like some Republicans proposing to further arm teachers) are troubled to see what they view as an infringement on their rights. But even that debate can happen on the basis of truth—that mass shooters exist, that they are 98% men, and that children have died in these shootings causing infinite devastation on the part of their families.

The Truth Hurts, but Its Negation Hurts More

Most parents at some point will suffer their very own version of an info war, as their children deliberately twist facts for personal gain. Jones’s actions are not unlike those of a child who breaks a beloved porcelain vase and quickly blames their younger sibling. The parents are not mad necessarily about the vase—they are mad about being lied to, and the lack of empathy that implies.

But children generally grow out of such deflection and blame games, whereas Jones apparently did not. It wasn’t just a vase that broke—the lives and hearts of parents were cracked open not just once in the initial mass shooting, but countless times as Jones followers harassed them and negated their truth.

I believe Jones knew he was inflicting real harm—like watching his little sibling get spanked for that broken vase—and took no action to stop it. What’s upsetting about his actions is that in saying that Sandy Hook was “100% real” on the stand, he presumably knew he was lying to his listeners; episode after episode.

Infowars viewers should be outraged. Jones treated them as ignorant pawns ripe for his political objectives. The right deserves to hear conservative perspectives based on the truth. And so does the left. Fighting fair means starting from the same playing field, which in the court of ideas has to be objective fact. As Scarlett Lewis, a mourning parent who lost her son at Sandy Hook noted from the stand in her testimony, “Truth — truth is so vital to our world. Truth is what we base our reality on, and we have to agree on that to have a civil society.”

When the Truth Hurts Someone with Power, It’s Extra Costly

On the other hand, the truth can be costly, too. Corporations have increasingly learned that suing people in media for sharing the truth about the impact of their business practices can be an incredibly effective way of getting such whistleblowers to stop—simply because of their inability to keep up on legal costs vs any actual assessment as to whether their statements are true or not.

In 2019 I was personally sued by private prison company CoreCivic
CXW
, in the wake of the family separation crisis, for saying that prisons and immigrant detention centers separate families. On a simply mechanical basis, when one family member goes behind bars for any reason and their child or mother or husband is no longer with them, it seems like a prudent use of the English language to refer to this family as “separated.” Claiming otherwise is negating the suffering of these detained parents who deeply missed their children in the same way that Jones attempted to negate the suffering of the Sandy Hook parents.

The Business and Human Rights Centre has referred to this CoreCivic lawsuit as a SLAPP suit, a Strategic Lawsuit Against Public Participation. They further define SLAPPs as “one tactic used by unscrupulous business actors to stop people raising concerns about their practices.” SLAPPs can take the form of criminal or civil lawsuits brought to intimidate, bankrupt and silence critics.” It is just one of 355 lawsuits that they’ve identified globally in a 2021 report, including by companies like Chevron
CVX
, Unilever and Walmart
WMT
, targeting both writers and activists. And as media rooms shrink internationally, it makes it hard for investigative journalism not only to thrive but to afford the level of legal protection required to tell hard truths. And yet, if we don’t, we lose our ability to shape the world we all want to live in.

Do we need $150M lawsuits to determine the truth? Or can we simply ask for more from media?

Let’s face it—no one likes a lawsuit. Certainly not grieving parents. “It seems so incredible to me that we have to do this — that we have to implore you, to punish you — to get you to stop lying,” Lewis told Jones from the stand. “You don’t understand, and you won’t understand unless there is some form of punishment that would make you understand.”

There is a hope that the hefty price Jones will pay will discourage others who seek benefit, whether monetary or political, on the basis of lies. But it’s a cautionary tale that simply shouldn’t be necessary. All of us can become more conscious consumers before spreading information, whether on the right or the left. We can let conspiracy theories die on the vine rather than fueling them with likes and shares. The average person is unlikely to sue, but we can still take responsibility for the information we spread. We can punish such lairs with their marginalization. And we can commit to protecting those who dare to tell the truth.

Full disclosures related to my work available here. This post does not constitute investment, tax, or legal advice, and the author is not responsible for any actions taken based on the information provided herein. Certain information referenced in this article is provided via third-party sources and while such information is believed to be reliable, the author and Candide Group assume no responsibility for such information.

CoreCivic filed a lawsuit in March of 2020 against author Morgan Simon and her firm Candide Group, claiming that certain of her prior statements on Forbes.com regarding their involvement in family detention and lobbying activities are “defamatory.” While we won dismissal of the case in November of 2020, CoreCivic has appealed such that the lawsuit is still active.

Follow me on Twitter or LinkedIn. Check out my website or some of my other work here.

Adblock test (Why?)



Source link

Continue Reading

Media

Social media is polluting society. Moderation alone won’t fix the problem – MIT Technology Review

Published

 on


We all want to be able to speak our minds online—to be heard by our friends and talk (back) to our opponents. At the same time, we don’t want to be exposed to speech that is inappropriate or crosses a line. Technology companies address this conundrum by setting standards for free speech, a practice protected under federal law. They hire in-house moderators to examine individual pieces of content and remove them if posts violate predefined rules set by the platforms.

The approach clearly has problems: harassment, misinformation about topics like public health, and false descriptions of legitimate elections run rampant. But even if content moderation were implemented perfectly, it would still miss a whole host of issues that are often portrayed  as moderation problems but really are not. To address those non-speech issues, we need a new strategy: treat social media companies as potential polluters of the social fabric, and directly measure and mitigate the effects their choices have on human populations. That means establishing a policy framework—perhaps through something akin to an Environmental Protection Agency or Food and Drug Administration for social media—that can be used to identify and evaluate the societal harms generated by these platforms. If those harms persist, that group could be endowed with the ability to enforce those policies. But to transcend the limitations of content moderation, such regulation would have to be motivated by clear evidence and be able to have a demonstrable impact on the problems it purports to solve.

Moderation (whether automated or human) can potentially work for what we call “acute” harms: those caused directly by individual pieces of content. But we need this new approach because there are also a host of “structural” problems—issues such as discrimination, reductions in mental health, and declining civic trust—that manifest in broad ways across the product rather than through any individual piece of content. A famous example of this kind of structural issue is Facebook’s 2012 “emotional contagion” experiment, which showed that users’ affect (their mood as measured by their behavior on the platform) shifted measurably depending on which version of the product they were exposed to. 

In the blowback that ensued after the results became public, Facebook (now Meta) ended this type of deliberate experimentation. But just because they stopped measuring such effects does not mean product decisions don’t continue to have them.

Structural problems are direct outcomes of product choices. Product managers at technology companies like Facebook, YouTube, and TikTok are incentivized to focus overwhelmingly on maximizing time and engagement on the platforms. And experimentation is still very much alive there: almost every product change is deployed to small test audiences via randomized controlled trials. To assess progress, companies implement rigorous management processes to foster their central missions (known as Objectives and Key Results, or OKRs), even using these outcomes to determine bonuses and promotions. The responsibility for addressing the consequences of product decisions is often placed on other teams that are usually downstream and have less authority to address root causes. Those teams are generally capable of responding to acute harms—but often cannot address problems caused by the products themselves.

With attention and focus, this same product development structure could be turned to the question of societal harms. Consider Frances Haugen’s congressional testimony last year, along with media revelations about Facebook’s alleged impact on the mental health of teens. Facebook responded to criticism by explaining that it had studied whether teens felt that the product had a negative effect on their mental health and whether that perception caused them to use the product less, and not whether the product actually had a detrimental effect. While the response may have addressed that particular controversy, it illustrated that a study aiming directly at the question of mental health—rather than its impact on user engagement—would not be a big stretch. 

Incorporating evaluations of systemic harm won’t be easy. We would have to sort out what we can actually measure rigorously and systematically, what we would require of companies, and what issues to prioritize in any such assessments. 

Companies could implement protocols themselves, but their financial interests too often run counter to meaningful limitations on product development and growth. That reality is a standard case for regulation that operates on behalf of the public. Whether through a new legal mandate from the Federal Trade Commission or harm mitigation guidelines from a new governmental agency, the regulator’s job would be to work with technology companies’ product development teams to design implementable protocols measurable during the course of product development to assess meaningful signals of harm. 

That approach may sound cumbersome, but adding these types of protocols should be straightforward for the largest companies (the only ones to which regulation should apply), because they have already built randomized controlled trials into their development process to measure their efficacy. The more time-consuming and complex part would be defining the standards; the actual execution of the testing would not require regulatory participation at all. It would only require asking diagnostic questions alongside normal growth-related questions and then making that data accessible to external reviewers. Our forthcoming paper at the 2022 ACM Conference on Equity and Access in Algorithms, Mechanisms, and Optimization will explain this procedure in more detail and outline how it could effectively be established.

When products that reach tens of millions are tested for their ability to boost engagement, companies would need to ensure that those products—at least in aggregate—also abide by a “don’t make the problem worse” principle. Over time, more aggressive standards could be established to roll back existing effects of already-approved products.

There are many methods that might be suitable for this type of process. These include protocols like the photographic affect meter, which has been used diagnostically to assess how exposure to products and services affects mood. Technology platforms are already using surveys to assess product changes; according to reporters Cecilia Kang and Sheera Frankel, Mark Zuckerberg looks at survey-based growth metrics for most every product decision, the results of which were part of his choice to roll back the “nicer” version of Facebook’s news feed algorithm after the 2020 election. 

It would be reasonable to ask whether the technology industry sees this approach as feasible and whether companies would fight against it. While any potential regulation might engender such a response, we have received positive feedback from early conversations about this framework—perhaps because under our approach, most product decisions would pass muster. (Causing measureable harms of the sort described here is a very high bar, one that most product choices would clear.) And unlike other proposals, this strategy sidesteps direct regulation of speech, at least outside the most extreme cases.

At the same time, we don’t have to wait for regulators to take action. Companies could readily implement these procedures on their own. Establishing the case for change, however, is difficult without first starting to collect the sort of high-quality data we’re describing here. That is because one cannot prove the existence of these types of harms without real-time measurement, creating a chicken-and-egg challenge. Proactively monitoring structural harms won’t resolve platforms’ content issues. But it could allow us to meaningfully and continuously verify whether the public interest is being subverted. 

The US Environmental Protection Agency is an apt analogy. The original purpose of the agency was not to legislate environmental policy, but to enact standards and protocols so that policies with actionable outcomes could be made. From that point of view, the EPA’s lasting impact was not to resolve environmental policy debates (it hasn’t), but to make them possible. Likewise, the first step for fixing social media is to create the infrastructure that we’ll need in order to examine outcomes in speech, mental well-being, and civic trust in real time. Without that, we will be prevented from addressing many of the most pressing problems these platforms create.

Nathaniel Lubin is a fellow at the Digital Life Initiative at Cornell Tech and the former director of the Office of Digital Strategy at the White House under President Barack Obama. Thomas Krendl Gilbert is a postdoctoral fellow at Cornell Tech and received an interdisciplinary PhD in machine ethics and epistemology at UC Berkeley.

Adblock test (Why?)



Source link

Continue Reading

Media

Cineplex Digital Media Selected by Primaris REIT for New In-Mall Digital Media and Directory Signage Network – Canada NewsWire

Published

 on


Digital Signage Solutions Planned for 19 Shopping Centres Across Canada including Dufferin Mall in Toronto and Orchard Park in Kelowna

TORONTO, Aug. 9, 2022 /CNW/ – (TSX: CGX) – Today, Cineplex Digital Media (CDM), a division of Cineplex, announced that it has been selected to develop, install, and maintain a state-of-the-art digital signage network in Primaris REIT (Primaris) managed shopping centres in markets across Canada, including Toronto, Calgary, and Kelowna. CDM was selected for its extensive experience in the creation and management of innovative digital networks as well as its ability to offer a solution that includes revenue generation, content development, and advertising media sales through Cineplex Media.

As part of the partnership, CDM will operate a network of nearly 70 digital displays at 19 Primaris owned and managed retail properties in British Columbia, Alberta, Manitoba, Ontario, Quebec, and New Brunswick. Each property will receive a custom display solution consisting of large double-sided portrait screens for media advertising, mall directories, and maps. The new network of digital displays is expected to be fully deployed nationally this fall.

“We are thrilled that Primaris selected CDM for this exciting project. Our experience-first approach and data-driven audience targeting will enable Primaris to engage shoppers during their mall visits, as well as provide our media partners with the ability to reach even more of Canada’s mall consumers in additional key markets,” said Fab Stanghieri, Executive Vice President and Managing Director, Media, Cineplex. “Our shopping network that includes 69 centres with over 700 million visitors yearly, combined with our Primaris partnership, will now allow us to reach 13 new population centres with more than 1.4 million local residents.”

“Primaris’ ongoing commitment to our consumers and retail partners is exemplified through our continuous efforts to increase traffic to our shopping centres with memorable and meaningful experiences,” said Jasleen Bhinder, Director, Marketing, Primaris REIT. “We are excited to work with CDM’s strategic customer-centric and robust in-house team of experts and look forward to CDM’s strategy-focused programs including creative playlists, optimization, specialty tenant branding opportunities, and innovative technology solutions.”

As a one-stop digital signage solution provider, CDM offers end-to-end services that drive results. Making a name for itself in the Digital Out-of-Home, Retail, Financial, Grocery, and Quick Service Restaurant industries, CDM provides innovative, data-led digital signage network solutions for clients, including Primaris. CDM is not only about hanging screens, but its industry leadership also stems from its expertise in creative and experience design, data & analytics services, installation and operational excellence, and media sales. CDM makes impressions worth more, do more, and deliver more. For more information on Cineplex Digital Media, visit CDMExperiences.com.

About Cineplex

Cineplex (TSX: CGX) is a top-tier Canadian brand that operates in the Film Entertainment and Content, Amusement and Leisure, and Media sectors. Cineplex offers a unique escape from the everyday to millions of guests through its circuit of over 170 movie theatres and location-based entertainment venues. In addition to being Canada’s largest and most innovative film exhibitor, the company operates Canada’s favourite destination for ‘Eats & Entertainment’ (The Rec Room) and complexes specially designed for teens and families (Playdium). It also operates successful businesses in digital commerce (CineplexStore.com), alternative programming (Cineplex Events), cinema media (Cineplex Media), digital place-based media (Cineplex Digital Media) and amusement solutions (Player One Amusement Group). Providing even more value for its guests, Cineplex is a joint venture partner in Scene+, Canada’s largest entertainment loyalty program.

Proudly recognized as having one of the country’s Most Admired Corporate Cultures, Cineplex employs over 10,000 people in its offices and venues across Canada and the United States. To learn more, visit Cineplex.com.

About Primaris

Primaris REIT (TSX: PMZ.UN) owns and manages 35 retail properties aggregating approximately 11.4 million square feet, at Primaris REIT’s ownership share valued at approximately $3.3B, including 22 enclosed shopping centres totaling approximately 9.8 million square feet and 13 unenclosed shopping centre and mixed-use properties aggregating approximately 1.6 million square feet. Primaris REIT is one of the largest owners and managers of enclosed shopping centres in Canada, and one of the largest owners and operators of retail property of all formats across Canada.

SOURCE Cineplex

For further information: Cineplex Media Relations: Judy Lung, Director, Communications, Cineplex, [email protected]; Primaris Media Relations: Jasleen Bhinder, Director, Marketing, Primaris REIT, [email protected]

Adblock test (Why?)



Source link

Continue Reading

Trending