Connect with us

Media

Social media and telecom companies vague about their response to January 6 committee – CNN

Published

 on


(CNN)A wide range of telecommunications and social media companies are still grappling with how to respond, if at all, to a request by the House Select Committee investigating the January 6 attack on the Capitol to preserve the records of several hundred people that could play a role in their investigation.

The uncertainty around how they will respond comes against the backdrop of what is expected to be a protracted legal battle once the committee begins the process of formally requesting records be turned over as part of their investigation. The likelihood of litigation increased when House Minority Leader Kevin McCarthy and several of his fellow Republican members cried foul over the committee’s request.
An official at one of the companies that received a request from the committee told CNN that McCarthy’s warning last week was interpreted as a “shot across the bow” for phone providers, in particular. Still, many of the companies have indicated they still intend to work with the committee but the responses were overwhelmingly vague as far as what that would entail.
In addition to the preservation of records requests, Thursday was also the deadline for 15 social media companies, many of which were also on the preservation of records request list, to turn over a range of records related to company policies dealing with extremism, misinformation and foreign influence. Thursday also marked the deadline for various government agencies to comply with the committee’s request.
A spokesperson for the select panel said in a statement Thursday night that “with several hours to go before today’s deadline, the Select Committee has received thousands of pages of documents in response to our first set of requests and our investigative team is actively engaged to keep that flow of information going.”
“These records supplement the material we’ve received from other House Committees related to their earlier probes of January 6th. The Select Committee is also aware that the National Archives has undertaken the process required by law for review of presidential records,” the statement continued.
A spokesperson from the National Archived told CNN it had received the request from the committee and “will respond to it in accordance with the Presidential Records Act.”
The preservation requests that were sent to 35 companies were not for these companies to turn over any of these records, but just to preserve them in the event the committee’s investigation leads them to ask for them to be handed over. In their letters to the companies, the committee went to great pains to point out that the request should not be interpreted as the subjects being the targets of the investigation or being accused of doing anything wrong.
“Inclusion of any individual name on the list should not be viewed as indicative of any wrongdoing by that person or others,” the letters reads. “The document identifies individuals who may have relevant information to aid the factfinding of the Select Committee.”
The committee had sent the request to the 35 companies asking them specifically to contact the panel if for some reason they were unable to comply with it.
The complex and extensive requests coupled with the unique nature of the committee’s work seems to have left many of these companies in a difficult position. CNN reached out to all 35 companies to see how they plan to respond. Most did not respond at all and the ones that did offered diplomatic responses that did not give much insight into how they plan to comply.
“We strongly condemn the violence that took place on Jan 6 at the U.S. Capitol,” said Clint Smith, Chief Legal Officer for Discord, in a statement. “We have been contacted by the House Select Committee and intend to cooperate fully as appropriate.”
While Smith makes it clear that Discord, an instant messaging and digital distribution platform, wants to cooperate with the investigation, the company could not describe at what level they plan to comply.
Discord was not alone. Much bigger tech giants like Facebook and Google chose not to go into detail about their work with the select committee.
“We have received the Select Committee’s letter and are committed to working with Congress on this. The events of January 6 were unprecedented and tragic, and Google and YouTube strongly condemn them,” said a spokesperson for the company. “We’re committed to protecting our platforms from abuse, including by rigorously enforcing our policies for content related to the events of January 6.”
Meanwhile, Facebook chose only to acknowledge they had received the committee’s request, but not how they planned to act. “We have received the request and look forward to continuing to work with the committee,” said a company spokesperson. The spokesperson referred CNN to the committee when asked what specifically the company turned over.
The same can be said for Zoho, an online office suite provider, which told CNN they had no comment beyond confirming they received the committee’s request.
While the companies that responded seemed reluctant to provide many specifics around their plans, few went out of their way to challenge the committee’s authority. Rocket.chat, an integrated messaging platform, told CNN they planned to do all they could to help the committee’s work.
“Rocket.Chat has always complied with such requests and has kept a close relationship with the authorities to communicate/share anything possible to help in these types of cases,” said Sana Javid, a spokesperson for Rocket.Chat.
But there were some companies that took a more defiant tone, in part because their businesses are located overseas and because the services they provide make it impossible for them to supply the committee with all they are requesting. Proton, a Switzerland-based encrypted e-mail provider told CNN they won’t comply unless forced to by the Swiss government.
“Our use of zero-access encryption means that we do not have access to the message content being requested,” said a company spokesperson. “Proton only complies with legally binding orders that are initiated or approved by Swiss authorities and therefore meet Swiss legal standards.”
The social network Gab, which is known as a platform widely used by the alt-right and white supremacists, publicly posted a point by point response to the committee’s request for information. They claimed they did not have much of the information the committee had requested. Furthermore Gab told the committee they respond only requests from law enforcement, when compelled by subpoena. They argued that the “Stored Communications Act” prevented them from providing what the committee was asking for.
The major phone providers, like AT&T, Verizon and T-Mobile, which some consider to be among the most consequential for lawmakers who were named as part of the request, overwhelmingly did not respond to CNN’s questions about their plans to comply with the committee.
Part of why the companies seem unwilling to publicly reveal how they plan to respond is likely because the issue is almost certainly headed to court.
“I think it’s very unlikely that any of the companies are just going to produce the documents without somebody going to court,” said Justin Antonipillai, an expert on data privacy and the former Acting Undersecretary for Economic Affairs at the Commerce Department during the Obama Administration.
Republicans have already gone out of their way to suggest the requests are inappropriate. House Minority Leader Kevin McCarthy, whose name CNN has reported is on the list of people the committee is interested in, suggested that companies that comply could be in violation of the law.
Antonipillai said that he expects most companies will preserve the records to be safe, but won’t turn those records over until the matter is settled in a court of law.
“You can already see that the congressional Republicans are laying the foundation to say that the congressional committee has no authority to issue the subpoenas, and then they will argue that it’s overboard and that the amount of data that’s being collected is unnecessary,” Antonipillai said. However, he said the courts have historically given the committees like these wide latitude to execute their subpoena power.
“I think it’s going to have broad leeway and if history holds true, courts are going to give this congressional committee a pretty wide berth to go in and pull in the records that they’re asking for,” he added.
The committee has accused McCarthy of attempting to intimidate the companies so they will slow-walk their compliance because of fears of legal repercussions.
“The Select Committee is investigating the violent attack on the Capitol and attempt to overturn the results of last year’s election. We’ve asked companies not to destroy records that may help answer questions for the American people,” committee spokesman Tim Mulvey said in a statement to CNN. “The committee’s efforts won’t be deterred by those who want to whitewash or cover up the events of January 6th, or obstruct our investigation.”
Despite McCarthy’s interference, Antonipillai said the chances of the minority leader being charged with obstruction of any kind is unlikely.
“I think it’s really unlikely that it would rise to the level of an obstruction of justice or obstruction of an investigation, just to send a letter like this,” he said referring to McCarthy’s statement about the committee’s request to preserve records. “I think if it escalated or they did something outside of the normal channel maybe, but I don’t see this rising to the level of that.”
This story has been updated with a statement from the House select committee.

Adblock test (Why?)



Source link

Continue Reading

Media

Social Media Has the Same Downsides As Alcohol – The Atlantic

Published

 on


Last year, researchers at Instagram published disturbing findings from an internal study on the app’s effect on young women. “Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse,” the authors wrote in a presentation obtained by The Wall Street Journal. “They often feel ‘addicted’ and know that what they’re seeing is bad for their mental health but feel unable to stop themselves.”

This was not a new revelation. For years, Facebook, which owns Instagram, has investigated the app’s effects on its users, and it kept getting the same result. “We make body image issues worse for one in three teen girls,” said one slide from a 2019 presentation. “Teens who struggle with mental health say Instagram makes it worse.”

The findings weren’t all negative. Although many teenagers reported that Instagram was compulsive but depressing, most teenagers who acknowledged this dark side said they still thought the app was enjoyable and useful.

So a fair summary of Instagram according to Instagram might go like this: Here is a fun product that millions of people seem to love; that is unwholesome in large doses; that makes a sizable minority feel more anxious, more depressed, and worse about their bodies; and that many people struggle to use in moderation.

What does that sound like to you? To me, it sounds like alcohol—a social lubricant that can be delightful but also depressing, a popular experience that blends short-term euphoria with long-term regret, a product that leads to painful and even addictive behavior among a significant minority. Like booze, social media seems to offer an intoxicating cocktail of dopamine, disorientation, and, for some, dependency. Call it “attention alcohol.”

I personally don’t spend much time on Instagram, but on reflection I love Twitter quite like the way I love wine and whiskey. Other analogies fall short; some people liken social media to junk food, but ultra-processed snacks have few redeemable health qualities compared with just about every natural alternative. I have a more complicated relationship with Twitter. It makes my life better and more interesting. It connects me with writers and thinkers whom I would never otherwise reach. But some days, my attention will get caught in the slipstream of gotchas, dunks, and nonsense controversies, and I’ll feel deeply regretful about the way I spent my time … only to open the app again, several minutes later, when the pinch of regret has relaxed and my thumb reaches, without thought, toward a familiar blue icon on my phone.

For the past decade, writers have been trying to jam Facebook into various analogical boxes. Facebook is like a global railroad; or, no, it’s like a town square; or, perhaps, it’s like a transnational government; or, rather, it’s an electric grid, or a newspaper, or cable TV.

Each of these gets at something real. Facebook’s ability to connect previously unconnected groups of people to information and commerce really does make it like a 21st-century railroad. The fact that hundreds of millions of people get their news from Facebook makes it very much like a global newspaper. But none of these metaphors completely captures the full berserk mosaic of Facebook or other social-media platforms. In particular, none of them touches on what social media does to the minds of the young people who use it the most.

“People compare social media to nicotine,” Andrew Bosworth, a longtime Facebook executive, wrote in an extensive 2019 memo on the company’s internal network. “I find that wildly offensive, not to me but to addicts.” He went on:

I have seen family members struggle with alcoholism and classmates struggle with opioids. I know there is a battle for the terminology of addiction but I side firmly with the neuroscientists. Still, while Facebook may not be nicotine I think it is probably like sugar. Sugar is delicious and for most of us there is a special place for it in our lives. But like all things it benefits from moderation.

But in 2020, Facebook critics weren’t the ones comparing its offerings to addiction-forming chemicals. The company’s own users told its research team that its products were akin to a mildly addictive depressant.

If you disbelieve these self-reports, perhaps you’ll be persuaded by the prodigious amounts of outside research suggesting the same conclusion. In June, researchers from NYU, Stanford, and Microsoft published a paper with a title that made their position on the matter unambiguous: “Digital Addiction.” In closing, they reported that “self-control problems cause 31 percent of social media use.” Think about that: About one in three minutes spent on social media is time we neither hoped to use beforehand nor feel good about in retrospect.

Facebook acknowledges these problems. In a response to the Wall Street Journal exposé published on Tuesday, Karina Newton, the head of public policy at Instagram, stood by the company’s research. “Many find it helpful one day, and problematic the next,” she wrote. “Many said Instagram makes things better or has no effect, but some, particularly those who were already feeling down, said Instagram may make things worse.” But this self-knowledge hasn’t translated into sufficient reform.

Thinking of social media as attention alcohol can guide reform efforts. We have a kind of social infrastructure around alcohol, which we don’t have yet for social media. The need to limit consumption is evident in our marketing: Beer ads encourage people to drink responsibly. It’s in our institutions: Established organizations such as Alcoholics Anonymous are devoted to fighting addiction and abuse. It’s in our regulatory and economic policy: Alcohol is taxed at higher rates than other food and drink, and its interstate distribution has separate rules. There is also a legal age limit. (Instagram requires its users to be 13 years old, although, as it goes with buying alcohol, many users of the photo-sharing app are surely lying about their age.)

Perhaps most important, people have developed a common vocabulary around alcohol use: “Who’s driving tonight?”; “He needs to be cut off”; “She needs some water”; “I went too hard this weekend”; “I might need help.” These phrases are so familiar that it can take a second to recognize that they communicate actual knowledge about what alcohol is and what it does to our bodies. We’ve been consuming booze for several thousand years and have studied the compound’s specific chemical effects on the liver and bloodstream. Social media, by contrast, has been around for less than two decades, and we’re still trying to understand exactly what it’s doing, to whom, and by what mechanism.

We might be getting closer to an answer. A 124-page literature review compiled by Jonathan Haidt, an NYU professor, and Jean Twenge, a San Diego State University professor, finds that the negative effects of social media are highly concentrated among young people, and teen girls in particular. Development research tells us that teenagers are exquisitely sensitive to social influence, or to the opinions of other teens. One thing that social media might do is hijack this keen peer sensitivity and drive obsessive thinking about body image, status, and popularity. Instagram seems to create, for some teenage girls, a suffocating prestige economy that pays people in kudos for their appearance and presentation. The negative externality is dangerously high rates of anxiety.

How do we fix it? We should learn from alcohol, which is studied, labeled, taxed, and restricted. Similar strictures would discourage social-media abuse among teenagers. We should continue to study exactly how and for whom these apps are psychologically ruinous and respond directly to the consensus reached by that research. Governments should urge or require companies to build more in-app tools to discourage overuse. Instagram and other app makers should strongly consider raising their minimum age for getting an account and preventing young users from presenting fake birthdates. Finally, and most broadly, parents, teens, and the press should continue to build a common vocabulary and set of rules around the dangers of excess social media for its most vulnerable users.

Digital sabbaths are currently the subject of columns and confessionals. That’s a good start, but this stuff should be sewn into our everyday language: “No apps this weekend”; “I need to be cut off”; “I love you, but I think you need to take a break”; “Can you help me stay offline?” These reforms should begin with Facebook. But with social media, as with every other legal, compulsive product, the responsibility of moderation ends with the users.

Adblock test (Why?)



Source link

Continue Reading

Media

Media Availability: Minister Haggie Available to Media to Discuss Emergency Services – News Releases – Government of Newfoundland and Labrador

Published

 on


The Honourable John Haggie, Minister of Health and Community Services, will hold a media availability today (Thursday, September 16) to discuss emergency services following a meeting with NAPE.

The availability will take place in the Media Centre, East Block, Confederation Building, at 2:15 p.m. Media covering the availability are asked to attend in-person.

The availability will be live-streamed on the Government of Newfoundland and Labrador’s Facebook and Twitter accounts and on YouTube.

-30-

Media contacts
Nancy Hollett
Health and Community Services
709-729-6554/327-7878
nancyhollett@gov.nl.ca

2021 09 16
12:45 pm

Adblock test (Why?)



Source link

Continue Reading

Media

The Growing Tensions Between Digital Media Platforms and Copyright Enforcement – AAF – American Action Forum

Published

 on


Executive Summary

  • Copyright infringement tensions between digital “new media” platforms and traditional media are at an all-time high.
  • Pressure from copyright holders combined with aggressive infringement-flagging algorithms and significant penalties under current regulations push platforms to take down content—often before infringement has been proven.
  • While there are legitimate concerns regarding copyright infringement online, current regulation incentivizes over-blocking content in order to avoid fines; this tactic is alienating content creators and limiting free speech and innovation.
  • Moreover, recent reform proposals aim to increase platform liability; this will make platforms even more cautious, exacerbating current problems and seriously limiting the content that has made these platforms a novel means of entertainment.

Introduction

Digital media or “new media” platforms that host user-generated videos such as YouTube or Vimeo, and livestreams such as Twitch, YouTube Gaming, and Facebook Gaming, are gaining a bigger role in the entertainment industry. This trend accelerated during the coronavirus pandemic, with viewership rates increasing to 27.9 billion hours in 2020, an all-time high. While most of the livestreaming platforms initially focused on gaming content, their offerings have expanded to include podcasters, DJs, musicians, and traditional sports. For example, Twitch is now the official streaming partner of USA Basketball and hosted the Spain broadcast of the biggest South American soccer tournament.

As these platforms grow, the attention and level of scrutiny grows as well. One of the most prominent criticisms is that the platforms are failing to properly address copyright infringement on their websites. Record labels and movie studios complain that these platforms are not doing a good enough job protecting their intellectual property rights. Yet on the other side, content creators and their fans complain that overly restrictive application of copyright regulations severely limits content that should constitute “fair use” of copyrighted material.

The “fair use” doctrine” and the Digital Millennium Copyright Act (DMCA) are at the center of this debate. The DMCA, the most important law regarding copyrighted work on the internet, aims to prevent the unauthorized access and copying of copyrighted works, which usually requires authorization by the copyright holder. The exception is “fair use,” or the reproduction of these copyrighted materials for “criticism, comment, news reporting, teaching (including multiple copies for classroom use), scholarship, or research.” Fair use is key to the development of an online entertainment industry, as it allows the content creators on these platforms to reproduce materials to create original content such as parodies, commentary, reviews, or live reactions.

Copyright Enforcement Is Increasingly Burdensome for Platforms and Creators

Platforms bear the responsibility of enforcing the fair use doctrine. Under the DMCA, they can face fines of over $150,000 per instance of copyright infringement. According to a public statement by Twitch, the number of copyright claims on its platform increased from less than 50 per year to more than 1,000 a week. This can translate into multi-million-dollar fines if platforms’ moderation is deemed inadequate.

This has pushed the platforms in the direction of preemptively taking down content or sanctioning streamers once they receive a DMCA claim and letting content creators appeal their case after the sanction. It is more cost effective to review appeals carefully over a longer period, as they are not bound to respond to appeals within any specific timeline, as is the case with DMCA claims. The number of appeals will certainly be lower, and in case of a mistake, the potential revenue loss for platforms will certainly be lower than the potential fine for a DMCA violation.

Platforms have also moved toward automation as a mechanism to respond to DCMA claims in a timely and cost-effective manner. By using automated systems and algorithms, platforms forgo the need to use human systems, which tend to be costly and slower in their review process. While on-demand platforms such as YouTube have implemented algorithmic systems for around 14 years, livestreaming platforms have started to increasingly implement similar systems in order to quickly remove or mute a potentially copyright-infringing livestream.

While automation has been beneficial in terms of response time, its increased application has presented multiple issues. One of its main issues is its lack of accuracy, where fair use content or original materials can be incorrectly flagged. This is a common problem, as automated systems lack comprehension of context and can be activated with as little as three seconds of audio or video being reproduced. This lack of context has also led to the sanctioning of content where copyrighted music was played unintentionally, such as a video capturing loud music from a passing car or store speaker.

Another common problem with automated systems is that they are vulnerable to exploitation. For example, there are cases of law enforcement officers playing copyrighted music to prevent civilians’ recordings from being uploaded to these platforms. Another example is the weaponization of DMCA claims, where a user flags content as a violation of copyright with the intention of censoring or negatively impacting a content creator, rather than as a legitimate claim over copyright infringement. In fact, it has become common for content creators to be extorted by ill-intentioned individuals who threaten a copyright claim unless they are paid a certain amount of money.

The combination of caution, automation, and preemptive takedowns reflects the rising burden of moderating copyright infringement. An example of this is the introduction of the three-strike system, where content creators are banned from posting content after receiving three copyright claims. Beyond threatening content creators, this practice threatens the platforms themselves, as they run the risk of alienating the creators that provide the content which makes them appealing to the viewers and advertisers that provide revenue for them.

Proposed Changes to the DMCA Will Make the Issue Worse

Current proposals to update the DMCA and copyright enforcement regulations seek to increase platforms’ legal liability, which could make this situation worse. Senator Tom Tillis has led efforts to pass legislation for more stringent copyright enforcement, reforming both the “notice and takedown” system in the DMCA and increasing the legal consequences of copyright infringement. The Protecting Lawful Streaming Act and the Copyright Alternative Small-Claims Enforcement (CASE) Act, both included in last year’s appropriations bill, introduced major tweaks to copyright enforcement. The CASE Act created a small-claims copyright tribunal, with the objective of speeding up the dispute process for copyright cases under $30,000. On the other hand, the Protecting Lawful Streaming Act targets commercial websites designated exclusively to illegally streaming copyrighted content by making this act a felony instead of a civil infraction.

Sen. Tillis has also said he hopes to introduce legislation that would increase platform liability as moderators; this would require the platforms to establish a system that prevents the re-upload of copyrighted content previously taken down. This change would replace the current “notice and takedown,” where platforms are bound to remove content after it has been flagged as a copyright violation, with a “notice and stay-down” system. Such a system would compel platforms to take a more proactive and strict approach, in which they must review and approve content before it is posted, rather than after the fact. Advocates of this system claim it is the best mechanism to prevent the reposting of infringing content, as platforms will be forced to moderate at an earlier stage, allowing them to prevent rather than react.

Yet this approach could further stifle creativity and innovation on these platforms. Increasing platforms’ potential liability would push them to take a further precautionary approach, where they will likely over-block content in order to reduce potential legal liabilities. By placing a higher burden on platforms, platforms would have to review and approve all content before it is published. To do so, platforms would need to further rely on automatization to review content in a timely manner, so that creators are still able to post content, but platforms are able to comply with regulation. While this could potentially prevent some cases of copyright infringement, it will do at a cost to consumers, content creators, and platforms. Consumers would be further deprived of content and content creators would face further barriers to enter a booming market, potentially pushing them out of it. This would severely hinder the platforms’ value proposition and content diversity, effectively hindering their growth.

Better Principles for Potential DMCA Reform

To maintain the growth of the new-media platform industry, policymakers should focus on updating and expanding the definition of fair use so that its application in these platforms is clearer. By establishing clearer fair use guidelines, creators and platforms can more easily moderate potentially infringing content. More important, the definition of fair use must be broadened to include newer uses, such as video game streaming or movie and music reviews. Adopting a broad, technology-neutral definition of fair use is vital for promoting an open internet, which hosts these novel forms of entertainment. This provides platforms with a clearer roadmap to focus on combating privacy and meaningful copyright violations.

While some platforms—such as the Facebook Gaming streaming platform—have been able to strike licensing deals with major record labels to use their music in streams, such agreements usually require the payment of hefty fees that only a few platforms can afford. Under the DMCA, copyright holders hold higher leverage in this kind of negotiations, and licensing fees would have to offset projected earnings from pursuing compensation under the DMCA.

Policymakers and regulators ought to also understand the nuances of content moderation. When formulating content moderation strategies, platforms face continuous and multiple tradeoffs: relying on human systems tends to increase accuracy but will sacrifice timeliness and increase costs. On the other hand, relying on automated systems increases timeliness and reduces costs, but at the expense of over-blocking content, and increasing misreporting and vulnerability for exploitation. While adding a human backstop could be helpful to remedy this issue, the pressure of fines and time-to-takedown restrains push platforms to prioritize timeliness over accuracy.

These challenges are magnified in livestreaming platforms, where responding to copyright infringement should ideally happen in real-time. Yet such immediate responses require significant additional resources to detect, analyze, take down, and notify the streamer of the infringement. This can be an extremely difficult task for platforms, considering livestreams can last for multiple hours and the threshold for what is considered infringement can be as low as three seconds.

Conclusion

New media platforms, or platforms that host livestreaming and video content, have shown tremendous growth as new entertainment, evolving from a niche audience to attracting mainstream users. Nonetheless, this growth might be severely hindered by the platforms’ growing conflicts with current copyright regulation. Increasing pressure from copyright holders and the threats of onerous fines under the DMCA have pushed platforms to implement automated systems to take down materials flagged as infringing on copyrights. The technical limitations with algorithmic systems have generated a problem of over-blocking, where creativity and innovation are stifled, and content creators’ right to fair use can be trampled, pushing them off of the platforms. Reform must be fair both for copyright holders, content creators and new media platforms. Rather than simply piling on more regulation, policymakers and regulators should strive to make fair use policies clearer and more workable, and shift the burden of proof to copyright holders claiming harm, instead of forcing content creators to prove themselves innocent.

Adblock test (Why?)



Source link

Continue Reading

Trending