Connect with us

Media

Canadians’ social media data not stored in country, study finds

Published

 on

Reading Time: 4 minutes

By Pooja Rambaran

Social media platforms such as Facebook and Twitter transfer and store user data in a variety of jurisdictions outside of Canada, according to a recent discussion paper by the Cybersecure Policy Exchange (CPX) at Ryerson.

The study found that most social media privacy policies do not explicitly state the jurisdictions in which the personal data of their users are stored, processed and transferred. This means that “social media platforms can easily transfer personal data between various countries with little oversight or transparency,” the paper reads.

Yuan Stevens, co-author of the study, said the core belief of the paper is that “people in Canada deserve to have control and autonomy over their personal data as a critical aspect of cybersecurity.”

Stevens described personal data as anything that relates to someone as a specific, identifiable person.

Almost every major social media platform—including Facebook, Instagram, LinkedIn, Snapchat, Twitter and TikTok—has faced major security breaches in the last five years, according to the CPX report written by Stevens, Mohammed Masoodi and Sam Andrey.

In 2018, Cambridge Analytica, a data analytics company, was found responsible for improperly collecting personal data of millions of Facebook users. The paper states of these 87 million users, more than 600,000 were Canadians.

As technological companies routinely face buy-outs, mergers and bankruptcies, the storage and protection of personal data may change outside of Canadian regulation. “Malicious hackers can also take advantage of data stored in locations where the data are subject to weak data protection safeguards,” the paper states.

“Our data protection laws have historically given ample freedom to corporations to treat our personal data as they please with little legal oversight,” said Stevens.

The Personal Information Protection and Electronic Documents Act (PIPEDA) is responsible for protecting the personal data of social media users in Canada.

However, it does not prohibit companies from transferring data to third parties or other jurisdictions. When transferring this data to third parties, PIPEDA cited that organizations should provide a comparable level of protection for the collected data to what it would’ve received had it remained within the company.

Yet the act does not specify the meaning of the term “comparable level of protection” and this is left up to the discretion of the individual companies.

“The self-regulatory approach of PIPEDA fundamentally jeopardizes the security, privacy and protection of personal data for users of social media platforms,” the paper reads, adding that this data can be transferred to a variety of jurisdictions without the knowledge of Canadian social media users and with little restrictions under the Canadian privacy law.

“People in Canada deserve to have control and autonomy over their personal data as a critical aspect of cybersecurity”

On the contrary, the European Union’s (EU) General Data Protection Regulation (GDPR), requires organizations that collect personal data of their constituents to comply with their obligations, including legally-binding corporate rules or clear consent for the transfer of data, the paper states.

Those who violate the privacy and security standards set by the GDPR are subject to harsh fines, possibly amounting to as high as 20 million euros, according to the GDPR website.

“In Europe, data protection is an extension of human rights, where the right to control your personal data…is a part of informational self-determination,” said Stevens. “But in Canada, our data protection laws ensure no such protection to people.”

The researchers of the study found that some Canadians were mainly concerned with external government surveillance primarily from China and the U.S. Other Canadians indicated a lack of trust with current Canadian institutions as they believe that storage in Canada could still be improperly surveilled or used, the paper states.

The authors of the paper suggested three policy changes that can be employed by the Canadian government to improve their current data protection laws—comparable protection, consent and special protections for sensitive personal data.

A recent survey by CPX found that 86 per cent of Canadians support policies to keep Canadians’ data within Canada.

Siya Joshi, a first-year computer science student, was previously unaware that Canadian laws allow companies to release user information across borders.

“I would like to know where any personal information I store on my accounts or anything I post is being used, whether that is worldwide or national,” said Joshi, adding that she agrees with the policy suggestions made in the paper.

“[Those] would ensure that I know what [information] is being sent, why and if I agree for it to be sent,” said Joshi.

The paper stated that there needs to be a more rigorous definition of the term “comparable level of protection” in PIPEDA.

When social media companies transfer the personal data of their users outside of Canada, there should be explicit and proactive oversight mechanisms for their privacy, according to Stevens.

“Like the EU, Canada could maintain a list of countries whose data protection laws are deemed sufficient for transfer,” said Stevens.

She added that companies could otherwise provide transfer agreements that demonstrate that the transfer location of the data is sufficient under Canada’s data protection laws.

In cases where the transfer location is not pre-approved and no transfer agreement exists, Stevens suggested that the data protection law should allow social media companies to collect explicit consent for the transfer of data.

This option also requires the disclosure of the specific personal data to be transferred, countries where the data could be stored and the other organizations involved in the process.

The final policy suggestion involves better protection of sensitive personal data such as individuals’ racial or ethnic origins, sex life, sexual orientation, political opinions, religious beliefs, as well as genetic and biometric data.

“[Canadian] laws merely say that more protection is needed when data is more sensitive, allowing social media companies to decide themselves whether highly-revealing personal data deserves certain treatments that better protect our privacy,” said Stevens.

Drawing on thoughts from Shoshana Zuboff, the Charles Edward Wilson Professor Emerita at Harvard Business School and author of The Age of Surveillance Capitalism, Stevens said that companies can collect, analyze and optimize users’ personal data as a form of raw material to predict and shape their behaviours in the name of economic freedom.

“A data protection law that explicitly seeks to enhance economic development will never sufficiently protect our individual and collective rights to informational self-determination as an extension of privacy, one of our fundamental freedoms in Canada,” said Stevens.

Source link

Continue Reading

Media

InvestorChannel's Media Watchlist Update for Wednesday, January, 27, 2021, 16:00 EST – InvestorIntel

Published

 on


image_pdfimage_print

InvestorChannel’s Media Stocks Watchlist Update video includes the Top 5 Performers of the Day, and a performance review of the companies InvestorChannel is following in the sector.
Sources Include: Yahoo Finance, AlphaVantage FinnHub & CSE.
For more information, visit us at InvestorIntel.com or email us at info@investorintel.com

Watchlist Companies:
– Moovly Media Inc (MVY.V) CAD 0.23 (21.05%)n- Glacier Media Inc. (GVC.TO) CAD 0.43 (4.94%)n- ZoomerMedia Limited (ZUM.V) CAD 0.11 (0.00%)n- GVIC Communications Corp. (GCT.TO) CAD 0.27 (0.00%)n- WOW! Unlimited Media Inc (WOW.V) CAD 0.57 (0.00%)n- Media Central Corp Inc (FLYY.CN) 0.02 (0.00%)n- Quizam Media Corp (QQ.CN) 0.37 (0.00%)n- Thunderbird Entertainment Group Inc (TBRD.V) CAD 3.00 (0.00%)n- Wix.com Ltd (WIX) USD 244.50 (-0.86%)n- Zoom Video Communications Inc (ZM) USD 370.74 (-0.97%)n- Slack Technologies Inc (WORK) USD 42.05 (-1.71%)n- Corus Entertainment Inc. (CJR-B.TO) CAD 4.84 (-2.42%)n- MediaValet Inc (MVP.V) CAD 2.70 (-2.88%)n- Postmedia Network Canada Corp (PNC-A.TO) CAD 1.50 (-3.23%)n- Stingray Group Inc (RAY-A.TO) CAD 7.68 (-3.40%)n- Adobe Inc. (ADBE) USD 460.00 (-3.42%)n- HubSpot Inc (HUBS) USD 349.99 (-6.01%)n- Network Media Group Inc (NTE.V) CAD 0.16 (-6.06%)n- Lingo Media Corp (LM.V) CAD 0.08 (-6.25%)n- QYOU Media Inc (QYOU.V) CAD 0.22 (-8.51%)n

Let’s block ads! (Why?)



Source link

Continue Reading

Media

Social media companies should face new legal duty to 'act responsibly,' expert panel finds – The Tri-City News

Published

 on


Social media companies can’t be trusted to moderate themselves, so it falls to the government to enforce new restrictions to protect Canadians from harmful content online, according to a report currently under review by the federal heritage minister.

The Canadian Commission on Democratic Expression, an expert panel of seven members, including former chief justice Beverley McLachlin, said it had become difficult to ignore the fact too many real-world manifestations of online interactions are turning violent, destructive or hateful, despite social media’s parallel role in empowering positive social movements.

The panellists were particularly struck by the role they saw social media play last fall in “sowing distrust” in the aftermath of the U.S. presidential election, culminating in the lethal invasion of the U.S. Capitol. And they found, with the Quebec mosque shooting, the Toronto van attack and the armed invasion of Rideau Hall, that “Canada is not immune.”

“We recognize the charter, we recognize the ability of people to express themselves freely,” said Jean La Rose, former chief executive officer of the Aboriginal Peoples Television Network (APTN) and one of the seven commissioners, in an interview.

“But there must be limits at one point. There has to be limits as to where free speech becomes a racist discourse, or a hurtful discourse, or a hateful discourse.”

‘We have been at the receiving end of racist threats’

These limits would come in the form of a new law passed by Parliament, the commission recommended, that would force social media platforms like Twitter and Facebook, search engines like Google and its video-sharing site YouTube and others to adhere to a new “duty to act responsibly.”

The panel purposefully did not spell out what responsible behaviour should look like. Instead, it said this determination should be left to the government — as well as a new regulator that would oversee a code of conduct for the industry and a new “social media council” that would bring together the platforms with civil society and other groups.

La Rose said his experience in the journalism world demonstrated how there needed to be reasonable limits on what people can freely express so they are not permitted to call for the killings of Muslims, for example, or encourage violence against an individual by posting their home address or other personal details online.

“Having worked in media, having worked at APTN, for example, we have been at the receiving end of racist threats, of severe injury to our people, our reporters and others because of the view we present of the situation of the Indigenous community in Canada,” he said.

“Literally, we’ve had some reporters run off the road when they were covering a story because people were trying to block the telling of that story. So as a news entity, we have seen how far sometimes misinformation, hate and hurtful comments can go.”

Rules must reflect issue’s ‘inherent complexity’: Google

Canadian Heritage Minister Steven Guilbeault has himself recently indicated that legislation to address “online hate” will be introduced “very soon.”

The minister has pointed to the popularity of such a move: a recent survey by the Canadian Race Relations Foundation (CRRF), for example, found that fully four-fifths of Canadians are on board with forcing social media companies to rapidly take down hateful content.

“Canadians are now asking their government to hold social media companies accountable for the content that appears on their platforms,” Guilbeault said after the CRRF survey was published.

“This is exactly what we intend to do, by introducing new regulations that will require online platforms to remove illegal and hateful content before they cause more harm and damage.”

Guilbeault has met with the commission to discuss their recommendations and is currently reviewing their report, press secretary Camille Gagné-Raynauld confirmed.

Representatives from Facebook Canada and Twitter Canada were among several people who provided witness testimony and participated in commission deliberations, the report said. Twitter declined comment to Canada’s National Observer.

“We haven’t reviewed the full report yet, so we can’t comment on the specific recommendations,” said Kevin Chan, global director and head of public policy for Facebook Canada. “We have community standards that govern what is and isn’t allowed on our platform, and in most cases those standards go well beyond what’s required by law.”

Chan also said Facebook agreed regulators should make “clear rules for the internet” so private companies aren’t left to make decisions themselves.

Google spokesperson Lauren Skelly said the company shares Canadians’ concerns about harmful content online and said YouTube takes its responsibility to remove content that violates its policies “extremely seriously.” She said the company has significantly ramped up daily removals of hate speech and removed millions of videos last quarter for violations.

“Any regulation needs to reflect the inherent complexity of the issue and the scale at which online platforms operate,” said Skelly. “We look forward to continuing our work with the government and local partners on addressing the spread of online hate to ensure a safer and open internet that works for all Canadians.”

Incentives ‘not aligned with the public interest’: Jaffer

The nine-month study by the commission, an initiative led by the Public Policy Forum, found that with everything from disinformation campaigns to conspiracy theories, hate speech and people targeted for harm, toxic content was being “amplified” by the actions of social media companies.

The study rejected the notion that social media platforms are “neutral disseminators of information,” finding instead that they curate content to serve their own commercial interests.

“The business model of some of the major social media companies involves keeping people engaged with their platforms as much as possible. And it turns out that keeping people engaged means feeding them sensational content because that’s what keeps people clicking,” said Jameel Jaffer, executive director of the Knight First Amendment Institute at Columbia University and another commissioner.

“The incentives for social media companies are not aligned with the public interest. These are private companies whose obligation is to make money for their shareholders.”

The commission also proposed a tribunal to deal with dispute resolutions quickly, as well as a “transparency regime” that would require social media companies to make certain information available to the regulator, including the “algorithmic architecture used to identify problematic content.”

Jaffer wrote a “concurring statement” in the report, where he confessed it was difficult to endorse the commission’s proposed “duty to act responsibly” without going further to define how that duty will work in reality. He said defining it will require “difficult tradeoffs” between free speech, privacy and other issues.

Carl Meyer / Local Journalism Initiative / Canada’s National Observer

Let’s block ads! (Why?)



Source link

Continue Reading

Media

Social media companies should face new legal duty to 'act responsibly,' expert panel finds – North Shore News

Published

 on


Social media companies can’t be trusted to moderate themselves, so it falls to the government to enforce new restrictions to protect Canadians from harmful content online, according to a report currently under review by the federal heritage minister.

The Canadian Commission on Democratic Expression, an expert panel of seven members, including former chief justice Beverley McLachlin, said it had become difficult to ignore the fact too many real-world manifestations of online interactions are turning violent, destructive or hateful, despite social media’s parallel role in empowering positive social movements.

The panellists were particularly struck by the role they saw social media play last fall in “sowing distrust” in the aftermath of the U.S. presidential election, culminating in the lethal invasion of the U.S. Capitol. And they found, with the Quebec mosque shooting, the Toronto van attack and the armed invasion of Rideau Hall, that “Canada is not immune.”

“We recognize the charter, we recognize the ability of people to express themselves freely,” said Jean La Rose, former chief executive officer of the Aboriginal Peoples Television Network (APTN) and one of the seven commissioners, in an interview.

“But there must be limits at one point. There has to be limits as to where free speech becomes a racist discourse, or a hurtful discourse, or a hateful discourse.”

‘We have been at the receiving end of racist threats’

These limits would come in the form of a new law passed by Parliament, the commission recommended, that would force social media platforms like Twitter and Facebook, search engines like Google and its video-sharing site YouTube and others to adhere to a new “duty to act responsibly.”

The panel purposefully did not spell out what responsible behaviour should look like. Instead, it said this determination should be left to the government — as well as a new regulator that would oversee a code of conduct for the industry and a new “social media council” that would bring together the platforms with civil society and other groups.

La Rose said his experience in the journalism world demonstrated how there needed to be reasonable limits on what people can freely express so they are not permitted to call for the killings of Muslims, for example, or encourage violence against an individual by posting their home address or other personal details online.

“Having worked in media, having worked at APTN, for example, we have been at the receiving end of racist threats, of severe injury to our people, our reporters and others because of the view we present of the situation of the Indigenous community in Canada,” he said.

“Literally, we’ve had some reporters run off the road when they were covering a story because people were trying to block the telling of that story. So as a news entity, we have seen how far sometimes misinformation, hate and hurtful comments can go.”

Rules must reflect issue’s ‘inherent complexity’: Google

Canadian Heritage Minister Steven Guilbeault has himself recently indicated that legislation to address “online hate” will be introduced “very soon.”

The minister has pointed to the popularity of such a move: a recent survey by the Canadian Race Relations Foundation (CRRF), for example, found that fully four-fifths of Canadians are on board with forcing social media companies to rapidly take down hateful content.

“Canadians are now asking their government to hold social media companies accountable for the content that appears on their platforms,” Guilbeault said after the CRRF survey was published.

“This is exactly what we intend to do, by introducing new regulations that will require online platforms to remove illegal and hateful content before they cause more harm and damage.”

Guilbeault has met with the commission to discuss their recommendations and is currently reviewing their report, press secretary Camille Gagné-Raynauld confirmed.

Representatives from Facebook Canada and Twitter Canada were among several people who provided witness testimony and participated in commission deliberations, the report said. Twitter declined comment to Canada’s National Observer.

“We haven’t reviewed the full report yet, so we can’t comment on the specific recommendations,” said Kevin Chan, global director and head of public policy for Facebook Canada. “We have community standards that govern what is and isn’t allowed on our platform, and in most cases those standards go well beyond what’s required by law.”

Chan also said Facebook agreed regulators should make “clear rules for the internet” so private companies aren’t left to make decisions themselves.

Google spokesperson Lauren Skelly said the company shares Canadians’ concerns about harmful content online and said YouTube takes its responsibility to remove content that violates its policies “extremely seriously.” She said the company has significantly ramped up daily removals of hate speech and removed millions of videos last quarter for violations.

“Any regulation needs to reflect the inherent complexity of the issue and the scale at which online platforms operate,” said Skelly. “We look forward to continuing our work with the government and local partners on addressing the spread of online hate to ensure a safer and open internet that works for all Canadians.”

Incentives ‘not aligned with the public interest’: Jaffer

The nine-month study by the commission, an initiative led by the Public Policy Forum, found that with everything from disinformation campaigns to conspiracy theories, hate speech and people targeted for harm, toxic content was being “amplified” by the actions of social media companies.

The study rejected the notion that social media platforms are “neutral disseminators of information,” finding instead that they curate content to serve their own commercial interests.

“The business model of some of the major social media companies involves keeping people engaged with their platforms as much as possible. And it turns out that keeping people engaged means feeding them sensational content because that’s what keeps people clicking,” said Jameel Jaffer, executive director of the Knight First Amendment Institute at Columbia University and another commissioner.

“The incentives for social media companies are not aligned with the public interest. These are private companies whose obligation is to make money for their shareholders.”

The commission also proposed a tribunal to deal with dispute resolutions quickly, as well as a “transparency regime” that would require social media companies to make certain information available to the regulator, including the “algorithmic architecture used to identify problematic content.”

Jaffer wrote a “concurring statement” in the report, where he confessed it was difficult to endorse the commission’s proposed “duty to act responsibly” without going further to define how that duty will work in reality. He said defining it will require “difficult tradeoffs” between free speech, privacy and other issues.

Carl Meyer / Local Journalism Initiative / Canada’s National Observer

Let’s block ads! (Why?)



Source link

Continue Reading

Trending