For privacy and data protection officers across Canada, COVID-19 was a dominant presence in 2020. Protecting personal data with many employees working from home while using new video, audio and text collaboration tools was a challenge. In some organizations, new e-commerce services were adopted in record time.
COVID will cast a big shadow in 2021, with two prime questions: Can employees be asked to give proof of vaccination for on-premise work, and what sort of proof will be accepted. Will it have to be paper or can there be a digital equivalent?
So far, no federal, provincial or territorial jurisdiction has said how governments will address this, temporarily leaving the private sector to work it out.
It’s assumed that early in the new year as the pace of vaccinations picks up, provinces and territories will have answers.
New legislation
The other dominant issue in 2021 will be the federal government’s proposed new private-sector data privacy law, the Canadian Consumer Privacy Act (CCPA).
Officially known as Bill C-11, it’s a sweeping overhaul of the existing Personal Information Protection and Electronic Documents Act (PIPEDA), changing the federal Office of the Privacy Commissioner from being an ombudsman to a regulator, with the power to recommend multi-million dollar fines to a new Personal Information and Data Protection Tribunal.
But for planning purposes, data protection officials and lawyers wonder if it will become law in the current session of Parliament. Minority governments can fall at any time. The last federal election was in October 2019. There is speculation the Liberal government will go to the polls as soon as it can to take advantage of the goodwill it has built up during the pandemic. Prime Minister Justin Trudeau told the CBC that he has no plans to call an election, but he’s ready for a campaign.
Since the government introduced C-11 and held the first reading debate, it hasn’t scheduled committee meetings, which is where the details of the act would be scrutinized and witnesses from the private sector called.
It isn’t known yet how vigorously the opposition and companies will fight to change C-11. Some business groups have said they aren’t enthused about the proposal to give a privacy regulator the power to levy hefty fines.
On the other hand, there will be pressure to pass the bill because the European Union is demanding countries have privacy laws similar to the General Data Protection Regulation (GDPR). PIPEDA is unlikely to make the cut.
“[C-11] may be the big story of the year because we’ve been waiting so long,” said Teresa Scassa. Canada Research Chair in Information Law and Policy at the University of Ottawa’s Faculty of Law. “It’s such an important bill in terms of private sector data protection. It’s a complete reworking of (PIPEDA), and I think the framework is going to be with us for a long time, it’s really important to get it right.”
But it’s not going to be easy, she says. “It’s really hard to be on top of all of it. Unpacking it and trying to figure out what’s changed and whether it’s for the better will take up a lot of energy in 2021.”
Remember, she added, the government has also promised a reformed Privacy Act, which covers the federal government’s duties to protect personal information. The Justice Department is accepting submissions up to Jan. 17.
British Columbia is also consulting on updating its private sector privacy law, while Quebec’s legislature is debating proposed amendments to its privacy legislation. Ontario is consulting on whether it should have its own private-sector privacy law. Its position may change now that C-11 has been introduced.
Scassa said a “sleeper issue” in 2021 may be worker and student surveillance online. With more employees working from home, some employers want to keep tabs in some way on how productive their staff is. It’s particularly an issue in the financial sector where regulations demand management keep an eye on employees handling large sums of money.
Facial recognition woes
Meanwhile, with students forced to take classes online from home, universities and colleges are grappling with how to assure there’s no cheating on tests. Some have turned to so-called proctoring applications which may make students show an image of their room to ensure no texts are open or notes tacked to a wall during an exam. The application may also use facial recognition technology to identify students.
The Globe and Mail recently ran a story on the issue, with one student of colour complaining the application refused to recognize her. This is in line with many studies that show facial recognition is less accurate with non-white faces.
There was enough controversy in 2020 that IBM withdrew its facial recognition solution. Clearview AI agreed in July to stop marketing its product to police here, but that came after federal Privacy Commissioner Daniel Therrien and three provincial commissioners announced an investigation into how Clearview collects the baseline images from the internet that its application uses. Therrien is also investigating the RCMP’s use of Clearview. Both reports may be released in 2021.
Therrien started investigations this year into the August cyberattacks on Canada Revenue and the GCKey credentials service used by many federal departments after hackers got into accounts of 11,000 users. With several provincial privacy commissioners, Therrien also launched an investigation into the data collection capabilities of Tim Horton’s mobile app.
The private sector is interested in the possibilities of merging facial recognition with other data it collects. Privacy Commissioner Therrien set some guardrails with the release in October of an investigation into how real estate developer Cadillac Fairview collected and analyzed five million images of shoppers in a mall without their knowledge. The images were captured from cameras hidden in information kiosks. The developer said the purpose wasn’t to identify people but analyze shoppers by age and gender. It has placed decals on mall entrances that explained the privacy policy.
But Therrien said there was no meaningful consent. Cadillac Fairview abandoned the project and said it has no plans to revive it.
More stories about privacy snafus
Among the more searing reports issued this year by Therrien was his investigation into the theft by an employee of data on 9.7 million customers of the Quebec-based Desjardin credit union over a two-year period. Data protection pros must have winced as the report pointed out that:
Data on some 4 million people stolen were former customers. It wasn’t clear why Desjardins was holding on to this data. PIPEDA says firms can only retain personal information needed for commercial reasons. Therrien called the discovery that this data was still sitting around “startling”;
Dejardins had 13 directives, policies and procedures for protecting personal information. But some policies and procedures were incomplete or had not been implemented;
One of them forbade copying data onto USB keys. That’s what the insider did, in contravention of the confidentiality agreement he signed;
While Desjardins’ information system restricted access to customer data to authorized users it allowed movement of restricted data to unprotected directories and storage media without any controls.
Desjardins could have reduced the possible exposure of personal data by theft by using data masking techniques to hide identifying information — as recommended by its own data protection security standards;
Desjardins knew there were security problems and had started implementing data loss prevention technology. Slowly implementing. Too slow, as it turned out.
Another insider-related report issued this year dealt with the selling of customer information by two employees of a call centre company with a branch in India hired by Dell for third party support. Several Canadians complained to the privacy commissioner after getting phony tech support calls from someone who knew a lot of information about them including their names and Dell products they owned. Dell discovered that two India-based employees of that call centre provider had sold customer lists of more than 7,800 Canadians to others who apparently made the fake phone calls.
The privacy commissioner’s office found Dell is responsible for the personal information transferred to third parties and is obligated to ensure that those firms properly protect information. However, it found data safeguards were insufficient We found that certain safeguards related to access controls, logging and monitoring, and technical controls were insufficient. It also found that Dell failed to adequately investigate the circumstances of the June 2017 breach and failed to adequately respond to customer complaints.
The investigation was satisfied Dell has since improved its safeguards and oversight.
Bonus round
Scassa pointed out a number of other interesting privacy-related rulings this year:
The Supreme Court of Canada upheld the constitutionality of a bill (the Genetic Non-discrimination Act) to protect people from being compelled by insurers and employers to have or show results of a genetic test. Briefly, the law in part makes that a criminal offence. Some argued this was a provincial matter because it touched on health. But, Scassa said, three of the judges in the majority ruled criminal law to support the protection of privacy. That could expand federal power. For example, Scassa said, it might be used to criminalize certain uses of artificial intelligence applications;
A British Columbia appeal court decision allowing a class-action lawsuit involving a 2013 data breach at a credit union to go ahead. What was interesting, Scassa said, is what wasn’t discussed in the arguments: Whether the civil wrong of “breach of privacy and intrusion upon inclusion,” a relatively new concept first approved of by an Ontario court, exists in B.C. The appeal court hinted that it would really like someone to step forward and make the case;
The uproar in Ontario in April when the province issued an emergency order allowing police, firefighters and paramedics to access health authority databases listing names of those who have tested COVID-19 positive and who they might come into contact with. “There were completely insufficient guardrails on that,” Scassa said, citing allegations that at least two police departments were abusing their access. It’s an example, she said of “maybe not thinking it through” during a crisis. First responder access to those databases was revoked in July.
Would you recommend this article?
Thanks for taking the time to let us know what you think of this article! We’d love to hear your opinion about this or any other story you read in our publication. Click this link to send me a note →
The federal government is ordering the dissolution of TikTok’s Canadian business after a national security review of the Chinese company behind the social media platform, but stopped short of ordering people to stay off the app.
Industry Minister François-Philippe Champagne announced the government’s “wind up” demand Wednesday, saying it is meant to address “risks” related to ByteDance Ltd.’s establishment of TikTok Technology Canada Inc.
“The decision was based on the information and evidence collected over the course of the review and on the advice of Canada’s security and intelligence community and other government partners,” he said in a statement.
The announcement added that the government is not blocking Canadians’ access to the TikTok application or their ability to create content.
However, it urged people to “adopt good cybersecurity practices and assess the possible risks of using social media platforms and applications, including how their information is likely to be protected, managed, used and shared by foreign actors, as well as to be aware of which country’s laws apply.”
Champagne’s office did not immediately respond to a request for comment seeking details about what evidence led to the government’s dissolution demand, how long ByteDance has to comply and why the app is not being banned.
A TikTok spokesperson said in a statement that the shutdown of its Canadian offices will mean the loss of hundreds of well-paying local jobs.
“We will challenge this order in court,” the spokesperson said.
“The TikTok platform will remain available for creators to find an audience, explore new interests and for businesses to thrive.”
The federal Liberals ordered a national security review of TikTok in September 2023, but it was not public knowledge until The Canadian Press reported in March that it was investigating the company.
At the time, it said the review was based on the expansion of a business, which it said constituted the establishment of a new Canadian entity. It declined to provide any further details about what expansion it was reviewing.
A government database showed a notification of new business from TikTok in June 2023. It said Network Sense Ventures Ltd. in Toronto and Vancouver would engage in “marketing, advertising, and content/creator development activities in relation to the use of the TikTok app in Canada.”
Even before the review, ByteDance and TikTok were lightning rod for privacy and safety concerns because Chinese national security laws compel organizations in the country to assist with intelligence gathering.
Such concerns led the U.S. House of Representatives to pass a bill in March designed to ban TikTok unless its China-based owner sells its stake in the business.
Champagne’s office has maintained Canada’s review was not related to the U.S. bill, which has yet to pass.
Canada’s review was carried out through the Investment Canada Act, which allows the government to investigate any foreign investment with potential to might harm national security.
While cabinet can make investors sell parts of the business or shares, Champagne has said the act doesn’t allow him to disclose details of the review.
Wednesday’s dissolution order was made in accordance with the act.
The federal government banned TikTok from its mobile devices in February 2023 following the launch of an investigation into the company by federal and provincial privacy commissioners.
— With files from Anja Karadeglija in Ottawa
This report by The Canadian Press was first published Nov. 6, 2024.
LONDON (AP) — Most people have accumulated a pile of data — selfies, emails, videos and more — on their social media and digital accounts over their lifetimes. What happens to it when we die?
It’s wise to draft a will spelling out who inherits your physical assets after you’re gone, but don’t forget to take care of your digital estate too. Friends and family might treasure files and posts you’ve left behind, but they could get lost in digital purgatory after you pass away unless you take some simple steps.
Here’s how you can prepare your digital life for your survivors:
Apple
The iPhone maker lets you nominate a “ legacy contact ” who can access your Apple account’s data after you die. The company says it’s a secure way to give trusted people access to photos, files and messages. To set it up you’ll need an Apple device with a fairly recent operating system — iPhones and iPads need iOS or iPadOS 15.2 and MacBooks needs macOS Monterey 12.1.
For iPhones, go to settings, tap Sign-in & Security and then Legacy Contact. You can name one or more people, and they don’t need an Apple ID or device.
You’ll have to share an access key with your contact. It can be a digital version sent electronically, or you can print a copy or save it as a screenshot or PDF.
Take note that there are some types of files you won’t be able to pass on — including digital rights-protected music, movies and passwords stored in Apple’s password manager. Legacy contacts can only access a deceased user’s account for three years before Apple deletes the account.
Google
Google takes a different approach with its Inactive Account Manager, which allows you to share your data with someone if it notices that you’ve stopped using your account.
When setting it up, you need to decide how long Google should wait — from three to 18 months — before considering your account inactive. Once that time is up, Google can notify up to 10 people.
You can write a message informing them you’ve stopped using the account, and, optionally, include a link to download your data. You can choose what types of data they can access — including emails, photos, calendar entries and YouTube videos.
There’s also an option to automatically delete your account after three months of inactivity, so your contacts will have to download any data before that deadline.
Facebook and Instagram
Some social media platforms can preserve accounts for people who have died so that friends and family can honor their memories.
When users of Facebook or Instagram die, parent company Meta says it can memorialize the account if it gets a “valid request” from a friend or family member. Requests can be submitted through an online form.
The social media company strongly recommends Facebook users add a legacy contact to look after their memorial accounts. Legacy contacts can do things like respond to new friend requests and update pinned posts, but they can’t read private messages or remove or alter previous posts. You can only choose one person, who also has to have a Facebook account.
You can also ask Facebook or Instagram to delete a deceased user’s account if you’re a close family member or an executor. You’ll need to send in documents like a death certificate.
TikTok
The video-sharing platform says that if a user has died, people can submit a request to memorialize the account through the settings menu. Go to the Report a Problem section, then Account and profile, then Manage account, where you can report a deceased user.
Once an account has been memorialized, it will be labeled “Remembering.” No one will be able to log into the account, which prevents anyone from editing the profile or using the account to post new content or send messages.
X
It’s not possible to nominate a legacy contact on Elon Musk’s social media site. But family members or an authorized person can submit a request to deactivate a deceased user’s account.
Passwords
Besides the major online services, you’ll probably have dozens if not hundreds of other digital accounts that your survivors might need to access. You could just write all your login credentials down in a notebook and put it somewhere safe. But making a physical copy presents its own vulnerabilities. What if you lose track of it? What if someone finds it?
Instead, consider a password manager that has an emergency access feature. Password managers are digital vaults that you can use to store all your credentials. Some, like Keeper,Bitwarden and NordPass, allow users to nominate one or more trusted contacts who can access their keys in case of an emergency such as a death.
But there are a few catches: Those contacts also need to use the same password manager and you might have to pay for the service.
___
Is there a tech challenge you need help figuring out? Write to us at onetechtip@ap.org with your questions.
LONDON (AP) — Britain’s competition watchdog said Thursday it’s opening a formal investigation into Google’s partnership with artificial intelligence startup Anthropic.
The Competition and Markets Authority said it has “sufficient information” to launch an initial probe after it sought input earlier this year on whether the deal would stifle competition.
The CMA has until Dec. 19 to decide whether to approve the deal or escalate its investigation.
“Google is committed to building the most open and innovative AI ecosystem in the world,” the company said. “Anthropic is free to use multiple cloud providers and does, and we don’t demand exclusive tech rights.”
San Francisco-based Anthropic was founded in 2021 by siblings Dario and Daniela Amodei, who previously worked at ChatGPT maker OpenAI. The company has focused on increasing the safety and reliability of AI models. Google reportedly agreed last year to make a multibillion-dollar investment in Anthropic, which has a popular chatbot named Claude.
Anthropic said it’s cooperating with the regulator and will provide “the complete picture about Google’s investment and our commercial collaboration.”
“We are an independent company and none of our strategic partnerships or investor relationships diminish the independence of our corporate governance or our freedom to partner with others,” it said in a statement.
The U.K. regulator has been scrutinizing a raft of AI deals as investment money floods into the industry to capitalize on the artificial intelligence boom. Last month it cleared Anthropic’s $4 billion deal with Amazon and it has also signed off on Microsoft’s deals with two other AI startups, Inflection and Mistral.