This story is part of Focal Point iPhone 2022, CNET’s collection of news, tips and advice around Apple’s most popular product.
What’s happening
Apple will be offering a new “Lockdown Mode” for its iPhones, iPads and Mac computers this fall. It’s designed to fight advanced hacking and targeted spyware like the NSO Group’s Pegasus.
Why it matters
The move is Apple acknowledging, in a way, that the threat is serious and growing. Pegasus was used by repressive governments to spy on human rights activists, lawyers, politicians and journalists.
What’s next
Cybersecurity watchers believe Apple may push customers and competitors to take stronger security postures. Ultimately, the way we all use technology may have to change.
Three years ago, Apple put up an ad in Las Vegas, showing the backside of one of its devices, with the phrase “What happens on your iPhone, stays on your iPhone.” It was a bold, if cheeky, claim. But Apple is increasingly living up to it.
The tech giant has been ramping up its commitments to privacy and security with a string of new features that cybersecurity experts say are amounting to more than a bullet-point feature to differentiate its products from Samsung gadgets and other devices powered by Google’s Android OS. Instead, Apple’s moves have sent ripples through the advertising world and upset government officials — signs, tech watchers say, that Apple is following through on its promises.
That’s why many cybersecurity experts took notice of Apple’s Lockdown Mode when it was unveiled last Wednesday. The feature is designed to activate “extreme” protections for the company’s iPhones, iPads and Mac computers. Among them, Apple’s Lockdown Mode blocks link previews in the messages app, turns off potentially hackable web browsing technologies, and halts any incoming FaceTime calls from unknown numbers. Apple’s devices also won’t accept accessory connections unless the device is unlocked. (Here’s how to use Apple’s Lockdown mode on an iPhone.)
Of the people using its roughly 2 billion active devices around the world, Apple said few would actually need to turn the feature on. But cybersecurity experts say these types of extreme measures may need to become more commonplace as governments around the world broaden who they target while stepping up their frequency of attacks.
In just the last week, the FBI and Britain’s MI5 intelligence organization took the rare step of issuing a joint warning of the “immense” threat Chinese spies pose to “our economic and national security,” and that its hacking program is “bigger than that of every other major country combined.” Other government agencies have made similar warnings about hacking from other adversaries, including Russia, which the US Office of the Director of National Intelligence said in 2017 has targeted think tanks and lobbying groups in addition to the government and political parties.
And unlike widespread ransomware or virus campaigns, which are often designed to spread as quickly as possible, targeted attacks are often designed for quiet intelligence gathering, which could lead to stolen technology, exposed state secrets and more.
Apple itself said last week that it’s tracked targeted hacking efforts toward people in nearly 150 countries over the past eight months. Apple has already begun a program of warning people when they may be targeted. When Lockdown Mode is released in the fall, cybersecurity experts say, it’ll represent an escalation on Apple’s part, particularly because the feature will be available to anyone who wants to turn it on.
“There were a number of attempts over the years to make highly secure devices, and it’s great to have those things and having them put out there, but we haven’t seen widespread adoption,” said Kurt Opsahl, deputy executive director and general counsel at the Electronic Frontier Foundation, which advocates for privacy and other civil liberties in the digital world. And though Opsahl believes an up-to-date phone is probably good enough for the average person, he said that any way Apple can raise the cost of hacking a phone helps protect the devices.
“Make no mistake about it, Lockdown Mode will be a major blow,” said Ron Deibert, a professor of political science and director of the Citizen Lab for cybersecurity researchers at the University of Toronto.
Coming change
Much of Apple’s approach to cybersecurity can be traced back to 2010, when company co-founder Steve Jobs discussed his view of privacy on stage at D8 conference.
“Privacy means people know what they’re signing up for, in plain English, and repeatedly,” Jobs said. “Ask them. Ask them every time. Make them tell you to stop asking them if they get tired of your asking them. Let them know precisely what you’re going to do.”
It was a departure from other internet giants, such as Facebook, whose co-founder, Mark Zuckerberg, was listening in the audience. Google, Facebook and Amazon largely make their money through targeted advertisements, which are often at odds with user privacy. After all, the more targeted the ad, more relevant and effective it likely is.
Apple, by comparison, makes little of its money from advertisements. Instead, the iPhone, iPad and Mac computers made up more than 70% of its sales last year, adding up to over $259 billion combined.
Accordingly, Apple offers security features by default across the board to all its users. When people download Facebook for the first time and start using it on their phone, they’re quickly greeted with popups asking whether they want to give the app access to their microphone or camera.
Last year, Apple took it a step further, asking if people wanted to stop companies from tracking them across websites and apps, a feature Apple calls App Tracking Transparency. Research surveys suggest nearly all people answer that they don’t want to be tracked, a move that Facebook owner Meta said has meaningfully hurt its finances, costing as much as $10 billion in lost sales this year. “It’s a substantial headwind to work our way through,” Meta CFO David Wehner said in February.
But offering effectively a new mode on iPhones altogether is an entirely new approach. When people activate Lockdown Mode on their device, by flipping a switch in the settings app, it then needs to restart — effectively loading a new set of code and rules under Apple’s “extreme” security measures.
“Apple is ultimately making it as easy as possible to make choices about security and privacy,” said Jeff Pollard, a Forrester analyst who focuses on cybersecurity and risk. Pollard said this approach offers an opportunity for Apple to test the waters between usability and security, while following through on its promise to continually improve on Lockdown Mode over time. “We have to make it easier to do, so our adversaries have to try harder.”
Future security
Lockdown Mode may be one of Apple’s most significant security moves to date, but the company still has more it needs to do. Craig Federighi, Apple SVP and head of software, testified to a courtroom last year that his company’s Mac computers face a “significantly larger malware problem” than its iPhones, iPads and other devices.
“Today, we have a level of malware on the Mac that we don’t find acceptable,” Federighi said during testimony defending Apple in a lawsuit with Fortnite maker Epic Games. Each week, Apple identifies a couple of pieces of malware on its own or with the help of third parties, he said back then, and it uses built-in systems to automatically remove malicious software from customers’ computers. The nasty programs still proliferate, though. In the year ended last May, Federighi said, Apple had fought 130 types of Mac malware, and one program alone infected 300,000 systems.
Lockdown Mode doesn’t directly address widespread malware issues, but it could end up forcing hackers to put even more time and resources toward finding security flaws they can exploit.
“Something has to be done,” said Betsy Sigman, a distinguished teaching professor emeritus at Georgetown University’s McDonough School of Business.
An alarming problem to Sigman is that malware developers stand to make hundreds of millions of dollars from targeted hacks like Pegasus. The groups that have sprung up to fight them, meanwhile, are much smaller and need funding both to fight the threat and to help protect and educate potential victims.
“It’s going to cost a lot of money,” Sigman said. Apple pledged a grant of at least $10 million to the Dignity and Justice Fund, which was established by the Ford Foundation, to help support human rights and fight social repression. Sigman said much more investment will be needed. “I hope Apple will get together with other high-tech companies and work together on this.”
Meanwhile, many cybersecurity experts, including Susan Landau, are looking forward to trying out Lockdown Mode when Apple releases it in the fall, along with its annual set of major software upgrades. A cybersecurity and policy professor at Tufts University, and a former employee at Google and Sun Microsystems, Landau is already careful about what websites she visits and what devices she uses. She keeps a separate Google Chromebook for handling her finances, and she refuses to download most apps to her phone unless she knows she can trust the company that made them.
“It’s convenience versus security,” she said. Landau follows these protocols out of principle, because she — like nearly all of us — doesn’t have the time or capability to validate every app or website’s safety. Apple and Google both have established security tests for their respective app stores, but Landau said the new apps, capabilities and upgrades that arrive each year can make them more vulnerable. “Complexity is the bane of security.”
To her, Lockdown Mode may help us all begin to understand the balance between gee-whiz features and security, particularly as state-sponsored hackers step up their attacks. “People have gotten used to the convenience without understanding the problems,” Landau said. “The convenience we’ve all grown accustomed to has got to change.”
The federal government is ordering the dissolution of TikTok’s Canadian business after a national security review of the Chinese company behind the social media platform, but stopped short of ordering people to stay off the app.
Industry Minister François-Philippe Champagne announced the government’s “wind up” demand Wednesday, saying it is meant to address “risks” related to ByteDance Ltd.’s establishment of TikTok Technology Canada Inc.
“The decision was based on the information and evidence collected over the course of the review and on the advice of Canada’s security and intelligence community and other government partners,” he said in a statement.
The announcement added that the government is not blocking Canadians’ access to the TikTok application or their ability to create content.
However, it urged people to “adopt good cybersecurity practices and assess the possible risks of using social media platforms and applications, including how their information is likely to be protected, managed, used and shared by foreign actors, as well as to be aware of which country’s laws apply.”
Champagne’s office did not immediately respond to a request for comment seeking details about what evidence led to the government’s dissolution demand, how long ByteDance has to comply and why the app is not being banned.
A TikTok spokesperson said in a statement that the shutdown of its Canadian offices will mean the loss of hundreds of well-paying local jobs.
“We will challenge this order in court,” the spokesperson said.
“The TikTok platform will remain available for creators to find an audience, explore new interests and for businesses to thrive.”
The federal Liberals ordered a national security review of TikTok in September 2023, but it was not public knowledge until The Canadian Press reported in March that it was investigating the company.
At the time, it said the review was based on the expansion of a business, which it said constituted the establishment of a new Canadian entity. It declined to provide any further details about what expansion it was reviewing.
A government database showed a notification of new business from TikTok in June 2023. It said Network Sense Ventures Ltd. in Toronto and Vancouver would engage in “marketing, advertising, and content/creator development activities in relation to the use of the TikTok app in Canada.”
Even before the review, ByteDance and TikTok were lightning rod for privacy and safety concerns because Chinese national security laws compel organizations in the country to assist with intelligence gathering.
Such concerns led the U.S. House of Representatives to pass a bill in March designed to ban TikTok unless its China-based owner sells its stake in the business.
Champagne’s office has maintained Canada’s review was not related to the U.S. bill, which has yet to pass.
Canada’s review was carried out through the Investment Canada Act, which allows the government to investigate any foreign investment with potential to might harm national security.
While cabinet can make investors sell parts of the business or shares, Champagne has said the act doesn’t allow him to disclose details of the review.
Wednesday’s dissolution order was made in accordance with the act.
The federal government banned TikTok from its mobile devices in February 2023 following the launch of an investigation into the company by federal and provincial privacy commissioners.
— With files from Anja Karadeglija in Ottawa
This report by The Canadian Press was first published Nov. 6, 2024.
LONDON (AP) — Most people have accumulated a pile of data — selfies, emails, videos and more — on their social media and digital accounts over their lifetimes. What happens to it when we die?
It’s wise to draft a will spelling out who inherits your physical assets after you’re gone, but don’t forget to take care of your digital estate too. Friends and family might treasure files and posts you’ve left behind, but they could get lost in digital purgatory after you pass away unless you take some simple steps.
Here’s how you can prepare your digital life for your survivors:
Apple
The iPhone maker lets you nominate a “ legacy contact ” who can access your Apple account’s data after you die. The company says it’s a secure way to give trusted people access to photos, files and messages. To set it up you’ll need an Apple device with a fairly recent operating system — iPhones and iPads need iOS or iPadOS 15.2 and MacBooks needs macOS Monterey 12.1.
For iPhones, go to settings, tap Sign-in & Security and then Legacy Contact. You can name one or more people, and they don’t need an Apple ID or device.
You’ll have to share an access key with your contact. It can be a digital version sent electronically, or you can print a copy or save it as a screenshot or PDF.
Take note that there are some types of files you won’t be able to pass on — including digital rights-protected music, movies and passwords stored in Apple’s password manager. Legacy contacts can only access a deceased user’s account for three years before Apple deletes the account.
Google
Google takes a different approach with its Inactive Account Manager, which allows you to share your data with someone if it notices that you’ve stopped using your account.
When setting it up, you need to decide how long Google should wait — from three to 18 months — before considering your account inactive. Once that time is up, Google can notify up to 10 people.
You can write a message informing them you’ve stopped using the account, and, optionally, include a link to download your data. You can choose what types of data they can access — including emails, photos, calendar entries and YouTube videos.
There’s also an option to automatically delete your account after three months of inactivity, so your contacts will have to download any data before that deadline.
Facebook and Instagram
Some social media platforms can preserve accounts for people who have died so that friends and family can honor their memories.
When users of Facebook or Instagram die, parent company Meta says it can memorialize the account if it gets a “valid request” from a friend or family member. Requests can be submitted through an online form.
The social media company strongly recommends Facebook users add a legacy contact to look after their memorial accounts. Legacy contacts can do things like respond to new friend requests and update pinned posts, but they can’t read private messages or remove or alter previous posts. You can only choose one person, who also has to have a Facebook account.
You can also ask Facebook or Instagram to delete a deceased user’s account if you’re a close family member or an executor. You’ll need to send in documents like a death certificate.
TikTok
The video-sharing platform says that if a user has died, people can submit a request to memorialize the account through the settings menu. Go to the Report a Problem section, then Account and profile, then Manage account, where you can report a deceased user.
Once an account has been memorialized, it will be labeled “Remembering.” No one will be able to log into the account, which prevents anyone from editing the profile or using the account to post new content or send messages.
X
It’s not possible to nominate a legacy contact on Elon Musk’s social media site. But family members or an authorized person can submit a request to deactivate a deceased user’s account.
Passwords
Besides the major online services, you’ll probably have dozens if not hundreds of other digital accounts that your survivors might need to access. You could just write all your login credentials down in a notebook and put it somewhere safe. But making a physical copy presents its own vulnerabilities. What if you lose track of it? What if someone finds it?
Instead, consider a password manager that has an emergency access feature. Password managers are digital vaults that you can use to store all your credentials. Some, like Keeper,Bitwarden and NordPass, allow users to nominate one or more trusted contacts who can access their keys in case of an emergency such as a death.
But there are a few catches: Those contacts also need to use the same password manager and you might have to pay for the service.
___
Is there a tech challenge you need help figuring out? Write to us at onetechtip@ap.org with your questions.
LONDON (AP) — Britain’s competition watchdog said Thursday it’s opening a formal investigation into Google’s partnership with artificial intelligence startup Anthropic.
The Competition and Markets Authority said it has “sufficient information” to launch an initial probe after it sought input earlier this year on whether the deal would stifle competition.
The CMA has until Dec. 19 to decide whether to approve the deal or escalate its investigation.
“Google is committed to building the most open and innovative AI ecosystem in the world,” the company said. “Anthropic is free to use multiple cloud providers and does, and we don’t demand exclusive tech rights.”
San Francisco-based Anthropic was founded in 2021 by siblings Dario and Daniela Amodei, who previously worked at ChatGPT maker OpenAI. The company has focused on increasing the safety and reliability of AI models. Google reportedly agreed last year to make a multibillion-dollar investment in Anthropic, which has a popular chatbot named Claude.
Anthropic said it’s cooperating with the regulator and will provide “the complete picture about Google’s investment and our commercial collaboration.”
“We are an independent company and none of our strategic partnerships or investor relationships diminish the independence of our corporate governance or our freedom to partner with others,” it said in a statement.
The U.K. regulator has been scrutinizing a raft of AI deals as investment money floods into the industry to capitalize on the artificial intelligence boom. Last month it cleared Anthropic’s $4 billion deal with Amazon and it has also signed off on Microsoft’s deals with two other AI startups, Inflection and Mistral.