Last week saw the U.S. Senate join the ever-growing chorus of federal officials advising staff against using Zoom, with one top official calling the video software a “privacy and security concern.” And while there are myriad reasons to be concerned about the video-call platform—from the potential for foreign snooping to its issues with encryption, to, well, everything else—it looks like the turning point for some federal officials boils down to one thing: shitty teens.
Advertisement
But what, exactly, is allowing these shitty teens to troll members of Congress and others around the country? Turns out, in many cases at least, it’s just a bit of clever googling. More worrisome: The same search tactics for finding Zoom calls can apply to the company’s product specifically built for government use.
The “Zoom-bombing” problem hit a new apex last week when Ohio Republican Rep. Jim Jordan sent a memo to the House Oversight Committee, asking Chairwoman Caroline Maloney, a New York Democrat, to shut down the committee’s ties to Zoom. Jordan’s letter came less than a day after the Senate’s sergeant at arms warned the chamber’s members and staff to not use the service. The reason? Pranksters on the platform interrupting a congressional meeting. As Jordan wrote:
“[I]n spite of the warnings by the FBI and media outlets, on April 3, 2020, you held a Zoom-hosted Member briefing on women’s rights in Afghanistan with the Special Inspector General for Afghanistan Reconstruction (SIGAR),” Jordan wrote. During this important briefing, the session was ‘Zoom-bombed’ at least three times.
Advertisement
Jordan added that the impact of potential “hacking and malware” on the devices of meeting attendees is “still being determined.” But as most of the kiddie culprits behind these sorts of attacks will tell you, it’s ridiculously easy to find Zoom meetings to trash. And from a little bit of analysis, I found that the adage this doesn’t just go for the calls held in classrooms and among recovering alcoholics, but also for those held on Capitol Hill.
The case Jordan complained about is the first publicized instance of these attacks reaching the federal level, but attacks on local government have been happening for weeks. Trolls have reportedly descended on city council meetings happening across prettymucheverystate you can name, to shitpost porn, Nazi memorabilia, and presumably, Nazi-themed porn.
Advertisement
In response to these attacks (and others), Zoom beefed up its security practices, giving each meeting a virtual waiting room by default, allowing hosts to pre-screen potential participants and boot out any obvious trolls with names like “Ben Dover” and “Hugh Jass.” Though the threat of these pre-screenings—not to mention potential jail time—deterred a chunk of these trolls, just as many went and…. adopted benign names to sneak into these rooms the same way that they always had.
Advertisement
I’m not pretending to be a zoom-bombing scholar, but after kicking it with zoom bombers for about a day, I was able to figure out how a lot of these kids were finding these codes to begin with. The overwhelming majority come from tween and teenage students just passing their own class’s codes amongst themselves to screw with their teachers. A few of the more enterprising types created scrapers or bots to mine any invites to Zoom meetings off of major social platforms.
Another popular method, as it turns out, is “google dorking”—essentially using certain keywords in Google’s search bar to dredge up vulnerable intel from the web. Dorks (as these keywords are called) aren’t just the bread and butter of hacking aficionados or cybercriminals, but of certain investigative journalists, including myself—which means that I could theoretically “hack” into the same congressional meetings these teens were a part of.
Advertisement
So I decided to give it a try—not to bomb any meetings myself, but just to see if I could find where these asshats were digging them up.
Advertisement
Unless you take a few extra steps to bury stuff from search engines like Google, just about everything you post online is indexed and stored in a searchable, digital record. When it comes to sites run by folks that are a tinge less tech-savvy—like, say, the website of a given local government—knowing the right words to search can turn up anything from the site’s entire history, even if it’s hidden behind some sort of password protector. And as I’d found previously, all public-facing Zoom links share a similar searchable string—making it easy to find an endless buffet of upcoming meetings that have been posted somewhere online.
Gizmodo first reached out to Zoom about its Google dorking problem two weeks ago, but those inquiries went unanswered until today when we tipped them off to the fact that the same issue applies to Zoom For Government meetings.
Advertisement
“Zoom takes security extremely seriously,” a spokesperson for the company said. “Zoom is aware that in some instances where users have shared links to meetings publicly, they may be indexed by search engines—and we are working hard to de-index those links and have the results taken down.”
The spokesperson added, “We strongly encourage all users to not post links to sensitive meetings on public websites, and we recommend the use of password protection and virtual waiting rooms to ensure uninvited users are not able to join.”
Advertisement
To be fair, the reason that more than a few city municipalities had their meetings crop up in my search results was because they were using plain, vanilla Zoom. For the folks that have a bit more cash to burn—or a few more state secrets to keep under wraps—it’s more likely they’re using Zoom For Government, the elite offshoot that was endorsed by the Department of Homeland Security last year as a “secure cloud solution.” According to publicly available documentation, this branch of Zoom also counts other notable partners like the Centers for Disease Control, Customs and Border Patrol, and the Department of Agriculture.
Naturally, I assumed that the teleconferencing software of choice for the Pentagon and ICE would make its meetings a bit more difficult to find, but just like before, these meetings were only a few clicks away. Five minutes in, I’d found a few links for meetings held at the USDA, the NSF, and a handful of coronavirus conference calls hosted by the CDC.
Advertisement
It’s worth noting that none of these links were particularly juicy, so to speak—you’re (probably, hopefully) not going to be finding any internal meetings between the top brass in the U.S. military by poking around at Google search. But you will find calls aimed at the public: think USDA calls with local farmers, CDC calls with local hospitals, or NSF calls with local universities. In cases like these, the waiting room feature doesn’t do jack shit—if a Zoom bomber can fudge their name to sneak into AA meetings under the guise of a fake alcoholic, they can damn well do the same to sneak into a CDC meeting under the guise of a fake hospital employee, or a fake federal contractor.
Advertisement
In Zoom’s defense, a lot of this is out of their hands, since it’s the federal authority that’s putting these links out into the world for all search engines to see. But if the company can completely revamp its data center structure in the name of national security, the least it can do is tip off its clientele about what they might be accidentally airing on the open web.
The federal government is ordering the dissolution of TikTok’s Canadian business after a national security review of the Chinese company behind the social media platform, but stopped short of ordering people to stay off the app.
Industry Minister François-Philippe Champagne announced the government’s “wind up” demand Wednesday, saying it is meant to address “risks” related to ByteDance Ltd.’s establishment of TikTok Technology Canada Inc.
“The decision was based on the information and evidence collected over the course of the review and on the advice of Canada’s security and intelligence community and other government partners,” he said in a statement.
The announcement added that the government is not blocking Canadians’ access to the TikTok application or their ability to create content.
However, it urged people to “adopt good cybersecurity practices and assess the possible risks of using social media platforms and applications, including how their information is likely to be protected, managed, used and shared by foreign actors, as well as to be aware of which country’s laws apply.”
Champagne’s office did not immediately respond to a request for comment seeking details about what evidence led to the government’s dissolution demand, how long ByteDance has to comply and why the app is not being banned.
A TikTok spokesperson said in a statement that the shutdown of its Canadian offices will mean the loss of hundreds of well-paying local jobs.
“We will challenge this order in court,” the spokesperson said.
“The TikTok platform will remain available for creators to find an audience, explore new interests and for businesses to thrive.”
The federal Liberals ordered a national security review of TikTok in September 2023, but it was not public knowledge until The Canadian Press reported in March that it was investigating the company.
At the time, it said the review was based on the expansion of a business, which it said constituted the establishment of a new Canadian entity. It declined to provide any further details about what expansion it was reviewing.
A government database showed a notification of new business from TikTok in June 2023. It said Network Sense Ventures Ltd. in Toronto and Vancouver would engage in “marketing, advertising, and content/creator development activities in relation to the use of the TikTok app in Canada.”
Even before the review, ByteDance and TikTok were lightning rod for privacy and safety concerns because Chinese national security laws compel organizations in the country to assist with intelligence gathering.
Such concerns led the U.S. House of Representatives to pass a bill in March designed to ban TikTok unless its China-based owner sells its stake in the business.
Champagne’s office has maintained Canada’s review was not related to the U.S. bill, which has yet to pass.
Canada’s review was carried out through the Investment Canada Act, which allows the government to investigate any foreign investment with potential to might harm national security.
While cabinet can make investors sell parts of the business or shares, Champagne has said the act doesn’t allow him to disclose details of the review.
Wednesday’s dissolution order was made in accordance with the act.
The federal government banned TikTok from its mobile devices in February 2023 following the launch of an investigation into the company by federal and provincial privacy commissioners.
— With files from Anja Karadeglija in Ottawa
This report by The Canadian Press was first published Nov. 6, 2024.
LONDON (AP) — Most people have accumulated a pile of data — selfies, emails, videos and more — on their social media and digital accounts over their lifetimes. What happens to it when we die?
It’s wise to draft a will spelling out who inherits your physical assets after you’re gone, but don’t forget to take care of your digital estate too. Friends and family might treasure files and posts you’ve left behind, but they could get lost in digital purgatory after you pass away unless you take some simple steps.
Here’s how you can prepare your digital life for your survivors:
Apple
The iPhone maker lets you nominate a “ legacy contact ” who can access your Apple account’s data after you die. The company says it’s a secure way to give trusted people access to photos, files and messages. To set it up you’ll need an Apple device with a fairly recent operating system — iPhones and iPads need iOS or iPadOS 15.2 and MacBooks needs macOS Monterey 12.1.
For iPhones, go to settings, tap Sign-in & Security and then Legacy Contact. You can name one or more people, and they don’t need an Apple ID or device.
You’ll have to share an access key with your contact. It can be a digital version sent electronically, or you can print a copy or save it as a screenshot or PDF.
Take note that there are some types of files you won’t be able to pass on — including digital rights-protected music, movies and passwords stored in Apple’s password manager. Legacy contacts can only access a deceased user’s account for three years before Apple deletes the account.
Google
Google takes a different approach with its Inactive Account Manager, which allows you to share your data with someone if it notices that you’ve stopped using your account.
When setting it up, you need to decide how long Google should wait — from three to 18 months — before considering your account inactive. Once that time is up, Google can notify up to 10 people.
You can write a message informing them you’ve stopped using the account, and, optionally, include a link to download your data. You can choose what types of data they can access — including emails, photos, calendar entries and YouTube videos.
There’s also an option to automatically delete your account after three months of inactivity, so your contacts will have to download any data before that deadline.
Facebook and Instagram
Some social media platforms can preserve accounts for people who have died so that friends and family can honor their memories.
When users of Facebook or Instagram die, parent company Meta says it can memorialize the account if it gets a “valid request” from a friend or family member. Requests can be submitted through an online form.
The social media company strongly recommends Facebook users add a legacy contact to look after their memorial accounts. Legacy contacts can do things like respond to new friend requests and update pinned posts, but they can’t read private messages or remove or alter previous posts. You can only choose one person, who also has to have a Facebook account.
You can also ask Facebook or Instagram to delete a deceased user’s account if you’re a close family member or an executor. You’ll need to send in documents like a death certificate.
TikTok
The video-sharing platform says that if a user has died, people can submit a request to memorialize the account through the settings menu. Go to the Report a Problem section, then Account and profile, then Manage account, where you can report a deceased user.
Once an account has been memorialized, it will be labeled “Remembering.” No one will be able to log into the account, which prevents anyone from editing the profile or using the account to post new content or send messages.
X
It’s not possible to nominate a legacy contact on Elon Musk’s social media site. But family members or an authorized person can submit a request to deactivate a deceased user’s account.
Passwords
Besides the major online services, you’ll probably have dozens if not hundreds of other digital accounts that your survivors might need to access. You could just write all your login credentials down in a notebook and put it somewhere safe. But making a physical copy presents its own vulnerabilities. What if you lose track of it? What if someone finds it?
Instead, consider a password manager that has an emergency access feature. Password managers are digital vaults that you can use to store all your credentials. Some, like Keeper,Bitwarden and NordPass, allow users to nominate one or more trusted contacts who can access their keys in case of an emergency such as a death.
But there are a few catches: Those contacts also need to use the same password manager and you might have to pay for the service.
___
Is there a tech challenge you need help figuring out? Write to us at onetechtip@ap.org with your questions.
LONDON (AP) — Britain’s competition watchdog said Thursday it’s opening a formal investigation into Google’s partnership with artificial intelligence startup Anthropic.
The Competition and Markets Authority said it has “sufficient information” to launch an initial probe after it sought input earlier this year on whether the deal would stifle competition.
The CMA has until Dec. 19 to decide whether to approve the deal or escalate its investigation.
“Google is committed to building the most open and innovative AI ecosystem in the world,” the company said. “Anthropic is free to use multiple cloud providers and does, and we don’t demand exclusive tech rights.”
San Francisco-based Anthropic was founded in 2021 by siblings Dario and Daniela Amodei, who previously worked at ChatGPT maker OpenAI. The company has focused on increasing the safety and reliability of AI models. Google reportedly agreed last year to make a multibillion-dollar investment in Anthropic, which has a popular chatbot named Claude.
Anthropic said it’s cooperating with the regulator and will provide “the complete picture about Google’s investment and our commercial collaboration.”
“We are an independent company and none of our strategic partnerships or investor relationships diminish the independence of our corporate governance or our freedom to partner with others,” it said in a statement.
The U.K. regulator has been scrutinizing a raft of AI deals as investment money floods into the industry to capitalize on the artificial intelligence boom. Last month it cleared Anthropic’s $4 billion deal with Amazon and it has also signed off on Microsoft’s deals with two other AI startups, Inflection and Mistral.