Did the FBI get Apple to kill iCloud backup encryption? | Canada News Media
Connect with us

Tech

Did the FBI get Apple to kill iCloud backup encryption?

Published

 on

Reuters detonated a huge controversy in the Apple and security spaces yesterday with this headline — Apple dropped plan for encrypting backups after FBI complained – sources.

First, what Reuters is referring to as “backups” here is specifically iCloud backups. There are still two ways to backup Apple devices — to iCloud and locally to a Mac or PC. For a very long time now, and still, to this day, you can locally back up to your Mac or PC and opt-in to those backups being password encrypted. It’s not a plan, it’s a checkbox. And none of that has changed.

Second. Based on reactions to the headline, many people, including very tech-savvy people, didn’t seem to realize or remember that iCloud backups weren’t quote-unquote encrypted.

Our favorite VPN service is more affordable now than ever before

I say quote-unquote because the backups actually are encrypted. But, Apple has their own set of keys and can access them.

Which… is not unusual. There are a couple of reasons for that: Features and fail-safes.

Features and fail-safes

I’ll get to fail-safes in a bit, but, online storage not being end-to-end encrypted at the container level allows for additional features that can be highly convenient for the customer. Web-based access to single files within a backup, for example, including and especially photos, calendars, and contacts, like you get at iCloud.com.

There are a few end-to-end encrypted by design storage services, and services that provide options or local tools for end-to-end encryption, and you can always upload locally encrypted files.

But iCloud is by no means unique in how it treats online backups… which is probably why some people reacted so strongly to the headline — given how much Apple talks about privacy and security, some just assume it applies to everything.

Now, Apple does treat some data differently. For example, health data and keychain password data are end-to-end encrypted and Apple locks out everyone but you, including Apple themselves.

To explain the difference and maybe the dissonance, I’m going to flashback for a bit.

Two Steps Forward

Many years ago, there was a scandal involving celebrities and their private photos being leaked online. Many but not all of them came from iCloud backups. iCloud was never hacked, but if someone famous re-used the same password as another service that was hacked, or used security questions that could be looked up on Wikipedia, for example, attackers could and did get into them.

And Apple had to put a stop to it and fast. No one but you should ever be gaining access to your iCloud account.

So, Apple implemented two-step authentication. And the way they implemented it meant anyone using it had to write down or print out a long pseudo-random recovery key and keep it safe in case they ever forgot their iCloud passwords or couldn’t supply the second step for authentication. Otherwise, they’d get locked out of their own accounts.

And, of course… people being people… they promptly turned on 2SA, lost their recovery key, and got locked out of their own iCloud accounts. Including and especially the irreplaceable data stored in those accounts like baby and wedding pictures.

Apple was flooded with requests to help people get back into their accounts, but without the recovery key there was nothing Apple could do. The data was lost. For all intents and purposes, destroyed.

I’ve done a few videos about this already but it’s worth repeating again: For most people, most of the time, encrypted backups are a bad idea precisely because if anything goes wrong — and things often go wrong — they can’t be recovered.

Even a physical hard drive, if it’s encrypted, and it gets damaged, there’s no amount of recovery in the world that can get your data back.

For Infosec — information security — people, that’s the whole point. And they will always say to encrypt everything, because it’s better to lose than to leak.

Data retention experts, though, will tell you never to encrypt those backups, precisely because they’ve seen so many people lose so much of their own data.

Unfortunately, given the times we live in, we’ve heard far more from the safe-everything than the save-everything crowd, and really it’s critically important to always consider both sides.

Apple totally deserves some of the blame for this as well, given how much publicity they, like everyone else, give to security and how little they, like everyone else, give to education around what should and shouldn’t be kept secure and what shouldn’t.

Internally, though, Apple learned that being extremist about anything, including security, is not only inconsiderate but also harmful.

Re-Factored

Apple had to put a stop to people losing access to their own data as well, and just as fast.

To do that, Apple deprecated the old two-step authentication system and rolled out a new two-factor authentication system that was not only easier for most people to manage but would also allow Apple to recover iCloud backups for people if and when they locked themselves out but could still prove ownership.

The downside to this was that, because Apple could access the backups, they were then legally obligated to hand over those backups in the event of a subpoena.

Why would Apple make that tradeoff? For the exact reasons I just explained.

For the vast majority of people, the risk of data loss is significantly — significantly — higher than the risk of data theft or subpoena.

In other words, for most people, most of the time, the biggest danger isn’t someone else, including a law enforcement agency, getting access to your data — it’s you losing access to it.

Apple, the company that would rather just lock up absolutely everything, decided it would actually be in the better interests of their customers to be a little less extremist in this specific case.

This is why, for the last five years or so, iCloud Backups have been encrypted but not end-to-end encrypted — unlike almost everything else on iCloud, Apple doesn’t lock themselves out so they can help you if you happen to lock yourself out.

And, of course, for anyone not comfortable with that there was and continues to be the option for fully encrypted backups available via Mac or PC.

Encrypting the iCloud

Apple dropped plans to let iPhone users fully encrypt backups of their devices in the company’s iCloud service after the FBI complained that the move would harm investigations. This according to six sources familiar with the matter.

Reuters probably means Apple devices rather than just the iPhone, because there’s very little chance the iPhone would be treated differently than, say, the iPad when it comes to iCloud backups.

Now, yes, local backups to a Mac or Pac can still be fully encrypted but those aren’t as convenient or consistent as iCloud backups.

That’s why, after Apple made the change from 2SA to 2FA several people, myself included, understanding the risk of data loss, still asked for the option to turn on end-to-end encryption for iCloud backups as well.

The question even came up in a Speigle interview with Apple’s CEO, Tim Cook, translated by Google:

Our users have a key there, and we have one. We do this because some users lose or forget their key and then expect help from us to get their data back. It is difficult to estimate when we will change this practice. But I think that in the future it will be regulated like the devices. We will therefore no longer have a key for this in the future.

Regulated as in handled, not as in mandated by law.

Now, Reuters isn’t citing that interview as the source, but, they are saying…

The tech giant’s reversal, about two years ago, has not previously been reported.

And that makes Reuter’s report problematic. Regardless of the source, it hasn’t been “about two years” since Tim Cook’s interview where he clearly says Apple is working on end-to-end encrypted iCloud backups. It’s been barely more than a year. Not even 15 months.

If something as simple, as checkable, as the time-line is wrong, what else may be wrong?

Apple and Law Enforcement

It shows how much Apple has been willing to help U.S. law enforcement and intelligence agencies, despite taking a harder line in high-profile legal disputes with the government and casting itself as a defender of its customers’ information.

But does it though?

Apple’s willingness to help law enforcement has never been in question. Through words and actions, Apple has repeatedly demonstrated a commitment to obeying local laws, including a willingness to help law enforcement as required by those laws. This doesn’t show how much because, how much has always been shown to be… much.

The hard-line Apple took with San Bernardino and is taking with Pensacola is completely different in kind — because data on servers and on devices is different in kind, and legal requests and extra-legal requests are different in kind.

Apple’s argument about devices has and continues to be that they are far more likely to be lost or stolen and therefor requires much strong protection – silicon-level encryption. And that Apple can’t grant access to law enforcement because that would also grant access to anyone who finds, steals, or otherwise gains possession of someone else’s device.

In other words, Apple isn’t locking down iPhones to keep law enforcement organizations with valid warrants out, they’re doing it to keep criminals out. The downside, for law enforcement, is that they get locked out as well.

It’s fail secure instead of fail safe, the exact inverse of the iCloud backup situation, and the exact opposite trade-off.

Now, I personally believe in, and have made a couple videos advocating, that our devices contain so much deeply personal data they’re de facto extensions of our persons and as such deserve 5th amendment-like protections and exemptions under the law. But that’s me. As far as I know, Apple hasn’t advocated for anything that extreme.

What Apple has done, though, is state that they shouldn’t be compelled to provide access to data beyond the scope of existing laws. That there should be no extra legal requests.

That’s why, when Attorneys General and Directors FBI demanded Apple create back doors and break encryption on the iPhones in San Bernardino and Pensacola, Apple said no.

But, even before it got to those pay grades, when Apple was legally asked for the iCloud backups, Apple provided them. Both of those facts were widely and jointly reported. There’s no disconnect there, no surprise, no gotcha, not unless someone deliberately fabricates one.

Apple’s plans

More than two years ago, Apple told the FBI that it planned to offer users end-to-end encryption when storing their phone data on iCloud, according to one current and three former FBI officials and one current and one former Apple employee.

OK, so, this is where the Reuter’s report becomes highly problematic. Again, Apple is on record saying they will obey local laws. But encryption is not illegal and the idea that Apple would involve the FBI in that type of process would be seen as an incredible violation of trust given the relationship Apple has crafted with their customers. It would be, to many, a dealbreaker.

Apple’s entire reputation is based on the commitment to product and customer, and whether that helps or hurts the FBI’s or anyone else’s extra-legal agenda shouldn’t and can’t matter. Product and customer have to come first.

Now, there’s nothing on the record that I’m aware of that I can point to either prove or disprove this allegation.

It does seem to go against everything we know about how and why Apple operates to the point where, if I were in a Dungeons and Dragons campaign right now, I’d be leaping up and yelling “disbelieve!”.

But that’s a subjective opinion, not an objective set of facts.

I have had numerous off-the-record conversations over the years since this iCloud Backup system was implemented and I’ve only ever heard that it was done this way, 100%, to help customers who had previously been locking themselves out of their accounts. Any benefit to law enforcement was unintended but also unavoidable — the cost of prioritizing and preserving customer access.

John Gruber of Daring Fireball, who has as good sources inside Apple as anyone, wrote:

my own private conversations over the last several years, with rank-and-file Apple sources who’ve been directly involved with the company’s security engineering, is that Menn’s sources for the “Apple told the FBI that it planned to offer users end-to-end encryption when storing their phone data on iCloud” bit were the FBI sources, not the Apple sources, and that it is not accurate.

Since anyone who wants to keep everyone, including Apple, locked out of their simply has to toggle iCloud Backup off or, like I said before, use a Mac or PC to create fully encrypted backups, the damage to Apple’s reputation from backroom dealing like that wouldn’t really be worth it.

Especially since there’s no indication Apple has done anything to prevent iCloud Backups from being turned off, or to remove encrypted backup capabilities from the Mac or PC. Something that could easily have been done under the guise of dropping support for a legacy system.

Who’s thwarting who?

Under that plan, primarily designed to thwart hackers, Apple would no longer have a key to unlock the encrypted data, meaning it would not be able to turn material over to authorities in a readable form even under court order.

This also rings untrue to me. I can’t think of a single case where hackers have successfully gained access to Apple’s iCloud backup keys.

In every case, access to data has been achieved by gaining physical access to a device that has the keys, or socially engineering or otherwise gaining credentials to access and restore the iCloud backup from another remote device.

An Apple plan to enable end-to-end encrypted backups would only really thwart two groups of people: The users who lose access to their own accounts, as has happened in the past, and law enforcement agencies who want to subpoena the iCloud backups.

In private talks with Apple soon after, representatives of the FBI’s cyber crime agents and its operational technology division objected to the plan, arguing it would deny them the most effective means for gaining evidence against iPhone-using suspects, the government sources said.

I have no way to verify whether or not these private talks really happened — I suspect but cannot prove that there’s a ton of broken telephone going on here — but if this sounds like something the FBI would argue it’s because it’s something the FBI does in fact argue.

It’s also not accurate. Governments now have access to unprecedented amounts of data about all of us, almost all the time. In some cases that includes cameras and other forms of physical surveillance. In almost all cases, metadata about who we contact, when, where, and how.

When Apple spoke privately to the FBI about its work on phone security the following year, the end-to-end encryption plan had been dropped, according to the six sources.

So, two years ago, Apple had this plan and spoke to the FBI about it. A year ago they spoke to the FBI about it again and said it was dropped. But, that’s also about when Tim Cook first mentions Apple was working on exactly this plan. This means, again, the timeline doesn’t really make sense.

Reuters could not determine why exactly Apple dropped the plan.

Which is a really interesting thing to say right after quoting six sources about the plan being dropped?

“Legal killed it, for reasons you can imagine,” another former Apple employee said he was told, without any specific mention of why the plan was dropped or if the FBI was a factor in the decision.

I can imagine many things, including Apple legal being worried about lawsuits from customers locked out of their data, even when it’s their own fault.

That person told Reuters the company did not want to risk being attacked by public officials for protecting criminals, sued for moving previously accessible data out of reach of government agencies or used as an excuse for new legislation against encryption.

Why not? Apple already gets attacked by public officials, the highest of officials and very publicly on Twitter, for exactly those reasons. It’s not a risk if it’s already happening.

End-to-end encrypting backups is also currently legal, and Apple already does it for PC backups. They can’t be sued for that, at least not successfully.

” They decided they weren’t going to poke the bear anymore,” the person said, referring to Apple’s court battle with the FBI in 2016 over access to an iPhone used by one of the suspects in a mass shooting in San Bernardino, California.

Also why not? Pressuring Apple over encryption isn’t just a risk for Apple, it’s a risk for the government as well because, like we saw when they withdrew from the San Bernardino case, they’re actually scared precedent wouldn’t go their way.

As to new legislation against encryption, Apple has said they believe it should be a legislative decision. They’ll fight it, of course, because it’s in the best interests of their customers to fight it, but as we discussed before, Apple will ultimately follow the law. And, there’s also no telling whether that law can or would pass. Overreaching information laws have been successfully defeated in the past.

Reuters then quotes two former FBI officials who weren’t present for the talks, which is just the opposite of informative.

However, a former Apple employee said it was possible the encryption project was dropped for other reasons, such as concern that more customers would find themselves locked out of their data more often.

Which, like I said before, has been only and exactly the rationale I’ve heard from people at Apple over the last few years.

Once the decision was made, the 10 or so experts on the Apple encryption project – variously code-named Plesio and KeyDrop – were told to stop working on the effort, three people familiar with the matter told Reuters.

This I really wonder about. It’s possible those specific projects were canned, but my understanding is that this is a discussion that is still ongoing at Apple.

And, it’s not atypical for several similar projects to be canceled in favor of better projects that ultimately achieve the same thing. That happens all the time.

Implemention optional encryption

Implementing a new architecture that keeps out bad actors but doesn’t lock out account owners but also allows for end-to-end encryption that’s still as considerate and forgiving as possible is the definition of non-trivial and it absolutely has to be done right.

Look, all the arguments we have on Twitter and in comments about what Apple can and should do, those same arguments happen inside Apple. They’re not a monoculture or hive-mind, they’re a diverse group of passionate, over-achieving, type-a personality with a lot of strong opinions about what should and shouldn’t be done and how. Up to and including the highest levels of the company.

And everything from the articles that get written to the videos that get made to the radars that get filed to the off-the-record conversations that take place help to inform and empower those arguments. Because everyone wants their opinion to win out and will take the best and brightest backup they can get to help make sure it wins and stay won.

Like with how health data and keychain password data is end-to-end encrypted, even when backed up.

That’s ultimately why I’m really happy Reuters published this.

Not because it set off some needless panic, especially from people sharing it without doing even basic due diligence or critical thinking before panicking people with ginned up controversies, manufactured outrange, and conspiracy theories.

But because it’s an incredibly important topic and it just may help propel it once again to the top of Apple’s iCloud roadmap. Yes, even as they’re still struggling to fix up and finish everything from Messages on iCloud to the last round of iOS 13 server-side changes.

Which, by the way, if you turn on and then turn off iCloud Backup, will still let you sync messages between your devices but will move the key from the iCloud Backup to your local device. This stuff is complicated.

Personally, I think it’s critically important for Apple to provide opt-in end-to-end encryption for iCloud backups. Moreover, on a dataset by dataset basis.

Because, contrary to the hype, end-to-end encryption isn’t always for the best. In many cases, it can be for the worst. Maybe I want my messages backup totally secure but still want to access my Photos on iCloud.com? I should be able to do that.

Basically, anything that would be more damaging and harmful to you if it were leaked than lost, you should be able to encrypt it. Again, Apple already does that by default for things like passwords and health data, but you should get to choose other types of data, any types of data that concern you.

And, anything that would be more damaging and harmful to you if it were lost than leaked, you should absolutely not encrypt even if you have the option. That’s the way iCloud backup works now and should still be the default, because it’s in the best interests of 99% of people 99% of the time.

Totally not an easy system to architect in a way that isn’t unnecessarily burdensome or error-prone for end-users, but totally Apple’s job to figure out.

And I hope Apple figures out and ships it, and soon, even if I personally would never turn it on, for exactly the reasons I’ve repeated here… repeatedly.

But for the benefit for every dissident, whistleblower, journalist, oppressed minority, persons at risk, or privacy advocate who would.

 

Source link

Continue Reading

News

The Internet is Littered in ‘Educated Guesses’ Without the ‘Education’

Published

 on

Although no one likes a know-it-all, they dominate the Internet.

The Internet began as a vast repository of information. It quickly became a breeding ground for self-proclaimed experts seeking what most people desire: recognition and money.

Today, anyone with an Internet connection and some typing skills can position themselves, regardless of their education or experience, as a subject matter expert (SME). From relationship advice, career coaching, and health and nutrition tips to citizen journalists practicing pseudo-journalism, the Internet is awash with individuals—Internet talking heads—sharing their “insights,” which are, in large part, essentially educated guesses without the education or experience.

The Internet has become a 24/7/365 sitcom where armchair experts think they’re the star.

Not long ago, years, sometimes decades, of dedicated work and acquiring education in one’s field was once required to be recognized as an expert. The knowledge and opinions of doctors, scientists, historians, et al. were respected due to their education and experience. Today, a social media account and a knack for hyperbole are all it takes to present oneself as an “expert” to achieve Internet fame that can be monetized.

On the Internet, nearly every piece of content is self-serving in some way.

The line between actual expertise and self-professed knowledge has become blurry as an out-of-focus selfie. Inadvertently, social media platforms have created an informal degree program where likes and shares are equivalent to degrees. After reading selective articles, they’ve found via and watching some TikTok videos, a person can post a video claiming they’re an herbal medicine expert. Their new “knowledge,” which their followers will absorb, claims that Panda dung tea—one of the most expensive teas in the world and isn’t what its name implies—cures everything from hypertension to existential crisis. Meanwhile, registered dietitians are shaking their heads, wondering how to compete against all the misinformation their clients are exposed to.

More disturbing are individuals obsessed with evangelizing their beliefs or conspiracy theories. These people write in-depth blog posts, such as Elvis Is Alive and the Moon Landings Were Staged, with links to obscure YouTube videos, websites, social media accounts, and blogs. Regardless of your beliefs, someone or a group on the Internet shares them, thus confirming your beliefs.

Misinformation is the Internet’s currency used to get likes, shares, and engagement; thus, it often spreads like a cosmic joke. Consider the prevalence of clickbait headlines:

  • You Won’t Believe What Taylor Swift Says About Climate Change!
  • This Bedtime Drink Melts Belly Fat While You Sleep!
  • In One Week, I Turned $10 Into $1 Million!

Titles that make outrageous claims are how the content creator gets reads and views, which generates revenue via affiliate marketing, product placement, and pay-per-click (PPC) ads. Clickbait headlines are how you end up watching a TikTok video by a purported nutrition expert adamantly asserting you can lose belly fat while you sleep by drinking, for 14 consecutive days, a concoction of raw eggs, cinnamon, and apple cider vinegar 15 minutes before going to bed.

Our constant search for answers that’ll explain our convoluted world and our desire for shortcuts to success is how Internet talking heads achieve influencer status. Because we tend to seek low-hanging fruits, we listen to those with little experience or knowledge of the topics they discuss yet are astute enough to know what most people want to hear.

There’s a trend, more disturbing than spreading misinformation, that needs to be called out: individuals who’ve never achieved significant wealth or traded stocks giving how-to-make-easy-money advice, the appeal of which is undeniable. Several people I know have lost substantial money by following the “advice” of Internet talking heads.

Anyone on social media claiming to have a foolproof money-making strategy is lying. They wouldn’t be peddling their money-making strategy if they could make easy money.

Successful people tend to be secretive.

Social media companies design their respective algorithms to serve their advertisers—their source of revenue—interest; hence, content from Internet talking heads appears most prominent in your feeds. When a video of a self-professed expert goes viral, likely because it pressed an emotional button, the more people see it, the more engagement it receives, such as likes, shares and comments, creating a cycle akin to a tornado.

Imagine scrolling through your TikTok feed and stumbling upon a “scientist” who claims they can predict the weather using only aluminum foil, copper wire, sea salt and baking soda. You chuckle, but you notice his video got over 7,000 likes, has been shared over 600 times and received over 400 comments. You think to yourself, “Maybe this guy is onto something.” What started as a quest to achieve Internet fame evolved into an Internet-wide belief that weather forecasting can be as easy as DIY crafts.

Since anyone can call themselves “an expert,” you must cultivate critical thinking skills to distinguish genuine expertise from self-professed experts’ self-promoting nonsense. While the absurdity of the Internet can be entertaining, misinformation has serious consequences. The next time you read a headline that sounds too good to be true, it’s probably an Internet talking head making an educated guess; without the education seeking Internet fame, they can monetize.

______________________________________________________________

 

Nick Kossovan, a self-described connoisseur of human psychology, writes about what’s

on his mind from Toronto. You can follow Nick on Twitter and Instagram @NKossovan.

 

Continue Reading

Tech

Tight deadlines on software projects can put safety at risk: survey

Published

 on

 

TORONTO – A new survey says a majority of software engineers and developers feel tight project deadlines can put safety at risk.

Seventy-five per cent of the 1,000 global workers who responded to the survey released Tuesday say pressure to deliver projects on time and on budget could be compromising critical aspects like safety.

The concern is even higher among engineers and developers in North America, with 77 per cent of those surveyed on the continent reporting the urgency of projects could be straining safety.

The study was conducted between July and September by research agency Coleman Parkes and commissioned by BlackBerry Ltd.’s QNX division, which builds connected-car technology.

The results reflect a timeless tug of war engineers and developers grapple with as they balance the need to meet project deadlines with regulations and safety checks that can slow down the process.

Finding that balance is an issue that developers of even the simplest appliances face because of advancements in technology, said John Wall, a senior vice-president at BlackBerry and head of QNX.

“The software is getting more complicated and there is more software whether it’s in a vehicle, robotics, a toaster, you name it… so being able to patch vulnerabilities, to prevent bad actors from doing malicious acts is becoming more and more important,” he said.

The medical, industrial and automotive industries have standardized safety measures and anything they produce undergoes rigorous testing, but that work doesn’t happen overnight. It has to be carried out from the start and then at every step of the development process.

“What makes safety and security difficult is it’s an ongoing thing,” Wall said. “It’s not something where you’ve done it, and you are finished.”

The Waterloo, Ont.-based business found 90 per cent of its survey respondents reported that organizations are prioritizing safety.

However, when asked about why safety may not be a priority for their organization, 46 per cent of those surveyed answered cost pressures and 35 per cent said a lack of resources.

That doesn’t surprise Wall. Delays have become rampant in the development of tech, and in some cases, stand to push back the launch of vehicle lines by two years, he said.

“We have to make sure that people don’t compromise on safety and security to be able to get products out quicker,” he said.

“What we don’t want to see is people cutting corners and creating unsafe situations.”

The survey also took a peek at security breaches, which have hit major companies like London Drugs, Indigo Books & Music, Giant Tiger and Ticketmaster in recent years.

About 40 per cent of the survey’s respondents said they have encountered a security breach in their employer’s operating system. Those breaches resulted in major impacts for 27 per cent of respondents, moderate impacts for 42 per cent and minor impacts for 27 per cent.

“There are vulnerabilities all the time and this is what makes the job very difficult because when you ship the software, presumably the software has no security vulnerabilities, but things get discovered after the fact,” Wall said.

Security issues, he added, have really come to the forefront of the problems developers face, so “really without security, you have no safety.”

This report by The Canadian Press was first published Oct. 8, 2024.

Companies in this story: (TSX:BB)

The Canadian Press. All rights reserved.

Source link

Continue Reading

Tech

Beware of scams during Amazon’s Prime Big Deal Days sales event: cybersecurity firm

Published

 on

 

As online shoppers hunt for bargains offered by Amazon during its annual fall sale this week, cybersecurity researchers are warning Canadians to beware of an influx of scammers posing as the tech giant.

In the 30 days leading up to Amazon’s Prime Big Deal Days, taking place Tuesday and Wednesday, there were more than 1,000 newly registered Amazon-related web domains, according to Check Point Software Technologies, a company that offers cybersecurity solutions.

The company said it deemed 88 per cent of those domains malicious or suspicious, suggesting they could have been set up by scammers to prey on vulnerable consumers. One in every 54 newly created Amazon-related domain included the phrase “Amazon Prime.”

“They’re almost indiscernible from the real Amazon domain,” said Robert Falzon, head of engineering at Check Point in Canada.

“With all these domains registered that look so similar, it’s tricking a lot of people. And that’s the whole intent here.”

Falzon said Check Point Research sees an uptick in attempted scams around big online shopping days throughout the year, including Prime Days.

Scams often come in the form of phishing emails, which are deceptive messages that appear to be from a reputable source in attempt to steal sensitive information.

In this case, he said scammers posing as Amazon commonly offer “outrageous” deals that appear to be associated with Prime Days, in order to trick recipients into clicking on a malicious link.

The cybersecurity firm said it has identified and blocked 100 unique Amazon Prime-themed scam emails targeting organizations and consumers over the past two weeks.

Scammers also target Prime members with unsolicited calls, claiming urgent account issues and requesting payment information.

“It’s like Christmas for them,” said Falzon.

“People expect there to be significant savings on Prime Day, so they’re not shocked that they see something of significant value. Usually, the old adage applies: If it seems too good to be true, it probably is.”

Amazon’s website lists a number of red flags that it recommends customers watch for to identify a potential impersonation scam.

Those include false urgency, requests for personal information, or indications that the sender prefers to complete the purchase outside of the Amazon website or mobile app.

Scammers may also request that customers exclusively pay with gift cards, a claim code or PIN. Any notifications about an order or delivery for an unexpected item should also raise alarm bells, the company says.

“During busy shopping moments, we tend to see a rise in impersonation scams reported by customers,” said Amazon spokeswoman Octavia Roufogalis in a statement.

“We will continue to invest in protecting consumers and educating the public on scam avoidance. We encourage consumers to report suspected scams to us so that we can protect their accounts and refer bad actors to law enforcement to help keep consumers safe.”

Falzon added that these scams are more successful than people might think.

As of June 30, the Canadian Anti-Fraud Centre said there had been $284 million lost to fraud so far this year, affecting 15,941 victims.

But Falzon said many incidents go unreported, as some Canadians who are targeted do not know how or where to flag a scam, or may choose not to out of embarrassment.

Check Point recommends Amazon customers take precautions while shopping on Prime Days, including by checking URLs carefully, creating strong passwords on their accounts, and avoiding personal information being shared such as their birthday or social security number.

The cybersecurity company said consumers should also look for “https” at the beginning of a website URL, which indicates a secure connection, and use credit cards rather than debit cards for online shopping, which offer better protection and less liability if stolen.

This report by The Canadian Press was first published Oct. 8, 2024.

Source link

Continue Reading

Trending

Exit mobile version