When Robert Winters was single back in 2017, he used artificial intelligence to find more matches on Tinder. He downloaded AI software that automatically swiped on profiles and started conversations for him, leading to hundreds of dating prospects. Not long after, Tinder banned him.
“Before the AI-Tinder hack that I implemented, there was maybe a dozen of matches [over] a few days,” he said. “Its difference is day and night.”
That program is far more savvy than the AI tools most people would use to find a date. Still, the 39-year-old digital design strategist from Belgium says even simple AI aids can optimize online dating.
Last year, Winters started using AI to generate photos of himself for his dating profile.
During that same period, a bunch of new AI-powered dating tools came out. They can help improve users’ dating profiles, assist with texting or even go on “first dates” for people. But their release has also opened up a debate on the ethics of their use.
“There’s this question of: to what extent should we be allowing people to use AI to represent themselves?” said Liesel Sharabi, a dating app researcher and Arizona State University associate professor.
“When does that become deceptive, and when is it helpful?”
Dating profile enhancement
Among other features, a slew of websites and phone apps say they can help people create a better curated first impression on dating apps. These kinds of AI services will:
Write your bio
Write your prompt responses
Identify your best pictures
Create AI-generated images of you
Dmitri Mirakyan is the co-founder of YourMove AI, a website and app that offers an AI dating profile generator and reviewer, among other services. He says his company has written over 500,000 profiles and his website gets about 200,000 visitors every month.
AI is helping people flirt and land dates. But is it ethical?
Some people are using artificial intelligence to help them with flirty online banter. But what are the ethics behind presenting an enhanced — or even alternate — form of your personality when dating?
About a third of users are young men, he estimated.
He says some people use his services because they’re introverted or are older and unfamiliar with dating apps.
“We help these folks get a leg up and dive into online dating because marketing yourself is hard,” Mirakyan said.
Kathryn Coduto, an online dating researcher and Boston University assistant professor, said while these tools may be useful, they can also make people appear less authentic.
“When AI is used to create a profile, it doesn’t really feel like you anymore. It feels like a computer trying to figure out who you are.”
Her research found that many people are hesitant to trust AI, even if there are benefits.
Pickup lines
Some apps offer users a rolodex of AI-generated openers. One shown in a TikTok ad says, “I’m not sure how to put this. I usually go for sevens, but I guess I’ll settle for a 10.”
Often, users can request a certain tone, from something sweet to something spicy. But developers say that no matter the style, their programs boost users’ confidence by helping them take that crucial first step while educating them on ways to improve their communication skills.
Unlike using AI for profile enhancements, Coduto said AI-generated pickup lines reflect a long-established practice — as many people rely on friends for help with openers.
“Is AI really different from friends when it comes to opening lines?”
“Both guys and girls need help when opening up on dating apps. It’s an artificial environment that is not the same as in real life,” said Roman Khaves, co-founder of Rizz, an AI dating assistant. “Opening up is very nerve-wracking for a lot of users.”‘
Khaves says his app has had 3.5 million users since it launched last spring.
Coduto said men are often under far more pressure because they’re “still expected to send that first message or to have that really great opening line.”
Sometimes, the pickup lines can sound unnatural or silly, but that doesn’t mean people can’t change them.
“I think there’s an argument that you can learn from AI, particularly when we think about things like pickup lines.”
Messaging assistant
Apps including Rizz and YourMove AI also allow users to upload screenshots of their online conversations to an AI scanner that suggests how to respond.
Dating apps like Bumble already include prompts to help people chat more easily.
These kinds of texting aids can play a key role in getting people to meet up in person, which is often a major goal for online daters, said Jevan Huston, an AI and dating app researcher and Hintze Law associate.
Assistive technology can dislodge people’s anxieties and “allow them to engage when they otherwise wouldn’t.”
These texting aid apps are often subscription-based, and many people online say the costs are prohibitive. For example, Rizz offers a three-day free trial, but a week’s subscription costs $9.99, and a year’s costs $99.99.
Regardless, say a user finally lands the date they’ve been hoping for. Some might still find themselves in a Catch-22.
“AI is not going to help you have that real-life conversation,” Coduto said.
“If you are communicating solely via AI or you’re really being assisted by AI, I definitely think that could be a form of catfishing.”
In defence of these kinds of services, Huston said people often present themselves differently online, whether that’s on social media or dating apps.
Another point of debate involves disclosure. Users will have to decide if, how and when, it’s necessary to tell a date about using AI in conversation.
“Starting off on an honest foot is really important,” Coduto said.
Mirakyan said people using his chat assistant should be “as transparent as people that have a prosthetic leg should be transparent about the fact that they’re using a prosthetic leg to walk.”
“I don’t feel like anybody should be obligated to disclose that they’re an introvert or that, like, they’re drinking a beer to overcome the fact that they’re introverted and want to be more sociable.”
Another consideration is that uploaded screenshots capture a two-way conversation, so the other person would likely be unaware that their chats are being captured, shared and possibly stored by a third party.
Mirakyan and Khaves said their technology doesn’t save people’s information and only extracts and analyzes the text in conversations.
AI goes on dates for you
Some companies are alleviating people’s dating fatigue by creating AI-simulated blind dates.
Volar Dating, which launched in the U.S. earlier this year, is one such company. In a brief on-boarding process, users tell the chatbot about themselves — their age, location, hobbies.
Then, using AI, the bot simulates a first date between two people.
A CBC test found that the AI sometimes extrapolates information to make up new talking points that may not be true, such as saying an avid reader read a particular book when they didn’t.
Once matched, users can then decide whether they’d like to send a message request to actually talk to the other person.
“AI could be used to cut down on the amount that people are just swiping on dating apps,” said Sharabi, of Arizona State University. “It’s quite a bit different from how we engage with dating apps currently.”
Date an AI
Last year, Replika, one of the leading AI chatbot companion firms, launched Blush, an app exclusively for AI dating.
It works like a dating app, but the people aren’t real — they’re AI personalities, each one with its own backstory.
Omri Gillath, a social psychology professor at the University of Kansas, says these relationships, which he called parasocial, aren’t healthy in the long term.
But in the short term, he says, they could create a safe space for some people to express their attachment needs.
“That said, as a society, we need to ask ourselves, is that the solution?”
On its website, Blush says its app lets people practise dating in a controlled environment, and then apply those lessons in the real world, something Gillath is skeptical about.
He pointed out that Blush and other similar programs include anime and furry characters that people can engage with.
“So is that your practice for the real world, or is this just your way to fulfil your preferences?”
Similar to the other AI dating apps, there is still not a lot of conclusive data about the effects of these products on people’s behaviour.
He said one thing is clear: “The further away that you get from face-to-face, in-person, human touch, the further you’re getting from what we evolved to do and evolve to be.”
Huston says that regardless of the stance people take on intimate relationships with AI, people should consider that there is an epidemic of loneliness in many societies.
“If it can aid loneliness and provide, whether it’s partnership, companionship, someone to listen … I think that’s a value and something that should not be disregarded.”
Do these apps work?
Even though success can be measured in different ways, it’s hard to know what that would look like with AI apps, as there isn’t a lot of research and much of it isn’t public, Coduto said.
Anecdotally, Mirakyan of the YourMove AI app says he has heard of success stories from users.
“Quite a few people have told me that they’ve found relationships through [using conversation tools],” he said.
“I’m going to be responsible for at least a couple of kids at this point.”
Meanwhile, Winters says the technology provides a kind of skill training in a world that has shifted online and away from spontaneous in-person interactions, especially for men.
He himself is no longer on the dating market. He met his girlfriend in person at an after-work event.
The federal government is ordering the dissolution of TikTok’s Canadian business after a national security review of the Chinese company behind the social media platform, but stopped short of ordering people to stay off the app.
Industry Minister François-Philippe Champagne announced the government’s “wind up” demand Wednesday, saying it is meant to address “risks” related to ByteDance Ltd.’s establishment of TikTok Technology Canada Inc.
“The decision was based on the information and evidence collected over the course of the review and on the advice of Canada’s security and intelligence community and other government partners,” he said in a statement.
The announcement added that the government is not blocking Canadians’ access to the TikTok application or their ability to create content.
However, it urged people to “adopt good cybersecurity practices and assess the possible risks of using social media platforms and applications, including how their information is likely to be protected, managed, used and shared by foreign actors, as well as to be aware of which country’s laws apply.”
Champagne’s office did not immediately respond to a request for comment seeking details about what evidence led to the government’s dissolution demand, how long ByteDance has to comply and why the app is not being banned.
A TikTok spokesperson said in a statement that the shutdown of its Canadian offices will mean the loss of hundreds of well-paying local jobs.
“We will challenge this order in court,” the spokesperson said.
“The TikTok platform will remain available for creators to find an audience, explore new interests and for businesses to thrive.”
The federal Liberals ordered a national security review of TikTok in September 2023, but it was not public knowledge until The Canadian Press reported in March that it was investigating the company.
At the time, it said the review was based on the expansion of a business, which it said constituted the establishment of a new Canadian entity. It declined to provide any further details about what expansion it was reviewing.
A government database showed a notification of new business from TikTok in June 2023. It said Network Sense Ventures Ltd. in Toronto and Vancouver would engage in “marketing, advertising, and content/creator development activities in relation to the use of the TikTok app in Canada.”
Even before the review, ByteDance and TikTok were lightning rod for privacy and safety concerns because Chinese national security laws compel organizations in the country to assist with intelligence gathering.
Such concerns led the U.S. House of Representatives to pass a bill in March designed to ban TikTok unless its China-based owner sells its stake in the business.
Champagne’s office has maintained Canada’s review was not related to the U.S. bill, which has yet to pass.
Canada’s review was carried out through the Investment Canada Act, which allows the government to investigate any foreign investment with potential to might harm national security.
While cabinet can make investors sell parts of the business or shares, Champagne has said the act doesn’t allow him to disclose details of the review.
Wednesday’s dissolution order was made in accordance with the act.
The federal government banned TikTok from its mobile devices in February 2023 following the launch of an investigation into the company by federal and provincial privacy commissioners.
— With files from Anja Karadeglija in Ottawa
This report by The Canadian Press was first published Nov. 6, 2024.
LONDON (AP) — Most people have accumulated a pile of data — selfies, emails, videos and more — on their social media and digital accounts over their lifetimes. What happens to it when we die?
It’s wise to draft a will spelling out who inherits your physical assets after you’re gone, but don’t forget to take care of your digital estate too. Friends and family might treasure files and posts you’ve left behind, but they could get lost in digital purgatory after you pass away unless you take some simple steps.
Here’s how you can prepare your digital life for your survivors:
Apple
The iPhone maker lets you nominate a “ legacy contact ” who can access your Apple account’s data after you die. The company says it’s a secure way to give trusted people access to photos, files and messages. To set it up you’ll need an Apple device with a fairly recent operating system — iPhones and iPads need iOS or iPadOS 15.2 and MacBooks needs macOS Monterey 12.1.
For iPhones, go to settings, tap Sign-in & Security and then Legacy Contact. You can name one or more people, and they don’t need an Apple ID or device.
You’ll have to share an access key with your contact. It can be a digital version sent electronically, or you can print a copy or save it as a screenshot or PDF.
Take note that there are some types of files you won’t be able to pass on — including digital rights-protected music, movies and passwords stored in Apple’s password manager. Legacy contacts can only access a deceased user’s account for three years before Apple deletes the account.
Google
Google takes a different approach with its Inactive Account Manager, which allows you to share your data with someone if it notices that you’ve stopped using your account.
When setting it up, you need to decide how long Google should wait — from three to 18 months — before considering your account inactive. Once that time is up, Google can notify up to 10 people.
You can write a message informing them you’ve stopped using the account, and, optionally, include a link to download your data. You can choose what types of data they can access — including emails, photos, calendar entries and YouTube videos.
There’s also an option to automatically delete your account after three months of inactivity, so your contacts will have to download any data before that deadline.
Facebook and Instagram
Some social media platforms can preserve accounts for people who have died so that friends and family can honor their memories.
When users of Facebook or Instagram die, parent company Meta says it can memorialize the account if it gets a “valid request” from a friend or family member. Requests can be submitted through an online form.
The social media company strongly recommends Facebook users add a legacy contact to look after their memorial accounts. Legacy contacts can do things like respond to new friend requests and update pinned posts, but they can’t read private messages or remove or alter previous posts. You can only choose one person, who also has to have a Facebook account.
You can also ask Facebook or Instagram to delete a deceased user’s account if you’re a close family member or an executor. You’ll need to send in documents like a death certificate.
TikTok
The video-sharing platform says that if a user has died, people can submit a request to memorialize the account through the settings menu. Go to the Report a Problem section, then Account and profile, then Manage account, where you can report a deceased user.
Once an account has been memorialized, it will be labeled “Remembering.” No one will be able to log into the account, which prevents anyone from editing the profile or using the account to post new content or send messages.
X
It’s not possible to nominate a legacy contact on Elon Musk’s social media site. But family members or an authorized person can submit a request to deactivate a deceased user’s account.
Passwords
Besides the major online services, you’ll probably have dozens if not hundreds of other digital accounts that your survivors might need to access. You could just write all your login credentials down in a notebook and put it somewhere safe. But making a physical copy presents its own vulnerabilities. What if you lose track of it? What if someone finds it?
Instead, consider a password manager that has an emergency access feature. Password managers are digital vaults that you can use to store all your credentials. Some, like Keeper,Bitwarden and NordPass, allow users to nominate one or more trusted contacts who can access their keys in case of an emergency such as a death.
But there are a few catches: Those contacts also need to use the same password manager and you might have to pay for the service.
___
Is there a tech challenge you need help figuring out? Write to us at onetechtip@ap.org with your questions.
LONDON (AP) — Britain’s competition watchdog said Thursday it’s opening a formal investigation into Google’s partnership with artificial intelligence startup Anthropic.
The Competition and Markets Authority said it has “sufficient information” to launch an initial probe after it sought input earlier this year on whether the deal would stifle competition.
The CMA has until Dec. 19 to decide whether to approve the deal or escalate its investigation.
“Google is committed to building the most open and innovative AI ecosystem in the world,” the company said. “Anthropic is free to use multiple cloud providers and does, and we don’t demand exclusive tech rights.”
San Francisco-based Anthropic was founded in 2021 by siblings Dario and Daniela Amodei, who previously worked at ChatGPT maker OpenAI. The company has focused on increasing the safety and reliability of AI models. Google reportedly agreed last year to make a multibillion-dollar investment in Anthropic, which has a popular chatbot named Claude.
Anthropic said it’s cooperating with the regulator and will provide “the complete picture about Google’s investment and our commercial collaboration.”
“We are an independent company and none of our strategic partnerships or investor relationships diminish the independence of our corporate governance or our freedom to partner with others,” it said in a statement.
The U.K. regulator has been scrutinizing a raft of AI deals as investment money floods into the industry to capitalize on the artificial intelligence boom. Last month it cleared Anthropic’s $4 billion deal with Amazon and it has also signed off on Microsoft’s deals with two other AI startups, Inflection and Mistral.