OTTAWA – How old are you? What is your gender? Are you Indigenous? Are you a Canadian citizen? Do you have a family?
Those are just a few of the data points that a new artificial intelligence system will use to determine if somebody might be at risk of chronic homelessness in Ottawa, thanks to a team-up with a Carleton University researcher.
The national capital is not the first municipality to use the emerging technology as a tool to mitigate a worsening crisis — London, Ont., previously pioneered a similar project, while in California, Los Angeles has an initiative that identifies individuals at risk of becoming homeless.
As cities increasingly turn to AI, some advocates are raising concerns about privacy and bias. But those behind the project insist it is just one tool to help determine who might need help.
The researcher developing the Ottawa project, Majid Komeili, said the system uses personal data such as age, gender, Indigenous status, citizenship status and whether the person has a family on record.
It also looks at factors like how many times they may have previously been refused service at a shelter and reasons they received a service.
The system will also use external data such as information about the weather and economic indicators like the consumer price index and unemployment rate. Komeili said the system will predict how many nights the individual will stay in a shelter in six months’ time.
“This will be a tool in the service providers’ toolbox, ensuring that no one falls through the cracks because of (a) human mistake. The final decision-maker will remain a human,” he said in an email.
That information is available in the first place because people are already “highly tracked” in order to receive various benefits or treatments, argued McGill University associate professor said Renee Sieber.
“Homeless people, unfortunately, are incredibly surveilled, and the data is very intrusive,” Sieber said.
The data might include details about medical appointments, drug addictions, relapses and HIV status.
Sieber said it’s important to ask whether AI technology is really necessary. “Do you know any more about chronic homelessness with AI than you did with a spreadsheet?”
It was only a matter of time before AI got involved, suggested Tim Richter, president of the Canadian Alliance to End Homelessness.
Though they are not widespread, such tools “can to a degree probably anticipate who’s more likely to experience homelessness or chronic homelessness,” he said. “Using AI to do that could be very helpful in targeting interventions to people.”
Most places do not have good enough data to establish such systems, said Richter.
His organization is working with cities across the country, including London and Ottawa, to help collect better “real-time, person-specific” information — “in a way that protects their privacy.”
Chronic homelessness means an individual has been homeless for more than six months, or has experienced repeated episodes of homelessness over that time frame.
While 85 per cent of people are in and out of homelessness quickly, some 15 to 20 per cent “get stuck,” Richter said.
AI systems should be able to do their job and flag individuals who are at risk by looking at aggregate community-level data and without knowing the specific identity of the individual involved, said Richter.
That’s the approach the Ottawa project is taking. Identifiable information like names and contact information is replaced by codes.
“There is a master list that includes the linkages between the identifier codes and user identities. AI training and testing operate solely on the coded dataset. The master list is stored separately on a secure server with restricted access,” Komeili explained.
He noted the system uses data that has already been gathered in previous years and isn’t specifically being collected for use by AI.
Vinh Nguyen, the City of Ottawa’s manager of social policy, research and analytics, said in a statement that any sharing of data collected by the city “undergoes rigorous internal review and scrutiny.”
“Data we share is often aggregated and where that is not possible, all identifiable information is removed to ensure strict anonymity of users,” he said, adding that collaborations with academics must be reviewed by an ethics board before data work takes place.
Nguyen said the city is currently conducting “internal testing and validation” and plans to consult with the shelter sector and clients before implementing the model, with consultations planned for late fall.
Alina Turner, co-founder of HelpSeeker, a company that uses AI in products dealing with social issues, said the “superpowers” of AI can be useful when it comes to comprehensive analysis of the factors and trends that feed into homelessness.
But her company made a conscious choice to stay away from predicting individual-level risk, she said.
“You can just get into a lot of trouble with bias in that,” she said, noting that data vary between different communities and “the racial bias of that data is a major challenge as well.”
One long-acknowledged problem with AI is that its analysis is only as good as the data that is fed into it. That means when data come from a society with systemic racism built into its systems, AI predictions can perpetuate it.
For instance, due to systemic factors, Indigenous individuals are at a higher risk of homelessness.
If an AI system were to automatically give someone a higher score once they come into a shelter and identify as Indigenous, though, “there’s a lot of ethical issues with taking that approach,” Turner argued.
Komeili, the Ottawa researcher, said bias is a “known issue with similar AI-powered products.” He noted humans have biases too, and different individuals might make different recommendations.
“One advantage of an AI-based approach is that, when used as an assistive tool in the toolbox of human experts, it can help them converge on a standard approach. Such an assistive tool helps human experts avoid missing important details and may reduce the likelihood of human errors.”
Luke Stark, an assistant professor at Western University, is working on a project studying the use of data and AI for homelessness policy in Canada, including the existing AI initiative in London, Ont.
He said another problem that human decision-makers need to think about is how predictions can cause certain segments of the homeless population to be missed.
Women are more likely to avoid shelters for safety reasons, and are more likely to turn to options such as couch surfing, he noted.
An AI system using data from the shelter system will focus on “the kind of person who already uses the shelter system … and that leaves out a whole bunch of people.”
Stark described predictive systems as the latest technology that risks obscuring the root causes of homelessness.
“One concern that we have is that all this attention to these triage-based solutions then takes the pressure off of policy-makers to actually look at those structural causes of homelessness that are there in the first place,” he said.
As Richter put it: “Ultimately, the key to ending homelessness is housing.”
This report by The Canadian Press was first published Aug. 4, 2024.