"Canada rejected the permanent residence application of a McMaster postdoc from the Sorbonne who wor
| Source: Mastodon | Original article
Canada’s immigration agency has rejected the permanent‑residence application of a McMaster University postdoctoral researcher who earned her doctorate at the Sorbonne and studies the immunology of ageing. The denial, issued by Immigration, Refugees and Citizenship Canada (IRCC), cites an “incomplete” application – but the underlying cause was a generative‑AI system that hallucinated her academic credentials, mistakenly flagging her as lacking the required qualifications.
The incident shines a spotlight on the growing reliance on large language models to triage and evaluate immigration files. IRCC introduced the AI tool earlier this year to accelerate processing times and reduce manual workload, but the technology’s propensity for fabricating or mis‑interpreting data has now produced a concrete, high‑stakes error. For a country that depends on skilled researchers to sustain its knowledge‑based economy, a false rejection threatens both individual careers and the broader talent pipeline.
Legal experts note that applicants can appeal IRCC decisions, yet the opacity of AI‑driven assessments complicates the evidentiary basis for a challenge. The case may prompt a review of the agency’s AI governance framework, including requirements for human verification, audit trails and bias mitigation. Advocacy groups are already calling for a pause on fully automated decision‑making until robust safeguards are in place.
Watch for IRCC’s official response, which is expected within the next two weeks, and for any court filings by the researcher or her legal counsel. Parallel developments – such as the federal government’s upcoming AI‑ethics legislation and other reported AI mishaps in public services – will indicate whether Canada will tighten oversight or double down on automation in its immigration system.
Sources
Back to AIPULSEN