A nearly $1.6 million healthcare report commissioned by the government of Newfoundland and Labrador is under scrutiny for containing significant errors, including citations of non-existent academic papers. The consulting firm Deloitte Canada has acknowledged using artificial intelligence in the report's creation but maintains the core findings are sound.
The incident marks the second time this year a government-commissioned Deloitte report has been found to contain AI-related inaccuracies, following a similar case in Australia. This raises new questions about the reliability of AI tools in crafting critical public policy documents.
Key Takeaways
- A 526-page Deloitte report for a Canadian province cost nearly $1.6 million and contained fabricated academic citations.
- Researchers were cited on papers they never wrote, and some cited journals do not appear to exist.
- Deloitte Canada admitted to using AI for a "small number of research citations" but denies it was used to write the report.
- This follows a similar incident where Deloitte used AI in a flawed $290,000 report for the Australian government.
Report's Purpose and Flawed Foundation
The extensive 526-page document was intended to guide the Newfoundland and Labrador Department of Health and Community Services. It was commissioned to provide advice on pressing issues such as virtual care, staff retention incentives, and the lingering effects of the COVID-19 pandemic on the healthcare system.
The province is currently grappling with significant shortages of nurses and doctors, making the report's recommendations potentially influential for future policy. However, an investigation by The Independent, a local news outlet, uncovered serious problems with the report's sourcing.
The investigation found instances of what are commonly known as AI "hallucinations." These included references to academic papers that do not exist, incorrect author attributions on fabricated studies, and citations to journals with no record of the mentioned articles.
By the Numbers
- $1.6 Million: The approximate cost of the report to the Canadian province.
- 526 Pages: The total length of the healthcare strategy document.
- 2: The number of countries this year where Deloitte has faced questions over AI-generated errors in government reports.
Researchers Respond to False Citations
One of the academics incorrectly cited in the report was Gail Tomblin Murphy, an adjunct professor at Dalhousie University's School of Nursing. She was listed as a co-author on a paper that she confirmed "does not exist."
"It sounds like if you’re coming up with things like this, they may be pretty heavily using AI to generate work," Tomblin Murphy stated. "We have to be very careful to make sure that the evidence that’s informing reports [is] the best evidence, that it’s validated evidence."
She stressed the importance of accuracy in such documents, not only because of their cost to the public but because their purpose is to provide evidence-based guidance to solve critical problems.
Deloitte's Position and a Pattern of Errors
In response to the findings, Deloitte Canada issued a statement defending the overall integrity of its work. "Deloitte Canada firmly stands behind the recommendations put forward in our report," a spokesperson said.
The firm acknowledged the citation issues but downplayed their significance. "We are revising the report to make a small number of citation corrections, which do not impact the report findings. AI was not used to write the report; it was selectively used to support a small number of research citations."
A Similar Case in Australia
This incident in Canada mirrors a recent event in Australia. In July, Deloitte delivered a $290,000 report to the Australian government on welfare policy. Researchers quickly flagged that it also contained references to non-existent academic papers and even a fabricated quote from a court judgment. Deloitte later admitted to using Azure OpenAI in that report's creation and issued a partial refund to the government.
As of now, the flawed Canadian report remains available on the government's website. There has been no public announcement regarding a potential refund or a formal review of the procurement process. The provincial government has not yet publicly addressed the issue.
Implications for Public Trust and AI in Consulting
The repeated incidents involving a major global consulting firm highlight the growing pains of integrating generative AI into professional workflows. While these tools can accelerate research and writing, they are prone to fabricating information with a high degree of confidence.
For governments and public bodies that spend millions on expert analysis, these errors undermine the very foundation of evidence-based policymaking. The reliance on reports from firms like Deloitte is based on an assumption of rigorous fact-checking and academic integrity.
As AI becomes more accessible, this case serves as a critical reminder of the need for human oversight and verification, especially when the resulting reports influence decisions affecting public services like healthcare. The question now is how consulting firms and their government clients will adapt their processes to prevent such costly and misleading errors in the future.





