A recent decision from the British Columbia Workers’ Compensation Appeal Tribunal (BC WCAT) A2501051, 2025 CanLII 97422, may be of interest to the public sector adjudicator community at large on the issue of how to handle artificial intelligence (AI) submissions that include legal citation hallucinations.
In this case the worker had filed submissions in relation to whether exceptions existed to allow a prohibited action complaint to be filed late. In reviewing the submission, the Deputy Registrar noted problems with the worker’s submission in particular “The policy cited by the worker is not a current policy and does not say what the worker says it does. The cases he has cited either do not exist, or do not have anything to do with what he has cited them for.”
The Deputy Registrar found that it appeared the worker’s “submission was created, at least partly, with the use of artificial intelligence. It is widely known that large language-based artificial intelligence models can prepare lengthy submissions that sound like they were written by a person with expertise. However, these models can also 'hallucinate' legal cases, meaning they make them up.”