THE SNAPSHOT
tv is helping to prepare Canada’s defence community for AI-supported command and control, including fast developing Arctic surveillance scenarios, by simulating how humans and intelligent systems make decisions together under pressure.
THE CHALLENGE: A tidal wave of data
On a computer screen in a human performance lab at tv, the Arctic is alive with movement. A myriad of colourful symbols representing vessels of all sorts and origins transit newly accessible waters. Signals blink. Data streams in from sensors scattered across the northern expanse.

Command-and-contron simulation in tv’s Cognitive and Motor Performance Lab. (Cody Turner photo)
The map is a command-and-control simulation. No classified systems are running. No real-world decisions are being made. But the cognitive processes under scrutiny mirror a challenge Defence Research and Development Canada (DRDC) is grappling with.
The Arctic is rapidly emerging as an area of geopolitical interest. Melting sea ice is opening new shipping routes. Vessel traffic is increasing. The complexity of its defence is deepening. At the same time, sensing technologies are proliferating and becoming significantly more sophisticated, generating a tidal wave of data that must be integrated, interpreted, and acted on.
“We’re looking at a new space, a new future operating theatre. And regardless of whether that’s the Arctic or somewhere we’ve worked before, these operational areas are just going to be more complicated,” says Dr. Aren Hunter, head of the Maritime Science Experimentation and Analytics (M-SEA) Section at DRDC. “There’s just going to be a bunch of new data that’s going to end up in the hands and in the minds of operators. At some point, you hit a ceiling.”

Marine Technician, Master Seaman Mathieu Allard-Audet responds to engineering emergency drills on board HMCS HALIFAX. (Corporal Braden Trudeau, Trinity - Formation Imaging Services photo)
But a potential relief valve has emerged. Industry is developing AI-supported decision-support tools to help operators manage complexity. One of them is Cognitive Shadow, developed by Thales, a multinational defence, aerospace, and cybersecurity company.
The AI platform under consideration by DRDC, “has the capacity to learn human decision-making processes and patterns to provide real-time support to operators,” says Dr. Daniel Lafond, a lead scientist at , Thales’s AI accelerator. “This provides a security net for human decision making in contexts of high ambiguity, cognitive overload, and mental fatigue.”
The promise is significant. The work now lies in translating that promise into systems operators can trust, use, and rely on under real-world conditions.
THE SOLUTION: Getting into the head of the armed forces
Dr. Heather Neyedli, who leads , says that while the algorithms that drive Thales’s AI are undoubtably important, successful integration into live command-and-control environments also requires a deep understanding of how the platform will be adopted and used by operators.

Dr. Neyedli fits a participant with an eye-tracking device that reveals how they process on-screen information.
Together with DRDC, Thales and Laval University, and supported by an NSERC Alliance grant, Dr. Neyedli is leading a research program that uses command-and-control simulations to examine how humans and intelligent systems interact under pressure. She explains that it’s work that would be difficult to conduct using classified platforms or the scarce time of senior operational personnel.
“We almost never deal directly with military subject-matter experts,” says the cognitive psychology scientist, who is a professor in Dal's School of Health and Human Performance. “They’re really hard to get hold of, and they may not be able to speak openly about specifically what they do.”
Instead, Dr. Neyedli’s team aims to reproduces something more fundamental.
“What we do is create simulations that replicate the task that the actual experts do, but in a way that can be understood by a more general population,” she says. “We’re not recreating the classified system. We’re recreating the thinking.”
THE WORK: Simulating tomorrow’s decisions
For Arctic surveillance, that means imagining a future environment that is not yet fully realized.
“We’re asking what the Arctic looks like twenty years from now,” Dr. Neyedli says. “What kinds of vessels are there? What kinds of sensors are feeding information into the system? What kinds of uncertainty does an operator have to deal with?”

A participant navigates a simulation.
Participants are placed into these simulated control-centre environments. Information arrives from multiple sources. Decisions must be made about when to act on the advice of the AI, when to wait, when to confer with fellow participants, and when to question the guidance of the system.
“There are a lot of things we want to understand deeply about how this technology affects human behaviour. But we just can’t do that work ourselves at scale,” says Dr. Hunter. “Being able to say, ‘Heather, can you go investigate this?’ and then take what she finds and validate it with the operational community – that’s incredibly valuable for us.”
Dr. Neyedli says the findings are taken back to the designers at Thales to help shape decisions in areas like what information operators need in a given moment, how displays can be refined to reduce workload, and where clearer cues can support more accurate judgment.

Dr. Heather Neyedli
Dr. Lafond says that tv and at Laval are also helping Thales improve Cognitive Shadow by investigating new self-monitoring capabilities that enable the platform to recognize situations where it tends to be less reliable. The AI can then inform the human operator, to avoid overreliance that can result in errors in specific situations.
THE IMPACT: AI for the Canadian context
Beyond technical performance, the research is also helping to shape how Canada thinks about AI-supported defence systems that meet the specific needs of its personnel. For Dr. Hunter, working with tv to adapt Thales’s technology for the Canadian context helps to ensure the tools are grounded in Canadian operational realities.
Dr. Aren Hunter and LCdr Shawn Stacey discuss a DRDC‑developed underwater battlespace awareness tool. Photo provided.
“It makes me feel more confident that we are looking to ourselves for solutions in this area for the Canadian Armed Forces,” says Dr. Hunter. “It is so critical that we have control over this space.”
This emphasis extends beyond the technology itself, to the knowledge base the technology is trained on. Dr. Hunter says it’s essential that the platform be developed to serve and learn from Canadian operators in order to make it relevant and trustworthy for the people in uniform who will eventually use it.
“Canadian military situations are different. There’s something unique about the spaces we work in and might be working in the future, such as the Arctic,” says Dr. Hunter. “So, when we look at how we operate now and, in the future, we do need our defence AI solutions to be based on Canadian knowledge and experiences.”