J Conroy, C Thierauf, P Rule, E Krause, H Akitaya, A Gonczi, M Korman, M Scheutz. 2021 IEEE International Conference on Robotics and Automation (ICRA).
How do we make a robot that can provably cover a complex region?
When COVID hit, labs everywhere started looking for ways to contribute. At Tufts, the Computational Geometry group and the Human-Robot Interaction Lab (where I was a grad student at the time) aimed to make autonomous ultraviolet disinfection both affordable and scientifically grounded. Hospitals had been using UV robots for years, but most were prohibitively expensive and not especially intelligent. They relied on fixed lamp positions or heuristic movement patterns with no guarantee that all surfaces were disinfected.
The geometry group developed an algorithmic solution to that problem: they found a way to compute provably sufficient exposure coverage for every visible surface in a room. On the other side, my job was to turn that theory into a functioning robot platform and integrate it into our DIARC architecture. The goal was to show that a rigorous algorithm could actually run in the messy physical world.
The problem
UVC light kills pathogens by damaging their DNA, but the effect drops off quickly with distance and line-of-sight. That means a lamp in one spot may leave huge gaps in coverage. To disinfect properly, a robot has to figure out where to go, how long to stay there, and how to minimize total time while guaranteeing that every reachable surface gets enough radiation. That’s a geometric and optimization problem that the Computational Geometry team formalized into a linear program. This way, they could take a 2d map of a complex space, find what points the robot needs to bring the disinfection lamp, and prove that the space will be cleaned.
Building the platform
While they worked on the algorithm, I built the hardware and systems stack. The robot used a Jetson Nano for onboard compute, a 17-watt germicidal lamp, and dual Hokuyo lidars for 360-degree mapping. I designed the chassis around standard aluminum extrusion, added brushless DC motors with differential drive, and ran everything through ROS. On top of that I integrated DIARC for coordination and fault handling, so the robot could plan, execute, and report back within our cognitive framework rather than running as a standalone ROS node.
It wasn’t fancy (lockdown conditions reduced whatever parts were available), but it was functional and reliable enough to run long unsupervised trials.
The algorithm in practice
The pipeline started with mapping. Lidar scans were converted into a clean polygonal floor plan using morphological filtering and simplification. From there, the algorithm computed visibility polygons for every candidate lamp position, modeled light falloff, and solved a linear program to find the minimal set of waypoints and dwell times that achieved complete coverage. Finally, the robot planned a traversal path through those waypoints using a modified traveling-salesman heuristic.
My contribution was wiring all of this into a closed-loop system: the algorithm ran, DIARC generated the mission plan, and the robot executed it autonomously, tracking progress and handling errors through the cognitive layer.
Testing and results
We tested in simulation and then in real indoor environments. Compared to a stationary lamp, the robot achieved far better coverage in far less time—roughly half the disinfection duration with near-complete surface exposure. The movements were smooth, the system stable, and the DIARC integration allowed for supervision, logging, and easy recovery from faults.
Why it mattered
This collaboration worked because each group focused on what they were best at. The geometry team gave us a provable algorithm—an actual guarantee of complete disinfection. The HRILab provided the embodiment, perception, and reasoning infrastructure to make that guarantee meaningful in the real world.
It was also a useful reminder that theoretical soundness and practical autonomy don’t have to be separate. You can build a robot that reasons formally about its own coverage and still runs on off-the-shelf parts.
In the end, we didn’t just make another UV robot. We made one that could prove it was doing its job, and we showed that rigorous algorithms can survive contact with real hardware.
