🎙️ Podcast Link 🎙️
It’s late on a Friday afternoon here in Brisbane, so it’s time for a fun story about a prank my PhD labmates played on me two decades ago, and how it unexpectedly led to my most highly cited and awarded research work. 📚🤓
At the time, I was working on our RatSLAM system – a neurally-inspired robot mapping and navigation system that performed SLAM – Simultaneous Localization And Mapping. 🤖🗺️
SLAM is the process of a robot exploring and mapping a new environment while keeping track of its location. It’s a big topic in robotics and computer vision. 🔍🚀
One day, my labmates put tape over the camera on our Pioneer 3-DX mobile robot, blinding it, and waited for it to fail. 🎥🙈 EXCEPT, the RatSLAM system continued to work, and I didn’t initially notice any change. 😲
A SLAM system needs motion information and a way to recognize places it has visited before. The tape destroyed the camera’s detail perception, but it still registered an “average brightness” of the scene, becoming a single pixel light intensity meter. 💡📷
In our stable office environment, the SLAM system continued to work because the light intensity readings were consistent and varied distinctively across locations. 🏢🔆
Years later, this discovery informed our SeqSLAM system and our “How Low Can You Go” research, pushing the boundaries on low resolution and robust visual localization in specific use cases like path-like environments. The work won awards, led to millions in funded research & projects, and much follow on work. 🏆💡📈
So, while I’m not encouraging pranks, remember that when something goes wrong, it might lead to a breakthrough. You never know what you might discover! 🌟🤖🔬
📖 SeqSLAM: https://ieeexplore.ieee.org/abstract/document/6224623
📖 How low can you go? https://journals.sagepub.com/doi/10.1177/0278364913490323
📖 Follow on work: https://scholar.google.com.au/scholar?hl=en&as_sdt=0%2C5&q=seqslam&btnG=
#ResearchStory #PhDLife #Robotics #SLAM #RatSLAM #SeqSLAM #RoboticsResearch #InnovativeDiscoveries #AcademicJourney #PranksInTheLab #RoboticsMapping #NeuralInspired #ComputerVision #ScientificDiscovery #Brisbane #RoboticsLab #BestPaperAward #ResearchBreakthrough #FridayFun #UnexpectedInsights #AI #MachineLearning #ResearchImpact
Full Video Notes
All right, it’s Friday afternoon. It’s been a long week, and I wanted to have a little fun sharing a story about an unexpected event from my early PhD days, which was about two decades ago. At the time, I was working on biologically or neurally inspired robot navigation with a number of colleagues and my supervisor, Gordon. We were working on a system called RatSLAM.
SLAM is an acronym in robotics that stands for Simultaneous Localization and Mapping. It’s the challenge of a robot moving through a new environment, mapping it while simultaneously keeping track of its location within that environment. SLAM has been one of the key challenges in robotics for decades. While much of the foundational work was done 25 years ago, it remains a vibrant and active area of research. There’s even an ongoing joke in the community: SLAM is solved, but everyone is still working on it.
Our system was called RatSLAM because it was inspired by the mapping and navigation systems in the brains of animals like rats. As has since been discovered, similar systems exist in the brains of many mammalian species, including humans. At the time, I was working with a Pioneer 3DX robot—one of those black-and-red, medium-sized mobile robots you’d see in robotics labs worldwide. These robots were excellent research tools back then. They could carry computers, support various sensors, and were a workhorse for SLAM research.
I was part of a mixed lab, sharing space with four or five other PhD candidates and maybe a postdoc. We were a well-knit team and often did things together outside of work. However, we also played pranks on each other—or at least, I was often on the receiving end of them.
The Prank: Taping Over the Camera
One particular prank, and I can’t remember who was responsible, involved someone secretly sticking black tape over the camera of my Pioneer robot. This camera was a pan-tilt-zoom model and the robot’s primary sensor for visual input. It was crucial for both the mapping and localization tasks in RatSLAM. Without the camera functioning, the system couldn’t compare visual data to detect if the robot was revisiting a location—a process called loop closure or visual place recognition.
From what I recall, my labmates were eagerly waiting for me to run an experiment and discover that nothing was working. But to their surprise—and later, my own—the RatSLAM system still worked almost as well as it had before. I didn’t immediately notice anything was wrong. In fact, I think they had to tell me about the prank because I didn’t figure it out on my own. Eventually, I would have realized it when I looked closely at the robot, but their excitement got the better of them.
The Unexpected Insight
What fascinated me was why the system had still worked. We dug into the issue and discovered that, while the tape obscured any discernible details in the camera’s images, it wasn’t a complete blackout. The tape allowed some average intensity of light to pass through, effectively turning the 640×480 resolution camera into a single-pixel sensor. This sensor couldn’t capture detailed images but could still measure the overall brightness of the environment.
In an indoor office setting, lighting tends to be relatively consistent over short periods. For example, at a specific location, the camera might record an intensity value of 133. Returning to the same spot with the same orientation would produce a similar value, like 132 or 134. This repeatable brightness measurement, combined with motion data from the robot’s wheel encoders, allowed the SLAM system to continue operating—at least for short periods—as if nothing had changed.
This unexpected behavior led me to explore how much resolution a camera truly needs for visual SLAM systems to perform tasks like place recognition. Inspired in part by this prank, we later demonstrated that impressive feats of visual place recognition could be achieved using extremely low-resolution sensors—even down to a single pixel. These sensors function more like the light sensor on your phone that detects whether it’s against your ear than a traditional camera.
A Career-Defining Breakthrough
This line of research culminated in some of my most significant work, including our Sequence SLAM method. Sequence SLAM became my most-cited paper, winning awards and receiving over a thousand citations. It led to funded research projects and applications in various fields. While I wouldn’t attribute all of this to the prank, it was a pivotal moment that sparked new ideas and directions in my career.
The Value of the Unexpected
The prank taught me an important lesson: sometimes, unexpected results in experiments are worth paying attention to. Instead of dismissing them as wasted efforts, it’s valuable to dig deeper and try to understand what’s happening. Most of the time, these unexpected outcomes won’t lead anywhere, but occasionally, they can reveal critical insights or even career-defining breakthroughs. Those rare moments of discovery are incredibly exciting and rewarding, making the effort worthwhile.
Anyway, that’s my story for this Friday afternoon. I hope you found it enjoyable and maybe even relevant to how you approach your research. Have a great weekend!