A researcher from the Queensland University of Technology (QUT) has developed a visual recognition system for autonomous cars.
According to a recent press release, this system is capable of mimicking a human driver’s ability to recognise locations when approaching from a different direction and under radically different environmental conditions.
The Problem
Everything that moves, including vehicles and robots, needs to know where they are located.
They usually do that by recognising where they are against some map of the world that they have in their head or in their computer.
With cars, there are all sorts of challenges, like day turning into night and different types of weather. One of the particular challenges is how cars obviously do not always go in the same direction.
The Solution
The researcher’s study outlines how navigation systems can take existing captured information and analyse it in a new way so that they can recognise locations they have previously gone past.
This includes recognising locations despite approaching it from a different direction, and landmarks are flipped and potentially partially obscured.
Humans do these things organically and cannot explain how it is being done. One way of doing it is to interpret environment with human-like semantic scene understanding, enabled by deep machine learning.
Coming from either direction and having different weather conditions have not yet been done before in research.
The researcher related that when he is revisiting from the opposite direction, he tries to recognise and match ‘meaningful’ visual landmarks.
With his thesis currently under examination, the PhD researcher has been awarded first place at the SAGE Higher Degree Research Student Publication Prize for his article.
According to him, getting the award for the “Semantic-geometric visual place recognition: a new perspective for reconciling opposing views” was particularly pleasing as it was the culmination of his PhD research.
Other projects
He was also involved and served as one of the key researchers on the project to take an Artificial Intelligence (AI) system on a road trip of south-east Queensland.
The aim of the road trip was to guarantee that the autonomous cars of the future will be smart enough to handle tough Australian road conditions.
OpenGov Asia earlier reported on the said project.
The project involved a driver taking an electric Renault fitted with high tech sensors and computers on a 1,200 kilometre road trip that includes a wide range of road and driving conditions.
The AI will serve as the ultimate backseat driver during the trip. It will watch the team as they drive and determine if it could perform the same as a human driver in all conditions.
Before undertaking his PhD, the researcher worked at a robotics research lab in India where he researched human and object tracking, including making a robot that could serve tea in an office environment.
The robot was programmed to carry a tray of teacups to workers, navigate an office space and avoid obstacles such as people crossing in its path.