As all urban drivers know, pedestrians are unpredictable. And they differ from city to city and from country to country. For example, people in the Indian city of Mumbai don’t look at at the car when they are trying to cross a road. Whereas people in London tend to like making eye contact with a driver before deciding to cross.
That’s why one UK startup believes AVs should be tested in unpredictable situations in busy cities such as London and Tokyo. Humanising Autonomy is its name and it was set up by three co-founders, Leslie Nooteboom (CDO), Maya Pindeus (CEO) and Raunaq Bose (CTO). Its mission is to improve safety and efficiency of autonomous mobility systems through better understanding of human behaviour across cities.
The founders told Auto Futures: “Autonomous systems are unable to understand the complexities of human behaviour, which creates one of the primary obstacles in the development of automated vehicles in cities. Current solutions don’t consider the full range of human behavior at street level. This lack of perceptive abilities and understanding makes vehicles unsafe around people, and slows down the technology’s adoption rate and efficiency in navigating urban environments.”
At present a lot of AV testing is taking place in locations with few pedestrians such as Mountain View, California and Phoenix, Arizona. Places where people rarely walk.
“Autonomous vehicles without human understanding have to drive very carefully, stopping unnecessarily for people who are just waiting and leaning against a lamppost, or performing an emergency stop when it does not accurately predict that someone might run across a pedestrian crosswalk,” say the founders.
Humanising Autonomy uses what its called ‘intention prediction’ technology. Current detection systems recognise humans as a ‘bounding box’ and the movement of that box is tracked on the streets. “Our research showed that a human behaves much more complex than that, so we started looking into the box.”
They add: “From a large range of features we can recognise what the person is doing, this allows us to create a prediction of the intent of that person. We can recognise specific actions, how distracted someone is, and many more behaviours that impact the future behaviour of a pedestrian.”
The company’s software applies several novel artificial intelligence methods and behavioural psychological models on camera footage, allowing them to more fully understand pedestrian intent. Currently its algorithms recognise more than 150 human behaviours. The team are working with companies such as Daimler Mercedes-Benz, Kyocera, and Airbus and applying the technology in various use-cases.
The software is being implemented to increase the safety of Level 2 Advanced Driver Assistance Systems (ADAS), improve human interaction with Level 4+ autonomous vehicles and integration in infrastructure.
Location-wise, they are training their behavioural models to recognise pedestrian behaviour from London, New York, Detroit, Stuttgart, Mumbai and Tokyo. “These projects are showing us where the need for understanding pedestrian intent is high. We are growing quickly and this allows us to build out more technology to predict human behaviour. There are several exciting new projects coming up.”
As for the future, the team believe that every machine should be able to interact with humans in a natural way. “The impact cars made on our current-day cities has been huge. Autonomous cars have the capability to increase the efficiency of travel, offer a more sustainable way to move around, increase the safety of vehicles in dense urban environments, give back space to citizens and improve the experience of mobility in general. At the moment, people are driving these vehicles around, and people are extremely good at interacting with other people.”
The Humanising Autonomy team believes that it is their challenge to make sure that the mobility of the future offers an even better experience than the current one. And that’s a very real challenge.