Driverless Cars Stumble Over Advertisements and Unpredictable Pedestrians
The rapid advancement of autonomous vehicle technology is facing a significant hurdle, with driverless cars reportedly being confused by static advertisements and struggling to navigate the unpredictable movements of pedestrians. This revelation comes as trials for self-driving taxis are set to commence in London and have already been conducted in various UK locations, including York. Despite the rollout of these computer-controlled vehicles, public confidence in their safety mechanisms remains a major concern.
Professor John McDermid, a government advisor on self-driving vehicles and a software expert from York University, recently highlighted these issues. Speaking at London’s Science Media Centre, he explained that these sophisticated AI systems can still be easily misled.
The Perils of a Life-Size Ad
One particularly concerning incident involved a driverless car mistaking a life-size advertisement on the side of a bus for a group of actual pedestrians. The advert, for the 2015 film The Man from U.N.C.L.E., featured several actors. The car’s artificial intelligence, unable to distinguish between the realistic depiction and real people, initiated an emergency stop. This sudden manoeuvre, while intended to avoid a perceived hazard, could have potentially endangered vehicles following behind.
Professor McDermid elaborated on this anomaly, stating:
“One of the automated vehicle companies I work with had a situation where their vehicle did a sudden emergency stop because it’s all pedestrians in the road, except they weren’t. It was a life-size advert on the side of a bus, but to an AI, it was human beings. That seems very obvious [to us], but actually, to the AI, it’s not.”


Navigating Pedestrian Chaos
Beyond misleading advertisements, driverless cars in trials within Professor McDermid’s home city of York have also encountered difficulties with the unpredictable behaviour of pedestrians. This includes instances where individuals cross the road even after the traffic light has turned green, and the pedestrian signal has begun flashing the ‘don’t walk’ symbol.
In contrast, in the United States, where self-driving car development is more advanced, ‘jaywalking’ is a criminal offence with potentially severe repercussions. This legal framework may encourage pedestrians to adhere more strictly to crossing signals. However, in Britain, the pedestrian often reigns supreme, and the current generation of autonomous vehicles appears ill-equipped to fully comprehend this dynamic.
Professor McDermid described the scenario in York:
“It’s seen that there’s a traffic light, so identified the hazard, because the light is red. It changes to green, the vehicle is about to move off. But this is York, so the tourists – although the lights change to green – still walk across the road. Computer vision doesn’t understand what it doesn’t have models for in the world. It doesn’t know what a roundabout is.”
Global Concerns and Near Misses
The upcoming launch of Waymo’s driverless taxi trials in London from Easter, with plans for Uber to integrate the service for paying customers, brings these safety concerns into sharper focus. However, similar issues have been reported in the US. Two years ago, in San Francisco, school crossing guards (often referred to as ‘lollypop ladies’ in the UK) reported numerous near misses with Waymo’s autonomous vehicles.
A survey of 30 crossing attendants revealed that approximately a quarter had experienced a “close call” with a self-driving car, with some being forced to dart out of the way to avoid a collision. Veteran crossing guard Theresa Dorn recounted three near-misses with driverless cars in a single year. In one alarming incident, a parent had to quickly intervene to pull a child to safety. Dorn’s experience underscores a growing sentiment: “The parent grabbed the child, looked at the car – and there was nobody driving it. Why do they have these driverless cars? I think somebody should be driving them.”
The Standard of Safety
Current government guidance in Britain mandates that “self-driving vehicles should be held to the same high standard behaviour as that expected of human drivers.” However, a significant portion of the public surveyed believes that the standards for autonomous vehicles should be even higher. This sentiment is likely fuelled by a deep-seated fear of an increase in road fatalities, which currently stand at around 1,600 deaths annually on UK roads.
Professor McDermid has issued a stark warning, urging that pedestrians should not be treated as a “moral crumple zone” for the development and deployment of robotic vehicles. As the technology progresses, ensuring it can reliably interpret and react to the complexities of the real world, including the unexpected, will be paramount to public acceptance and, more importantly, public safety.







