Driverless Cars Stumble Over Advertisements and Unpredictable Pedestrians
The rapid advancement of autonomous vehicle technology, while promising a future of effortless commutes, is encountering significant hurdles, particularly concerning the ability of these sophisticated systems to interpret the nuances of the real world. Recent incidents highlight how driverless cars can be easily confused by everyday scenarios, raising pertinent questions about their safety readiness as trials loom in major cities.
One particularly striking example involved a driverless car mistaking a life-size advertisement on the side of a bus for a group of actual pedestrians. The vehicle’s artificial intelligence (AI), designed to identify and react to human presence on the road, interpreted the static image of actors from the film “The Man from U.N.C.L.E.” as a cluster of people. This misinterpretation triggered an abrupt emergency stop, a manoeuvre that could have posed a serious hazard to vehicles following closely behind.
Professor John McDermid, an advisor to the UK government on self-driving vehicles and a software expert at York University, elaborated on this anomaly. He explained that for the AI, the visual input of human figures, regardless of their context, was sufficient to warrant a safety response. “It seems very obvious [to us], but actually, to the AI, it’s not,” Professor McDermid noted, underscoring the gap between human perception and machine interpretation. This incident, which occurred during trials with an automated vehicle company, illustrates the challenges in programming AI to distinguish between real-world entities and their representations.

Beyond static images, the unpredictable nature of human behaviour also presents a significant challenge for autonomous systems. In trials conducted in Professor McDermid’s home city of York, driverless cars have reportedly been bewildered by pedestrians who do not strictly adhere to traffic signals. This includes instances where pedestrians begin to cross the road even when the light has changed to green for vehicular traffic, and the “no crossing” signal is still flashing.
In contrast to countries like the United States, where jaywalking is a penalised offence and pedestrians are more inclined to follow crossing instructions diligently, the United Kingdom appears to grant pedestrians greater autonomy. This cultural difference means that robot vehicles struggle to adapt to a pedestrian environment where individuals might proceed across a road based on their own assessment rather than solely on traffic light indications.
Professor McDermid described the confusion observed in York: “It’s seen that there’s a traffic light, so identified the hazard, because the light is red. It changes to green, the vehicle is about to move off. But this is York, so the tourists – although the lights change to green – still walk across the road.” He further explained that current computer vision systems lack the comprehensive models necessary to understand complex real-world phenomena, such as the concept of a roundabout or the less predictable behaviours of human pedestrians.
The impending launch of Waymo’s driverless taxi trials in London, scheduled to commence from Easter, with plans for integration with Uber to offer robotaxi services to the public, brings these safety concerns into sharper focus. However, the challenges are not unique to the UK. In San Francisco, where Waymo has been operating its driverless vehicles, school crossing guards, often referred to as “lollipop ladies,” have reported numerous near-miss incidents with these autonomous cars.
A survey of 30 crossing attendants in San Francisco revealed that approximately a quarter had experienced a “close call” with a self-driving vehicle, with some having to quickly move out of the way to avoid an accident. Theresa Dorn, a veteran crossing guard, recounted three such near-misses within a year. In one harrowing incident, a parent had to swiftly grab a child to prevent them from being struck by a driverless car. Dorn expressed her apprehension, questioning the necessity of these vehicles without human oversight: “Why do they have these driverless cars? I think somebody should be driving them.”
The UK government’s current guidance stipulates that self-driving vehicles should uphold the same behavioural standards expected of human drivers. However, a significant portion of the public surveyed expressed a desire for even higher safety standards, driven by a palpable fear of increased road fatalities, which currently stand at around 1,600 deaths annually on UK roads. Professor McDermid has issued a stark warning, urging that pedestrians should not be relegated to a “moral crumple zone” for the advancement of robocars, emphasising the critical need for robust safety measures and a deeper understanding of real-world complexities before widespread adoption.








