You’re looking at what’s known in the autonomous-car industry as an “edge case”—a situation where a vehicle might have behaved unpredictably because its software processed an unusual scenario differently from the way a human would. In this example, image-recognition software applied to data from a regular camera has been fooled into thinking that images of cyclists on the back of a van are genuine human cyclists.
Source: This image is why self-driving cars need many types of sensors