(Inside Science) -- The race to develop self-driving cars is on. We’ve already seen test-drive prototypes. And soon, some researchers argue, we could have robot cars acting as chauffeurs during our daily commutes -- letting us sit back and read, text, email or watch TV, while our car does all the driving.
But there are still challenges carmakers need to overcome before we see highways packed with autonomous vehicles. You know -- little things, like mistaking harmless puddles of water for potholes. Or big stuff, like misjudging the movements of a pedestrian and causing serious injury.
Now researchers are working to get self-driving cars to act as if there’s a real person behind the wheel. To do that they need to be equipped with “eyes” and “brains” that work more like humans’.
To many people, the idea of self-driving cars is something of futuristic sci-fi movies. But the future is here. And autonomous cars have the potential to transform the way we get around. But before we see rows of self-driving cars at the dealer, the technology behind the cars still has obstacles to overcome.
Paul Banks, president and founder of Tetravue said, “The car needs to be able to understand its environment, be able to recognize at a distance whether it’s a piece of paper or whether it’s a rock, whether it’s a pothole. And the thing that makes it easiest for the car to be able to tell that is depth -- distance.”
To recognize depth and distance, most self-driving cars use similar technologies, with one vital piece of equipment being a LIDAR sensor. It maps objects in 3-D by bouncing laser beams off its surroundings, providing detailed maps the car needs to get around, and identifying objects like pedestrians and other vehicles.
But LIDAR isn’t perfect in conditions like poor lighting at night or bad weather. This is why 3-D cameras that can process details from the world around them, at high speeds and long distances, are key if the autonomous car industry is to properly take off.
“Depth is critical, and being able to see something smaller than a car requires millions of pixels. From an optics perspective, it’s great because we’re able to use optics in a new way that allows us to measure what has been a really difficult problem. And we’ve been successful in building prototypes and showing that we can really measure 2 million pixels at a time, out to 50 yards from the camera in snowstorms even -- and that’s never been done before,” said Banks.
“We create the distance measurement by using a short strobe that illuminates the whole area. The light comes back and then that’s where we’re able to measure the time and distance, and we do it all at once. So, it happens in a fraction of a microsecond. And you get 2 million points, and then 30 times a second, just like video rates, you get the video plus the distance information at the same time. Our objective is within 12 to 24 months [to] have engineering samples or product prototypes available for sale,” concluded Banks.