Saturday, December 28, 2019
System Helps Self-driving Cars See in Fog
System Helps Self-driving Cars See in Fog System Helps Self-driving Cars See in Fog System Helps Self-driving Cars See in FogAutonomous vehicles are already driving themselves in test mode down American streets. But their onboard navigation systems totenstill cant help them maneuver safely through heavy or even light fog. Particles of light, it turns out, bounce around the water droplets before they can reach the cameras that guide the vehicles. That scattering of light poses major navigation challenges in heavy mist.Researchers at the Massachusetts Institute of Technology are driving toward a solution for that problem. Theyve developed a system that can sense the depth and gauge the distance of hidden objects to safely navigate driverless vehicles through fog.The researchers announced their milestone two days after March 18 when an autonomous car operated by Uber, with an emergency backup driver behind the wheel, hit a woman on a street in Tempe, AZ. The accident happened at 10 pm, but the weather was clear and dry. While fog is not the only issue for autonomous vehicle navigation, it definitely presents a problem.Why cant you see through fog? Because it refracts light rays and jumbles the information that arrives at the human eye, making it impossible to inform a clear picture.Guy Satat, graduate student, MIT Media Lab Guy Satat checks the images returned to his groups system, which uses a time-of-flight camera. Image Melanie Gonick/MITPart of that problem is that not all radar systems are the same. Those that guide airplanes down runways, for example, use radio waves, which have long wavelengths and low frequencies and dont return high-enough resolution for autonomous vehicle navigation. Like other, longer wavelengths in the electromagnetic spectrum, such as X-Rays, they dont do a good job distinguishing different schriftarts of materials. That characteristic is needed to differentiate between something like a tree from a curb, says Guy Satat, a graduate student in the Camera Culture Group at the MIT Media Lab who led the research under group leader Ramesh Raskar.Also for You Adding Depth Perception to Autonomous VehiclesInstead, todays autonomous navigation systems mostly rely on light detection and ranging (LiDAR) technology, which sends out millions of infrared laser beams every second and measures how long they take to bounce back to determine the distances to objects. But LiDAR, in its present state, cant see through fog as if fog wasnt there, Satat says.Were dealing with realistic fog, which is dense, dynamic, and heterogeneous, he says It is constantly moving and changing, with patches of denser or less-dense fog. Satat says.Satat and his team sought a method that would use the shorter, more precise near-visible light rays that humans and animals rely upon to see.Why cant you see through fog? Satat asks. Because it refracts light rays and jumbles the information that arrives at the human eye, making it impossible to inform a clear picture.The MIT researchers new system builds on existing LiDAR technology. It uses a time-of-flight camera, which fires short bursts of laser light through a scene clouded by forms. The formsin this case, the fogscatter the light photons. Onboard software then measures the time it takes photons to return to a sensor on the camera.The photons that traveled directly through the fog are the quickest to make it to the system because they arent scattered by the dense cloud-like material.The straight line photons arrive first, some arrive later, but the majority will scatter hundreds and thousands of time before they reach the sensor, Satat says Of course theyll arrive much later.The camera counts the photons that reach it every 56 trillionths of a second and onboard algorithms calculate the distance light traveled to each of the sensors 1,024 pixels. That enables it to handle the variations in fog density that foiled earlier systems. In other words, it can deal with ci rcumstances in which each pixel sees a different type of fog, Satat says.By doing so, the system creates a 3D image of the objects hidden among or behind the material that scatters the light.We dont need any prior knowledge about the fog and its density, which helps it to work in a wide range of fog conditions, Satat says.The MIT lab has also used its visible-light-range camera to see objects through other scattering materials, such as human skin. That application could eventually be used as an X-ray alternative, he says.Driving in bad weather conditions is one of the remaining hurdles for autonomous driving technology. This new technology can address that by making autonomous vehicles super drivers through the fog, Satat says.Self-driving vehicles require super vision, he says. We want them to be driven better and safer than us, but they should also be able to drive in conditions where were not able to drive, like fog, rain, or snow.Jean Thilmany is an independent writer.More stori es on technology and society from ASME.orgLevitating with a Tornado of Sound WavesHow to Raise a Coder in Four Easy StepsEngineers Break Down Borders
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.