Google is teaching its self-driving cars to share the road with cyclists

Google has taught its self-driving cars to recognize cyclists and how they behave to better predict their course. The self-driving car, using sensors and software, can also detect and interpret cyclists’ hand signals, Google says in a report released Tuesday. The self-driving cars have learned to be more cautious around cyclists as a result. The cars have also learned how to recognize different types of bikes such as tandems and unicycles. Google said "Our cars recognize cyclists as unique users of the road, and are taught to drive conservatively around them (it helps to have a number of avid cyclists on our engineering team!).
Through observing cyclists on the roads and private test track, we’ve taught our software to recognize some common riding behaviors, helping our car better predict a cyclist’s course. Our sensors can detect a cyclist’s hand signals as an indication of an intention to make a turn or shift over. Cyclists often make hand signals far in advance of a turn, and our software is designed to remember previous signals from a rider so it can better anticipate a rider’s turn down the road. Because our cars can see 360 degrees, we’re more aware of cyclists on the road—even in the dark.
Google introduced the prototype—a gumdrop-shaped vehicle it designed itself—in June 2015. The self-driving car doesn’t have pedals or a steering wheel, but only sensors and software. It hopes to commercialize its technology by 2020. The company’s tests still include Lexus RX450h SUVs equipped with autonomous software. Google is testing its self-driving cars—which are equipped with a backup steering wheel and brakes when on public roads—in Mountain View, Calif., Austin, Phoenix, and Kirkland, Wash.

Get Data Sheet, Fortune’s technology newsletter

If Google hopes to launch its self-driving cars for public use in just four years, its software will have to learn all of the nuances required for driving. For instance, the company recently shared how its engineers taught its autonomous prototypes when it’s appropriate to honk, and to use a different honks and beeps depending on the circumstance. Of course, it needs to be able to see the cyclists too. Google relies on sensors as well as light-sensitive radar (known as Lidar), which are large and expensive. Tesla’s autopilot technology, which allows for hands-free driving on highways, does not use Lidar. Tesla CEO Elon Musk has previously called it unnecessary and overkill for what autopilot needs. But that could change. Concerns surrounding the safety of autonomous vehicle technology—and the continued fallout over the death of a Tesla driver using autopilot—could prompt Tesla and other companies to use Lidar as autonomous features continue to increase in cars. To be clear, Tesla does use radar, just not Lidar. In later tweets Tony Fadell, the former CEO and co-founder of Alphabet-owned Nest and now co-founder of Actev Motors, say radar or Lidar should be used. Lidar is used by other automakers that are testing self-driving cars, including Ford. But it’s unclear if the price will fall enough to be deployed widely in commercialized self-driving cars.

Google over the past couple of months has focused on teaching its autonomous car various tasks that drivers use on a daily basis. Last month, for example, Google’s self-driving car learned to use its horn at appropriate times (tips that many human drivers could use). Now, the search giant has added the ability for its car to interact with cyclists on public roadways.
In its latest self-driving car report, Google said its cars recognize cyclists as unique users of the road and have been taught to drive conservatively around them. For example, Google’s cars won’t attempt to pass a cyclist that is riding in the middle of a lane, even if there is technically enough room. When it does pass, it gives cyclists ample buffer room.
What’s more, Google cars can now detect a cyclist’s hand signals for intent to turn or change lanes. Because cyclists often make hand signals well in advance of their turn, Google’s software is programmed to remember previous signals to better anticipate future moves down the road.
Google added that its car cameras can see in 360 degrees, even in the dark, and that they’ve trained it to recognize a wide variety of bikes including those with big wheels, bikes with car seats, tandem bikes, multi-colored frame bikes and even unicycles.

There “is a need for new and better ways for bicyclists and cars to communicate. Right now bicyclists often rely upon eye contact or hand motions from drivers to tell them that the driver is yielding,” said McLeod. “Self-driving cars could benefit from visual indicators meant for bicyclists and pedestrians that help everyone know how the car will behave.”


Comments

Popular posts from this blog

MYKHAYLO MUDRYK TO CHELSEA

HOW TO FIND UR SITE ON GOOGLE FIRST PAGE

USA MILITARY BASES AT RISK