The Defense Advanced Research Projects Agency (DARPA) will hold its third Grand Challenge competition on 3 November, 2007. The DARPA Urban Challenge features autonomous ground vehicles conducting simulated military supply missions in a mock urban area. Safe operation in traffic is essential to US military plans to use autonomous ground vehicles to conduct important missions.
The DARPA Urban Challenge is an autonomous vehicle research and development programme with the goal of developing technology that will keep war fighters off the battlefield and out of harm's way. The Urban Challenge features autonomous ground vehicles manoeuvering in a mock city environment, executing simulated military supply missions while merging into moving traffic, navigating traffic circles, negotiating busy intersections, and avoiding obstacles.
The programme is conducted as a series of qualification steps leading to a competitive final event, scheduled to take place on 3 November, 2007. DARPA is offering $2m for the fastest qualifying vehicle, and $1m and $500 000 for second and third place.
This programme is an outgrowth of two previous DARPA Grand Challenge autonomous vehicle competitions. The first Grand Challenge event was held in March 2004 and featured a 142-mile desert course. Fifteen autonomous ground vehicles attempted the course and no vehicle finished. In the 2005 Grand Challenge, four autonomous vehicles successfully completed a 132-mile desert route under the required 10-hour limit, and DARPA awarded a $2 million prize to 'Stanley' from Stanford University.
Stanford University
When five autonomous vehicles, including the Stanford Racing Team's winning entry Stanley, finished the 2005 Grand Challenge in the still Nevada Desert, they passed a milestone of artificial intelligence. The robots in the 2007 Urban Challenge, however, will have to handle traffic. It is a tougher test that calls for a new generation of technology.
"In the last Grand Challenge, it did not really matter whether an obstacle was a rock or a bush because either way you would just drive around it," says Sebastian Thrun, an associate professor of computer science and electrical engineering. "The current challenge is to move from just sensing the environment to understanding the environment."
That is because in the Urban Challenge, sponsored by the Defense Advanced Research Projects Agency (DARPA), the competing robots will have to accomplish missions in a simulated city environment, which includes the traffic of the other robots and traffic laws.
This means that on race day, 3 Nov., the robots not only will have to avoid collisions, but also they will have to master concepts that befuddle many humans, such as right of way.
"This has a component of prediction," says Mike Montemerlo, a senior research engineer in the Stanford Artificial Intelligence Lab (SAIL). "There are other intelligent robot drivers out in the world. They are all making decisions. Predicting what they are going to do in the future is a hard problem that is important to driving. Is it my turn at the intersection? Do I have time to get across the intersection before somebody hits me?"
The racing team, based in the School of Engineering, is supported by returning industry team members Intel, MDV-Mohr Davidow Ventures, Red Bull and Volkswagen of America and joined this year by new supporters Applanix, Google and NXP Semiconductors. DARPA also has provided $1 million of funding.
Introducing Junior
Junior is a 2006 Passat wagon whose steering, throttle and brakes have all been modified by engineers at the Volkswagen of America Electronics Research Lab in Palo Alto, California, to be completely computer-controllable. The engineers also have created custom mountings for a bevy of sophisticated sensors.
An important difference between Junior and Stanley is that Junior must be aware of fast-moving objects all around it, while Stanley only had to grapple with still objects in front of it. Junior's sensors are therefore much more sophisticated, Thrun says. They include a range-finding laser array that spins to provide a 360°, three-dimensional view of the surrounding environment in near realtime. The laser array is accompanied by a device with six video cameras that 'see' all around the car. Junior also uses bumper-mounted lasers, radar, global positioning system receivers and inertial navigation hardware to collect data about where it is and what is around.
Because Junior collects much more data than Stanley did, its computational hardware must be commensurately more powerful, says Montemerlo. Using Intel Core 2 Duo processors - each chip includes multiple processing units - Junior's 'brain' is about four times more powerful than Stanley's.
But what makes Junior truly autonomous will be its software, which is the focus of about a dozen students, faculty and researchers at the SAIL. Modules for tasks such as perception, mapping and planning give Junior the machine-learning ability to improve its driving and to convert raw sensor data into a cohesive understanding of its situation. New software development began last fall. Montemerlo has been testing some of the team's software modules in simulated traffic situations since the beginning of the year.
All about Junior
The body
Make and model: 2006 Volkswagen Passat wagon
Engine: 4-cylinder turbo diesel injection
Transmission: Six-speed direct-shift gearbox
Engine cubic capacity: 1968cc
Fuel Consumption: City: 9,2 l/100 km, Highway: 5,5 l/100 km, Combined: 6,8 l/ 100 km
Power: 103 kW at 4000 rpm
Torque: 320 Nm at 1800–2500 rpm
Top speed: 203 km/h
Acceleration: 0–100 km/h: 10,1 sec
Engine provides power through a high-current prototype alternator and a battery-backed, electronically-controlled power system.
The senses
Cutting-edge sensors and custom AI software enable Junior to determine its position and perceive its surroundings, day or night, even in adverse GPS conditions.
Position: position and orientation are estimated by an Applanix POS LV 420 system that provides realtime integration of multiple dual-frequency GPS receivers, a high-performance inertial measurement unit, wheel odometry, and Omnistar’s satellite-based Virtual Base Station service. Realtime accuracy is ≅50 cm and 1/50th of a degree.
Localisation: Junior's position and the path on the road are both optimised in realtime with the help of several active sensors. Two side-facing SICK Lidars along with a forward-facing RIEGL LMS-Q120 Lidar, allow Junior to find lane markings from brightness differences in the ground and improve position estimation to within 5 cm.
Perception: for object detection and tracking, a Velodyne HD Lidar looks in every direction 15 times a second, combining 64 individual lasers into millions of 3D points at up to 65m range. In addition, two IBEO ALASCA XT Lidars in the front and two SICK LD-LRS Lidars in the rear handle ranges up to 200 m. This sensor array allows for continuous long-range, high-accuracy coverage of the surrounding environment.
The brains
Hardware: rack-mount servers equipped with Intel’s latest dual- and quad-core processors process data from sensors up to 200 times a second and run all of Junior’s artificial intelligence software.
Software: Junior’s intelligence comes from a suite of integrated, custom-coded modules, including a planner (making decisions, choosing paths), a mapper (transforming sensor data into environment models), a localiser (refining GPS position and road map structure from lane markings), and a controller (turning decisions into driving).
© Technews Publishing (Pty) Ltd | All Rights Reserved