The idea of autonomous machines doesn’t sit well with everyone — at least not when it is in reference to driverless cars. You can thank some recent high-profile autonomous ride-sharing accidents for that.
But a less intimidating and perhaps more accurate way to understand the concept of driverless vehicle technology is by describing it as “personalized vehicle control.” That is the preferred phrase of one Texas Engineer who is leading the way by integrating three data sources simultaneously to help improve safety and reliability and eventually bring driverless car technology closer to reality.
Junmin Wang, director of the Mobility Systems Laboratory in the Cockrell School’s Walker Department of Mechanical Engineering and considered one of the world’s leading minds in autonomous vehicle research, is reluctant to use words like “autonomy” when discussing his work, opting instead for “personalized” as a more appropriate, less jarring description.
“In the same way that modern smartphones do so much more than make calls, cars of the future will do much more than get us from one place to another,” he said. “They will be data-driven, smart, intuitive and tailored to individual drivers’ requirements.”
Smartphones are constantly collecting data based on how we use our devices in order to curate a more seamless interaction between human and machine. Though similar, the technology underpinning personalized vehicle control is more complicated than what you’d find in a smartphone, which utilizes mobile internet search data to determine, for example, how much you like products from a particular store before sending you special sales at that store.
With an autonomous vehicle, however, there is no margin for error. Simply collecting data on past driving habits and applying that information in future scenarios would be fine if every journey were the exact same. And while the drive to work can sometimes feel repetitive, no two trips are alike.
According to Amit Bhasin, director of UT’s Center for Transportation Research (CTR), there are a number of key elements to analyzing that data that must first be understood before autonomous vehicles become a reality.
“It will require a radically different approach to engineering — an approach that allows engineers to search for new solutions by combining research in human behavior, automotive technology and transportation infrastructure altogether,” he said.
That combination is being made possible by Wang, a pioneer in the integration of data who has begun analyzing driving behavior by interconnecting three different data sources — a driving simulator; a standalone engine, or powertrain; and a fully equipped autonomous vehicle prototype designed by UT researchers. He believes this unique approach provides a broad vantage point from which he can observe what drives human behavior when a person is controlling the engine that powers the car.
The Simulator
Wang’s lab recently acquired one of the most sophisticated driving simulators in the world. Its 210-degree, cinema-size screen has a limitless number of unique driving scenarios and a specially designed driving seat simulator that is controlled by a cluster of computers. The computers control six degrees of motion, providing an assortment of physical sensations one might experience in any and all driving conditions. The simulator also generates noises, such as those made by an engine, and familiar weather sounds, such as heavy rain or wind.
“The simulator is the only one of its kind in the entire country and was designed with our exact research requirements and specifications in mind,” Wang said. “It provides a reliable platform to discover and safely test methods in order to design new infrastructure and renovate existing networks.”
The Powertrain
Data from the simulator is sent in real time to a standalone engine, or powertrain, located in an adjacent lab. The engine is used to determine how individual differences in people’s driving behavior impact performance as well as to inform artificial intelligence software that will enable the engine to learn the most efficient performance techniques to match a range of driving styles. For this project, Wang is using a diesel engine that can be switched to also test gasoline, plug-in HEV or electric power sources.
The Prototype
Data from both the simulator and engine are then shared with a fully equipped autonomous vehicle prototype, designed by Texas Engineers. The prototype vehicle includes multiple remote sensors, such as LIDAR (a laser-based way to measure distance between targets) and AI informed software.
This third and final data source is perhaps the most exciting for many people eagerly awaiting a future where they can just sit back and relax while the car does all the work. CTR demoed all three aspects of Wang’s research at a recent event held at UT’s J.J. Pickle Research Campus for leaders and representatives of the Texas Department of Transportation. Some said the loud “oohs” and “aahs” of the delegation could be heard blocks away from the experiments.
Though great progress has been made as researchers like Wang continue to bring autonomous technologies from concept closer to reality, more data needs to be analyzed and more work needs to be done before we will see cars driving themselves down our streets with any frequency.
“I am optimistic that lower levels of personalized autonomy and technology will penetrate the existing automobile market in the near future,” Bhasin said. “Sensors and on-board safety features are rapidly being integrated into several commercially available car models and, while not nearly as exciting as a robot driving a car, they can be very effective in enhancing safety and ultimately making driverless cars a concept that more people can buy into.”