
Image Credit: EyeEm Mobile GmbH/iStock
A robotaxi pulls up to the curb in Los Angeles. The front seat is empty, no driver in sight. The customer slides into the back seat, and off the ride goes to a destination typed into the app, its cameras and sensors silently collecting where she goes and maybe even recording what she looks like or says inside. When the ride ends, payment is automatically processed through a stored credit card, and the passenger receives a prompt to rate her experience. What happens to all this data? Is it privacy-protected? Is it safe?
New research from an international group of privacy experts reveals a troubling blind spot for data collection in robotaxis: although well-established frameworks help engineers identify privacy risks, there is little practical guidance for choosing the actual tools, called privacy-enhancing technologies, or PETs, that are supposed to safeguard data. To test the state of the field, researchers applied three leading academic methods for choosing PETs to realistic robotaxi scenarios that traced every stage of data collection, from booking to payment to post-ride analytics. Their finding was stark: none of the methods provided sufficient support for protecting riders’ sensitive information.
The study, presented at 20th DPM International Workshop on Data Privacy Management conference, held September 25, 2025 in Toulouse, France, is a collaborative effort between researchers from USC Viterbi’s Information Sciences Institute (ISI), Ulm University, Bosch Research, University of Halle-Wittenberg, Continental Automotive Technologies, Goethe University Frankfurt, and Qualcomm Technologies. “Ultimately, we are focused on privacy engineering,” said David Balenson, associate director of the networking and cybersecurity division at ISI, who worked on the project. “That question brought us together; specifically, what we perceive as a gap in methods for selecting privacy-enhancing technologies.”
What are PETs?
PETs range from well-known tools like encryption and anonymization to advanced methods such as differential privacy, multiparty computation, and homomorphic encryption. The goal is to ensure data can be used for legitimate purposes, like routing a car or processing a payment, without exposing personal details unnecessarily.
Earlier research by the team concentrated on identifying privacy threats and applying broad design strategies. This latest study, however, tackles the harder question of implementation. “This is where the rubber hits the road in that you need to select the actual technologies,” said Balenson. “The challenge is that there are so many PETs, how do you decide which ones to use, and once selected, how do you best implement and configure them for a real-world system?”
And for riders, privacy protections could come with hidden costs, as co-author Ala’a Al-Momani of Ulm University pointed out: “Each PET may enhance privacy from a certain angle, but can also introduce undesirable effects on the customer experience, such as slight delays caused by the additional computation required.”
Analyzing state-of-the-art PET selection
The team tested three academic PET-selection methods against a realistic robotaxi use case. They modeled the full lifecycle of a ride: creating an account, booking, vehicle assignment, the ride itself, payment, and post-ride analytics. For each phase, they cataloged the types of data collected, from names and credit card details to GPS locations, sensor recordings, and even in-car audio. They then applied the three approaches to see which PETs might best protect this data. In parallel, they created their own pragmatic method based on their collective expertise. The study summarized the strength and weaknesses of each approach, but found that they all revealed significant limitations. Some relied on oversimplified assumptions about when technologies could be applied, while others treated threats in isolation. Many failed to account for the interconnected nature of modern systems, where a privacy solution applied to one component might affect others.
The team’s pragmatic approach worked better in practice, but it required expert knowledge to apply effectively, which is a significant drawback for widespread adoption. “Privacy threat models are mature, but PET selection methods lag behind,” said Balenson. “Without reliable methods for doing this, robotaxis and other AI services could lose public trust. Privacy engineering must evolve alongside autonomous systems.”
With their work, the research team is calling for more agile, integrated, and iterative approaches to PET selection, according to Balenson. They emphasize the need to align privacy engineering with compliance and real world constraints. “Our goal is to lay the foundation for trustworthy, privacy-respecting mobility systems of the future,” he said. “Ultimately, a supporting method to select the right PET would enable robotaxi architects and engineers to design privacy-enhanced vehicles that transport passengers safely from point A to point B while also respecting their privacy,” said Al Momani.
“The challenge of PET selection is not restricted to robotaxis,” said co-author Jonathan Petit of Qualcomm Technologies. “We encourage the privacy engineering community to develop best practices that support developers in all application domains.”
Published on September 24th, 2025
Last updated on September 25th, 2025