A voice in public policies

| February 20, 2024 

From biotech to misinformation, some of today’s biggest policy challenges are informed by researchers at ISI.

Photo of the White House and an American flag

Douglas Rissing/iStock

In 2024, AI-powered content generation is projected to take a huge leap forward. How could new capabilities, such as fake but realistic-looking video generators, possibly endanger our society? This is the kind of question that legislators might pose to Wael Abd-Almageed, founder of the visual intelligence lab VIMAL at USC Viterbi’s Information Sciences Institute (ISI) and one of the world’s leading experts on deepfake technology.

ADVISING ON AI, BIOTECH AND MORE

In 2023, Abd-Almageed was working with Homeland Security on deepfake detection strategies. Many other ISI-ers actively engage with public policy. As experts and thought leaders, they are often called in to advise legislators on a number of today’s most complex issues at the intersection of policy and technology.

Yolanda Gil, USC’s Director of New Initiatives in AI and Data Science and an ISI Principal Scientist, has used her technical expertise to collaborate with numerous policymakers throughout her career. In 2024, she will also lead USC’s participation in the federal government’s first-ever consortium dedicated to AI safety: the U.S. AI Safety Institute Consortium (AISIC). Headquartered at the National Institute of Standards and Technology (NIST), this initiative will bring together government, academics and industry to support the development and deployment of safe and trustworthy AI.

Alexander Titus, a Principal Scientist in ISI’s AI division, has expertise in biotechnology. With a background in the life sciences, Titus has recently been appointed one of twelve Commissioners on the National Security Commission on Emerging Biotechnology (NSCEB), bringing together members of Congress, academia, and the private sector.  “We’re people who are tasked with thinking deeply about the US government’s collective approach to biotechnology,” Titus said. Taking into account a range of applications, from synthetic biology to new risks such as pandemic agents and bioweapons, the Commission puts forth policy recommendations aimed at safeguarding against biotech harms, while also leveraging its potential for economic leadership and national security.

Such opportunities to participate in government affairs is as challenging as it is rewarding. “Public service is an important part of a rich personal and professional life,” says Andrea Belz, ISI Research Director and USC Viterbi’s Vice Dean for Transformative Initiatives. Belz also serves on a federal committee responsible for evaluating the impact of a small business innovation funding program within the Department of Defense. “Studying the impact of these policies in the university setting is another way to perform public service and give back,” she said. 

For Terry Benzel, ISI’s Director of the Networking and Cybersecurity Division, giving back is at the state level. Benzel serves on the Technical Advisory Committee for the California Department of Transportation’s Road Charge Program. This initiative is considering a revolutionary change in how the state collects funds to pay for road repair, replacing the standard gas tax, typically collected at the gas pump, with a per-mile road use fee. It comes in response to the growing number of electric vehicles in California.

But a mileage tracking system will implicate typical network security and privacy issues. This is where Benzel’s expertise comes in. “It gives me a great sense of fulfillment to take my many years of deep research and research management, and put it into the world and society,” she said.

Only time will tell whether the Road Charge Program, along with Benzel’s efforts, will materialize into legislation. “Policy is always a little bit of a dance,” she said. But ISI research has certainly helped move the needle before.

ISI’S WORK IN “BOT” LAW

Over the past decade, Emilio Ferrara, ISI Principal Scientist and Associate Professor of Computer Science and Communications at USC, has spent years studying how social media bots—automated accounts programmed to mimic human behavior online—have contributed to disinformation, misinformation, and political distortion. His work has surfaced misleading bot operations in major political events around the globe, leading him to believe that addressing such threats would require not only technology solutions, but also policy ones. 

Ferrara’s call was heard. In 2019, California Governor Jerry Brown signed into law Senate Bill 1001, the nation’s first “bot” disclosure regulation. Drawing heavily from research and findings from Ferrara’s lab, the so-called B.O.T. Act, short for Bolstering Online Transparency, mandated that bots identify as automated accounts before attempting to influence real users’ voting or purchasing behavior. “That’s one example of how our work contributed to the creation of laws and regulations,” said Ferrara. It will not be the last.

Published on February 20th, 2024

Last updated on October 1st, 2024

Share this Post