Police and federal agencies in the United States are increasingly employing a new Artificial Intelligence tool called Track, developed by Veritone, to monitor individuals using physical descriptors such as body size, gender, hair color and style, clothing, and accessories. This approach allows authorities to circumvent growing state and municipal restrictions on conventional facial recognition, which has been criticized for privacy issues and racial bias. Track is currently in use by around 400 clients, including local and state police departments, universities, and even federal agencies such as the Department of Justice, Homeland Security, and Defense. The tool enables users to search through video footage by specifying various attributes and assembling movement timelines, even tracking individuals whose faces are concealed or obscured.
The technology´s adoption has triggered sharp criticism from privacy advocates like the American Civil Liberties Union. Experts express concern that Track presents similar, if not heightened, privacy threats compared to facial recognition, given its capability to monitor individuals across a variety of video feeds without requiring biometric data like facial features. The ACLU notes the potential for abuse, particularly in federal agencies already under scrutiny for monitoring protesters, activists, immigrants, and students. The software can ingest footage from police body cameras, drones, public platforms like YouTube, and private citizen uploads, expanding the scope of surveillance far beyond what was previously possible with facial recognition alone.
Veritone maintains that Track is primarily a tool to streamline investigations by identifying relevant video segments, and not intended as a blanket surveillance solution. Nevertheless, critics warn that this level of persistent and broad tracking could create unprecedented risks to privacy, bypassing the intent of new legal restrictions by tracking non-biometric attributes that can still be highly distinctive over time. Current legal definitions of biometric data are vague, often failing to account for attributes like body size and habitual clothing, which can enable comparable levels of surveillance. With laws varying widely by jurisdiction and implementation expanding rapidly, Track’s real-world impact—and legality in places where facial recognition is largely banned—remains unclear, prompting calls for stronger oversight and more comprehensive privacy protections.