San Francisco is set to become the latest U.S. city to invest in software, created by Texas-based BRS Labs, that monitors and memorizes movements as they are captured on security cameras. The software, AISight, watches footage in real-time and—like a human would—learns to understand, detect, and report “suspicious or abnormal behavior.”

What exactly is defined as suspicious or abnormal behavior? That appears to depend on the environment in which AISight is operating. Its creators say it can be used to flag everything from “unusual loitering” to activity occurring in restricted areas. It could issue an alert after spotting a person leaving a bag unattended in a crowded airport, for instance, or raise alarm if a person is seen trying to cross a perimeter.

San Francisco’s Municipal Transit Authority believes AISight will give it the capacity to track more than 150 “objects and activities” continuously at 12 MTA train stations in San Francisco, according to public procurement documents. BRS Labs has also reportedly struck a deal to monitor the new World Trade Center site in New York. And late last year it was announced that Houston had purchased AISight to be deployed as part of a “citywide surveillance initiative” to “identify potential criminal or terroristic behavioral activity.” It has also been installed in Louisiana for port security, and authorities in El Paso want to use it to monitor water treatment plants near the Mexico border.

The pioneering product has unsurprisingly been lauded by counter-terrorism industry aficionados, but it has caused alarm among privacy and civil liberties advocates. Like surveillance drones, biometric databases, and bomb-proof trash cans, opponents argue, AISight and similar technologies transform citizens into suspects. Because AISight is used to monitor and detect not just acts of crime but potential acts of crime, based purely on a set of algorithms, it is considered part of the push towards pre-emptive—or “pre-crime”—policing, which treats everyone as a potential criminal and targets people for crimes they have not yet committed (and may never commit)...