Arbitrage Blog

Read the latest blog post!


How AI Drones Are Changing Policing

Written by Arbitrage2025-11-14 00:00:00

Arbitrage Blog Image

AI-powered drones are quickly becoming the newest "beat cops," lifting off from rooftop docks to scan crash scenes, chase stolen cars, search for missing people, and even deliver Narcan during overdoses. In the United States, their adoption has accelerated: San Francisco police recorded 1,371 drone launches between May 2024 and August 2025 - largely for thefts, burglaries, and robberies. Officials there say drones improved response times and situational awareness. Nationwide, police departments from Miami to Cleveland and Charlotte have recently launched or expanded drone programs, reflecting a broader trend reported this month. Journalists have also documented wider deployments, underscoring how agencies under staffing pressure are leaning on autonomous features such as auto-launch, programmed patrol routes, and AI analytics.


Supporters inside policing frame the drones as force multipliers that keep officers out of harm's way and get eyes on scenes in seconds. In New York City, for example, an NYPD official said ahead of Labor Day 2023 that units would use drones to check large gatherings reported by callers, giving a glimpse as to how commanders view them as mobile cameras for crowd and emergency response. Trade and practitioner outlets make a similar case: AI-assisted "assistant patrol drones" can ease workloads, triage calls, and help clear scenes faster.


Critics warn that turning the sky into a moving sensor grid risks normalizing mass surveillance. The ACLU argues that autonomous patrols - even with a required "remote pilot in command" - could hinder free speech and public life, and urges strict guardrails on when drones are launched and how long footage is kept. Civil liberties groups are especially concerned about pairing drones with face recognition. In Europe, lawmakers moved first, restricting real-time remote biometric identification in public spaces to exceptional, court-authorized cases under the European Union Artificial Intelligence Act.


In the United States, the law is patchwork. Nearly all non-recreational flights fall under the FAA's Part 107 rules, which now require Remote ID beacons; operators who fail to comply face enforcement. Departments can also seek Certificates of Authorization (COAs) and specific waivers - especially for operations that go beyond visual line of sight (BVLOS) - to expand how and where they fly. The FAA and vendors have been piloting new pathways for public safety BVLOS, replacing the permission structure with a waiver model to enable routine operations without a visual observer in defined airspace. At the state and local level, rules vary widely: California's AB 481 forces police to publicly disclose policies, seek city-council approval, and file annual reports on "military equipment," which in many cities includes drones and their software. Other legislatures have targeted suppliers or weaponization: Connecticut approved restrictions on the use of Chinese and Russian drones by state and local agencies and banned drones equipped with deadly or incendiary weapons, while North Dakota's 2015 law famously permitted police drones with certain "less-than-lethal" options.


Police departments cite faster arrivals, fewer risky foot pursuits, better scene documentation, and the ability to "look first" before sending officers into volatile situations. San Francisco's surge in deployments coincided with claims of improved situational awareness and lower overall crime, although the causation is debated. Communities have also seen humanitarian uses, from disaster mapping to overdose response, where small drones can carry lifesaving supplies. Police officers tend to emphasize practicalities: drones are cheaper than helicopters, can be launched by a patrol car, and - when paired with AI for object detection or anomaly alerts - can comb large areas faster than a small team on foot.


While law enforcement praises the drones' efficiency and safety benefits, citizens and civil rights groups warn that without strict regulations, these autonomous eyes in the sky could turn public safety into constant surveillance. Residents worry about backyard flyovers, data retention, and the creep from emergency response to day-to-day monitoring of protests or nightlife. Advocates have pushed for warrant requirements, short retention windows, and explicit bans on biometric scanning or weaponization to avoid a "surveillance-by-default" future. Citizens and civil libertarians counter that "autonomy" is often oversold and that today's systems still rely on humans in the loop. But that is exactly why clear, democratically approved policies and auditing matter now, before truly autonomous patrols become technically feasible and common.


The bottom line is that AI-powered police drones are here, spreading fastest in cities that can afford fleets, docks, and analytics, and in places where councils have already passed transparency ordinances. The technology's promise of quicker responses and safer officers sits in tension with the risks of pervasive, AI-driven surveillance. Whether these flying cops earn public trust will turn on rules that are specific and enforced: tight launch criteria, meaningful public reporting, limits or bans on biometric scanning, careful retention policies, and robust oversight of any autonomous features. As the legal frameworks evolve - from FAA waivers and Remote ID in the U.S. to the EU's hard limits on real-time biometric ID - the debate is shifting from "Should police have drones?" to "Exactly how, when, and under what safeguards can they use them?"

Like this article? Share it with a friend!