Departmental Policies

Schenectady

The NYCLU conducted an exhaustive inquiry into the policies of each police department. Through our interaction, we have found that the Schenectady Police Department:

  • Was the only police department included in our survey that acknowledged using predictive policing software, noting that the Department had acquired and deployed the Esri ArcGIS mapping program. The Department stated that they “saturate law enforcement presence” in so-called “hot spots” identified by the software but that they do not have written policies regarding this practice.
  • Heavily redacted its use of force policy. Unlike other departments that provided full and unredacted policies, the Schenectady Police Department withheld information on the types of weapons carried by officers, the situations authorizing use of pepper spray, and the range of permissible force options that an officer can take in response to a person’s actions. This makes it impossible to fully assess the merits of the policy as a whole.
  • Completely redacted the criteria for activating and the procedures to be followed by the Department’s “Emotionally Disturbed Person Response Team.” People experiencing crisis or with intellectual and developmental disabilities face increased risk of harm during police encounters. Not being able to review the Department’s policies for these interactions makes it impossible to know whether the Schenectady Police Department is adequately addressing these concerns when they respond to people in distress.
  • Did not provide any data on the number of stops or field interviews carried out by officers, nor did the Department provide any data on investigations into allegations of officer misconduct.

Policy Spotlight: Predictive Policing Software

“Predictive policing” tools are marketed as helping law enforcement predict future crime by running relevant data through an algorithm. But the predictions are only as good as the data the algorithms analyze, and if there’s incomplete or racially biased enforcement data that goes into the system, then it undermines the usefulness of any predictions that come out. The Schenectady Police Department told us that they input “all types of criminal activity” into their software for analysis in order to produce a map to “assist in predicting where, when, and what types of crimes will be occurring in particular areas.” Based on these maps, the Department saturates certain locations where they “encourage increasing stops and field interview cards within these areas,” but noted that they have no written policies regarding these tactics. Without better data and steps to account for biases, predictive policing software mostly just predicts where police will be deployed and where enforcement activity will be encouraged. The risk of perpetuating racially-biased enforcement patterns is of particular concern, since we learned from our data requests that arrests for low-level offenses already disproportionately targeted communities of color and since the Schenectady Police Department told us that their unwritten rule is to encourage additional enforcement action in places identified by the algorithm. The Department should have written and publicly available standards related to their use of this software, but as an initial matter, there needs to be more public debate and scrutiny over these types of tools. Before police departments acquire or deploy any new surveillance or predictive policing tool, the public should be given an opportunity to review and question the impact it will have on communities.

You can learn more about the policies by clicking the cards below.

Documents received by August 2017

Policy Available
Policy Unavailable