Police Bodycams: Crossing the Line from Accountability to Shaming

 

Police bodycams are an emerging high-profile tool in law enforcement upon which many hopes for improved oversight, accountability, even justice are pinned.

When it comes to police bodycams, there are many perspectives:

  • Some celebrate them as an accountability measure, almost an institutionalized sousveillance.
  • For others, they’re an important new contribution to the public record
  • And where they are not included in the public record, they can at least serve as internal documents, subject to Access to Information legislation.

These are all variations on a theme – the idea that use of police bodycams and their resulting footage are about public trust and police accountability.

But what happens when they’re used in other ways?

In Spokane, Washington recently a decision was made to use bodycam footage for the purpose of shaming/punishment.  In this obviously edited footage, Sgt. Eric Kannberg deals calmly with a belligerent drunk, using de-escalation techniques even after the confrontation gets physical.  Ultimately, rather than meting out the typical visit to the  drunk tank, the officer opts to proceed via a misdemeanor charge and the ignominy of having the footage posted to Spokane P.D.'s Facebook page. The implications of this approach in terms of privacy, dignity, and basic humanity are far-reaching.

The Office of the Privacy Commissioner of Canada has issued Guidance for the Use of Body-Worn Cameras by Law Enforcement;  guidance that strives to balance privacy and accountability. The Guidelines include:

Use and disclosure of recordings

The circumstances under which recordings can be viewed:

  • Viewing should only occur on a need-to-know basis. If there is no suspicion of illegal activity having occurred and no allegations of misconduct, recordings should not be viewed.
  • The purposes for which recordings can be used and any limiting circumstances or criteria, for example, excluding sensitive content from recordings being used for training purposes. 
  • Defined limits on the use of video and audio analytics.
  • The circumstances under which recordings can be disclosed to the public, if any, and parameters for any such disclosure. For example, faces and identifying marks of third parties should be blurred and voices distorted wherever possible.
  • The circumstances under which recordings can be disclosed outside the organization, for example, to other government agencies in an active investigation, or to legal representatives as part of the court discovery process.

Clearly, releasing footage in order to shame an individual would not fall within these parameters. 

After the posted video garnered hundreds of thousands of views, its subject is now threatening to sue.  He is supported by the ACLU, which expressed concerns about both the editing and the release of the footage. 

New technologies offer increasingly powerful new tools for policing.  They may also intersect with old strategies of social control such as gossip and community shaming.  The challenge – or at least an important challenge– relates to whether those intersections should be encouraged or disrupted.

As always, a fresh examination of the privacy implications precipitated by the implementation of new technology is an important step as we navigate towards new technosocial norms.

Predictive? Or Reinforcing Discriminatory and Inequitable Policing Practices?

UPTURN released its report on the use of predictive policing on 31 August 2016.  

The report, entitled “Stuck in a Pattern:  Early Evidence on Predictive Policing and Civil Rights” reveals a number of issues both with the technology and its adoption:

  •  Lack of transparency about how the systems work
  • Concerns about the reliance on historical crime data, which may perpetuate inequities in policing rather than provide an objective base for analysis
  • Over-confidence on the part of law enforcement and courts on the accuracy, objectivity and reliability of information produced by the system
  • Aggressive enforcement as a result of (over) confidence in the data produced by the system
  • Lack of audit or outcome measures tracking in order to assess system performance and reliability

The report notes that they surveyed the 50 largest police forces in the USA and ascertained that at least 20 of them were using a “predictive policing system” and another 11 actively exploring options to do so.  In addition, they note that “some sources indicate that 150 or more departments may be moving toward these systems with pilots, tests, or new deployments.”

Concurrent with the release of the report, a number of privacy, technology and civil rights organizations released a statement setting forth the following arguments (and expanding upon them).

  1. A lack of transparency about predictive policing systems prevents a meaningful, well-informed public debate. 
  2. Predictive policing systems ignore community needs.
  3. Predictive policing systems threaten to undermine the constitutional rights of individuals
  4. Predictive policing systems are primarily used to intensify enforcement rather than to meet human needs
  5. Police could use predictive tools to identify which officers might engage in misconduct, but most departments have not done so
  6. Predictive policing systems are failing to monitor their racial impact.

Signatories of the statement included:

The Leadership Conference on Civil and Human Rights

18 Million Rising

American Civil Liberties Union

Brennan Center for Justice

Center for Democracy & Technology

Center for Media Justice

Color of Change

Data & Society Research Institute

Demand Progress

Electronic Frontier Foundation

Free Press

Media Mobilizing Project

NAACP

National Hispanic Media Coalition

Open MIC (Open Media and Information Companies Initiative)

Open Technology Institute at New America

Public Knowledge

 

The Right(s) to One’s Own Body

In July, police approached a computer engineering professor in Michigan to assist them with unlocking a murder victim’s phone by 3D-printing the victim’s fingerprints. 

It is a well-established principle of law that ‘there is no property in a corpse.’ This means that the law does not regard a corpse as property protected by rights.  So hey, why not, right? 

There is even an easy argument to be made that this is in the public interest.  Certainly, that seems to be how Professor Anil Jain (to whom the police made the request) feels: “If we can assist law enforcement that’s certainly a good service we can do,” he says.   

Marc Rotenberg, President of the Electronic Privacy Information Centre (EPIC) notes that if the phone belonged to a crime suspect, rather than a victim, police would be subject to a Supreme Court ruling requiring them to get a search warrant prior to unlocking the phone—with a 3D-printed finger or otherwise.

I’ve got issues with this outside the victim/suspect paradigm though. 

For instance, I find myself wondering about the application of this to live body parts. 

I’ve always been amused by the R v Bentham case, from the UK House of Lords in 2005. Bentham broke into a house to commit robbery and in course of this, used his fingers in his pocket to make a gun shape.  He was arrested.  Though he was originally convicted of possessing a firearm or imitation thereof, that conviction was overturned on the basis that it wasn’t possible for him to “possess” part of his own body.  But…if you can’t “possess” your own body, why wait for death before the State makes a 3-D copy of it for its own purposes?

And…we do have legislation about body parts, both live and dead – consider the regulation of organ donation and especially payment for organs.  Consider too the regulation of surrogacy, and of new reproductive technologies. 

Maybe this is a new area to ponder – it doesn’t fit neatly into existing jurisprudence and policy around the physical body.  The increasing use of biometric identifiers to protect personal information inevitably raises new issues that must be examined. 

UPDATE:  It turns out that the 3D printed fingerprint replica wasn’t accurate enough to unlock the phone.  Undeterred, law enforcement finally used a 2D replica on conductive paper, with the details enhanced/filled in manually.  This doesn’t really change the underlying concern, does it?