A “Black Box” for your Body?

UPDATE:  Allstate was recently granted a US patent for a "driving-behavior database that it said might be useful for health insurers, lenders, credit-rating agencies, marketers and potential employers." The program is just in the patent stage for now, but the company says: "the invention has the potential to evaluate drivers' physiological data, including heart rate, blood pressure and electrocardiogram signals, which could be recorded from steering wheel sensors."  Similarly Jaguar/Land Rover has a number of projects in process which are intended to monitor and assess driver focus and  brain activity; analyze physiological data; and incorporate predictions of expected behaviour, all in the name of making driving safer (or so we're told). 



Permission has been granted in a Canadian personal injury case has been to introduce FitBit data into evidence – the collected fitness tracking information will be run through an analytics program in order to compare it to activity levels of the general population.  The idea, in this instance, is to be able to quantitatively demonstrate the effects that the plaintiff’s injury has had on her abilities and activity.

Sounds simple right?  She has the data, she’s volunteering it, and being able to ground her claim with data is likely to be quite persuasive.  

Why should we worry about this? What is concerning is the broader potential implication—situations where the data is *not* volunteered but subpoenaed.  Consider situations where the activity data isn’t used to support an individual claim, but rather to undermine or even dismiss someone’s entitlement to relief. 

Let’s face it – we have seen and continue to see a trend towards admitting as evidence social network posts and similar information, even where “privacy settings” have been utilized in an effort to keep the information private.  This information has been ruled to be presumptively public no matter the expression of a contrary intention by the poster.  Given that, it seems likely that the same presumption of publicness will apply to the data generated by FitBits, heart rate monitors, pedometers, smartphone fitness apps, and other devices that collect and record personal fitness/performance information.

The notion of the quantified self is an alluring idea, but ultimately it is a new iteration of an old quandary. It starts with the assertion the more information we can collect and track, the more we can know.  Building on this, we see the growth of the common belief that with this information we can better control our bodies, perfect them. 

Essentially, this represents a normalization of surveillance—only at the level of individual body processes, and it is we who are surveilling ourselves.

Concern about surveillance and its effect on those who are surveilled is not new nor is it unreasonable.  The use and misuse of surveillance data has led to unexpected disclosures, and, in some cases, to the reinforcement of stereotypes and inequalities. 

In our fascination with the quantified self, it is important to ensure that the information we choose to collect—for our own purposes—does not become the equivalent of an airplane’s “black box”, a public record out of our control and potentially used against us.