Reclaiming YourSelf

“I felt that my silence implied that I *should* be ashamed….”

I LOVE this project, both the explanatory video and the photo shoot to which it refers.  Danish journalist Emma Holten, who had been victimized by revenge porn, on the importance of consent. 

We have seen the results of public shaming of the sexuality of girls and women – we’ve seen it in the suicide of Amanda Todd, the death of Rehteah Parsons.  In the way(s) others use the threat of releasing/sharing such photos to attempt to extort and manipulate girls and women.  And Holten is correct that this is grounded in misogyny, in the hatred and objectification of women. 

It is grounded too in the underlying attitude that female bodies and sexuality are wrong.  If these images, those naked bodies were not presumptively “shameful”, their revelation could not leveraged as a threat.  The judgments that perpetuate the sharing of such photos (“you shouldn’t have been such a whore”) reinforce and reiterate that shame. 

Holten’s response, to refuse to be shamed about her body and sexuality is a powerful one.  The decision to participate in a photo shoot and release those photos publicly – to actively share images of her body, to refuse to feel shamed about her sexuality – is an important one.  By refusing to allow herself to be subverted or silenced, she instead takes the site/sight of her “shame” and transforms it, making of it not only a moment of resistance but a response and refutation.  A celebration and a reclamation.

A “Black Box” for your Body?

UPDATE:  Allstate was recently granted a US patent for a "driving-behavior database that it said might be useful for health insurers, lenders, credit-rating agencies, marketers and potential employers." The program is just in the patent stage for now, but the company says: "the invention has the potential to evaluate drivers' physiological data, including heart rate, blood pressure and electrocardiogram signals, which could be recorded from steering wheel sensors."  Similarly Jaguar/Land Rover has a number of projects in process which are intended to monitor and assess driver focus and  brain activity; analyze physiological data; and incorporate predictions of expected behaviour, all in the name of making driving safer (or so we're told). 

 

 

Permission has been granted in a Canadian personal injury case has been to introduce FitBit data into evidence – the collected fitness tracking information will be run through an analytics program in order to compare it to activity levels of the general population.  The idea, in this instance, is to be able to quantitatively demonstrate the effects that the plaintiff’s injury has had on her abilities and activity.

Sounds simple right?  She has the data, she’s volunteering it, and being able to ground her claim with data is likely to be quite persuasive.  

Why should we worry about this? What is concerning is the broader potential implication—situations where the data is *not* volunteered but subpoenaed.  Consider situations where the activity data isn’t used to support an individual claim, but rather to undermine or even dismiss someone’s entitlement to relief. 

Let’s face it – we have seen and continue to see a trend towards admitting as evidence social network posts and similar information, even where “privacy settings” have been utilized in an effort to keep the information private.  This information has been ruled to be presumptively public no matter the expression of a contrary intention by the poster.  Given that, it seems likely that the same presumption of publicness will apply to the data generated by FitBits, heart rate monitors, pedometers, smartphone fitness apps, and other devices that collect and record personal fitness/performance information.

The notion of the quantified self is an alluring idea, but ultimately it is a new iteration of an old quandary. It starts with the assertion the more information we can collect and track, the more we can know.  Building on this, we see the growth of the common belief that with this information we can better control our bodies, perfect them. 

Essentially, this represents a normalization of surveillance—only at the level of individual body processes, and it is we who are surveilling ourselves.

Concern about surveillance and its effect on those who are surveilled is not new nor is it unreasonable.  The use and misuse of surveillance data has led to unexpected disclosures, and, in some cases, to the reinforcement of stereotypes and inequalities. 

In our fascination with the quantified self, it is important to ensure that the information we choose to collect—for our own purposes—does not become the equivalent of an airplane’s “black box”, a public record out of our control and potentially used against us.