ScoreAssured’s unsettling assurances
/Hearing a lot of talk about Tenant Assured – an offering from new UK company ScoreAssured. Pitched as being an assessment tool for “basic information” as well as “tenant worthiness,” TenantAssist scrapes social media sites (named sites so far as Facebook, Twitter, LinkedIn, and Instagram) content – including conversations and private messages – and then runs the data through natural language processing and other analytic software to produce a report.
The report rates the selected individual on five “traits” – extraversion, neuroticism, openness, agreeableness, and conscientiousness. The landlord never directly views posts of the potential tenant, but the report will include detailed information such as activity times, particular phrases, pet ownership etc.
Is this really anything new? We know that employers, college admissions, and even prospective landlords have long been using social media reviews as part of their background check process.
TenantAssured would say that at least with their service the individual is asked for and provides consent. And that is, at least nominally, true. But let’s face it – consent that is requested as part of a tenancy application is comparable to the consent for a background check on an employment application – “voluntary” only if you’re willing to not go any further in the process. Saying “no” is perceived as a warning flag that will likely result in one not being hired or not getting housing. Declining jobs and/or accommodations is not a luxury everyone can afford.
Asked about the possibility of a backlash from users, co-founder Steve Thornhill confidently asserted that “people will give up their privacy to get something they want.” That may be the case…but personally I’m concerned that people may be forced to give up their privacy to get something they urgently need (or quite reasonably want).
But let’s presume for a second that the consent is “freely” given. Problems with this model remain:
- Reports may include information such as pregnancy, political opinions, age, etc. – information that is protected by human rights codes. (Thornhill says, “all we can do is give them the information, it’s up to landlords to do the right thing”)
- Performed identity – our self-presentation on social media sites is constructed for particular (imagined) audiences. To remove it from that context does not render it presumptively true or reliable – quite the opposite.
- Invisibility of standards – how are these traits being assessed? What values are being associated with particular behaviours, phrases and activities and are they justified? An individual who is currently working in a bar or nightclub might show activity and language causing them to be receive negative ratings as excessive partiers or unstable, for instance. In fact, the Telegraph demonstrated this by running reports on their financial journalists (people who, for obvious reasons, tend to use words like fraud and loan rather frequently) and sure enough the algorithm rated them negatively in “financial stability”.
- Unlike credit bureaus, which are covered under consumer protection laws, there is no regulation of this sector. What that means, among other things, is that there is not necessarily any way for an individual to know what is included in their report, let along challenge the accuracy or completeness of such a report.
The Washington Post quite correctly identifies this as an exponential change in social media monitoring, writing that “…Score Assured, with its reliance on algorithmic models and its demand that users share complete account access, is something decidedly different from the sort of social media audits we’re used to seeing. Those are like cursory quality-control check; this is more analogous to data strip-mining.”
Would it fly here?
Again, we know that background checks and credit checks for prospective tenants aren’t new. We also know that, in Canada at least, our Information and Privacy Commissioners have had occasion to weigh in on these issues.
In 2004, tenant screening in Ontario suffered a setback when the Privacy Commissioner of Ontario instructed the (then) Ontario Rental Housing Tribunal to stop releasing so much personal information in their final orders. As a result, names are now routinely removed from the orders, making it significantly more difficult to scrape the records generally. As for individual queries, unless you know the names of the parties, the particular rental address and file number already, you will probably not be able to find anything about a person’s history in such matters.
Now, with the release of PIPEDA Report of Findings #2016-002, Feb 19, 2016 (posted 20 May 2016), that line of business is even more firmly shuttered. There, the OPC investigated the existence of a “bad tenant” list that was maintained by the landlord association. The investigation raised numerous concerns about the list:
- Lack of consent by individuals for their information to be collected and used for such a purpose
- Lack of accountability – there was no way for individuals to ascertain if any information about them was on the bad tenant list, who had placed it there, and what the information was.
- Simultaneously, the landlord association was also not assessing the accuracy or credibility of any of the personal information that it collected, placed on the list and regularly disclosed to other landlords, who then made decisions based upon it.
- Further, there was no way to ensure accuracy of the information on the list, and no way for individuals to challenge the accuracy or completeness of the information.
It was the finding of the Privacy Commissioner of Canada that by maintaining and sharing this information, the association was acting as a credit reporting agency, albeit without the requisite license from the province. Accordingly, the Commissioner found that the purpose for which the tenant personal information was collected, used or disclosed was not appropriate under s.5(3) of PIPEDA. The institution, despite disagreeing with the characterization of it as a credit bureau, implemented the recommendation to destroy the “bad tenant” list, cease collecting information for such a list, and to no longer share personal information about prospective tenants without explicit consent.
This is good news, but the temptation to monetize violations of privacy continues. SecureAssist has expansive plans. They anticipate launching (by the end of July 2016) similar “report” products targeted at Human Resources officers and employers, as well as parents seeking nannies.
“If you’re living a normal life,” Thornhill asserts, “then, frankly, you have nothing to worry about.” We all need to ask– who defines “normal”? And since when is a corporation’s definition of “normal” the standard for basic human dignity needs like employment or housing?