Say cheese! Privacy and facial recognition

  • Published on

  • View

  • Download


reFacial biometricsData protectionographs they uploaded to Facebook. This generated significantnitionwearing glasses or a hat or have recently acquired a suntan inscale of the challenge becomes apparent.However, great strides have been taken in the last few yearswith effective facial recognition products now becoming widelyavailable. Broadly speaking, facial recognition algorithms areeithergeometric or photometric.Geometricalgorithmsextractparticular features from an image of the subjects face. Forincreasing in their accuracy and the scope of their use. Forcheekbones, nose, and chin. Such applications have theadvantage of being largely unaffected by changes in lightingand allow the identification of faces from a variety of angles,including a profile view. Some new applications alsomake useof skin texture analysis which captures the visual details ofthe skin as shown in standard digital or scanned images.Available online at om p u t e r l aw & s e c u r i t y r e v i ew 2 7 ( 2 0 1 1 ) 6 3 7e6 4 0Saint Tropez or by St. Tropez. Add in the fact that a facialimage might be seen from different angles, with differentexpressions and under different lighting conditions and theexample, 3D facial recognition software allows morecomprehensive collection of distinctive features on thesurface of a face, such as the contours of the eye sockets,Facial recognition is interesting. It is a classic example ofa task that is trivial for a human but, until recently, has beenchallenging, if not impossible for a computer. Just consider thevariations in our appearance depending on whether we areto one another. These details are then used to search for otherimages with matching details. Photometric algorithms adopta statistical approach by distilling an image into values andcomparing those values with templates to eliminate variances.New developments in facial recognition applications arework?1. How does facial recog0267-3649/$ e see front matter 2011 Linkldoi:10.1016/j.clsr.2011.09.011The incident illustrates the sensitivity of facial recognition technology and the potentialconflict with data privacy laws. However, facial recognition has been around for some timeand is used by businesses and public organisations for a variety of purposes e primarily inrelation to law enforcement, border control, photo editing and social networking. There arealso indications that the technology could be used by commercial entities for marketingpurposes in the future.This article considers the technology, its practical applications and the manner inwhich European data protection laws regulate its use. In particular, how much controlshould we have over our own image? What uses of this technology are, and are not,acceptable? Ultimately, does European data protection law provide an adequate frameworkfor this technology? Is it a framework which protects the privacy of individuals withoutunduly constraining the development of innovative and beneficial applications and busi-ness models? 2011 Linklaters LLP. Published by Elsevier Ltd. All rights example, an algorithm may analyse the size and shape of theeyes,nose,cheekbones,and jawandtheirdistribution inrelationSocial networkingFacial recognitioncontroversy, arising as much as anything, from the companys failure to adequately informusers of this new service and to explain how the technology works.Privacy help users tag photKeywords: The popular social networking site, Facebook, recently launched a facial recognition tool toSay cheese! Privacy and facialBen Buckley, Matt HunterLinklaters LLP, London, UKa b s t r a c twww.compseconl ine.comaters LLP. Published by Elcognitionbl icat ions/prodclaw.htmsevier Ltd. All rights reserved.Finally, facial recognition technology is not only used toidentify individuals but can also be used to identify particularcharacteristics about that individual. For example, is theindividual aman or a woman? How old are they?What is theirphysical or emotional state e happy, anxious tired?2. Applications of facial recognitiontechnology2.2. The potential power of facial recognition technologyThis, however, is merely the tip of the iceberg and a hugenumber of other applications is possible. Some of these couldserve important, life-saving functions, e.g. in-car systemsdesigned to recognise if a driver is drowsy and to alert thedriver accordingly by changing the radio station.The ability to target marketing material at individualsbased on their appearance is one obvious but as-yet largelyRobo cop, The Guardian, 13 June 2002 .3 See Fraport AGs website: which includes a video (and background music) to show thetechnology in action.4 See the UK Border Agency: .2.1. Recent developmentsUntil recently, one of themost common applications for facialrecognition technology was crime prevention, detection andenforcement. The most widespread use has occurred in theUnited States, where the technology has been used to detectthe attendance of wanted criminals and terrorist suspects atlarge gatherings, such as sporting events and music concerts.It has also been used for law enforcement purposes in Europe.For example, it has been reported that the German FederalCriminal Police Office uses centralised facial recognition toolsto scan a database of centralised mug shot images1 to helprecognise unknown suspects, while the London Borough ofNewham has trialled a facial recognition system built into theborough-wide CCTV system.2Facial recognition tools have also been used widely inmaintaining national border security. For example, theGerman Federal Police use a voluntary facial recognitionsystem to allow travellers to pass through fully-automatedborder controls at Frankfurt airport.3 The UK governmentalso allows travellers with biometric passports to travelthrough automated border controls that use automated facialrecognition technology.4More recently, however, these tools have been adopted forrecreational and social networking purposes. The most publicexample of this development is Facebooks rollout of facialrecognition tools. The Facebook facial recognition tool allowsusers to tag friends more easily by automatically searchinguploaded photos for the faces of users friends. The names ofthose friends are then suggested to the user for taggingpurposes.5Facial recognition has also been rolled out by a number ofother technology suppliers in their photo editing and storingsoftware, such as Googles Picasa, Apples iPhoto, SonysPicture Motion Browser and Microsofts Windows LivePhotoGallery.1 Cognitec awarded contract by German federal criminal policeoffice,, 20 September 2006 .25 See the Facebook Blog: opportunity which the technology presents tobusinesses. This is not simply a matter of speculation; forexample, the multinational foodstuffs manufacturer Kraft,recently announced a joint project with an unidentifiedsupermarket chain to use the technology to identify and targetthose shoppers most likely to buy its products. Krafts CEOsummarised the technology as follows.6If it recognizes that there is a female between 25 and 29 standingthere, it may surmise that you are more likely to have minor childrenat home and give suggestions on how to spice up Kraft Macaroni &Cheese for the kids.In a similar vein, Adidas has announced that it is workingwith Intel to create digital walls which recognise the sex andapproximate age of individuals who walk by, and presentadvertisements for the products that those individuals wouldbe most likely to purchase.7At least for the present, it appears that marketingcampaigns are using facial recognition only to ascertain thebroad demographic category into which an individual falls(e.g. by reference to age and sex). Some fear, however, thatby linking to social networking sites or other databases,companies may, in future, be able to target marketing atparticular individuals based on their particular personalcircumstances.The future of this technology could deliver much moreinvasive applications. The Google Goggles app, available onsmart phones, allows users to take photographs of inanimateobjects and immediately identify them (e.g. wine labels,famous monuments and book covers). Google acknowledgesthe tool could be rolled out to recognise photographs of indi-viduals. This raises the prospect of your photograph beingtaken on the street and your life secrets being immediatelyrevealed. Eric Schmidt remarked.8We do have the relevant facial recognition technology at ourdisposal. But we havent implemented this on Google Goggles,because we want to consider the privacy implications and how thisfeature might be added responsibly. Im very concerned personallyabout the union of mobile tracking and face recognition.Google clearly recognises that a line needs to be drawnsomewhere. However, is that line in the right place? What6 Kraft To Use Facial Recognition Technology To Give YouMacaroni Recipes, Forbes, 1 September 2011, .7 Advertisers start using facial recognition to tailor pitches,Los Angeles Times, 21 August 2011 .8 Facebook in new privacy row over facial recognition feature,The Guardian, 8 June 2011, .happens to those who overstep the boundary? These issuesare largely determined by rights to privacy conferred byEuropean and domestic law.There are, however, powerful public policy reasons for notcharacterising facial images as sensitive personal data. If suchimages were sensitive personal data, the grounds required forprocessing would be greatly restricted, thereby potentiallyinhibiting even non-invasive uses of facial recognition tech-nology that benefit users. The restrictions on the processing ofnon-sensitive personal data under the DPA, appear well-suited to permitting these non-invasive, beneficial applica-tions of facial recognition technology while prohibiting, or atleast heavily restricting, the types of highly invasive usesidentified above.9 See: om p u t e r l aw & s e c u r i t y r e v i ew 2 7 ( 2 0 1 1 ) 6 3 7e6 4 0 639practice_html/9_responsibilities.html.10 See, for example, the Article 29 Working Partys Opinion onthe concept of personal data (Working Paper 136). One factor indetermining whether or not information is about an identifiable3. Data protection implications of facialrecognition technology3.1. How do data protection laws characterise facialbiometrics?European data protection laws protect personal datai.e. information about an identified or identifiable livingindividual.It has long been accepted that a persons image can, byitself, constitute personal data and benefit from the protectionconferred by data protection legislation. In the UK, the Infor-mation Commissioners Office has made this clear in its CCTVcode of practice, which states that images of individuals andthe information derived from those images relate to individ-uals and this data is covered by the Data Protection Act 1998(the DPA).9 However, much depends on the circumstances.For example, it seems less likely that the images captured byKraft to promote macaroni cheese would be personal data ei.e. it is unlikely that Kraft could identify any of the individualsphotographed or that there would ever be any intention toindentify those individuals.10 All Kraft wants is a generaldemographic assessment.European data protection laws also apply additional,stricter rules, to a sub-category of personal data, sensitivepersonal data, which includes data relating to race andethnicity, sexual life and health. Given a data subjects facialappearance might constitute personal data, might it alsoconstitute sensitive personal data? It is arguable that facialimages are sensitive personal data because, for example, theyallow users to determine a subjects racial or ethnic origin.This was the conclusion in the English case, Murray v ExpressNewspapers & Big Pictures (UK) Ltd [2007] EWHC 1908, thoughthe judge went on to say that the processing of that personaldata was justified, as the data wasmade public by the relevantindividual. The potentially bizarre consequences of thisdecision (for example, that it would be possible to compilea collection of photographs of individuals publicly leavinga particular mosque or a synagogue) are the topic for anotherarticle.A number of other European member states, including theCzech Republic and Estonia, have concluded that this type ofinformation is inherently deserving of enhanced protectionand have expanded the definition of sensitive personal data toinclude biometric data. This suggests that facial imageswould, in those jurisdictions, automatically be sensitivepersonal data.individual is whether the data controller has any intention tomake such identification.11 In Germany a stricter approach to the technology may betaken in the future. A draft bill on protection against severeviolations of personal rights on the internet12 has been updatedby the German Minister of the Interior and, although not yetpublic, it is supposed that it will contain regulations restrictingthe use of facial recognition software.12 We assume for these purposes that Facebook is a datacontroller and is subject to the jurisdiction of European dataprotection regulators. It has been argued that social networkscannot properly be characterised as data controllers and, there-fore, are not subject to European data protection laws but theArticle 29 Working Partys Opinion on the concepts of controllerand processor (Working Paper 169) characterises social3.2. How do data protection laws control the use of facialrecognition technology?The application of facial recognition technology to an indi-viduals facial image constitutes processing of personal dataand, therefore, can only take place if a legal justification exists.This means the processing should fall within one of the pro-cessing conditions set out in Article 7 of the Data ProtectionDirective and individuals ought to be informed of any use oftheir information under Articles 10 and 11 of that directive.European data protection law also imposes a range of otherobligations e.g. the data is kept secure, accurate and for nolonger than is necessary.11For facial recognition, one likely processing condition isthat the relevant individual has consented to the processing.Alternatively, it may be possible to show the processing is inan organisations legitimate interests and those interests arenot overridden by the fundamental rights and freedoms of theindividual (the so-called legitimate interests test). Stricterconditions apply to the processing of sensitive personal datawhich normally requires consent.The roll out of the new Facebook tagging function,described above, provides a useful illustration (assumingFacebook is subject to data protection laws12). In rolling outthis function, Facebook should have informed individuals ofthis new feature and ensured it satisfied a relevant processingcondition. In this regard, Facebook did not obtain consentfrom users as to the use of the tool in relation to their imagesand instead provided it on an opt-out basis, such that, inorder to avoid the use of the tool, users were required to takeaction to change their privacy settings on the site. Thisapproach is inconsistent with the privacy as default model,which would have dictated an opt-in arrangement, generallynetworks as data controllers. Again, this is a topic for anotherarticle.favoured by European regulators.13 Whilst it might be possibleto rely on the legitimate interests test as an alternative,regulators in Germany and in Switzerland have launchedinvestigations into Facebooks new technology.Googles Picasa website highlights a more practicaldifficulty in complying with data privacy laws. The user ofPicasa teaches the facial recognition system who is in theusers photographs. The Picasa system remembers andapplieswhat it learns to the next photographs uploaded by thesame user. However, the Picasa system does not (and couldnot) seek consent from the people in the photographs to theprocessing of their personal data as it has no means of con-tacting them (in contrast Facebooks facial technology onlyworks on other Facebook users so Facebook can easily informthem of that processing). Arguably, this is not an issue forGoogle, as it is only providing software to the users and is notitself processing the underlying information. However itwould be a different story entirely if Google were to processnumbers of users and the activity of users on Facebook havedecreased.14 This may in part be attributable to a mistrust ofsocial networking sites when it comes to the level ofprotection of a users privacy, and facial recognition mighthave contributed to this. Indeed, social network sites arenow looking to use privacy as a competitive advantage.15This could be new ground for competition between thelarge players like Facebook and Google and will clearly berelevant to the deployment of new technology, such asfacial recognition.4. ConclusionFacial recognition clearly has the potential to add value andbring benefit to society but could also seriously infringe indi-viduals privacy rights e it raises the potential for a surveil-c om p u t e r l aw & s e c u r i t y r e v i ew 2 7 ( 2 0 1 1 ) 6 3 7e6 4 0640this information itself (i.e. a data controller). As it has no directcontact with many of the individuals identified, it would behard to inform them of this processing or to otherwise justifyit under data privacy legislation.Other forms of facial recognition e.g. recognising gamblingaddicts in casinos, or minors in age restricted venues, ordrowsy drivers behind a steering wheel, may be able to rely onconditions other than consent or the legitimate interests test.For example, the processing may be necessary to comply withthe legal obligations of the data controller or to protect thevital interests of the data subject.3.3. Data protection and social normsIt is also important to recognise that European data protectionprinciples also reflect the publics expectations about howtheir personal information will be used. Even if, technically,the law permits these uses, how will they be received by theindividuals whose information is being processed?The impact of the increased use of facial recognitiontechnology by social networking sites has been metwith some nervousness. Recent reports are that the13 See, for example, the Article 29 Working Partys Opinion onthe definition of consent (Working Paper 187), which stated theposition that silence does not constitute consent nor do pre-ticked boxes, default settings, or opt-out consents processes.Facebooks opt-out tool, it appears, is at odds with this position.14 Facebook Global Growth Slows as U.S. Users Decline, FoxBusiness, 14 June 2011 .15 Coincidentally, as this article was being written, Facebookupdated its layout and included a new option which lets userslance society writ large.Do European data protection laws provide the right modelto regulate this technology? Arguably, they do. Their principledriven approach, both flexible and technologically neutral,allow the development of new and innovative applicationswhilst curbing excessive and intrusive uses. Nonetheless, thisis also verymuch a question of how those laws are interpretedand applied. Their flexibility, in this instance, in relation towhether facial recognition includes processing sensitivepersonal data, is also their potential downfall. Regulators willneed to keep abreast of developments in new facial recogni-tion technologies and new, more intrusive applications of thetechnology.Finally, facial recognition is one more example of thedazzling advances in computer science. Applications thatwere once the realm of science fiction are now reality. This isunlikely to be the last time that an extension to the boundariesof technological possibility raises new and interesting legalissues.Ben Buckley ( and Matt Hunter( Linklaters LLP, London.approve or reject tags before they appear on the users profile. for the latest. Say cheese! Privacy and facial recognition1 How does facial recognition technology work?2 Applications of facial recognition technology2.1 Recent developments2.2 The potential power of facial recognition technology3 Data protection implications of facial recognition technology3.1 How do data protection laws characterise facial biometrics?3.2 How do data protection laws control the use of facial recognition technology?3.3 Data protection and social norms4 Conclusion


View more >