{"id":16058,"date":"2019-08-12T08:10:04","date_gmt":"2019-08-12T16:10:04","guid":{"rendered":"http:\/\/www.palada.net\/index.php\/2019\/08\/12\/news-9801\/"},"modified":"2019-08-12T08:10:04","modified_gmt":"2019-08-12T16:10:04","slug":"news-9801","status":"publish","type":"post","link":"http:\/\/www.palada.net\/index.php\/2019\/08\/12\/news-9801\/","title":{"rendered":"Facial recognition technology: force for good or privacy threat?"},"content":{"rendered":"<p><strong>Credit to Author: Christopher Boyd| Date: Mon, 12 Aug 2019 15:00:00 +0000<\/strong><\/p>\n<p>All across the world, governments and corporations are looking to invest in or develop facial recognition technology. From law enforcement to marketing campaigns, facial recognition is poised to make a splashy entrance into the mainstream. Biometrics are big business, and third party contracts generate significant profits for all. However, those profits often come at the expense of users.<\/p>\n<p>There\u2019s much to be said for ethics, privacy, and legality in facial recognition tech\u2014unfortunately, not much of it is pretty. We thought it was high time we take a hard look at this burgeoning field to see <em>exactly<\/em> what&#8217;s going on around the world, behind the scenes and at the forefront.<\/p>\n<p>As it turns out&#8230;quite a lot.<\/p>\n<h3>The next big thing in tech?<\/h3>\n<p>Wherever you look, government bodies, law enforcement, protestors, campaigners, pressure and policy groups, and even the tech developers themselves are at odds. Some want an increase in biometric surveillance, others highlight flaws due to bias in programming.<\/p>\n<p>One US city has banned facial tech outright, while some nations want to embrace it fully. Airport <a href=\"https:\/\/www.pcmag.com\/encyclopedia\/term\/59748\/cctv\" target=\"_blank\" rel=\"noopener noreferrer\">closed-circuit TV (CCTV)<\/a>? Fighting crime with shoulder-mounted cams? How about just selling products in a shopping mall using facial tracking to find interested customers? It\u2019s a non-stop battlefield with new lines being drawn in the sand 24\/7.<\/p>\n<h3>Setting the scene: the 1960s<\/h3>\n<p>Facial recognition tech is not new. It was first conceptualised and worked on seriously in the mid &#8217;60s by pioneers such as <a href=\"http:\/\/www.historyofinformation.com\/detail.php?entryid=2495\" target=\"_blank\" rel=\"noopener noreferrer\">Helen Chan Wolf and Woodroe Bledsoe<\/a>. They did what they could to account for variances in imagery caused by degrees of head rotation using <a href=\"https:\/\/en.wikipedia.org\/wiki\/RAND_Tablet\" target=\"_blank\" rel=\"noopener noreferrer\">RAND tablets<\/a> to map 20 distances based on facial coordinates. From there, a name was assigned to each image. The computer then tried to remove the effect of changing the angle of the head from the distances it had already calculated, and recognise the correct individual placed before it.<\/p>\n<p>Work continued throughout the &#8217;60s, and was by all accounts successful. The computers used <a href=\"https:\/\/web.archive.org\/web\/20070326222249\/http:\/www.utexas.edu\/faculty\/council\/1998-1999\/memorials\/Bledsoe\/bledsoe.html\" target=\"_blank\" rel=\"noopener noreferrer\">consistently outperformed<\/a> humans where recognition tasks were concerned.<\/p>\n<h3>Moving on: the 1990s<\/h3>\n<p>By the mid to late &#8217;90s, airports, banks, and government buildings were making use of tech essentially built on its original premise. A new <a href=\"https:\/\/link.springer.com\/chapter\/10.1007\/978-1-4471-3087-1_65\" target=\"_blank\" rel=\"noopener noreferrer\">tool<\/a>, ZN-face, was designed to work with less-than-ideal angles of faces. It ignored obstructions, such as beards and glasses, to <a href=\"https:\/\/www.sciencedaily.com\/releases\/1997\/11\/971112070100.htm\" target=\"_blank\" rel=\"noopener noreferrer\">accurately determine the identity<\/a> of the person in the lens. Previously, this type of technology could flounder without clear, unobstructed shots, which made it difficult for software operators to determine someone&#8217;s identity. ZN-face could determine whether it had a match in 13 seconds.<\/p>\n<p>You can see a good rundown of these and other notable moments in early facial recognition development on <a href=\"http:\/\/adrianomescia.com\/facial-recognition\/\" target=\"_blank\" rel=\"noopener noreferrer\">this timeline.<\/a> It runs from the &#8217;60s right up to the mid &#8217;90s.<\/p>\n<h3>The here and now<\/h3>\n<p>Looking at the global picture for a snapshot of current facial recognition tech reveals\u2026well, chaos to be honest. Several distinct flavours inhabit various regions. In the UK, law enforcement rallies the banners for endless automated facial recognition trials. This despite test results so bad the universal response from researchers and even Members of Parliament is essentially \u201cplease stop.\u201d<\/p>\n<p>Reception in the United States is a little frostier. Corporations jostle for contracts, and individual cities either accept or totally reject what\u2019s on offer. As for Asia, Hong Kong experiences something akin to actual dystopian cyberpunk. Protestors not only <a href=\"https:\/\/www.cbc.ca\/news\/world\/hong-kong-protest-lasers-facial-recognition-technology-1.5240651\" target=\"_blank\" rel=\"noopener noreferrer\">evade facial recognitio<\/a>n tech but attempt to turn it back on the government.<\/p>\n<p>Let\u2019s begin with British police efforts to convince everyone that seemingly faulty tech is as good as they claim.<\/p>\n<h3>All around the world: The UK<\/h3>\n<p>The UK is no stranger to biometrics controversy, having made <a href=\"https:\/\/euobserver.com\/justice\/141919\" target=\"_blank\" rel=\"noopener noreferrer\">occasional forays into breach of privacy and stolen personal information<\/a>. A region <a href=\"https:\/\/www.bbc.co.uk\/news\/10164331\" target=\"_blank\" rel=\"noopener noreferrer\">averse to identity cards and national databases<\/a>, it still makes use of biometrics in other ways.<\/p>\n<p>Here&#8217;s an example of a small slice of everyday biometric activity in the UK. Non-European residents pay for <a href=\"https:\/\/www.rcn.org.uk\/get-help\/member-support-services\/immigration-advice-service\/biometric-residence-permit\" target=\"_blank\" rel=\"noopener noreferrer\">Biometric Residence Permits<\/a> every visa renewal\u2014typically every 30 months. Those cards contain biometric information alongside a photograph, visa conditions, and other pertinent information linked to several <a href=\"https:\/\/www.gov.uk\/government\/organisations\/home-office\" target=\"_blank\" rel=\"noopener noreferrer\">Home Office<\/a> databases.<\/p>\n<p>This <a href=\"https:\/\/www.whatdotheyknow.com\/request\/467499\/response\/1153541\/attach\/3\/47497%2520King%2520Response.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">Freedom of Information request<\/a> reveals that information on one Biometric Residence Permit card is tied to four separate databases:<\/p>\n<ul>\n<li>Immigration and Asylum Biometric System (Combined fingerprint and facial image database)<\/li>\n<li>Her Majesty\u2019s Passport Office Passports Main Index (Facial image only database)<\/li>\n<li>Caseworking Immigration Database Image Store (Facial image only database)<\/li>\n<li>Biometric Residence Permit document store (Combined fingerprint and facial image database)<\/li>\n<\/ul>\n<p>It&#8217;s worth noting that these are just the ones they\u2019re able to share. On top of this, the UK&#8217;s Data Protection Act <a href=\"https:\/\/www.theregister.co.uk\/2019\/01\/17\/data_protection_act_legal_challenge_immigration\/\" target=\"_blank\" rel=\"noopener noreferrer\">contains an exemption<\/a> that prevents immigrants from accessing data, or indeed preventing others from processing it, as is their right under the <a href=\"https:\/\/blog.malwarebytes.com\/glossary\/general-data-protection-regulation-gdpr\/\" target=\"_blank\" rel=\"noopener noreferrer\">Global Data Protection Regulation<\/a> (GDPR). In practice, this results in a two-tier system for personal data, and it means people can\u2019t access their own case histories when challenging what they feel to be a bad visa decision.<\/p>\n<h4>UK: Some very testing trials<\/h4>\n<p>It is against this volatile backdrop that the UK government wants to introduce facial recognition to the wider public, and residents with biometric cards would almost certainly be the first to feel any impact or fallout should a scheme get out of hand.<\/p>\n<p>British law enforcement have been trialling the technology for quite some time now, but with one problem: All the independent reports claim what\u2019s been taking place is a bit of a disaster.<\/p>\n<p>Big Brother Watch has conducted <a href=\"https:\/\/bigbrotherwatch.org.uk\/all-campaigns\/face-off-campaign\/\">extensive research<\/a> into the various trials, and found that an astonishing 98 percent of automated facial recognition matches at 2018\u2019s Notting Hill carnival were misidentified as criminals. Faring slightly (but not much) better than the Metropolitan Police were the South Wales Police, who managed to get it wrong 91 percent of the time\u2014yet, just like other regions, continue to promote and roll out the technology. On top of that, no fewer than 2,451 people had their biometric photos taken and stored without their knowledge.<\/p>\n<p>Those are some amazing numbers, and indeed the running theme here appears to be: \u201cThis doesn\u2019t work very well and we\u2019re not getting any better at it.\u201d<\/p>\n<p>Researchers at the Essex University of Essex Human Rights Centre <a href=\"https:\/\/www.essex.ac.uk\/news\/2019\/07\/03\/met-police-live-facial-recognition-trial-concerns\" target=\"_blank\" rel=\"noopener noreferrer\">essentially tore the recent trials to pieces<\/a> in a comprehensive rundown of the technology\u2019s current failings.<\/p>\n<ul>\n<li>Across six trials, 42 matches were made by the Live Facial Recognition (LFR) technology, but only eight of those were considered a definite match.<\/li>\n<li>Approaching the tests as if the LFR tech was simply some sort of CCTV device didn\u2019t account for its invasive-by-design nature, or indeed the presence of biometrics and long-term storage without clear disclosure.<\/li>\n<li>An absence of clear guidance for the public and the general assumption of legality for this tech used by police, versus a lack of explicit legal use in current law leaves researchers thinking this would indeed be found unlawful in the courts.<\/li>\n<li>The public might naturally be confounded, considering that if someone didn&#8217;t want to be included in the trial, law enforcement would assume that the person avoiding this technology may be suspect. There\u2019s no better example of this than a man who was fined \u00a390 (US$115) for avoiding the LFR cameras for \u201cdisorderly behaviour\u201d (covering his face) because they felt he was up to no good.<\/li>\n<\/ul>\n<p><a href=\"https:\/\/www.youtube.com\/watch?v=KqFyBpcbH9A\" data-rel=\"lightbox-video-0\">https:\/\/www.youtube.com\/watch?v=KqFyBpcbH9A<\/a><\/p>\n<h4>A damning verdict<\/h4>\n<p>The UK\u2019s Science and Technology Committee (made up of <a href=\"https:\/\/www.parliament.uk\/business\/committees\/\" target=\"_blank\" rel=\"noopener noreferrer\">MPs and Lords<\/a>) recently produced their own findings on the trials, and the results were pretty hard hitting. Some highlights from the report, somewhat boringly called \u201c<a href=\"https:\/\/publications.parliament.uk\/pa\/cm201719\/cmselect\/cmsctech\/1970\/1970.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">The work of the Biometrics Commissioner and the Forensic Science Regulator<\/a>\u201d (PDF):<\/p>\n<ul>\n<li>Concerns were raised that UK law enforcement is either aware or \u201cstruggling to comply\u201d with a 2012 High Court ruling that the indefinite retention of innocent people\u2019s custody images was unlawful\u2014yet the practise still continues. Those concerns are exacerbated when considering they\u2019d potentially be included in image matching watchlists for any LFR technology making use of custodial images. There is, seemingly, no money available for investing in the manual review and deletion of said images. There are currently some 21 million images of faces and tattoos on record, which will make for a gargantuan task. [Page 3]<\/li>\n<li>From page 4, probably the biggest hammer blow for the trials: \u201cWe call on the Government to issue a moratorium on the current use of facial recognition technology and no further trials should take place until a legislative framework has been introduced and guidance on trial protocols, and an oversight and evaluation system, has been established\u201d<\/li>\n<li>The <a href=\"https:\/\/www.gov.uk\/government\/organisations\/forensic-science-regulator\" target=\"_blank\" rel=\"noopener noreferrer\">Forensic Science Regulator<\/a> isn\u2019t on the lists it needs to be with regards to whistleblowing, so whistleblowers in (say) the LFR sector wouldn\u2019t be as protected by legislation as they would in others. [Page 10]<\/li>\n<\/ul>\n<p>There\u2019s a lot more in there to digest but essentially, we have a situation where facial recognition technology is failing any and all available tests. We have academics, protest groups, and even MP committees opposing the trials, saying \u201cThe error rate is nearly 100 percent\u201d and \u201cWe need to stop these trials.\u201d We have a massive collection of images, many of which need to be purged instead of being fed into LFR testing. And to add insult to injury, there\u2019s seemingly little scope for whistleblowers to call time on bad behaviour for technology potentially deployed to a nation\u2019s police force by the government.<\/p>\n<h4>UKGOV: Keep on keeping on<\/h4>\n<p>This sounds like quite the recipe for disaster, yet nobody appears to be listening. Law enforcement insists human checks and balances will help address those appalling trial numbers, but so far it <a href=\"https:\/\/www.theregister.co.uk\/2018\/05\/15\/met_police_slammed_inaccurate_facial_recognition\/\" target=\"_blank\" rel=\"noopener noreferrer\">doesn\u2019t appear to have helped much<\/a>. The Home Office <a href=\"https:\/\/www.bbc.co.uk\/news\/technology-49030595\" target=\"_blank\" rel=\"noopener noreferrer\">claims there is public support<\/a> for the use of LFR to combat terrorism and other crimes, but will \u201csupport an open debate\u201d on uses of the technology. What form this debate takes remains to be seen.<\/p>\n<h3>All around the world: the United States<\/h3>\n<p>The US experience with facial recognition tech is fast becoming a commercial one, as big players hope to roll out their custom-made systems to the masses. However, <a href=\"https:\/\/www.nbcnews.com\/news\/us-news\/how-facial-recognition-became-routine-policing-tool-america-n1004251\" target=\"_blank\" rel=\"noopener noreferrer\">many of the same concerns<\/a> that haunt UK operations are present here as well. Lack of oversight, ethics, failure rate of the technology, and bias against marginalised groups are all pressing concerns.<\/p>\n<h4>Corporate concerns<\/h4>\n<p>Amazon, potentially one of the biggest players in this space, has their own custom tech called <a href=\"https:\/\/aws.amazon.com\/rekognition\/\" target=\"_blank\" rel=\"noopener noreferrer\">Rekognition<\/a>. It\u2019s being licensed to <a href=\"https:\/\/www.cnet.com\/news\/what-is-amazon-rekognition-facial-recognition-software\/\" target=\"_blank\" rel=\"noopener noreferrer\">businesses and law enforcement<\/a>, and it\u2019s entirely possible someone may have already experienced it without knowing. The American Civil Liberties Union weren\u2019t exactly thrilled about this prospect, and <a href=\"https:\/\/www.cnet.com\/news\/aclu-wants-amazon-to-stop-offering-surveillance-technology-rekognition\/\" target=\"_blank\" rel=\"noopener noreferrer\">said as much<\/a>.<\/p>\n<p>Wanting to roll out Amazon&#8217;s custom tech to law enforcement, and ICE specifically, was met with pushback from multiple groups, <a href=\"https:\/\/thehill.com\/business-a-lobbying\/393583-amazon-employees-protest-sale-of-facial-recognition-tech-to-law\" target=\"_blank\" rel=\"noopener noreferrer\">including their own employees<\/a>. As with many objections to facial recognition technology, the issue was one focused on human rights. From the open letter:<\/p>\n<blockquote>\n<p><em>\u201cWe refuse to build the platform that powers ICE, and we refuse to contribute to tools that violate human rights. As ethically concerned Amazonians, we demand a choice in what we build, and a say in how it is used.\u201d<\/em><\/p>\n<\/blockquote>\n<p>Even some shareholders have <a href=\"https:\/\/www.cnet.com\/news\/shareholders-demand-amazon-end-facial-recognition-sales-to-government\/\" target=\"_blank\" rel=\"noopener noreferrer\">cold feet<\/a> over the potential uses for this powerful AI-powered recognition system. However, the best response you\u2019ll probably find to some of these concerns from Amazon is a blogpost from February called \u201c<a href=\"https:\/\/aws.amazon.com\/blogs\/machine-learning\/some-thoughts-on-facial-recognition-legislation\/\" target=\"_blank\" rel=\"noopener noreferrer\">Some thoughts on facial recognition legislation<\/a>.\u201d<\/p>\n<h4>And in the blue corner<\/h4>\n<p>Not everyone in US commercial tech is fully on board with facial technology, and it\u2019s interesting to see some of the other tech giant responses to working in this field. In April, Microsoft revealed they\u2019d <a href=\"https:\/\/www.theverge.com\/2019\/4\/17\/18411757\/microsoft-facial-recognition-sales-refused-police-access\" target=\"_blank\" rel=\"noopener noreferrer\">refused to sell facial tech<\/a> to Californian law enforcement. According to that article, Google flat out refused to sell it to law enforcement too, but they do have other AI-related deals that have <a href=\"https:\/\/www.mintpressnews.com\/government-facial-recognition-software\/245608\/\" target=\"_blank\" rel=\"noopener noreferrer\">caused backlash<\/a>.<\/p>\n<p>The overwhelming concerns were (again) anchored in possible civil rights abuses. Additionally, the already high error rates in LFR married to <a href=\"https:\/\/www.theverge.com\/2019\/1\/25\/18197137\/amazon-rekognition-facial-recognition-bias-race-gender\" target=\"_blank\" rel=\"noopener noreferrer\">potential bias in gender and race<\/a>\u00a0played a part.<\/p>\n<h4>From city to city, the battle rages on<\/h4>\n<p>In a somewhat novel turn of events, San Francisco became the first US city to <a href=\"https:\/\/www.bbc.co.uk\/news\/technology-48276660\" target=\"_blank\" rel=\"noopener noreferrer\">ban facial recognition technology entirely<\/a>. Police, transport authorities, and anyone else who wishes to make use of it will need approval by city administrators. Elsewhere, Orlando <a href=\"https:\/\/www.engadget.com\/2019\/07\/19\/orlando-amazon-rekognition-pilot\/\" target=\"_blank\" rel=\"noopener noreferrer\">passed on Amazon\u2019s Rekognition tech<\/a> after some 15 months of\u2014you guessed it\u2014<a href=\"https:\/\/www.orlandoweekly.com\/Blogs\/archives\/2019\/07\/18\/orlando-cancels-amazon-rekognition-capping-15-months-of-glitches-and-controversy\" target=\"_blank\" rel=\"noopener noreferrer\">glitches and technical problems<\/a>. Apparently, things were so problematic that they never reached a point where they were able to test images.<\/p>\n<p>Over in Brooklyn, NY, the pressure has started to bear down on facial tech on a much smaller, more niche level. The <a href=\"https:\/\/drive.google.com\/file\/d\/1w4ee-poGkDJUkcEMTEAVqHNunplvR087\/view\" target=\"_blank\" rel=\"noopener noreferrer\">No Biometric Barriers to Housing<\/a> act wants to:<\/p>\n<blockquote>\n<p><em>&#8230;prohibit the use of biometric recognition technology in certain federally assisted dwelling units, and for other purposes.<\/em><\/p>\n<\/blockquote>\n<p>This is a striking development. A growing number of landlords and building owners are inserting <a href=\"https:\/\/blog.malwarebytes.com\/privacy-2\/2019\/06\/smart-cities-difficult-choices-privacy-and-security-on-the-grid\/\" target=\"_blank\" rel=\"noopener noreferrer\">IoT\/smart technology<\/a> into people\u2019s homes. This is happening <a href=\"https:\/\/thebasispoint.com\/what-happens-when-your-landlord-decides-you-have-to-live-in-a-smart-apartment\/\" target=\"_blank\" rel=\"noopener noreferrer\">whether they want them or not<\/a>, regardless of how secure <a href=\"https:\/\/www.zdnet.com\/article\/how-crooks-can-cover-up-crimes-by-hacking-iot-cameras-to-show-fake-footage\/\" target=\"_blank\" rel=\"noopener noreferrer\">they may or may not be<\/a>.<\/p>\n<p>While I accept I may be sounding like a broken record, these concerns are valid. Perhaps, just perhaps, privacy isn\u2019t quite as dead as some would like to think. Error rates, technical glitches, exploitation of certain communities and using them as guinea pigs for emerging technology are <a href=\"https:\/\/ny.curbed.com\/2019\/7\/29\/8934279\/bill-ban-facial-recognition-public-housing-brooklyn-nyc\" target=\"_blank\" rel=\"noopener noreferrer\">all listed as reasons<\/a> for <a href=\"https:\/\/www.economist.com\/united-states\/2019\/05\/23\/america-is-turning-against-facial-recognition-software\" target=\"_blank\" rel=\"noopener noreferrer\">the great United States LFR pushback of 2019<\/a>.<\/p>\n<h3>All around the world: China<\/h3>\n<p>China is already a place deeply wedded to <a href=\"https:\/\/www.scmp.com\/news\/china\/society\/article\/2157883\/drones-facial-recognition-and-social-credit-system-10-ways-china\" target=\"_blank\" rel=\"noopener noreferrer\">multiple tracking\/surveillance systems<\/a>.<\/p>\n<p>There are 170 million CCTV cameras currently in China, with plans to add an additional 400 million between 2018 and 2021. This system is intended to be matched with facial recognition technology tied to multiple daily activities\u2014everything from getting toilet roll in a public restroom to opening doors. Looping it all together will be <a href=\"https:\/\/www.scmp.com\/news\/china\/society\/article\/2115094\/china-build-giant-facial-recognition-database-identify-any\" target=\"_blank\" rel=\"noopener noreferrer\">190 million identity cards<\/a>, with an intended facial recognition accuracy rate of 90 percent.<\/p>\n<p><a href=\"https:\/\/www.youtube.com\/watch?v=lH2gMNrUuEY\" data-rel=\"lightbox-video-1\">https:\/\/www.youtube.com\/watch?v=lH2gMNrUuEY<\/a><\/p>\n<p>People are also attempting to use \u201chyper realistic face molds\u201d to bypass biometric authentication payment systems. There\u2019s certainly no end of innovation taking place from both government and the population at large.<\/p>\n<blockquote class=\"twitter-tweet\" data-lang=\"en-gb\">\n<p dir=\"ltr\" lang=\"en\">Hyper-realistic face molds capable of tricking face recognition payment authentication systems. High chance of being outlawed in China I feel (subtitles mine) <a href=\"https:\/\/t.co\/7kj3AxA2XL\">pic.twitter.com\/7kj3AxA2XL<\/a><\/p>\n<p>\u2014 Matthew Brennan (@mbrennanchina) <a href=\"https:\/\/twitter.com\/mbrennanchina\/status\/1158435099773304833?ref_src=twsrc%5Etfw\">5 August 2019<\/a><\/p>\n<\/blockquote>\n<p><a href=\"https:\/\/platform.twitter.com\/widgets.js\">https:\/\/platform.twitter.com\/widgets.js<\/a><\/p>\n<h3>Hong Kong<\/h3>\n<p>Hong Kong has already experienced a few run-ins with biometrics and facial technology, but mostly for promotional\/marketing purposes. For example, in 2015, a campaign designed to raise awareness of littering across the region made use of DNA and technology produced in the US to shame litterbugs. Taking samples from rubbish found in the streets, they extracted DNA and produced facial reconstructions. Those face mockups were placed on billboards across Hong Kong in high traffic areas and places where the litter was originally recovered.<\/p>\n<p>Mileage will vary drastically on how accurate these images were because, as has been noted, the \u201cDNA alone can only produce a high probability of what someone looks like\u201d and the idea was to generate debate, not point fingers.<\/p>\n<p>All the same, wind forward a few years and the tech is being used to <a href=\"https:\/\/www.channelnewsasia.com\/news\/cnainsider\/shaming-jaywalkers-china-facial-recognition-technology-privacy-11196684\" target=\"_blank\" rel=\"noopener noreferrer\">dispense toilet paper and shame jaywalkers<\/a>. More seriously, we\u2019re faced with <a href=\"https:\/\/www.bbc.co.uk\/news\/world-asia-china-48607723\" target=\"_blank\" rel=\"noopener noreferrer\">daily protests in Hong Kong<\/a> over the <a href=\"https:\/\/www.amnesty.org\/en\/get-involved\/take-action\/stop-the-hong-kong-extradition-bill\/\" target=\"_blank\" rel=\"noopener noreferrer\">proposed extradition bill<\/a>. With the ability to protest safely at the forefront of people\u2019s minds, facial recognition technology steps up to the plate. Sadly, all it manages to achieve is to make the <a href=\"https:\/\/www.nytimes.com\/2019\/07\/26\/technology\/hong-kong-protests-facial-recognition-surveillance.html\" target=\"_blank\" rel=\"noopener noreferrer\">whole process even more fraught than it already is<\/a>.<\/p>\n<p>Protestors cover their faces, and phone owners disable facial recognition login technology. Police remove identification badges, so people on Telegram channels share personal information about officers and their families. Riot police carry cameras on poles because wall-mounted devices are hampered with <a href=\"https:\/\/observers.france24.com\/en\/20190806-hong-kong-protesters-use-lasers-confuse-police-damage-cameras\" target=\"_blank\" rel=\"noopener noreferrer\">laser pens and spray paint<\/a>.<\/p>\n<blockquote class=\"twitter-tweet\" data-lang=\"en-gb\">\n<p dir=\"ltr\" lang=\"en\">These lasers Hong Kong protesters are pointing at riot police through billowing tear gas, it&#8217;s like something out of a sci-fi movie. <a href=\"https:\/\/twitter.com\/hashtag\/AntiELAB?src=hash&amp;ref_src=twsrc%5Etfw\">#AntiELAB<\/a> <a href=\"https:\/\/t.co\/noTllDuc09\">pic.twitter.com\/noTllDuc09<\/a><\/p>\n<p>\u2014 Alejandro Alvarez (@aletweetsnews) <a href=\"https:\/\/twitter.com\/aletweetsnews\/status\/1155484331344826369?ref_src=twsrc%5Etfw\">28 July 2019<\/a><\/p>\n<\/blockquote>\n<p><a href=\"https:\/\/platform.twitter.com\/widgets.js\">https:\/\/platform.twitter.com\/widgets.js<\/a><\/p>\n<h4>Rules and (bending) regulations<\/h4>\n<p>Hong Kong itself has a <a href=\"https:\/\/www.info.gov.hk\/gia\/general\/201906\/05\/P2019060500349.htm\" target=\"_blank\" rel=\"noopener noreferrer\">strict set of rules<\/a> for Automatic Facial Recognition. One protestor attempted to <a href=\"https:\/\/gizmodo.com\/how-hong-kong-s-protestors-are-hindering-and-hijacking-1836732933\" target=\"_blank\" rel=\"noopener noreferrer\">make a home-brew facial recognition system<\/a> using online photos of police officers. The project was eventually shelved because of lack of time, but the escalation of recognition tech development by a regular resident is quite unique.<\/p>\n<p>This may all sound a little bit out there or over the top. Even so, with\u00a0<a href=\"https:\/\/twitter.com\/alicetruong\/status\/1158303198639058946\" target=\"_blank\" rel=\"noopener noreferrer\">1,000 rounds of tear gas<\/a> being fired alongside hundreds of rubber bullets, protestors aren\u2019t <a href=\"https:\/\/twitter.com\/eriiiic\/status\/1157995567504478211\" target=\"_blank\" rel=\"noopener noreferrer\">taking chances<\/a>. For now, we\u2019re getting a birds-eye view of what it would look like if LFR were placed front-and-center in a battle between government oversight and civil rights. Whether it tips the balance one way or the other remains to be seen.<\/p>\n<h3>Watching&#8230;and waiting<\/h3>\n<p>Slow, relentless legal rumblings in the UK are one thing. Cities embracing or rejecting technology in the US is quite another\u2014especially when the range of stances is from organizations and policies all the way down to the housing level. On the opposite side of the spectrum, seeing LFR in Hong Kong protests is an alarming insight into where the state of biometrics and facial recognition could lead if concerns aren&#8217;t addressed head on before implementation.<\/p>\n<p>It seems technology, as it so often does, has raced far ahead of our ability to define its ethical use.<\/p>\n<p>The question is: How do we catch up?<\/p>\n<p>The post <a rel=\"nofollow\" href=\"https:\/\/blog.malwarebytes.com\/privacy-2\/2019\/08\/facial-recognition-technology-force-for-good-or-privacy-threat\/\">Facial recognition technology: force for good or privacy threat?<\/a> appeared first on <a rel=\"nofollow\" href=\"https:\/\/blog.malwarebytes.com\">Malwarebytes Labs<\/a>.<\/p>\n<p><a href=\"https:\/\/blog.malwarebytes.com\/privacy-2\/2019\/08\/facial-recognition-technology-force-for-good-or-privacy-threat\/\" target=\"bwo\" >https:\/\/blog.malwarebytes.com\/feed\/<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p><strong>Credit to Author: Christopher Boyd| Date: Mon, 12 Aug 2019 15:00:00 +0000<\/strong><\/p>\n<table cellpadding='10'>\n<tr>\n<td valign='top' align='center'><a href='https:\/\/blog.malwarebytes.com\/privacy-2\/2019\/08\/facial-recognition-technology-force-for-good-or-privacy-threat\/' title='Facial recognition technology: force for good or privacy threat?'><img src='https:\/\/blog.malwarebytes.com\/wp-content\/uploads\/2019\/08\/shutterstock_1108927796.jpg' border='0'  width='300px'  \/><\/a><\/td>\n<\/tr>\n<tr>\n<td valign='top' align='left'>It seems facial recognition technology, as technology so often does, has raced far ahead of our ability to define its ethical use. We take a hard look at major concerns brewing in cities around the world.<\/p>\n<p>Categories: <\/p>\n<ul class=\"post-categories\">\n<li><a href=\"https:\/\/blog.malwarebytes.com\/category\/privacy-2\/\" rel=\"category tag\">Privacy<\/a><\/li>\n<\/ul>\n<p>Tags: <a href=\"https:\/\/blog.malwarebytes.com\/tag\/amazon\/\" rel=\"tag\">amazon<\/a><a href=\"https:\/\/blog.malwarebytes.com\/tag\/brooklyn\/\" rel=\"tag\">brooklyn<\/a><a href=\"https:\/\/blog.malwarebytes.com\/tag\/cctv\/\" rel=\"tag\">cctv<\/a><a href=\"https:\/\/blog.malwarebytes.com\/tag\/china\/\" rel=\"tag\">china<\/a><a href=\"https:\/\/blog.malwarebytes.com\/tag\/facial-recognition\/\" rel=\"tag\">facial recognition<\/a><a href=\"https:\/\/blog.malwarebytes.com\/tag\/hong-kong\/\" rel=\"tag\">hong kong<\/a><a href=\"https:\/\/blog.malwarebytes.com\/tag\/internet-of-things\/\" rel=\"tag\">Internet of Things<\/a><a href=\"https:\/\/blog.malwarebytes.com\/tag\/iot\/\" rel=\"tag\">IoT<\/a><a href=\"https:\/\/blog.malwarebytes.com\/tag\/lfr\/\" rel=\"tag\">LFR<\/a><a href=\"https:\/\/blog.malwarebytes.com\/tag\/microsoft\/\" rel=\"tag\">microsoft<\/a><a href=\"https:\/\/blog.malwarebytes.com\/tag\/orlando\/\" rel=\"tag\">orlando<\/a><a href=\"https:\/\/blog.malwarebytes.com\/tag\/san-francisco\/\" rel=\"tag\">san francisco<\/a><a href=\"https:\/\/blog.malwarebytes.com\/tag\/surveillanceware\/\" rel=\"tag\">surveillanceware<\/a><a href=\"https:\/\/blog.malwarebytes.com\/tag\/uk\/\" rel=\"tag\">uk<\/a><a href=\"https:\/\/blog.malwarebytes.com\/tag\/united-states\/\" rel=\"tag\">united states<\/a><a href=\"https:\/\/blog.malwarebytes.com\/tag\/us\/\" rel=\"tag\">us<\/a><\/p>\n<table width='100%'>\n<tr>\n<td align=right>\n<p><b>(<a href='https:\/\/blog.malwarebytes.com\/privacy-2\/2019\/08\/facial-recognition-technology-force-for-good-or-privacy-threat\/' title='Facial recognition technology: force for good or privacy threat?'>Read more&#8230;<\/a>)<\/b><\/p>\n<\/td>\n<\/tr>\n<\/table>\n<\/td>\n<\/tr>\n<\/table>\n<p>The post <a rel=\"nofollow\" href=\"https:\/\/blog.malwarebytes.com\/privacy-2\/2019\/08\/facial-recognition-technology-force-for-good-or-privacy-threat\/\">Facial recognition technology: force for good or privacy threat?<\/a> appeared first on <a rel=\"nofollow\" href=\"https:\/\/blog.malwarebytes.com\">Malwarebytes Labs<\/a>.<\/p>\n","protected":false},"author":4,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"colormag_page_container_layout":"default_layout","colormag_page_sidebar_layout":"default_layout","footnotes":""},"categories":[10488,10378],"tags":[5588,6285,851,402,14753,5729,6269,10495,22646,10516,1671,5897,6194,18209,6674,403,544],"class_list":["post-16058","post","type-post","status-publish","format-standard","hentry","category-malwarebytes","category-security","tag-amazon","tag-brooklyn","tag-cctv","tag-china","tag-facial-recognition","tag-hong-kong","tag-internet-of-things","tag-iot","tag-lfr","tag-microsoft","tag-orlando","tag-privacy","tag-san-francisco","tag-surveillanceware","tag-uk","tag-united-states","tag-us"],"_links":{"self":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/16058","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/comments?post=16058"}],"version-history":[{"count":0,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/16058\/revisions"}],"wp:attachment":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/media?parent=16058"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/categories?post=16058"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/tags?post=16058"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}