{"id":10282,"date":"2017-11-04T02:30:01","date_gmt":"2017-11-04T10:30:01","guid":{"rendered":"http:\/\/www.palada.net\/index.php\/2017\/11\/04\/news-4055\/"},"modified":"2017-11-04T02:30:01","modified_gmt":"2017-11-04T10:30:01","slug":"news-4055","status":"publish","type":"post","link":"https:\/\/www.palada.net\/index.php\/2017\/11\/04\/news-4055\/","title":{"rendered":"Critics are wrong to slam iPhone X\u2019s new face tech"},"content":{"rendered":"<p><img decoding=\"async\" src=\"https:\/\/images.techhive.com\/images\/article\/2016\/01\/intel.web.368.207-100639596-primary.idge.jpg\"\/><\/p>\n<p><strong>Credit to Author: Mike Elgan| Date: Sat, 04 Nov 2017 03:00:00 -0700<\/strong><\/p>\n<p>Apple\u2019s new iPhone X reads faces. And privacy pundits are gnashing their teeth over it.<\/p>\n<p>The phone\u2019s complex TrueDepth image system includes an infrared projector, which casts 30,000 invisible dots, and an infrared camera, which checks where in three-dimensional space those dots land. With a face in view, artificial intelligence on the phone figures out what\u2019s going on with that face by processing locations of the dots.<\/p>\n<p>Biometrics in general and face recognition in particular are touchy subjects among privacy campaigners. Unlike a password, you can\u2019t change your fingerprints \u2014 or face.<\/p>\n<p>Out of the box, the iPhone X\u2019s face-reading system does three jobs: Face ID (security access), Animoji (avatars that mimic users\u2019 facial expressions), and also something you might call \u201ceye contact,\u201d to figure out if the user is looking at the phone (to prevent sleep mode during active use).<\/p>\n<p>A.I. looks at the iPhone X\u2019s projected infrared dots and, depending on the circumstances, can check: Is this the authorized user? Is the user smiling? Is the user looking at the phone?<\/p>\n<p>Privacy advocates rightly applaud Apple because Face ID happens securely on the phone \u2014 face data isn\u2019t uploaded to the cloud where it could be hacked and used for other purposes. And Animoji and \u201ceye contact\u201d don\u2019t involve face recognition.<\/p>\n<p>Criticism is reserved for Apple\u2019s policy of granting face-data access to third-party developers, according to <a href=\"http:\/\/www.reuters.com\/article\/us-apple-iphone-privacy-analysis\/app-developer-access-to-iphone-x-face-data-spooks-some-privacy-experts-idUSKBN1D20DZ\" rel=\"nofollow\">a Reuters piece<\/a> published this week.<\/p>\n<p>That data roughly includes where parts of the face are (the eyes, mouth, etc.), as well as rough changes in the state of those parts (eyebrows raised, eyes closed and others). Developers can program apps to use this data in real time, and also store the data on remote servers.<\/p>\n<p>The controversy raises a new question in the world of biometric security: Does facial expression and movement constitute user data or personal information that should be protected in the same way that, say, location data or financial records should be?<\/p>\n<p>I\u2019ll give you my answer below. But first, here\u2019s why it really matters.<\/p>\n<p>The rise of machine learning and A.I. means that over time, face recognition, which is already very accurate, will become close to perfect. As a result, it will be used everywhere, possibly replacing passwords, fingerprints and even driver\u2019s licenses and passports for how we determine or verify who\u2019s who.<\/p>\n<p>That\u2019s why it\u2019s important that we start rejecting muddy thinking about face-detection technologies, and instead learn to think clearly about them.<\/p>\n<p>Here\u2019s how to think clearly about face tech.<\/p>\n<p>Face recognition is one way to identify exactly who somebody is.<\/p>\n<p>As I detailed in <a href=\"https:\/\/www.computerworld.com\/article\/3182269\/emerging-technology\/its-time-to-face-the-ugly-reality-of-face-recognition.html\">this space<\/a>, face recognition is potentially dangerous because people can be recognized at far distances and also online through posted photographs. That\u2019s a potentially privacy-violating combination: Take a picture of someone in public from 50 yards away, then run that photo through online face-recognition services to find out who they are and get their home address, phone number and a list of their relatives. It takes a couple of minutes, and anybody can do it free. This already exists.<\/p>\n<p>Major Silicon Valley companies such as Facebook and Google routinely scan the faces in hundreds of billions of photos and allow any user to identify or \u201ctag\u201d family and friends without permission of the person tagged.<\/p>\n<p>In general, people should be far more concerned about face-recognition technologies than any other kind.<\/p>\n<p>It\u2019s important to understand that other technologies, processes or applications are almost always used in tandem with face recognition. And this is also true of Apple\u2019s iPhone X.<\/p>\n<p>For example, Face ID won\u2019t unlock an iPhone unless the user\u2019s eyes are open. That\u2019s not because the system can\u2019t recognize a person whose eyes are closed. It can. The reason is that A.I. capable of figuring out whether eyes are open or closed is separate from the system that matches the face of the authorized user with the face of the current user. Apple deliberately chose to disable Face ID unlocking when the eyes are closed to prevent unauthorized phone unlocking by somebody holding the phone in front of a sleeping or unconscious authorized user.<\/p>\n<p>Apple also uses this eye detector to prevent sleep mode on the phone during active use, and that feature has nothing to do with recognizing the user (it will work for anyone using the phone). \u00a0<\/p>\n<p>In other words, the ability to authorize a user and the ability to know whether a person\u2019s eyes are open are completely separate and unrelated abilities that use the same hardware.<\/p>\n<p>Which brings us back to the point of controversy: Is Apple allowing app developers to violate user privacy by sharing face data?<\/p>\n<p>Critics lament Apple\u2019s policy of enabling third-party developers to receive face data harvested by the TrueDepth image sensors. They can gain that access in apps by using Apple\u2019s ARKit, and the specific new face-related tools therein.<\/p>\n<p>The tools allow the building of apps that can know the position of the face, the direction of the lighting on the face and also facial expression.<\/p>\n<p>The purpose of this policy is to allow developers to create apps that can place goofy glasses (or fashionable glasses to try on at an online eyewear store\u2019s website), or any number of other apps that can react to head motion and facial expression. Characters in multiplayer games will appear to frown, smile and talk in an instant reflection of the players\u2019 actual facial activity. Smiling while texting may result in the option to post a smiley face emoji.<\/p>\n<p>Apple\u2019s policies are restrictive. App developers can\u2019t use the face features without user permission, nor can they use them for advertising, marketing or making sales to third-party companies. They can\u2019t use face data to create user profiles that could identify otherwise anonymous users.<\/p>\n<p>The facial expression data is pretty crude. It can\u2019t tell apps what the person looks like. For example, it can\u2019t tell the relative size and position of resting facial features such as eyes, eyebrows, noses and mouths. It can, however tell changes in position. For example, if both eyebrows rise, it can send a crude, binary indication that, yes, both eyebrows went up.<\/p>\n<p>The question to be answered here is: Does a change in the elevation of eyebrows constitute personal user data? For example, if an app developer leaks the fact that on Nov. 4, 2017, Mike Elgan raised his left eyebrow, has my privacy been violated? What if they added that the eyebrow raising was associated with a news headline I just read or a tweet by a politician?<\/p>\n<p>That sounds like the beginning of a privacy violation. There\u2019s just one problem. They can\u2019t really know it\u2019s me \u2014 they just know that someone who claimed to have my name registered for their app, then later that a human face raised an eyebrow. I might have handed my phone to a nearby 5-year-old, for all they know. Also, they don\u2019t know what the eyebrow was reacting to. Was it something on screen? Or maybe somebody in the room said something to elicit that reaction.<\/p>\n<p>The eyebrow data is not only useless, it\u2019s also unassociated with both an individual person and the source of the reaction. Oh, and it\u2019s boring. Nobody would care. It\u2019s junk data for anyone interested in profiling or exploiting the public.<\/p>\n<p>Technopanic about leaked eyebrow-raising obscures the real threat of privacy violation by irresponsible or malicious face recognition.<\/p>\n<p>That\u2019s why I come not to bury Apple, but to praise it.<\/p>\n<p>Face recognition will prove massively useful and convenient for corporate security. The most obvious use is replacing keycard door access with face recognition. Instead of swiping a card, just saunter right in with even better security (keycards can be stolen and spoofed).<\/p>\n<p>This security can be extended to vehicles, machinery and mobile devices as well as to individual apps or specific corporate datasets.<\/p>\n<p>Best of all, the face recognition can be accompanied by peripheral A.I. applications that make it really work. For example, is a second, unauthorized person trying to come in when the door opens? Is the user under duress? Under the influence of drugs, or falling asleep?<\/p>\n<p>I believe great, secure face recognition could be one answer to the BYOD security problem, which still hasn\u2019t been solved. Someday soon enterprises could forget about authorizing devices, and instead authorize users on an extremely granular basis (down to individual documents and applications).<\/p>\n<p>Face recognition will benefit everyone, if done right. Or it will contribute to a world without privacy, if done wrong.<\/p>\n<p>Apple is doing it right.<\/p>\n<p>Apple\u2019s approach is to radically separate the parts of face scanning. Face ID deals not in \u201cpictures,\u201d but in math. The face scan generates numbers, which are crunched by A.I. to determine whether the person now facing the camera is the same person who registered with Face ID. That\u2019s all it does.<\/p>\n<p>The scanning, the generation of numbers, the A.I. for judging whether there\u2019s a match and all the rest all happens on the phone itself, and the data is encrypted and locked on the phone.<\/p>\n<p>It\u2019s not necessary to trust that Apple would prevent a government or hacker from using Face ID to identify a suspect or dissident or target. Apple is simply unable to do that.<\/p>\n<p>Meanwhile, the features that allow changes in facial expression and whether the eyes are open are super useful, and users can enjoy apps that implement these features without fear of privacy violation.<\/p>\n<p>Instead of slamming Apple for its new face tech, privacy advocates should be raising awareness about the risks we face with irresponsible face recognition.<\/p>\n<p><a href=\"https:\/\/www.computerworld.com\/article\/3236244\/mobile-wireless\/critics-are-wrong-to-slam-iphone-x-s-new-face-tech.html#tk.rss_security\" target=\"bwo\" >http:\/\/www.computerworld.com\/category\/security\/index.rss<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p><img decoding=\"async\" src=\"https:\/\/images.techhive.com\/images\/article\/2016\/01\/intel.web.368.207-100639596-primary.idge.jpg\"\/><\/p>\n<p><strong>Credit to Author: Mike Elgan| Date: Sat, 04 Nov 2017 03:00:00 -0700<\/strong><\/p>\n<article>\n<section class=\"page\">\n<p>Apple\u2019s new iPhone X reads faces. And privacy pundits are gnashing their teeth over it.<\/p>\n<p>The phone\u2019s complex TrueDepth image system includes an infrared projector, which casts 30,000 invisible dots, and an infrared camera, which checks where in three-dimensional space those dots land. With a face in view, artificial intelligence on the phone figures out what\u2019s going on with that face by processing locations of the dots.<\/p>\n<p>Biometrics in general and face recognition in particular are touchy subjects among privacy campaigners. Unlike a password, you can\u2019t change your fingerprints \u2014 or face.<\/p>\n<aside class=\"fakesidebar\"><strong>[ Further reading: <a href=\"https:\/\/www.computerworld.com\/article\/3235140\/apple-ios\/what-is-face-id-apples-new-facial-recognition-tech-explained.html?nsdr=true\">What is Face ID? Apple\u2019s facial recognition tech explained<\/a> ]<\/strong><\/aside>\n<p>Out of the box, the iPhone X\u2019s face-reading system does three jobs: Face ID (security access), Animoji (avatars that mimic users\u2019 facial expressions), and also something you might call \u201ceye contact,\u201d to figure out if the user is looking at the phone (to prevent sleep mode during active use).<\/p>\n<p class=\"jumpTag\"><a href=\"\/article\/3236244\/mobile-wireless\/critics-are-wrong-to-slam-iphone-x-s-new-face-tech.html#jump\">To read this article in full or to leave a comment, please click here<\/a><\/p>\n<\/section>\n<\/article>\n","protected":false},"author":4,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"colormag_page_container_layout":"default_layout","colormag_page_sidebar_layout":"default_layout","footnotes":""},"categories":[11062,10643],"tags":[10554,5897,714,11094],"class_list":["post-10282","post","type-post","status-publish","format-standard","hentry","category-computerworld","category-independent","tag-mobile","tag-privacy","tag-security","tag-smartphones"],"_links":{"self":[{"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/10282","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/comments?post=10282"}],"version-history":[{"count":0,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/10282\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/media?parent=10282"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/categories?post=10282"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/tags?post=10282"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}