{"id":12834,"date":"2018-07-17T10:45:11","date_gmt":"2018-07-17T18:45:11","guid":{"rendered":"http:\/\/www.palada.net\/index.php\/2018\/07\/17\/news-6601\/"},"modified":"2018-07-17T10:45:11","modified_gmt":"2018-07-17T18:45:11","slug":"news-6601","status":"publish","type":"post","link":"http:\/\/www.palada.net\/index.php\/2018\/07\/17\/news-6601\/","title":{"rendered":"RealNetworks Launches Free Facial Recognition Tool for Schools"},"content":{"rendered":"<p><img decoding=\"async\" src=\"https:\/\/media.wired.com\/photos\/5b4d3359f4e6546add647f4d\/master\/pass\/FacialRecog-Schools-Security-525409577.jpg\"\/><\/p>\n<p><strong>Credit to Author: Issie Lapowsky| Date: Tue, 17 Jul 2018 04:05:00 +0000<\/strong><\/p>\n<p><span class=\"lede\">Like many parents <\/span>in the United States, Rob Glaser has been thinking a lot lately about how to keep his kids from getting shot in school. Specifically, he\u2019s been thinking of what he can do that doesn\u2019t involve getting into a nasty and endless battle over what he calls \u201cthe g-word.\u201d<\/p>\n<p>It\u2019s not that Glaser opposes gun control. A steady Democratic donor, Glaser <a href=\"https:\/\/www.wired.com\/1997\/10\/progressive\/\">founded<\/a> the online streaming giant RealNetworks back in the 1990s as a vehicle for broadcasting left-leaning political views. It\u2019s just that any conversation about curbing gun rights in America tends to lead more to gridlock and finger-pointing than it does to action. \u201cI know my personal opinions aren\u2019t going to carry the day in this current political environment,\u201d Glaser says.<\/p>\n<p class=\"paywall\">So he started working on a solution that he believes will prove less divisive, and therefore more immediately actionable. Over the last two years, RealNetworks has developed a facial recognition tool that it hopes will help schools more accurately monitor who gets past their front doors. Today, the company launched a <a href=\"https:\/\/safr.ai\/\" target=\"_blank\">website<\/a> where school administrators can download the tool, called SAFR, for free and integrate it with their own camera systems. So far, one school in Seattle, which Glaser\u2019s kids attend, is testing the tool and the state of Wyoming is designing a pilot program that could launch later this year. \u201cWe feel like we\u2019re hitting something there can be a social consensus around: that using facial recognition technology to make schools safer is a good thing,\u201d Glaser says.<\/p>\n<p class=\"paywall\">But while Glaser\u2019s proposed fix may circumvent the decades-long fight over gun control in the US, it simultaneously positions him at the white-hot center of a newer, but still contentious, debate over how to balance privacy and security in a world that is starting to feel like a scene out of <em>Minority Report<\/em>. Groups like the Electronic Frontier Foundation, where Glaser is a former board member, have published a <a href=\"https:\/\/www.eff.org\/wp\/law-enforcement-use-face-recognition\" target=\"_blank\">white paper<\/a> detailing how facial recognition technology often misidentifies black people and women at higher rates than white men. Amazon\u2019s own employees have <a href=\"https:\/\/www.wired.com\/story\/why-tech-worker-dissent-is-going-viral\/\">protested<\/a> the use of its product Rekognition for law enforcement purposes. And just last week, Microsoft President Brad Smith <a href=\"https:\/\/www.wired.com\/story\/microsoft-calls-for-federal-regulation-of-facial-recognition\/\">called for federal regulation<\/a> of facial recognition technology, writing, \u201cThis technology can catalog your photos, help reunite families or potentially be misused and abused by private companies and public authorities alike.\u201d<\/p>\n<p>&#x27;This isn\u2019t just sci-fi. This is becoming something we, as a society, have to talk about.&#x27;<\/p>\n<p name=\"inset-left\" class=\"inset-left-component__el\">Rob Glaser, RealNetworks<\/p>\n<p class=\"paywall\">The issue is particularly fraught when it comes to children. After a school in Lockport, New York <a href=\"https:\/\/theintercept.com\/2018\/05\/30\/face-recognition-schools-school-shootings\/\" target=\"_blank\">announced<\/a> it planned to spend millions of dollars on facial recognition technology to monitor its students, the <a href=\"https:\/\/www.nyclu.org\/en\/press-releases\/nyclu-urges-state-block-facial-recognition-technology-lockport-schools\" target=\"_blank\">New York Civil Liberties Union<\/a> and the Legal Defense Fund voiced concerns that increased surveillance of kids might <a href=\"https:\/\/www.nytimes.com\/2018\/04\/04\/us\/politics\/racial-bias-school-discipline-policies.html\" target=\"_blank\">amplify existing biases<\/a> against students of color, who may already be over-policed at home and in school.<\/p>\n<p class=\"paywall\">&quot;The use of facial recognition in schools creates an unprecedented level of surveillance and scrutiny,&quot; says John Cusick, a fellow at the Legal Defense Fund. &quot;It can exacerbate racial disparities in terms of how schools are enforcing disciplinary codes and monitoring their students.&quot;<\/p>\n<p class=\"paywall\">Glaser, who says he is a \u201ccard-carrying member of the ACLU,\u201d is all too aware of the risks of facial recognition technology being used improperly. That\u2019s one reason, in fact, why he decided to release SAFR to schools first. \u201cIn my view when you put tech in the market, the right thing to do is to figure out how to steer it in good directions,\u201d he says.<\/p>\n<p class=\"paywall\">\u201cI personally agree you can overdo school surveillance. But I also agree that, in a country where there have been so many tragic incidents in schools, technology that makes it easier to keep schools safer is fundamentally a good thing.\u201d<\/p>\n<p class=\"paywall\">RealNetworks began developing the technology underpinning SAFR shortly after Glaser returned from a three-year hiatus. He hoped to <a href=\"https:\/\/www.wired.com\/2015\/05\/rob-glaser-gets-real\/\">reinvent the company<\/a>, a pioneer of the PC age, to compete in the mobile, cloud computing era. RealNetworks\u2019 first major product launch with Glaser back at the helm was a photo storing and sharing app called RealTimes. Initially, the facial recognition technology was meant to help the RealTimes app identify people in photos. But Glaser acknowledges that RealTimes \u201cwas not that big a success,\u201d given the dominance of companies like Google and Facebook in the space. Besides, he was beginning to see how the technology his team had developed could be used to address a far more pressing and still unsolved problem.<\/p>\n<p class=\"paywall\">Glaser approached the administrators at his children\u2019s school in Seattle, University Child Development School, which had just installed a gate and camera system, and asked if they might try using SAFR to monitor parents, teachers, and other visitors who come into the school. The school would ask adults, not kids, to register their faces with the SAFR system. After they registered, they\u2019d be able to enter the school by smiling at a camera at the front gate. (Smiling tells the software that it\u2019s looking at a live person and not, for instance, a photograph). If the system recognizes the person, the gates automatically unlock. If not, they can can enter the old-fashioned way by ringing the receptionist.<\/p>\n<p class=\"paywall\">According to head of school Paula Smith, the feedback from parents was positive, though only about half of them opted in to register their faces with the system. The school is approaching the technology with a light touch. It decided deliberately not to allow their students, who are all younger than 11, to participate, for instance. \u201cI think it has to be a decision that\u2019s very thoughtfully made,\u201d Smith says of using this technology on kids. Today, University Child Development School uses SAFR\u2019s age filter to prevent children from registering themselves. The software can predict a person&#x27;s age and gender, enabling schools to turn off access for people below a certain age. But Glaser notes that if other schools want to register students going forward, they can.<\/p>\n<p class=\"paywall\">Each face logged by SAFR gets a unique, encrypted hash that\u2019s stored on local servers at the school. Today, Glaser says it&#x27;s technically unfeasible to share that data from one site with another site, because the hashes wouldn&#x27;t be compatible with other systems. But that may change going forward, Glaser says. If, for instance, a school system wanted to deploy SAFR to all of its schools, the company may allow data to flow between them.<\/p>\n<p>&#x27;It&#x27;s tempting to say there&#x27;s a technological solution, that we&#x27;re going to find the dangerous people, and we&#x27;re going to stop them.&#x27;<\/p>\n<p name=\"inset-left\" class=\"inset-left-component__el\">Rachel Levinson-Waldman, Brennan Center<\/p>\n<p class=\"paywall\">For now, RealNetworks doesn\u2019t require schools to adhere to any specific terms about how they use the technology. The brief approval process requires only that they prove to RealNetworks that they are, in fact, a school. After that, the schools can implement the software on their own. There are no guidelines about how long the facial data gets stored, how it\u2019s used, or whether people need to opt in to be tracked.<\/p>\n<p class=\"paywall\">That&#x27;s concerning, says Rachel Levinson-Waldman, senior counsel to the Brennan Center&#x27;s Liberty and National Security Program. &quot;Facial recognition technology can be an added danger if there aren&#x27;t well-articulated guidelines about its use,&quot; she says.<\/p>\n<p class=\"paywall\">Schools could, for instance, use facial recognition technology to monitor who&#x27;s associating with who and discipline students differently as a result. &quot;It could criminalize friendships,&quot; says Cusick of the Legal Defense Fund.<\/p>\n<p class=\"paywall\">Glaser acknowledges the company will have to develop some clearer terms as it amasses more users. That\u2019s especially true if it begins branching out to other types of customers, including law enforcement agencies, a market Glaser is not ruling out. But he says the company is still figuring out whether it will implement strict user guidelines for schools or simply offer \u201cgentle encouragement\u201d about how SAFR should be used.<\/p>\n<p class=\"paywall\">There are also questions about the accuracy of facial recognition technology, writ large. SAFR boasts a 99.8 percent overall accuracy rating, based on a test, created by the University of Massachusetts, that vets facial recognition systems. But Glaser says the company hasn\u2019t tested whether the tool is as good at recognizing black and brown faces as it is at recognizing white ones. RealNetworks deliberately opted not to have the software proactively predict ethnicity,  the way it predicts age and gender, for fear of it being used for racial profiling. Still, testing the tool&#x27;s accuracy among different demographics is key. <a href=\"https:\/\/www.wired.com\/story\/photo-algorithms-id-white-men-fineblack-women-not-so-much\/\">Research has shown<\/a> that many top facial recognition tools are particularly bad at recognizing black women. Glaser notes, however, that the algorithm was trained using photos from countries around the world and that the team has yet to detect any such \u201cglitches.\u201d Still, the fact that SAFR is hitting the market with so many questions still to be ironed out is one reason why experts say the government needs to step in to regulate the use cases and efficacy of these tools.<\/p>\n<p class=\"paywall\">&quot;This technology needs to be studied, and any regulation that\u2019s being considered needs to factor in people who have been directly impacted: students and parents,&quot; Cusick says.<\/p>\n<p class=\"paywall\">If all schools were to use SAFR the way it&#x27;s being used in Seattle\u2014to allow parents who have explicitly opted into the system to enter campus\u2014it seems less likely to do much harm. The question is whether it will do any good. This sort of technology, Levinson-Waldman points out, wouldn&#x27;t have stopped the many school shootings that have, with a few high-profile exceptions like the shooting in Parkland, Florida, been perpetrated by students who had every right to be inside the classrooms they shot up. &quot;It&#x27;s tempting to say there&#x27;s a technological solution, that we&#x27;re going to find the dangerous people, and we&#x27;re going to stop them,&quot; she says. &quot;But I do think a large part of that is grasping at straws.&quot;<\/p>\n<p class=\"paywall\">Glaser, for one, welcomes federal oversight of this space. He says it&#x27;s precisely because of his views on privacy that he wants to be part of what is bound to be a long conversation about the ethical deployment of facial recognition. \u201cThis isn\u2019t just sci-fi. This is becoming something we, as a society, have to talk about,\u201d he says. \u201cThat means the people who care about these issues need to get involved, not just as hand-wringers but as people trying to provide solutions. If the only people who are providing facial recognition are people who don\u2019t give a shit about privacy, that\u2019s bad.\u201d<\/p>\n<p class=\"related-cne-video-component__dek\">Defense Distributed, the anarchist gun group known for its 3D printed and milled &quot;ghost guns,&quot; has settled a case with the federal government allowing it to upload technical data on nearly any commercially available firearm.<\/p>\n<p><a href=\"https:\/\/www.wired.com\/story\/realnetworks-facial-recognition-technology-schools\" target=\"bwo\" >https:\/\/www.wired.com\/category\/security\/feed\/<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p><img decoding=\"async\" src=\"https:\/\/media.wired.com\/photos\/5b4d3359f4e6546add647f4d\/master\/pass\/FacialRecog-Schools-Security-525409577.jpg\"\/><\/p>\n<p><strong>Credit to Author: Issie Lapowsky| Date: Tue, 17 Jul 2018 04:05:00 +0000<\/strong><\/p>\n<p>A new facial recognition tool by RealNetworks aims to keep kids safe in school. But privacy experts fear the unchecked surveillance of kids could go awry.<\/p>\n","protected":false},"author":4,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"colormag_page_container_layout":"default_layout","colormag_page_sidebar_layout":"default_layout","footnotes":""},"categories":[10378,10607],"tags":[714],"class_list":["post-12834","post","type-post","status-publish","format-standard","hentry","category-security","category-wired","tag-security"],"_links":{"self":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/12834","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/comments?post=12834"}],"version-history":[{"count":0,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/12834\/revisions"}],"wp:attachment":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/media?parent=12834"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/categories?post=12834"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/tags?post=12834"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}