{"id":18356,"date":"2022-02-24T14:10:10","date_gmt":"2022-02-24T22:10:10","guid":{"rendered":"http:\/\/www.palada.net\/index.php\/2022\/02\/24\/news-12089\/"},"modified":"2022-02-24T14:10:10","modified_gmt":"2022-02-24T22:10:10","slug":"news-12089","status":"publish","type":"post","link":"http:\/\/www.palada.net\/index.php\/2022\/02\/24\/news-12089\/","title":{"rendered":"\u201cEthnicity recognition\u201d tool listed on surveillance camera app store built by fridge-maker\u2019s video analytics startup"},"content":{"rendered":"<p><strong>Credit to Author: David Ruiz| Date: Tue, 22 Feb 2022 23:37:51 +0000<\/strong><\/p>\n<p>The <a href=\"https:\/\/store.azena.com\/shop\/p\/A_00140002\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">bizarre promotional video<\/a> promises \u201cFace analysis based on best of breed Artificial Intelligence algorithms for Business Intelligence and Digital Signage applications.\u201d What follows is footage of a woman pushing her hair behind her ears, a man grimacing and baring his teeth, and an actor in a pinstripe suit being slapped in the face against a green screen. Digitally overlayed on each person\u2019s face are colored outlines of rectangles with supposed measurements displayed: \u201cF 25 happiness,\u201d \u201ccaucasian_latin,\u201d \u201cM 38 sadness.\u201d<\/p>\n<p>The commercial reel advertises just one of the many video analytics tools available for download on an app store monitored by the Internet of Things startup Azena, itself a project from the German kitchen appliance maker Bosch.<\/p>\n<p>Bosch, known more for its line of refrigerators, ovens, and dishwashers, also develops and sells an entire suite of surveillance cameras. Those surveillance cameras have become increasingly \u201csmart,\u201d <a href=\"https:\/\/theintercept.com\/2022\/02\/11\/surveillance-video-ai-bosch-azena\/\">according to recent reporting from The Intercept<\/a>, and to better equip those cameras with smart capabilities, Bosch has tried to emulate the same success of the smart phone\u2014offering an app store through Azena where users can download and install new, developer-created tools onto Bosch camera hardware.<\/p>\n<p>According to Bosch and Azena, the apps are safe, the platform is secure, and the entire project is innovative.<\/p>\n<p>\u201cI think we\u2019re just at the beginning of our development of what we can use video cameras for,\u201d said Azena CEO Hartmut Schaper, in speaking with The Intercept.<\/p>\n<h3><strong>Facial recognition\u2019s flaws<\/strong><\/h3>\n<p>Many of the available apps on the Azena apps store claim to provide potentially useful analytics, like alerting users when fire or smoke are detected, monitoring when items are out of stock on shelves, or checking for unattended luggage at an airport. But others veer into the realm of pseudo-science, claiming to be able to scan video footage to detect signs of \u201cviolence and street fighting,\u201d and, as The Intercept reported, offering up \u201cethnicity detection, gender recognition, face recognition, emotion analysis, and suspicious behavior detection.\u201d<\/p>\n<p>Such promises on video analysis have flooded the market for years, but their accuracy has always been suspect.<\/p>\n<p>In 2015, the image recognition algorithm rolled out in Google Photos <a href=\"https:\/\/www.theverge.com\/2018\/1\/12\/16882408\/google-racist-gorillas-photo-recognition-algorithm-ai\">labeled Black people as gorillas<\/a>. In 2018, the organization Big Brother Watch found that the facial recognition technology rolled out by the UK\u2019s Metropolitan Police at the Notting Hill carnival registered a <a href=\"https:\/\/www.theguardian.com\/uk-news\/2018\/may\/15\/uk-police-use-of-facial-recognition-technology-failure\">mismatch 98 percent of the time<\/a>. And in the same year, American Civil Liberties Union scanned the face of every US Congress member against a database of alleged criminal mugshots using Amazon\u2019s own facial recognition technology and found that the technology made 28 erroneous matches.<\/p>\n<p>When it comes to analyzing video footage to produce more nuanced results, like emotional states or an unfounded calculation of \u201csuspicion,\u201d the results are equally bad.<\/p>\n<p>According to a <a href=\"https:\/\/www.article19.org\/emotion-recognition-technology-report\/\">recent report from the organization Article 19<\/a>, which seeks to maintain a global freedom to expression, \u201cemotion recognition technology is often pseudoscientific and carries enormous potential for harm.\u201d<\/p>\n<p>One need look no further than the promotional video described earlier. In the span of less than one second, the actor being slapped in the face goes from being measured as \u201ceast_asian\u201d and \u201cM 33 sadness\u201d to \u201ccaucasion_latin\u201d and \u201cM 37 sadnesss.\u201d<\/p>\n<p>Of equal concern for the apps are the security standards put into place by Azena on its app store.<\/p>\n<h3><strong>Security and quality concerns<\/strong><\/h3>\n<p>According to documentation viewed by The Intercept, Azena reviews incoming, potential apps for their \u201cdata consistency\u201d and the company also \u201cperforms \u2018a virus check\u2019 before publishing to its app store. \u2018However,\u2019 reads the documentation, \u2018we do not perform a quality check or benchmark your app.\u2019\u201d<\/p>\n<p>That process is a little different from the Apple App Store and the Google Play Store.<\/p>\n<p>\u201cWhen it comes to Apple, there&#8217;s definitely more than just a virus scan,\u201d said Thomas Reed, director of Mac and Mobile at Malwarebytes. \u201cFrom what I understand, there&#8217;s a multi-step process designed to flag both App Store rule violations and malicious apps.\u201d<\/p>\n<p>That doesn\u2019t mean that junk apps don\u2019t end up on the Apple App Store, Reed said\u2014it just means that there\u2019s a known, public process about what types of apps are and are not allowed. And that same premise is true for the Google Play Store, as Google tries to ensure that submitted apps do not break an expansive set of policies meant to protect users from being scammed out of money, for example, or from invasive monitoring. In 2020, for instance, Google implemented <a href=\"https:\/\/www.zdnet.com\/article\/google-formally-bans-stalkerware-apps-from-the-play-store\/\">stricter controls against stalkerware-type applications<\/a>.<\/p>\n<p>According to The Intercept\u2019s reporting on Azena though, the company\u2019s review process relies heavily on the compliance of its developers. The Intercept wrote:<\/p>\n<p>\u201cBosch and Azena maintain that their auditing procedures are enough to weed out problematic use of their cameras. In response to emailed questions, spokespeople from both companies explained that developers working on their platform commit to abiding by ethical business standards laid out by the United Nations, and that the companies believe this contractual obligation is enough to rein in any malicious use.<\/p>\n<p>At the same time, the Azena spokesperson acknowledged that the company doesn\u2019t have the ability to check how their cameras are used and doesn\u2019t verify whether applications sold on their store are legal or in compliance with developer and user agreements.\u201d<\/p>\n<p>The Intercept also reported that the operating system used on modern Bosch surveillance cameras could potentially be out of date. The operating system is a \u201cmodified version of Android,\u201d The Intercept reported, which feasibly means that Bosch\u2019s cameras could receive some of the same updates that Android receives. But when The Intercept asked a cybersecurity researcher to take a look at the updates that Azena has publicized, that researcher said the updates only accounted for vulnerabilities patched as late as 2019.<\/p>\n<p>In speaking with The Intercept, Azena\u2019s Schaper denied that his company is failing to install necessary security updates, and he explained that some of the vulnerabilities in the broader Android ecosystem may not apply to the cameras\u2019 operating system because of features that do not carry from one device to another, like Bluetooth connectivity.<\/p>\n<h3><strong>A bigger issue<\/strong><\/h3>\n<p>Malwarebytes Labs has written repeatedly about invasive surveillance\u2014from <a href=\"https:\/\/blog.malwarebytes.com\/stalkerware\/2019\/10\/how-to-protect-against-stalkerware-a-murky-but-dangerous-mobile-threat\/\">intimate partner abuse<\/a> to targeted government spying\u2014but the mundane work of security camera analysis often gets overlooked.<\/p>\n<p>It shouldn\u2019t.<\/p>\n<p>With the development of the Azena app platform and its many applications, an entire class of Internet of Things devices\u2014surveillance cameras\u2014has become a testing ground for video analysis tools that have little evidence to support their claims. Emotional recognition tools are nascent and largely un-scientific. \u201cEthnicity recognition\u201d seems to forever be stuck in the past, plagued by earlier examples of when a <a href=\"https:\/\/www.zdnet.com\/article\/tech-racism-will-dark-skinned-gamers-have-trouble-with-kinects-facial-recognition\/\">video game console couldn\u2019t recognize dark-skinned players<\/a> and when a soap dispenser <a href=\"https:\/\/gizmodo.com\/why-cant-this-soap-dispenser-identify-dark-skin-1797931773\">famously failed to work for a Facebook employee in Nigeria. <\/a>And \u201csuspicious behavior\u201d detection relies on someone, somewhere, determining what \u201csuspicious\u201d is, without having to answer why they feel that way.<\/p>\n<p>Above all else, the very premise of facial recognition itself has failed to prove effective, with multiple, recent experiments showing embarrassing failure rates.<\/p>\n<p>This is not innovation. It\u2019s experimentation without foresight.<\/p>\n<p>The post <a rel=\"nofollow\" href=\"https:\/\/blog.malwarebytes.com\/privacy-2\/2022\/02\/ethnicity-recognition-tool-listed-on-surveillance-camera-app-store-built-by-fridge-makers-video-analytics-startup\/\">\u201cEthnicity recognition\u201d tool listed on surveillance camera app store built by fridge-maker\u2019s video analytics startup<\/a> appeared first on <a rel=\"nofollow\" href=\"https:\/\/blog.malwarebytes.com\">Malwarebytes Labs<\/a>.<\/p>\n<p><a href=\"https:\/\/blog.malwarebytes.com\/privacy-2\/2022\/02\/ethnicity-recognition-tool-listed-on-surveillance-camera-app-store-built-by-fridge-makers-video-analytics-startup\/\" target=\"bwo\" >https:\/\/blog.malwarebytes.com\/feed\/<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p><strong>Credit to Author: David Ruiz| Date: Tue, 22 Feb 2022 23:37:51 +0000<\/strong><\/p>\n<p>A video analytics startup has built an app store for unproven facial recognition tools that can be loaded onto surveillance cameras.<\/p>\n<p>The post <a rel=\"nofollow\" href=\"https:\/\/blog.malwarebytes.com\/privacy-2\/2022\/02\/ethnicity-recognition-tool-listed-on-surveillance-camera-app-store-built-by-fridge-makers-video-analytics-startup\/\">\u201cEthnicity recognition\u201d tool listed on surveillance camera app store built by fridge-maker\u2019s video analytics startup<\/a> appeared first on <a rel=\"nofollow\" href=\"https:\/\/blog.malwarebytes.com\">Malwarebytes Labs<\/a>.<\/p>\n","protected":false},"author":4,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"colormag_page_container_layout":"default_layout","colormag_page_sidebar_layout":"default_layout","footnotes":""},"categories":[10488,10378],"tags":[16484,25092,25093,25094,14753,16352,5897,4053,25095,25096],"class_list":["post-18356","post","type-post","status-publish","format-standard","hentry","category-malwarebytes","category-security","tag-app-store","tag-azena","tag-bosch","tag-ethnicity-recognition","tag-facial-recognition","tag-google-play-store","tag-privacy","tag-surveillance","tag-surveillance-camera","tag-surveillance-cameras"],"_links":{"self":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/18356","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/comments?post=18356"}],"version-history":[{"count":0,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/posts\/18356\/revisions"}],"wp:attachment":[{"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/media?parent=18356"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/categories?post=18356"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/www.palada.net\/index.php\/wp-json\/wp\/v2\/tags?post=18356"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}