Concerned about what's happening on Twitter?
Check out the Nexus of Privacy's Dreamwidth community – or follow us on Mastodon!
Dr. Apryl Williams on Panopto (yalelaw.hosted.panopto.com)
"Karen memes constitute public surveillance of Black bodies by white women. This is no different than the racialized surveillance Black people have always experienced. It’s just a new way of talking about it. Talking about it as a meme allows it to reach new publics."
The Yale Information Society Project's live-tweet thread has some of the highlights.
Lorrie Faith Cranor on Communications of the ACM (cacm.acm.org)
We're used to nutrition labels on food. Why not take the same approach to apps? It's an attracitve idea, but current implementations fall short. CMU professor Lorrie Faith Cranor, who's researched short standardized privacy for years, looks at today's Android and iOS privacy labels and concludes that they "confuse developers and end users." Cranor includes a set of recommendations to improve the labels. Don't hold your breath though. If Apple and Google really cared about giving people good information, they'd have responded to the well-publicized confusing nature of the labels long ago. It's almost like they care more about looking like they're doing something to help users than about actually helping users.
Today at 1:30 pm Pacific: Viral Justice: Pandemics, Policing, and Public Bioethics.
Register on Zoom (columbiacuimc.zoom.us)
Join us for our first 2022-23 Ethics Grand Rounds 2022/23 “Viral Justice: Pandemics, Policing, and Public Bioethics ” by Ruha Benjamin, PhD, Professor of African American Studies at Princeton University and Founding Director of the Ida B. Wells Just Data Lab.
In this talk, Ruha Benjamin examines the twin crises of COVID-19 and police violence, mapping the multiple vectors through which racism gets under the skin, into the blood stream, attacking our bodies and body politic. She offers a vision of change, viral justice – as a practical and principled approach to transmuting a hostile racial climate into one that is more habitable, hopeful, and just.
Monday, November 7: #PoliceAbuseTech: No Tech 4 Tyrants Report Launch
Sign up on Eventbrite (eventbrite.com)
Among global movements to reckon with police powers, NT4T's new report "Surveillance Tech Perpetuates Police Abuse of Power" looks at how police use surveillance technology to abuse their power in the UK and globally.
Thursday November 10: Limits of Decolonial Debate in Internet Studies - DigiLabour Online Roundtable.
Register on Zoom (digilabour-br.zoom.us)
Continuing the discussion started at AoIR conference, this online roundtable analyzes the limits of decolonial approach in internet studies. What are the pitfalls of the decolonial approach, particularly in terms of being co-opted by the colonizers as a brand? What is missing from the decolonial debate in internet studies? How can we make sense of the oversized role played by institutions from the Global North in this debate? Starting from these and other questions, this roundtable will bring researchers to share their views and engage in a constructive debate.
Participants include Abeba Birhane (Mozilla Foundation / University College Dublin), Tarcizio Silva (Mozilla Foundation/ Federal University of ABC, Brazil), Anita Gurumurthy (IT for Change), Syed Mustafa Ali (Open University), Cheryll Soriano (Manila University), Rigoberto Lara Guzmán (Data & Society), and Walter Lippold (Rio Grande do Sul Institute)
Federal privacy legislation
The Facial Recognition Act: A Promising Path to Put Guardrails on a Dangerously Unregulated Surveillance Technology
Jake Laperruque on Lawfare (lawfareblog.com)
There’s consensus in Congress that facial recognition needs to be reined in, but not nearly enough action to bring about effective rules. A new bill could jump-start the debate and move the nation toward the comprehensive set of limits that are needed.
Kendra Albert on SSRN (papers.ssrn.com)
Much ink has been spilled in defense of or against Section 230. But most scholars do not cite Section 230 as an example of abolition of the police state or prison industrial complex despite the fact that Section 230 may represent the largest single carveout of people and entities from state criminal liability in United States history.
FTC Brings Action Against Ed Tech Provider Chegg for Careless Security that Exposed Personal Data of Millions of Customers
Federal Trade Commission (ftc.gov)
Chegg, a homework help app, exposed the data of 40 million users, including details about some students’ sexual orientation and religion. That's only one of a series of security problems for Chegg, who suffered four breaches in four years. In their settlement with the FTC, Chegg has agreed to lock the door after the data has been leaked and adopt a comprehensive security program. here's the full
- F.T.C. Accuses Ed Tech Firm Chegg of ‘Careless’ Data Security, Natasha Singer on NYTimes (nytimes.com)
- FTC Chair Lina Khan has a short Twitter thread summarizing the order
Across the pond ...
Khari Johnson on WIRED (wired.com)
The Digital Markets Act will force big tech platforms to break open their walled gardens in 2023, says the EU’s new ambassador to Silicon Valley.
Joris van Hoboken on Verfassungsblog (verfassungsblog.de)
The Digital Services (DSA) has finally been published in the Official Journal of the European Union on 27 October 2022. This publication marks the end of a years-long drafting and negotiation process, and opens a new chapter: that of its enforcement, practicable access to justice, and potential to set global precedents.
Tara on GDPR Beetle (gdprbeetle.eu)
Discover new data protection cases, pending at the EU Court of Justice.
Annie Greenley-Giudici on TrustArc Privacy Blog (trustarc.com)
When does the GDPR apply to your data collection activities? Review three common misconceptions about how the GDPR applies.
Jason Leopold, Katrina Manson, and William Turton on bloomberg.com
An “experienced” analyst working at the National Security Agency developed a surveillance project about a decade ago that resulted in the unauthorized targeting and collection of private communications of people or organizations in the US, newly unearthed documents show. An investigation into the matter, which hasn’t been previously reported, found that the analyst “acted with reckless disregard” and violated numerous rules and possibly the law, according to a 2016 report by the NSA’s Office of Inspector General.
Online age-verification system could create ‘honeypot’ of personal data and pornography-viewing habits, privacy groups warn
Josh Taylor on The Guardian (theguardian.com)
As the government develops online safety guidelines, digital rights groups says any approach requiring the use of ID is ‘invasive and risky’
Justin Hendrix on Tech Policy Press (techpolicy.press)
Prof. Citron's “The Fight for Privacy: Protecting Dignity, Identity, and Love in the Digital Age” was published in October by W.W. Norton and Penguin Vintage UK. It's especially timely in the aftermath of the Dobbs decision.
- In her new book, Danielle Keats Citron makes the case for the ‘right to intimate privacy’, Tonya Riley on CyberScoop (cyberscoop.com)
- The Lawfare Podcast: Danielle Citron on Intimate Privacy and How to Preserve It in a Digital Age, Jen Patja Howell on Lawfare (lawfareblog.com)
- On protecting intimate privacy: A chat with Danielle Citron, Jedidiah Bracy on International Association of Privacy Professionals (iapp.org)
Alex Bellos on The Guardian (theguardian.com)
Techniques which allow the sharing of data whilst keeping it secure may revolutionise fields from healthcare to law enforcement
Rachel Eisen on TechNative (technative.io)
Five states — California, Colorado, Connecticut, Utah, and Virginia — have enacted comprehensive consumer data privacy laws, with Massachusetts expected to pass its Massachusetts Information Privacy and Security Act (MIPSA). While these regulations appear to hamper marketers, using data consumers give willingly has advantages: it allows one-to-one personalization and gives consumers more control over their online shopping interactions without sacrificing privacy. That’s why zero- and first-party data will grow increasingly important for marketers’ efforts to humanize digital interactions that drive long-term customer value.
Michela Meister on papers.ssrn.com
Reproductive rights in the United States are under threat, and the threat is growing more serious by the day.
Eileen Yu on ZDNET (zdnet.com)
Government says it will push up maximum fines for serious or repeated data privacy breaches to AU$50 million, up from the current AU$2.22 million, in a move that follows a spate of cybersecurity incidents that compromised customer data, including Medibank.
Howard Solomon on IT World Canada (itworldcanada.com)
As Canada’s public and private sectors launch new digital identity programs, federal, provincial, and territorial regulators say rights to privacy and transparency must be fully respected throughout their design and operation. “The development and implementation of a digital ID ecosystem is a tremen…
Chris Burt on BiometricUpdate.com (biometricupdate.com)
Data privacy regulators agree to a resolution that personal information should be used for facial recognition only in accordance with a set of six principles.
Mack DeGeurin on Gizmodo (gizmodo.com)
The watchdog’s deputy commissioner warned the tech wasn’t backed by science and that any benefits were outweighed by the risks.
Ashwin Krishnan, on TechTarget (techtarget.com)
Metaverse privacy is a moving target. Learn about the main data privacy concerns the metaverse poses and how to mitigate them.
Georgetown Law Welcomes Natalie Roisman as New Executive Director of its Institute for Technology Law & Policy
on Georgetown Law (law.georgetown.edu)
Leading technology attorney Natalie Roisman joins the Institute for Technology Law & Policy at Georgetown Law this week as its new Executive Director. Roisman, a veteran practitioner and former president of the Federal Communications Bar Association (FCBA), will bring extensive experience on technol…
Daniel Berrick and Jameson Spivack on Future of Privacy Forum (fpf.org)
Today’s virtual (VR), mixed (MR), and augmented (AR) reality environments, collectively known as extended reality (XR), are powered by the interplay of multiple sensors, large volumes and varieties of data, and various algorithms and automated systems, such as machine learning (ML). These complex relationships enable functions like gesture-based controls and eye tracking, without which XR experiences would be less immersive or unable to function at all. However, these experiences often depend on sensitive personal information, and the collection, processing, and transfer of this data to other parties may pose privacy and data protection risks to both users and bystanders.
ALSO: Understanding Extended Reality Technology & Data Flows, the infographic from FPF the blog post discusses
Eileen Yu on ZDNET (zdnet.com)
Southeast Asia’s digital economy is projected to reach $200 billion in gross merchandise value this year, but market players will need to understand consumer behaviour to sustain growth as adoption of online services matures across the region.
Mike Morper, Virtru on VentureBeat (venturebeat.com)
Best practices for implementing zero trust data control for better data protection, wherever your data resides.
lifewire’s editorial guidelines on Lifewire (lifewire.com)
Imagine your webcam recording your face the whole time you’re using your computer or phone, and giving that data to a company like Facebook. Welcome to the future of AR and VR.
Tom Jackson on Disrupt Africa (disrupt-africa.com)
Cape Town-based “privacy-by-design” startup Omnisient has raised an undisclosed amount of funding from old and new investors to support its continued international expansion and further develop its platform.
Odia Kagan on JD Supra (jdsupra.com)
Sixth in a series of articles on the Colorado Privacy Act draft rules.
Joao-Pierre S. Ruth on InformationWeek (informationweek.com)
Brian Ebert, Lokker advisory board member and former chief of staff of the Secret Service, discusses findings and issues companies may face on data privacy.
Sebastian Klovig Skelton, on ComputerWeekly.com (computerweekly.com)
The massive proliferation of ethical frameworks for artificial intelligence has done little to change how the technology is developed and deployed, with experts questioning the tech industry’s commitment to making it a positive social force.