In case you haven't heard, there's a new Chief Twit. His other company is known for racial abuse of Black workers and software bugs, and as soon has he took over he fired Vijaya Gadde, the widely-respected head of Trust & Safety. What could possibly go wrong?
– A new Chief Twit - and a big Twitter privacy issue
Privacy's only one of the many reasons the Twitter acquisition's a big deal. Unfortunately, most of the articles I've read mostly provide a fairly limited range of perspectives. As Andre Brock says, "Black Twitter’s dominance over the platform has a LOT to do with how many tech pundits/liberal voices on here see Twitter as a ‘hell site’." I've collected a bunch of other reactions in a Twitter moment.
Sara Morrison on Vox (vox.com)
That said, privacy's certainly a big concern! Morrison notes that Twitter doesn't really allow you to delete data, observes that the new owner – or whoever he designates – could certainly get at private direct messages if they wanted to. And that's not the only threat; firing Gadde is likely to lead to a lot of other departures from Twitter's security team, so it's prime opportunity for intelligence agencies, organized crime, or teenagers to get access to data.
For somebody like me, this isn't a big deal. I've said a few things in DMs that might be a bit embarassing if they were public, but nothing that would put me or anybody else at risk. But think about whistleblowers or activists who have shared information with journalists or other activists in Twitter DMs – and some of the new Chief Twit's statements that certainly make it seem like he's sucking up to countries like Russia and China.
Darius Kazemi on runyourown.social
As people consider moving off Twitter, it's worth thinking about setting up your own small social network. Kazemi's excellent overview lays out some general principles of running a small social network site, focusing more on community building more than they are related to specific technologies.
Running a social network site is community building first and a technical task second.
And while community building is hard work, it's often worth it.
This is my pitch to you: using big social media sites is easy, but you pay a steep price for it. You should consider running your own site, which is harder, but can be extremely rewarding.
Privacy after Roe
Jordan Famularo and Richmond Wong on Brookings (brookings.edu)
Steps the tech sector should take to prevent the misuse of data in abortion-related cases.
Jack Gillum on bloomberg.com
In the hours after the US Supreme Court overturned Roe v. Wade, federal authorities were monitoring social media to gather intelligence about nationwide protests and possible violence.
Jake Laperruque on Center for Democracy and Technology (cdt.org)
Fusion centers are law enforcement hubs that stockpile and share data to aid state and local investigations. At least 80 fusion centers operate across the country, with at least one in each state. They are owned and operated by the states and nearly all staff at facilities are from state and local agencies. However, fusion centers significantly rely on the federal government, which can leverage that reliance to ensure that its assistance is not used to facilitate abortion-related investigations.
This is one of a series of blog posts examining how the federal government could prevent federal surveillance assistance to state and local law enforcement from being used to investigate and prosecute reproductive health care choices. Other posts examined Regional Computer Forensic Laboratories, the National Domestic Computer Assistance Center, and The Computer Analysis Response Team.
Automated decision systems
Margot E. Kaminski on Social Studies Research Network (papers.ssrn.com)
From the abstract:
Companies and governments now use Artificial Intelligence (AI) in a wide range of settings. But using AI leads to well-known risks—that is, not yet realized future harms that arguably present challenges for a traditional liability model. It is thus unsurprising that lawmakers in both the United States and the European Union (EU) have turned to the tools of risk regulation for governing AI systems.
This Article observes that constructing AI harms as risks is a choice with consequences. Risk regulation comes with its own policy baggage: a set of tools and troubles that have emerged in other fields. Moreover, there are at least four models for risk regulation, each with both overlapping and divergent goals and methods. Emerging conflicts over AI risk regulation illustrate the tensions that emerge when regulators employ one model of risk regulation, while stakeholders call for another.
Sam Biddle on The Intercept (theintercept.com)
The documents provide an inside look at an Iranian government program that lets authorities monitor and manipulate people’s phones.
on International Association of Privacy Professionals (iapp.org)
The IAPP’s Jedidiah Bracy interviews Danielle Citron about her new book “The Fight for Privacy: Protecting Dignity, Identity and Love in the Digital Age.”
Ariel Bogle on ABC News (abc.net.au)
While adults are also at risk of scams or stigma due to the disclosure of health records, children are a special case — particularly because they don’t always get a say about what carers sign them up for or share, experts say.
Zack Whittaker on TechCrunch (techcrunch.com)
The server, which Amazon took offline, was not protected with a password.
Hector Dominguez on Smart City PDX (smartcitypdx.com)
This Zine is an educational material to learn more about surveillance technologies and Digital Justice. e Smart City PDX
An Unrepresentative Democracy: How Disinformation and Online Abuse Hinder Women of Color Political Candidates in the United States
Dhanaraj Thakur, DeVan Hankerson Madrigal on Center for Democracy and Technology (cdt.org)
Executive Summary In a press interview, former Vermont state house representative Kiah Morris said she reported at least 26 incidents to the local police where she and her family felt threatened between 2016 and 2018. The severity of the targeted abuse both on and offline ultimately led Rep. Morris…
Mike Pearl on Mashable (mashable.com)
In fact, you might have already gotten paid.
Future of Privacy Forum (fpf.org)
As businesses increasingly develop and adopt extended reality (XR) technologies, including virtual (VR), mixed (MR), and augmented (AR) reality, the urgency to consider potential privacy and data protection risks to users and bystanders grows. Lawmakers, regulators, and other experts are increasingl…
Rumsha Sajid and Danny Cendejas on Apple Podcasts (podcasts.apple.com)
From a growing culture of deputized surveillance, to shows like “Ring Nation,” surveillance is evolving with us. Two deeply experienced organizers working at MediaJustice discuss the historic roots of present day surveillance—colonialism and slavery—and the continued disproportionate impact that surveillance has on communities of color.
Brandy Betz on CoinDesk (coindesk.com)
Notebook Labs aims to accelerate DeFi adoption with its crypto identity protocol.
Dana Hatherly on Yukon News (yukon-news.com)
Jason Pedlar of the Yukon is among those federal, provincial and territorial privacy regulators
Daniel Solove on TeachPrivacy (teachprivacy.com)
Prof Solove’s new Halloween cartoon - Privacy Law Frankenstein - involving the multitude of different privacy laws in the United States.
Brandon Vigliarolo on The Register (theregister.com)
China-owned boredom-killing biz issues precision-engineered denial
John Pavolotsky on JD Supra (jdsupra.com)
It’s a great time to be a privacy attorney.
Natasha Lomas on TechCrunch (techcrunch.com)
The UK privacy watchdog has warned against use of “emotion analysis” technologies, saying “immature” biometrics pose a discrimination risk.
WSJ on mint (livemint.com)
The iPhone maker plans to comply with new EU mandate governing how devices must be charged
Elizabeth Burgin Waller on JD Supra (jdsupra.com)
Generally, biometric privacy laws seek to protect the unique attributes of human beings that could be leveraged to access sensitive information about...
Daniel Susser on ojs.library.queensu.ca
Surveillance studies scholars and privacy scholars have each developed sophisticated, important critiques of the existing data-driven order. But too few scholars in either tradition have put forward alternative substantive conceptions of a good digital society. This, I argue, is a crucial omission. Unless we construct new “sociotechnical imaginaries,”new understandings of the goals and aspirations digital technologies should aim to achieve, the most surveillance studies and privacy scholars can hope to accomplish is a less unjust version of the technology industry’s own vision for the future.
on noyb.eu (noyb.eu)
More and more websites have added an option to say “no” to cookies and other tracking- as foreseen by the GDPR. Where did this trend come from?
Cody Venzke on Center for Democracy and Technology (cdt.org)
This fall, a federal trial court in Ohio ruled that certain uses of remote proctoring violate students’ right to be free from unreasonable searches under the Fourth Amendment. The court recognized that some aspects of remote proctoring technology intrude on the “core places” where society recognizes…
Ashwin Krishnan, on TechTarget (techtarget.com)
Metaverse privacy is a moving target. Learn about the main data privacy concerns the metaverse poses and how to mitigate them.
Diana Diamond on Almanac Online (almanacnews.com)
The Palo Alto Police Department wants the city to spend thousands of dollars each year, for an indefinite period, to purchase an unspecified number (10, 20, or?) of Automated License Plate Readers (ALPRs) to help them identify stolen vehicles, trace cars involved in crimes and help investigators locate perpetrators once a crime has occurred, according to a report from the Police Department.