EFFecting Digital Freedom

by Jason Kelley

Supreme Court Decision Overturning Abortion Rights Is a Privacy Wakeup Call

Fair and meaningful protections for data privacy are essential to independence and autonomy in the modern era.  Everyone deserves to have strong controls over the collection and use of information they necessarily leave behind as they go about their normal activities, like using apps, search engine queries, posting on social media, texting friends, and so on.  But after June's Supreme Court ruling in Dobbs v. Jackson Women's Health Organization, people are becoming more aware than ever before that those controls are sorely lacking.

Today, the concern is over abortion access - and for good reason: changing laws across the country now mean that those seeking, offering, or facilitating abortion access must assume that any data they provide online or offline could be sought by law enforcement.  But tomorrow, the concern could be over something else.  It is essential to remember that what is currently legal may not always be legal, and that who is in power can always shift, along with what laws are enforced, and how.

This isn't idle speculation - digital trails have already been ransacked in the service of questionable prosecution.  After the "J20" protest over U.S. election results in 2017, Department of Justice prosecutors served a search warrant on the hosting provider of a site that was dedicated to organizing and planning the protest.  The request would have required DreamHost to turn over the IP logs of all visitors to the site - anyone who visited, whether journalist, activist, or just interested reader - but it was later narrowed after pushback (though that later request was not without its flaws).  By early July 2018, federal prosecutors dropped many of the charges against many of the defendants in the case, but even this sort of overbroad demand for digital data has a chilling effect on future protest organizers.

To ensure that digital data isn't misused by companies, courts, law enforcement, or the government, EFF supports data privacy for all.  There are three main fronts in the fight to protect digital privacy and, whoever you are, you can help.

First, there are basic steps everyone can take to minimize data collection.  EFF recommends a variety of methods in our Surveillance Self-Defense guides, available at ssd.eff.org, as well as other guides on EFF.org.  We've got tips for protesters (for example: removing fingerprint unlock and Face ID on your phone, enabling full-disk encryption on your devices and using encrypted chat).  We've got recommendations for those in the abortion access movement (keeping more sensitive activities separate from your day-to-day ones, carefully reviewing the privacy settings on each app and account you use).  And we've got basic guides for anyone who wants to take their privacy into their own hands, while recognizing that there is no one-size-fits-all digital security solution.

But it should not be entirely up to individual people to take these elaborate steps to protect their own privacy.  Because privacy is a fundamental human right, it should be protected in law and statutes.  Digital privacy protections in the U.S. are generally weak, and we all must push our local, state, and federal representatives to do better.  Perhaps the most important thing we can do to minimize data harms is to ban online behavioral advertising, which creates staggering profits for tech companies by targeting ads to us based on our online behavior.  This incentivizes all online actors to collect as much of our behavioral information as possible, and then sell it to ad tech companies and the data brokers that service them.  This pervasive online behavioral surveillance apparatus is what turns our online lives into open books.  But there are also smaller bills that help fill the gaps: for example, Rep. (((Sara Jacobs'))) "My Body, My Data" Act will protect the privacy and safety of people seeking reproductive health care.  Specifically, this bill would restrict businesses and non-governmental organizations from collecting, using, retaining, or disclosing reproductive health information that isn't essential to providing the service someone asks them for.  Regardless of where you live in the U.S., or in other parts of the globe, you can visit EFF's Action Center at act.eff.org to find out how to speak up for better laws.

Lastly, we must demand that companies do better.  There is a lot that companies - from ISPs to app developers to platforms and beyond - can do to protect privacy, and those steps will benefit all users.  We must push companies to minimize the harm that can be done.  In some cases, that means demanding better privacy options by default.  In other cases, it means minimizing the use of "dark patterns" that push users to make choices harmful to their privacy.  And generally, it means ensuring companies allow pseudonymous access, rethink data retention policies, encrypt what they can, don't share or sell their data, and make their tools interoperable with others, so we can have choices about how we use their products.

This is a tall order - we don't always have much say in what companies do, especially when it hits their bottom line - but you can start by making the switch to more privacy-protective apps where possible, and speaking up when a company whose service you use is negligent with their data protections.  And if you're building your own tool or app, of course, you can make privacy a priority.

The Supreme Court decision to overturn abortion rights makes what was benign data now potentially criminal evidence.  But it might not stop there.  What we know from the past 30 years of work protecting digital rights is that if technology can be used to aid criminalization, it will be - and it might not matter whether the law appears just or not.  As always, EFF will be fighting back in the courts, in the legislature, and online - and you can fight back too.  It will take all of us, working together, to protect each other.

Return to $2600 Index