The Case for Privacy by Design

Do you dream of a future where you can opt-in to share your data — instead of opting out? It could be closer than you think.

Written by Sandra Matz
Published on Jan. 22, 2025
Two wooden cubes, the first reading “OPT” and the second being tilted by a person’s hand so you can see one side that reads “IN” and another that says “OUT.”
Image: Shutterstock / Built In
Brand Studio Logo

Back in the village, my neighbors had to observe my behavior to be helpful. And there was no way to prevent them from gaining access to the intricate details of my life if I wanted their support. And once they’d been helpful, I couldn’t just go back and ask them to forget what they knew. In any case, I doubt any of them would have shown much interest in that proposition. They enjoyed the gossip. 

That requirement no longer holds true in the digital world. We now have technologies that allow your data to remain in its safe harbor while still generating the insights you are looking for. It’s as if your neighbor were lending you their brain and resources for a day to process all your problems, without storing any of the data itself (well kind of).

Sounds like magic, right? It definitely felt like that when I first heard about it. It’s math magic called federated learning.

What Is Privacy by Design?

Privacy by design is when developers design tech with the user’s privacy at its core. Its purpose is to improve adherence to data protection by building the data processing procedures straight into the technology.

More on Data SafetyGoogle Is Keeping the Cookies. What Should You Do?

 

How Does Federated Learning Work?

The truth is, you don’t need to hand over all your data to a third party to get personalized recommendations and convenient services tailored to you. We all carry mini supercomputers in our pockets.

Remember the historic Apollo 11 mission that landed the first man on the moon? Your iPhone today has over 100,000 times more processing power than the Apollo 11 computer. It has over 1 million times more memory and over 7 million times more storage.

Federated learning taps into this computing power to run algorithms (and insights) locally.

Take Netflix. Instead of sending your viewing data to a central service it owns, Netflix could send its recommendation model to your device (i.e., to your laptop or smartphone, for example). The model would then update itself based on your data and recommend the best shows and movies for you. To make sure we all benefit from learning, your device would send an encrypted version of the updated model back to Netflix.

The result? You benefit. Netflix benefits, and all the other users benefit. But your personal data never leaves its safe harbor. You don’t need to trust a third party (regardless of how trustworthy that party might be) to securely store your data and use it for only the purposes it was intended. Federated learning replaces the need to trust with a system that is inherently trustworthy.

 

How Federated Learning Keeps Our Data Safer

This might sound like science fiction, but it’s not. Chances are you’re already benefiting from federated learning technology. Apple’s Siri, for example, is trained locally on your device. Using federated learning, Apple can send copies of its speech-recognition models to your iPhone or iPad, where it will process your audio data locally.

This means that none of your voice recordings ever need to leave your phone, but Siri still gets better at understanding your needs. Because your phone sends back the updated model to Apple to integrate the new insights into its master model, you are helping to improve the experience of other users. 

Governments could mandate technologies like federated learning for companies that have reached a certain number of users or that handle sensitive data. But using such technologies might also be in the best interest of companies.

Hoarding large amounts of personal data has become a growing security risk that can be incredibly costly. You don’t want to sit on a pile of gold if you know there are robbers lurking all around you waiting for their opportunity to steal it. You’d much rather keep it somewhere safe. Use it to do your business without the mandate to protect it. The same is true for personal data.

More on Data SecurityEverything You Need to Know About the 2025 Digital Operational Resilience Act

 

How Privacy by Design Work — and Empowers Us

Importantly, the shift to privacy by design could also significantly improve the products and services we use. This might seem counterintuitive. Less data should mean lower quality, shouldn’t it? It’s the classic argument you hear from tech companies.

But privacy by design doesn’t mean no data. It means trading data in exchange for better service and products. Today, much of this exchange amounts to mere lip service. There is no incentive for companies to fulfill their promises once they’ve acquired your data, leaving you in a bad position at the bargaining table.

But if companies depended on their customers’ active consent to collect and use personal data, they would be compelled to deliver value in return.

The formula is simple: no value, no data. Vague promises would no longer suffice. If you don’t perceive a benefit from sharing your personal data, you simply wouldn’t share it and would move on to another service that does a better job.

Mindmasters book jacket
Image provided by Harvard Business Review Press.

Take Instagram. The app’s recommendation algorithms promise to deliver the most relevant and engaging content by tapping into users’ personal data. That sounds helpful, but how can I be sure it’s actually true? Currently, I have to take Instagram’s word for it. There is no way for me to compare my personalized feed to a more generic version of the app or one that is based on only a subset of my data that I might feel comfortable sharing.

Once we shift the default to opt in, that changes. The generic version of the app would become my new baseline. For me to change the default, Instagram would need to show me how sharing my personal data gives me a much better experience. If it fails to do so, I could simply revoke data access and either go back to the generic version or move to a competitor that keeps its promises.

Privacy by design empowers us all to ask for more. 

Reprinted by permission of Harvard Business Review Press. Adapted from MINDMASTERS: The Data Driven Science of Predicting and Changing Human Behavior by Sandra Matz. Copyright 2025 Sandra Matz. All rights reserved.

Explore Job Matches.