In 2000, computer scientists at Georgia Tech launched the “Aware Home” experiment, a living laboratory where sensors embedded throughout the house and wearable computers would create a “human-home symbiosis”. The technology would capture countless data points about daily life, generating what researchers understood would be an entirely new knowledge domain.
The most remarkable thing about this experiment – as recounted by Shoshana Zuboff – wasn’t the technological ambition involved, but the moral clarity that guided it. 1 The Georgia Tech team operated under three assumptions that seem almost quaint today: first, that these data systems would create unprecedented knowledge; second, that the rights to this knowledge would belong exclusively to the people who lived in the house; and third, that the Aware Home would remain what homes have always been: a private sanctuary.
Twenty-five years later, that clarity has vanished. We live in an era where data has become the most valuable resource on Earth, yet the question of who owns it, controls it, and benefits from it has become strangely opaque. The intuitive understanding that data generated in our homes, from our bodies, and about our behaviors should belong to us has been systematically eroded, commercialized, and ignored. Digital sovereignty remains a promise yet to be fulfilled.
Current frameworks have demonstrably failed to protect digital rights from Big Tech dominance. In the most politically unstable period since World War II, if we couldn’t solve yesterday’s relatively simple privacy violations, how will we handle brain-reading technologies and quantum-cracked encryption?
In the context of the alarming divergence between accelerating technological change and stagnant governance structures, the question of data ownership and digital sovereignty becomes the defining challenge of the 21st century.
The Regulatory Retreat: How GDPR Lost Its Way
The transformation from the Aware Home’s moral clarity to today’s confusion resulted from deliberate choices, business models, and regulatory failures.
Maybe you remember how the early 2010s marked a peak moment when digital rights actually mattered in public discourse. Political parties were writing real policy recommendations, citizens started to care about their data, and revelations about mass surveillance and social media monetization sparked meaningful debates about who should control our digital lives. For a brief window, it felt like we might actually get this right.
In this context, the EU’s General Data Protection Regulation (GDPR), a law focused on data protection and privacy for individuals within the EU and EEA, represented a milestone: the first serious attempt to codify digital rights and was supposed to mark a turning point in the fight for data protection. But almost a decade later, even this landmark legislation has proven incomplete. The legislative process revealed how dramatically corporate influence in Brussels had shifted the stakes. As documented in David Bernet’s film “Democracy – Im Rausch der Daten” (Democracy – In the Intoxication of Data), tech giants deployed armies of lobbyists, spending millions to water down provisions and create loopholes. What emerged, while still a milestone, fell far short of what digital rights advocates had envisioned, a compromise shaped more by corporate interests than citizen protection.2
Research from George Washington University found that while GDPR strengthened individual rights on paper, it created significant unintended consequences that actually consolidated power in large technology companies. Mostly because large companies (like Google and Facebook) were better equipped to handle the high compliance costs and complex regulatory requirements, Medium-sized companies spent close to $3 million each to comply with GDPR requirements, while Fortune 500 firms paid an average of $16 million. These compliance costs forced smaller competitors out of the market, with market concentration in digital advertising increasing by 17% following GDPR implementation. 3
More problematically, large platforms like Google and Facebook leveraged their multiple product offerings to gather more data than ever before, using broad privacy policies that allow internal data sharing across services. Meanwhile, external data sharing, which might have enabled competition, became prohibitively expensive for smaller firms.
In May 2025, the European Commission published its “Simplification Omnibus IV” package, actively retreating from GDPR’s transparency requirements. Proposing to relax record-keeping obligations for companies with up to 750 employees, a dramatic expansion from the current threshold of 250 employees, this change would exempt thousands of medium-sized companies from maintaining detailed records of their data processing activities, precisely when such oversight is most needed. 4 Without accountability, we risk further eroding the principles of digital sovereignty that regulations like the GDPR were meant to protect.
While gaps in enforcement remain unresolved, we’ve begun to lose sight of the conversation itself. In the noise of AI regulation and mounting global crises, the important debates about digital sovereignty who owns our data and who holds the power, have been quietly drowned out. This retreat comes precisely as emerging technologies create entirely new categories of data vulnerability.
The New Vulnerabilities: When Everything Becomes Data
While the GDPR struggles with yesterday’s challenges, emerging technologies are creating entirely new categories of data vulnerability. Our recent articles on brain-computer interfaces, quantum computing, and the Internet of Things revealed how the boundaries between body, behavior, and machine are dissolving in ways that make current privacy protections obsolete.
Brain-computer interfaces represent perhaps the most profound challenge to our understanding of privacy and human autonomy. These technologies capture neural activity, the very electrical patterns that constitute our thoughts, emotions, and intentions. Neuralink has already conducted human trials, Meta is investing heavily in non-invasive BCI research, and military contractors are exploring neural interfaces for enhanced soldier performance. But if we already struggled to regulate cookies, how will we regulate cognition? The implications challenge the very foundation of digital sovereignty and individual autonomy.
The Internet of Things collects billions of intimate data points: When you leave your house, how you like your coffee, what’s in your fridge… It has already begun to eliminate meaningful choice about data collection, with billions of connected devices that are deeply integrated but wildly insecure, and increasingly inescapable. You’re reliant on your heating system to keep your home warm, your car’s computer to get to work, your medical devices to monitor your health. But all of these now require constant data sharing to function. When smart meters are mandatory and medical devices require constant connectivity, opting out of data collection becomes equivalent to opting out of modern life.
Quantum will fundamentally change the data game. With the capability of processing vastly more information than current computers can handle, including all that sensitive data you’ve been revealing while using ChatGPT or other chatbots as your personal therapist, quantum computers will be able to make previously secure data suddenly vulnerable – and to transform current data garbage into digital goldmines. At the same time, they’ll enable unprecedented pattern recognition across massive datasets, potentially revealing connections and insights from our digital footprints that we never imagined possible.
The Great Disconnect: Why We’ve Given Up
The most perplexing aspect of our current situation is not that our data is being exploited, it’s that we know it’s happening and continue to participate anyway.
According to Pew Research Center’s 2023 study, 67% of people say they understand little to nothing about what companies are doing with their personal data, a figure that has actually increased from 59% in previous years. This growing ignorance occurs alongside rising concern: 81% worry about how companies use their data, yet 56% frequently click “agree” on privacy policies without reading them. 5
This behavior reflects “privacy fatigue”, a state of learned helplessness where people become so overwhelmed by digital privacy complexity that they simply surrender. The study also found that 73% of Americans feel they have little to no control over what companies do with their data, while 79% feel powerless regarding government data collection. When people believe their actions won’t make a difference, they stop trying altogether.
The Stakes: Why Giving Up is no Option
But this acceptance may be the most dangerous aspect of all. We have grown accustomed to trading privacy for convenience, autonomy for efficiency, and democratic participation for algorithmic optimization.
But, there is still time to change course. And giving up isn’t an option. At least not for democracies.
When citizens disengage from data governance, they surrender democratic control over the digital infrastructure that increasingly shapes every aspect of social, economic, and political life – and ultimately, how we exist as humans and what our future societies will look like.
The convergence of public apathy, regulatory retreat, and technological advancement is creating conditions for digital authoritarianism. When every device is a potential surveillance tool, when thoughts themselves can be monitored, and when all existing privacy protections can be quantum-cracked, we face a world where human agency becomes obsolete. And it is far closer, than we might think. In fact, it’s the logical endpoint of current trends.
Business models based on extracting and monetizing personal data are being applied to increasingly intimate aspects of human experience. The implications for global peace and democratic governance are profound. When data becomes the primary source of power, those controlling data infrastructure control society itself. In such a world, digital sovereignty becomes not just a policy goal, but a democratic imperative.
Reclaiming the Future
We need a return to the moral clarity that guided the Aware Home project: the simple but revolutionary idea that data belongs to the people who generate it. This principle must be embedded in legislation, technological design, economic models, and democratic institutions. To reclaim digital sovereignty, we need more than policy tweaks: we need compelling narratives that help citizens understand what’s at stake, enforceable rights with real weight, and infrastructures that protect users instead of exploiting them. Most importantly, we must recognize this isn’t a technical problem, and there are no technical solutions to be found. It’s a political and moral challenge going to the heart of what kind of society we want.
The path forward requires both regulatory backbone and societal transformation. We need constitutional data rights, fundamental ownership principles as unshakeable as property law. This means real enforcement: collective lawsuits with penalties that genuinely hurt corporate bottom lines, mandatory algorithmic audits for systems affecting public welfare, and immediate moratoriums on brain-computer interfaces until robust protection standards exist.
Equally crucial is building societal understanding from the ground up. Digital literacy must start in elementary schools, not just “how to use apps” or even worse “how to use AI”; but “What are the risks involved in using new technologies? Who owns your data? and why does this even matters?” We need citizen juries deliberating major technology decisions, local digital rights organizations providing concrete help to individuals, and a new generation of politicians who understand code as fluently as they understand economics.
Most critically, states and international bodies like the EU and UN must prepare for corporate resistance. History shows that entrenched interests don’t surrender power voluntarily. Governments need the legal tools, technical expertise, and political will to break up data monopolies, enforce genuine transparency, and protect citizens even when trillion-dollar companies push back.
Twenty-five years ago, the Georgia Tech researchers proved that building technologies that respect human dignity is entirely possible. The question is whether we have the collective will to demand such technologies and the political courage to implement necessary changes.
The fight for data sovereignty will determine whether democracy survives the 21st century. It will determine whether the digital future serves human flourishing or subjugation, whether technology becomes a tool for liberation or oppression, whether we preserve democratic self-governance or surrender it to algorithmic control. The choice is still ours. But time is running out, and the stakes could not be higher.
We need to talk about data again; not as a technical abstraction or business opportunity, but as the foundation of human freedom in the digital age. The clarity of the “Aware Home” experiment awaits our return. The question is whether we will choose to go home.
It’s time to rethink the future we’re building!
References
- Zuboff, S. (2019). The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. Public Affairs.
- Bernet, D. (Director). (2015). Democracy – Im Rausch der Daten [Documentary film]. INDI FILM; gebrueder beetz filmproduktion; ARTE.
- Prasad, A. (2020, September 3). Unintended Consequences of GDPR. GW Regulartory Studies Center https://regulatorystudies.columbian.gwu.edu/unintended-consequences-gdpr
- de Souza, R., & Waem, H. (2025, May 28). Europe: European Commission publishes proposal for simplification of the GDPR. Privacy Matters. https://privacymatters.dlapiper.com/2025/05/europe-european-commission-publishes-proposal-for-simplification-of-the-gdpr/
- McClain, C. (2023, October 18). How Americans View Data Privacy. Pew Research Center; Pew Research Center. https://www.pewresearch.org/internet/2023/10/18/how-americans-view-data-privacy/