Designing Democratic Digital Futures: Exploring Deliberation on Data for the Public Interest
PhD project
As society navigates the challenges of the industrial age, giving way to a digital one, we find ourselves on the brink of an ‘unevenly distributed future.’ This transition has profound consequences for democracy, with design playing a pivotal role in shaping societal changes. Crafting designs that ensure diversity, equity, and inclusion is crucial for our digital futures.
Much of society's ongoing digital transformation calls for public deliberation, which remains a path to contest the conditions under which data-driven systems operate. This research highlights the importance of deliberation as a mechanism for sense-making and of storytelling as a tool for comprehending the intricate issues stemming from our increasing reliance on data.
This PhD project is part of DCODE, a European network and PhD Program –under the umbrella of the Marie Skłodowska-Curie Innovative Training Networks (H2020-MSCA-ITN-2020) within the European Horizon 2020 Framework Program.
There are profound implications of technologies powered by data, which is the basis for large language models, image recognition, and automated decision-making. Data allows companies to offer better-targeted behavioral modification campaigns based on the quality of their data. Simultaneously, data instrumentation reinforces pre-existing power structures through monitoring, prediction, and control. As we face the ramifications of extractive data practices associated with data colonialism, how can we reclaim our right to a more just digital future?
What alternative futures might emerge from exploring different stories, narratives, and metaphors and challenge the fundamental assumptions that drive the development and operation of data-driven systems?
As we delve into the metaphors surrounding data production and how it is understood, such as the widespread notion of data as the 'new oil,' we uncover exploitative and extractive practices driven by a profit-oriented model prioritizing fiduciary responsibilities to shareholders over the public interest. Furthermore, these practices and the dominant narratives around AI and data have vested interests that align with an 'unevenly distributed future.' Instead, we need stories about our future(s), both individual and collective, that serve our interests.
This is especially significant given that the profit-driven model has deviated from the original promise of a democratizing open web. As a result, we have encountered disruptions to our privacy, the commercialization of our behavior, the transformation of our human experience into data, and the disenfranchisement of citizens. Alongside these challenges, we also witness the loss of autonomy and the gradual erosion of a shared vision for a more equitable future.
One vital approach to these issues is legislating and regulating online platforms' operations and data practices, ideally involving creating safeguards based on collective decision-making through democratic governance.
However, as José van Dijck (2021) points out, platform governance is a complex process that has created a mosaic of regulatory frameworks within the European context. It stems from the emergence of corporate and state-controlled platform ecosystems, blurring the traditional boundaries between the state, market, and civil society –which delineate governmental structures. Consequently, regulatory efforts have been directed toward individual firms, markets, or specific platforms, encompassing market concentration, freedom of information, privacy rights, and labor rights. Hence, there is a "growing need to understand how platformization works and to create new imaginaries".
The idea of data-driven technologies serving the public interest fills a void in the current design landscape. Moreover, it opens pathways for alternative and just digital futures. Our right to self-determination lies within the stories that could shape these futures.