Lethe Machina: A Theory of Digital Forgetting, Ritualized Erasure, and the Ethics of Disappearance // Sasha Shilina


“At the water of Lethe’s stream they drink … long forgetfulness.”

— Virgil, Aeneid VI (6.713–715)


“…by the river Unmindful … he who drank forgot all things.”

— Plato, Republic X, “Myth of Er” (621a–b)


“In conditions of digital recall, loss is itself lost.”

— Mark Fisher, “The Slow Cancellation of the Future,” Ghosts of My Life (2014)


“… any semblance of my speech, with perhaps the exception of rhythm, is destroyed.”

— Alvin Lucier, I Am Sitting in a Room (1969)


“The tape runs on in silence.”

— Samuel Beckett, Krapp’s Last Tape (1958)




Prelude: The River That No Longer Runs

Lethe names an older mercy: water that loosens a life’s grip on itself. You drink and what happened stays happened — nothing erased, nothing revised — yet the past loses its adhesive hold; it becomes livable, digestible, something the present can metabolize rather than endlessly restage. In this older sense, forgetting is not failure but a biological competence, a boundary-making art. Without it, the human animal moves forward only by dragging its own sediment.


Our century, though, has built a rival hydrology. Memory no longer behaves like flow; it encrusts into infrastructure built from geological sediment. It stores. It indexes. It waits. It lithifies. The past doesn’t recede so much as sit in a limitless warehouse: a human-exclusion zone, meticulously indexed and ready for instant retrieval the moment a system decides that ‘now’ is the time for its return — often without you ever making a request. An archive, as part of an ensemble of logistical media (Durham Peters, 2015), becomes a kind of perpetual present, a vast stillness where every moment ever captured rests in identical containers under humming lights, neither alive nor dead but suspended in the amber of the server rack, waiting to be called forth by algorithms that have learned the grammar of resurrection better than we ever knew the art of letting go.

You can feel the shift in your gut…your psyche begins to resemble customer support for its own history. A photo surfaces on the wrong date, smiling from inside a day that cannot hold it. A search field returns a former self as if it were current evidence. A “memories” carousel offers intimacy by algorithmic schedule.

The harm here is rarely spectacular; it is incremental. The right to let things settle is taken from you. The past touches the present on someone else’s terms, packaged for circulation and calibrated for attention. It arrives stripped of its relations. What disappears is the privilege of forgetting on your own terms: the right to decide when and [how] the past is allowed to matter again.

This loss of timing or the expropriation of the ‘when’ then makes another distinction urgent. Recollection is a human event: uneven, interpretive, replete with gaps, sometimes merciful, sometimes false, always bound to a living situation that forces meaning to be remade. Recall is an operation: indexed, repeatable, indifferent. Under platform conditions, recall no longer waits to be called. It seeps into the spaces between actions — surfaced without summons, threaded through feeds, suggested alongside faces. And because it is automatic, it is scalable: your memory, rendered into a format that can be shared, compared, aggregated, set beside the memories of strangers.

“Lethe Machina” names the paradox that follows. We built machines that should have made forgetting easier – frictionless deletion, graceful disappearance, the option of silence – yet our dominant machines mechanize remembrance instead, expanding the reach of the trace and shrinking the room for endings. Still, the title also holds open a second meaning: a counter-machine not yet achieved, an apparatus that could grant fading without falsification, disappearance without violence, quiet without denial.

None of this is a plea for amnesia. Some records protect the vulnerable; some evidence is the minimum cost of justice. But a civilization that treats total retention as a moral reflex produces and unleashes a specific cruelty: traces become perpetually actionable, former selves indefinitely transportable, and ended relations are capable of reappearing with the insistence of the present tense. The problem is therefore architectural: not whether traces exist, but whether systems provide thresholds – attenuation, de-amplification, gated verification – by which what-has-been can become what-is-no-longer-operative. What follows attempts to sketch a technical grammar for endings or to gestate infrastructures that can stage departure, encode thresholds, and let time release its hold.


Hiroshi Sugimoto. Lake Superior, Eagle River, 2003
Hiroshi Sugimoto. Lake Superior, Eagle River, 2003.



1. A New Thanatology for the Internet: Archives Without Threshold

A theory of digital forgetting begins with a shift in temporal texture. In Memory, History, Forgetting, Paul Ricœur (2004) treats forgetting as a condition of narratability: a thinning that makes compression possible, that transmutes experience into life rather than holding it as inert remainder. “Narrative structure, which memory and history have in common,” he writes, “confirms this law of the necessity of forgetfulness” (Ricœur, 2004).

Forgetting then opens an interval: reinterpretation becomes possible, memory can be held without dominating, and continuity can persist without saturation. It marks the subtle threshold between retention and release. When that interval contracts, the archive grows denser. As Wolfgang Ernst discerns in our delirious age of archive fever, “Previously emphatically differentiated qualities of linear time, which belong to the symbolical order of time (past, present and future) increasingly fold into one, in a compressed, dense present” (Ernst, 2016). In platform regimes, this density is less a matter of individual psychology than a property of infrastructures that prioritize addressability. What persists is what remains legible to systems of retrieval — “the archive has no narrative memory, only a calculating one” (Ernst, 2013).

If the archive only calculates, then distance itself is recalculated. Thus, digital infrastructures do not eliminate distance, but they decouple distance from operational presence. This ties into what Zala Volčič and Mark Andrejevic (2023) characterise as an aesthetics of framelessness that makes it possible and perilous. The same logic that drives 360-degree sensing, total situational awareness, and the ambition of “collecting everything and holding it forever” also governs the temporal environment: the past is captured without a frame, held without a horizon, available without a situated perspective to give it proportion. Just as the self-driving car’s Lidar maps the entire surround because it cannot rely on a driver’s situated intentionality, platforms hold every trace because they cannot trust a subject’s judgment of relevance. Framelessness in space becomes framelessness in time. This framelessness figures into and has profound implications for how we manage endings, a problem that arguably classical funerary architectures understood intimately.



1.1 The Disappearance of Disappearance


Classical funerary architectures — graves, tombs, memorials — serve a dual purpose: they commemorate, but more fundamentally, they mediate. They operate as interfaces between presence and absence, translating the raw fact of loss into a legible structure of relation: here lies, here ended, here is remembered, here is allowed to fade. The grave functions as boundary-marker; the memorial as social contract, organizing attention, care, and release.


As we established, digital infrastructures invert this logic. Retention and reactivation are their default conditions. We see this all the time, a profile endures as a durable address; posts drift free of their originating context, becoming objects of repeated recontextualization; images persist as coordinates in a circulation system that continues long after the life they belonged to has diluted. To understand why this inversion has become profound, requires briefly a more fundamental account of how memory importantly relates and is shaped by its own exteriorization.

Bernard Stiegler’s concept of tertiary retention helps name this process and its transformation (1998). Across human culture, memory has always been exteriorized — in tools, writing, recordings — allowing transgenerational learning or also what Stiegler refers to as “long circuits” of meaning. But when exteriorized memory expands without proportionality, when traces remain actionable while growing semantically emaciated, memory does not deepen; it dilates.

It is precisely this dilation that creates a thanatological problem. When tertiary retention ceases to be a cultivated soil for long circuits of meaning and instead becomes an inert, ever-accessible mass, it loses its ability to distinguish between the living and the dead — because the system’s ontological categories are not organized around finitude. Traces persist as informational revenants: detached from embodied subjects, still actionable within technical and economic circuits, easily called back by search. And beneath all of this lies a familiar political economy: traces become behavioral surplus, harvested for prediction markets where retention is the business model (Zuboff, 2019).

From this economy surfaces a punitive architecture: the constant availability of reactivation, the ease with which fragments can be reframed. Memory becomes liability. The archive is no longer a record but a reserve — a standing stockpile of potential accusation, potential embarrassment, potential recalculation, waiting to be drawn upon. Where forgetting has no infrastructural support, forgiveness has no threshold to hold it. Time remains open for audit.


1.2 Oblivion, Erasure, Forgetting: Three Temporalities of Disappearance


If forgetting is the atmosphere life requires, then we must ask what kind of disappearance is possible when that atmosphere has been poisoned by total recall. Digital forgetting is often discussed as if it were a single gesture: deletion. Yet legal and philosophical debates distinguish several regimes that everyday language compresses into one promise (Tamò & George, 2014). These distinctions matter because they correspond to different temporal logics and different moral claims, and because each implies a different design and governance response.

Oblivion is a social-juridical theory of time and relevance: the claim that certain facts, once public, may lose their legitimate public salience. Oblivion does not make false; it allows to recede. It is the ethics of distance, the idea that what happened should not remain equally actionable forever, that exposure has a half-life, that publicness is not destiny. Its logic is threshold-based: when does “news” become “history,” and when does history become mere punishment-by-repetition?

Erasure, by contrast, belongs to the grammar of data protection and procedural remedy. It concerns unlawful processing, revoked consent, excessive retention, or disproportional collection. Where oblivion asks about public temporality – what deserves to remain visible – erasure asks about governance and compliance – what must be removed to restore legal order to data flows.

Forgetting, finally, is neither purely public nor purely procedural, it is the structural condition of livable time: the selective occlusion without which continuity becomes unbearable and action becomes impossible. Human memory forgets not because it fails, but because it must: it compresses, reorders, and enables meaning to emerge by letting detail die. To treat forgetting as mere deletion is to misunderstand its phenomenological function. Forgetting is not subtraction; it is the condition by which life stays open.

These three registers clarify why “the right to be forgotten”[1] often becomes conceptually overloaded. Total disappearance is rarely achievable and rarely required. Much of the harm arises from indefinite resurfacing and amplification rather than from the mere existence of traces. The design task, then, is the restoration of thresholds: de-indexing, contextual decay, and friction against effortless retrieval. Which traces warrant endurance, which warrant attenuation, and how transitions between the two are staged becomes the central problem, because the livability of time depends less on perfect subtraction than on the possibility of dignified passage across a threshold.


Hiroshi Sugimoto: Lightning Fields

Who on the hidden side of the tomb Scatter freckles All glide while screeching over the black river of the ear – Joyce Mansour.


2. Blockchain as Cemetery and Temple


Nowhere is this need for thresholds more urgently tested than in technologies built for permanence. Among contemporary technologies, blockchain systems make the metaphysics of permanence unusually explicit — almost naïvely so, with the candor of an object that does not pretend to be neutral. Their core promise is immutability: once recorded, a transaction is inscribed into a distributed ledger whose persistence is diffused across a consensus of machines [2]. This property underwrites practical uses — finance, governance, provenance — where tamper-resistance is treated as a condition of trust. Yet beneath this pragmatic surface lies a philosophical wager: that truth is what cannot be revised, that memory should be preserved in perpetuity, and that the highest ethical posture of a record is simply to remain.

This is why the cemetery metaphor returns so easily and poignantly. A cemetery stabilizes names, gives them a strange second residence. The chain does something similar with actions: it is an ossuary of decisions, housing remnants that refuse to compost back into time. But blockchains also contain a second impulse, one that changes the scene: they can be scripted. Smart contracts transcend mere archiving; they drive processes. They introduce gates, thresholds, permissions. They allow a record to have a life cycle and access to become an ethical decision rather than a default condition of storage.

If the cemetery is what blockchains are by default, the temple is what they could become through design. A temple epitomises how such rules can be implemented. It shapes when a space opens, who may enter, what must remain covered, what is sealed after a rite is complete. The temple disciplines memory. It gives time a liturgy. A Lethean chain embraces this idea: a ledger that understands seasons. It knows when to consecrate and when to withdraw. Permanence becomes something you choose, not something that just happens.

We can take note that a myriad of existing protocols are already gesturing in this direction: tokens whose metadata is designed to erode like the TimeGuardian protocol for time-bound NFT rights, which enables “presetting authorization and revocation periods” so that ownership includes duration and disappearance becomes part of the artwork’s ontology (Wang et al., 2026) . Contracts that seal themselves when their purpose is complete, implemented through group signatures with “natural expiration” where keys lose validity after their intended period, leaving proof of past action without perpetual usability (Malina et al., 2013). Storage regimes that expire unless actively renewed — realised through time-based revocable encryption where keys are bound to hierarchical time periods and “can no longer be used to decrypt ciphertext generated after their expiry time” (Liu et al., 2018), pushing memory from passive accumulation into ongoing intention.


This governance layer finds its technical analogue in dynamic access control frameworks where “access policies are dynamically adjusted” based on collective risk assessment and trust metrics (Muppidi et al., 2025), or in autonomous cryptographic capabilities that enable delegation without infrastructure — where old content remains but new access can be excluded, creating de facto social expiration without technical deletion (Allen, 2025) .


These gestures matter because permanence is not innocence. It is capacity, leverage, a way of holding the past within reach. Calling upon Jacques Derrida, the archive is never merely about preservation. As Derrida contends: “It is a question of the future, the question of the future itself, the question of a response, of a promise and of a responsibility for tomorrow. The archive: if we want to know what that will have meant, we will only know in times to come.” (Derrida, 1996, p. 36). Immutability, in this light, is a bet on what will matter, a wager that tomorrow’s judgments should have access to today’s traces.


Luciano Floridi’s warning about total recall — that it risks turning informational environments into something carceral — bears relevance here too, because it names what happens when legibility becomes continuous and compulsory (Floridi, 2013). But Floridi’s work also offers a positive resource. His concept of “informational personhood” insists that we are our information; to harm someone’s data is to harm the person (Floridi, 2011). This cuts both ways. If informational personhood means our data deserves protection, it also means that perpetual exposure is a form of perpetual wounding. The right to be forgotten, in Floridian terms, is not about erasing the past but about defending the integrity of the informational self against an environment that refuses to let it change. Forgetting becomes a condition of personhood, not its violation.


2.1 Audience-Dependent Ephemerality: Gradients of Disappearance


Forgetting has never been singular. Human memory is stratified: private recollection, social circulation, institutional archives each embedded in what Friedrich Kittler would call a discourse network: a technical regime determining what can be recorded, stored, and circulated. A family’s right to remember is not a stranger’s right to search. A court’s need for documentation is not a platform’s license to index forever. The violence of digital systems is to collapse these regimes into a single surface: public, persistent, frictionless — a persistence without remainder, without refuge.


A Lethean architecture in turn would seek to restore stratification. It would give different audiences different temporal horizons. Public visibility would be the most perishable layer; social visibility would persist longer but with limits; institutional access might endure, increasingly gated and accountable. As we noted above, recent proposals are springing upon that push toward precisely this kind of audience-dependent transience: content can remain in some contexts while fading in others, with time and permissions encoded into the access logic (Darwish & Smaragdakis, 2024). Something heavier happens: visibility becomes a relationship.


The route is oblique. The ciphertext can persist while the key decays. The trace can remain while its addressability weakens. Revocation schedules, time-bounded keys, contract-enforced expiration on a blockchain – these methods do not promise metaphysical disappearance (Darwish & Smaragdakis, 2024). What changes is power: its reach, its ease, its casualness. But this raises a question that code alone cannot answer: who guards the thresholds? On a blockchain, the rules execute without mercy or discretion — but who decides which traces are consecrated and which are allowed to fade? If the contract encodes the terms of forgetting, who holds the authority to rewrite it when the social consensus shifts? This is not a bug in the protocol; it is at the heart of a politics of infrastructure, rendered inescapable precisely because the ledger never forgets. And it grows more urgent when we consider what survives even after the keys decay — etched permanently into a chain that answers possibly to no one.


Hiroshi Sugimoto: Lightning Fields

Hiroshi Sugimoto: Lightning Fields, 2009.


3. Liturgy of Disappearance: Ritual, Opacity, Interface Ethics

If Lethe Machina began as diagnosis, its counter-movement unfolds as liturgy: a grammar through which digital traces may enter dusk. Cultures that understood mortality built thresholds into their worlds; they gave endings a surface to move across.


3.1 Mourning as Interface Ethics

What seems to disappear leaves no visible crossing; what remains leaves no clear boundary. Without a threshold, the dead cannot cross. They linger in data purgatory: profiles persisting as sites of unresolved address, platforms circulating reminders of the deceased as algorithmic events (Brubaker & Hayes, 2011; Brubaker, Hayes, & Dourish, 2013). Grief without boundary. Memory without dusk. Chillingly, we can survey our contemporary social media landscape to already see, for example, how Meta and its patent makes this dynamic unnervingly explicit: technology that would train AI on a user’s behaviour to keep posting, replying, even speaking as them after death.


But, if the dead can speak on platforms, who or even now ‘what’ across the networkscape decides whether they are grievable? Judith Butler’s insight that mourning defines the border of the grievable (2004) acquires infrastructural force here. In Precarious Life, she writes: “Loss and vulnerability seem to follow from our being socially constituted bodies, attached to others, at risk of losing those attachments, exposed to others, at risk of violence by virtue of that exposure” (Butler, 2004, p. 20): If we are, as she claims, constituted through our attachments, then digital traces are not external possessions but extensions of that constitutive vulnerability.

Some identities receive a veil; others are left in perpetual exposure, as if never attached to anyone at all. A liturgy of disappearance would answer differently. Not deletion that annihilates, but deletion that seals. Destruction attended to — given time, given form — turning negation into something livable. Weakening the claim of the trace without pretending the trace never was. This is what it might mean for a platform to let the dead be dead: not erased, but sealed; not forgotten, but finished.


3.2 Departure as a UI Primitive


What would it also mean to build interfaces that honor this kind of destruction — to give death a form the system can recognize? To treat departure as a core design function is to insist that mortality belongs inside the system’s vocabulary (Massimi & Charise, 2009). Users improvise veils and thresholds within existing structures — turning profile pages into shrines, comment threads into wake rooms (Brubaker et al., 2013). These improvisations reveal a demand the architecture does not formally acknowledge.


Design theory also contains resources for this work: Slow Media’s call for duration, Calm Technology’s retreat to the periphery, Hito Steyerl’s poor image, Rosa Menkman’s glitch aesthetics — all treat degradation as revelation, meaning emerging through compression and fracture (Slow Media Collective, 2010; Weiser & Brown, 1997; Steyerl, 2009; Menkman, 2011). The shared implication is that disappearance should be modeled as a temporal process rather than a binary event. A trace can thin rather than vanish. Circulation can cool rather than accelerate. Resurfacing can meet friction, so retrieval regains weight.


These improvisations point toward a deeper need: ritual darkness. “Dark patterns” in design discourse incarnate deception; Lethe Machina reclaims darkness as a necessary sanctuary — shelter from compulsory visibility, the right to opacity, the dignity of the veil. A ritual darkness designs especially for restraint: Logging becomes selective. Retention becomes intentional. Deactivation unfolds more rather through the motions and sequences of notice, review, fade, so the trace moves across marked ground. Dormancy and de-indexing function as soft veils, limiting circulation without denying existence. The system learns ultimately to hush.


3.3 Collective Rites: DAO-led Deletion


But a hush, left to infrastructure alone, remains a fragile thing — a silence that can be broken by the next update, the next terms of service, the next regime of accumulation. Silence requires not only design but guardians. This is why forgetting has also been profoundly communal. Societies curate memory through repetition and omission, through canon and taboo, through silence as much as inscription. The digital displaces this labor onto individuals inside architectures that favor permanence.


This is where on-chain governance comes to introduce the possibility of enacting collective thresholds. Building on Meg Leta Jones’s account of petitional erasure (Jones, 2016), DAOs could institute periodic rites of retirement: sunsetting proposals, pruning abandoned datasets, sealing contracts whose purpose has dissolved. Burn ceremonies and memory-pruning epochs become governance rituals — visible acknowledgments that endless inscription is not synonymous with virtue. This reframes forgetting critically as a collective governance capacity rather than exclusively an individual burden.


Even ragequit reads differently in this light: a ritual crossing, a withdrawal into dusk, a sealing of contributions against indefinite exposure.


Hiroshi Sugimoto: Seascapes

You choose the site of the wound where we speak our silence – Alejandra Pizarnik


4. Shadow Protocols: Metadata Ghosts and Infrastructures of Silence

Still beneath every scene of deletion, systems keep their own clandestine accounts — logs, caches, backups, the quiet bureaucracy of persistence. And now a newer substrate of survival: models that hold statistical impressions of what was ostensibly removed, residues that no longer resemble a record yet continue to scaffold memory. Call these residual formations ‘metadata ghosts’: presences without content, survivals without voice, traces that do not look like memory while still functioning as its infrastructure.


But metadata ghosts are the least of it. The more persistent challenge today is no longer exclusively the database, but the models built from it. Even perfect deletion at the level of records does not guarantee that the information has truly disappeared, because contemporary systems are increasingly shaped by learned, high-dimensional structures — such as the embeddings and neural network weights that power machine learning models. These structures capture and encode patterns from training data, meaning that the influence of deleted records may persist indirectly.

Consequently, ensuring data privacy requires new approaches, a field now known as machine unlearning, which goes beyond traditional record-level deletion to address the lingering influence of data within these models [3]. Critically, what machine unlearning confronts, then, is not the record but its afterlife: ‘parametric ghosts’ or deletions that survive as foresight, residues that no longer resemble a trace yet continue to shape what systems expect, recommend, and assume. The data is forgotten. Its direction is not.


This also invites further political implications, simply, ghosts do not distribute themselves evenly. Achille Mbembe’s necropolitics (2019) names the differential governance of death — the power to decide who lives and who dies, who is afforded a dignified passing and who is left to perish. In the digital register, we must also speak or scrutinise a type of necropolitics of trace: a regime that decides not only whose life matters, but also whose past is allowed to fade. In other words, inference also does not haunt everyone equally. The models that remember what has been deleted — that continue to score, rank, and target based on absorbed traces — perpetuate and amplify existing hierarchies of visibility and vulnerability. This requires attending to the differential effects of visibility.


Notably, danah boyd reminds us, “By and large, those who are looking are those who hold power over the person being observed. Parents look. Teachers look. Employers look. Governments look. Corporations look. These people are often looking to judge or manipulate. Given the powerful position they are in, those doing the looking often think that they have the right to look. The excuse is simple: ‘it’s public.’ But do they have the right to judge? The right to manipulate? This, of course, is the essence of conversations about surveillance” (boyd, 2009).


The right to disappear, in this light, is a condition of freedom for those whom the system never stops watching. This asymmetry has an economic form: where prediction is profitable, disappearance becomes costly; where disappearance is costly, it becomes a privilege. The market in behavioral futures converts forgetting into a luxury good.


If earlier parts of this essay approached forgetting through forms, rites, and permissions, we are insistent on a harder requirement: any serious architecture of forgetting must govern what survives the ending. Hence shadow protocols — design patterns and infrastructural logics that attend to residues, afterlives, and the operational power of what remains. Shadow protocols do not promise total erasure. They propose an ethics of remainder. They treat silence as a possible system state, rather than an illusion performed by a clean interface.


Building on our earlier discussion of possible direction both those worth pursuing and those already taking shape highlighted in section 2, we can now articulate more concretely what we envision: trace-decay timers that cause metadata to lose resolution unless renewed through explicit, justified use; residual gating that allows records to persist for integrity or audit purposes while becoming effectively non-queryable except under rare, accountable conditions; Lethean rate limits, meaning aggressive querying triggers progressive occlusion—extraction that fades with each attempt; and context-preserving dormancy, which lets traces remain while disabling their circulation — no ranking, no recommendation, no resurfacing — so that persistence loses its coercive force. What these patterns share is the reintroduction of temporality into afterlives. The remainder is no longer a technical accident; it becomes an ethical design object. What survives is allowed to survive differently: less actionable, less contagious, less coercive.


A final recognition: not only data has afterlives. Systems do too. Deprecated formats, abandoned platforms, obsolete hardware persist as ruins that still act, leak, and haunt — zombie infrastructures whose partial life creates new exposures (Parikka & Hertz, 2012). Lethean architectures must therefore extend the right to die beyond user content to protocols and infrastructures themselves: designing decay rather than leaving it to accident.


The task, finally, is to nourish the capacity to produce silence — deliberately, accountably, and with teeth: non-retrievability, non-resurfacing, and bounded influence. Systems that know how to remember, and also how to dissolve into the riverbanks of memory.


Hiroshi Sugimoto: Lightning Fields, 2009

At the borders of sight, bright stars rotate. The earth stands steady, seen from the heights.
All the black holes are altered to confined circuits and links.

– Forough Farrokhzad


Conclusion: Learning to Lose

Digital infrastructures remember too well — not because they are unusually faithful, but because retention has become the default temperament of the system, the path of least resistance that silently hardens into moral posture. Platforms, clouds, ledgers treat storage as prudence and addressability as virtue, as if the ability to summon a trace were already a form of care. Under this regime, endings fail to register where contemporary life increasingly happens: at the interface. Bodies end; accounts persist. Context dissolves; indices remain.


This is not simply more memory. It is a rearrangement of time. Once that rearrangement is seen, the design problem stops looking like a binary switch and begins to look like an ethics of thresholds. The enmeshment of human and non-human infrastructures therefore requires Lethean gradients — differential lifetimes and access across contexts — that make explicit who can still see and for how long.


Ultimately, in myth, Lethe is not merely oblivion. It is renewal: a passage through which life becomes livable again. The question for our systems is whether they can learn that lesson, how to store, yes, but also how to end; how to preserve integrity without demanding perpetual replay; how to allow closure without violence; how to build dusk into the archive. Lethe flows.




Footnotes


[1] European data-protection law makes disappearance operational: GDPR Article 17 establishes a right to erasure (“right to be forgotten”) under defined grounds and limits, formalizing when data should be allowed to “die” rather than persist by default (Regulation (EU) 2016/679). The Court of Justice’s Google Spain ruling clarifies a key design fact: controlling visibility can transform lived reality without rewriting the underlying record—de-indexing changes discoverability and reputational force while leaving the source publication intact (Google Spain SL v. AEPD and Mario Costeja González, C-131/12, 2014). Consumer “disappearing messages” illustrate the structural limit: time-based deletion can support intent, but disappearance is never purely technical because social re-circulation (screenshots, forwarding, re-uploads) produces remainder—so systems should design for partial, layered withdrawal rather than promising total erasure.


[2] Legal scholarship has shown how sharply blockchain immutability collides with the GDPR: if data is replicated across globally distributed nodes, who is the controller, and what could “erasure” mean in a ledger designed not to delete? Finck (2019) shows that the standard fixes (hashes-only, off-chain storage, encryption + key destruction, redactable designs like chameleon hashes) don’t remove the tension; they expose it: immutability is a technical choice with ontological weight. A recent systematic review makes the same point in governance terms: “forgetting on-chain” is not just engineering but accountability—roles, jurisdictions, and procedures that decide who can authorize withdrawal, who must honor it, and what traces inevitably remain (Celador, 2024).


[3] This is where the contemporary literature on machine unlearning becomes philosophically load-bearing rather than merely technical. Surveys on deletion and unlearning emphasize the core mismatch (Hu et al., 2023): storage systems can delete records, but learning systems entangle records into distributed parameters, making “removal” an epistemic operation rather than a file operation. Even when systems implement deletion workflows, they may preserve functional remnants through embeddings, cached features, distilled models, or downstream derivatives, producing what we might call the illusion of deletion: a user-visible disappearance that does not reliably imply causal withdrawal. The question Lethe Machina adds is stricter than compliance: what would count as evidence that a system has stopped using the past? Not only “was the row removed,” but “were the pathways of influence genuinely constrained.”


REFERENCES


Abouzied, A., & Chen, J. (2015). Harnessing data loss with forgetful data structures. In Proceedings of the Sixth ACM Symposium on Cloud Computing (pp. 168–173). ACM.


Ambrose, M. L., & Ausloos, J. (2013). The right to be forgotten across the pond. Journal of Information Policy, 3, 1–23. doi:10.5325/jinfopoli.3.2013.0001


Anderson, M. C., & Hanslmayr, S. (2014). Neural mechanisms of motivated forgetting. Trends in Cognitive Sciences, 18(6), 279–292.


Anderson, M. C., & Hulbert, J. C. (2021). Active forgetting: Adaptation of memory by prefrontal control. Annual Review of Psychology, 72, 1–33.


Andrejevic, M., & Volcic, Z. (2023). ATMOSPHERIC MEDIATION: From Smart Dust to Customizable Governance. In L. Parks, J. Velkova, & S. De Ridder (Eds.), Media Backends: Digital Infrastructures and Sociotechnical Relations (pp. 25–41). University of Illinois Press. http://www.jstor.org/stable/10.5406/jj.10405519.5


Assmann, A. (2011). Cultural memory and Western civilization: Functions, media, archives (D. Henry Wilson, Trans.). Cambridge University Press. (Original work published 1999)


Beckett, S. (1960). Krapp’s last tape, and other dramatic pieces. Grove Press.


Bernal, P. (2011). A right to delete? European Journal of Law and Technology, 2(2).


Biega, A. J., Gummadi, K. P., & Weikum, G. (2018). The right to be forgotten in the media: A data-driven study. Proceedings of the ACM on Human-Computer Interaction, 2(CSCW), Article 88.


Bishop, M., Butler, E. R., Butler, K., Gates, C., & Greenspan, S. (2013). Forgive and forget: Return to obscurity. In Proceedings of the 2013 New Security Paradigms Workshop (pp. 1–10). ACM.


Blumenberg, H. (1985). Work on myth (R. M. Wallace, Trans.). MIT Press. (Original work published 1979)


Boyd, D. (2010). Social network sites as networked publics: Affordances, dynamics, and implications. In Z. Papacharissi (Ed.), A networked self: Identity, community, and culture on social network sites (pp. 39–58). Routledge.


Bourtoule, L., Chandrasekaran, V., Choquette-Choo, C. A., Jia, H., Travers, A., Zhang, B., Lie, D., & Papernot, N. (2021). Machine unlearning. In 2021 IEEE Symposium on Security and Privacy (SP) (pp. 141–159). IEEE.


boyd, d. (2009, December 1/10). Do you see what I see?: Visibility of practices through social media [Conference presentation]. Supernova and Le Web, San Francisco, CA, United States; Paris, France. https://www.danah.org/papers/talks/2009/SupernovaLeWeb.html


Brubaker, J. R., & Hayes, G. R. (2011). “We will never forget you [online]”: An empirical investigation of post-mortem MySpace comments. In Proceedings of the ACM 2011 Conference on Computer Supported Cooperative Work (pp. 123–132). ACM. doi:10.1145/1958824.1958843


Brubaker, J. R., Hayes, G. R., & Dourish, P. (2013). Beyond the grave: Facebook as a site for the expansion of death and mourning. The Information Society, 29(3), 152–163. doi:10.1080/01972243.2013.777300


Butler, J. (2004). Precarious life: The powers of mourning and violence. Verso.


Celador, O. (2024). General Data Protection Regulation, right to be forgotten, blockchain technology and human rights. The Age of Human Rights Journal, 23, e8702. https://doi.org/10.17561/tahrj.v23.8702


Cao, Y., & Yang, J. (2015). Towards making systems forget with machine unlearning. In 2015 IEEE Symposium on Security and Privacy (pp. 463–480). IEEE.


Carlini, N., Tramer, F., Wallace, E., Jagielski, M., Herbert-Voss, A., Lee, K., Roberts, A., Brown, T., Song, D., Erlingsson, Ú., Oprea, A., & Raffel, C. (2023). Extracting training data from large language models. In Proceedings of the 32nd USENIX Security Symposium. USENIX.


Castelluccia, C., De Cristofaro, E., Francillon, A., & Kaafar, M.-A. (2011). EphPub: Toward robust ephemeral publishing. In 2011 19th IEEE International Conference on Network Protocols (pp. 165–175). IEEE.


Chun, W. H. K. (2011). Programmed visions: Software and memory. MIT Press.


Derrida, J. (1996). Archive fever: A Freudian impression (E. Prenowitz, Trans.). University of Chicago Press. (Original work published 1995).


Darwish, M. A., & Zarras, A. (2023). Digital forgetting using key decay. In Proceedings of the 38th ACM/SIGAPP Symposium on Applied Computing (pp. 34–41). ACM.


Darwish, M. A., & Smaragdakis, G. (2024). Disjunctive multi-level digital forgetting scheme. In Proceedings of the 39th ACM/SIGAPP Symposium on Applied Computing (SAC ’24). ACM. doi:10.1145/3605098.3635904


Eichhorn, K. (2019). The end of forgetting: Growing up with social media. Harvard University Press.


Ernst, W. (2013). Digital memory and the archive (J. Parikka, Ed.). University of Minnesota Press.


Ernst, W. (2016). Chronopoetics: The temporal being and operativity of technological media (A. Enns, Trans.). Rowman & Littlefield International. (Original work published in German)


Ernst, W. (2016, December 7). The delayed present: Media-induced tempor(e)alities and techno-traumatic irritations of “the contemporary” [Lecture presentation]. ARoS Aarhus Art Museum, Aarhus, Denmark. https://arts.au.dk/en/news-and-events/events/show/artikel/public-lecture-wolfgang-ernst-on-the-delayed-presence-1


Finck, M. (2019). Blockchain and the General Data Protection Regulation: Can distributed ledgers be squared with European data protection law? European Parliament (Policy Department).


Fisher, M. (2014). The slow cancellation of the future. In Ghosts of my life: Writings on depression, hauntology and lost futures (pp. 2–29). Zer0 Books.


Floridi, L. (2011) The informational nature of personal identity. Minds and Machines, 21 (4). pp. 549-566. ISSN 0924-6495


Floridi, L. (2013). The ethics of information. Oxford University Press.


Geambasu, R., Kohno, T., Levy, A. A., & Levy, H. M. (2009). Vanish: Increasing data privacy with self-destructing data. In Proceedings of the 18th USENIX Security Symposium (pp. 299–316). USENIX.


Glissant, É. (1997). Poetics of relation (B. Wing, Trans.). University of Michigan Press. (Original work published 1990)


Gondry, M. (Director). (2004). Eternal sunshine of the spotless mind [Film]. Focus Features; Anonymous Content; This Is That Productions.


Ginart, A., Guan, M., Valiant, G., & Zou, J. (2019). Making AI forget you: Data deletion in machine learning. In Advances in Neural Information Processing Systems (NeurIPS).


Groys, B. (2008). Art power. MIT Press.


Hansen, M. (2012). Ubiquitous sensation: Toward an atmospheric, collective, and microtemporal model of media. In U. Ekman (Ed.), Throughout: Art and culture emerging with ubiquitous computing (p. 70). MIT Press.


Haraway, D. J. (2016). Staying with the trouble: Making kin in the Chthulucene. Duke University Press.


Hertz, G., & Parikka, J. (2012). Zombie media: Circuit bending media archaeology into an art method. Leonardo, 45(5), 424–430. https://doi.org/10.1162/LEON_a_00438


Hoskins, A. (2009). Digital network memory. In A. Erll & A. Rigney (Eds.), Mediation, remediation, and the dynamics of cultural memory (pp. 91–106). De Gruyter.


Hui, Y. (2019). Recursivity and contingency. Rowman & Littlefield International.


Jones, M. L. (2016). Ctrl+Z: The right to be forgotten. NYU Press.


Koops, B.-J. (2011). Forgetting footprints, shunning shadows: A critical analysis of the “right to be forgotten” in big data practice. SCRIPTed, 8(3), 229–256. doi:10.2966/scrip.080311.229


Kundera, M. (1980). The book of laughter and forgetting (M. H. Heim, Trans.). Alfred A. Knopf. (Original work published 1979)


Lacour, S. (Ed.). (2008). La sécurité de l’individu numérisé: Réflexions prospectives et internationales. L’Harmattan.


Liu, J. K., Yuen, T. H., Zhang, P., & Liang, K. (2018). Time-based direct revocable ciphertext-policy attribute-based encryption with short revocation list (Paper 2018/330). Cryptology ePrint Archive. https://eprint.iacr.org/2018/330


Lucier, A. (1990). I am sitting in a room [CD]. Lovely Music, Ltd. (Original work composed 1969)


Malabou, C. (2012). The ontology of the accident: An essay on destructive plasticity (C. Shread, Trans.). Polity.


Malina, L., Hajny, J., & Martinasek, Z. (2013). Efficient group signatures with verifier-local revocation employing a natural expiration. In Proceedings of the 10th International Conference on Security and Cryptography (SECRYPT 2013). SciTePress.


Marton, A. (2015). Digital forgetting and the future of the past: Dis-membering social media. BI-Digital 2015: Abstract booklet.


Massimi, M., & Charise, A. (2009). Dying, death, and mortality: Toward thanatosensitivity in HCI. In CHI ’09 Extended Abstracts on Human Factors in Computing Systems (pp. 2459–2468). ACM.


Mayer-Schönberger, V. (2009). Delete: The virtue of forgetting in the digital age. Princeton University Press.


Mbembe, A. (2019). Necropolitics (S. Corcoran, Trans.). Duke University Press.


Menkman, R. (2011). The glitch moment(um). Institute of Network Cultures.


Nietzsche, F. (1997). On the advantage and disadvantage of history for life (P. Preuss, Trans.). Hackett Publishing Company. (Original work published 1874)


Öhman, C., & Floridi, L. (2017). The right to be forgotten: A duty to remember? Ethics and Information Technology, 19, 1–8.


Parikka, J. (2012). What is media archaeology? Polity.


Perlman, R. (2005). The ephemerizer: Making data disappear. Journal of Information System Security, 1(4), 51–68.


Peters, J. D. (2015). The marvelous clouds: Toward a philosophy of elemental media. University of Chicago Press.


Politou, E., Alepis, E., & Patsakis, C. (2018). Forgetting personal data and revoking consent under the GDPR: Challenges and proposed solutions. Journal of Cybersecurity, 4(1), tyy001. doi:10.1093/cybsec/tyy001


Plato. (n.d.). The republic (B. Jowett, Trans.). The Internet Classics Archive. https://classics.mit.edu/Plato/republic.html


Ricœur, P. (2004). Memory, history, forgetting (K. Blamey & D. Pellauer, Trans.). University of Chicago Press. (Original work published 2000)


Rouvroy, A. (2008). Réinventer l’art d’oublier et de se faire oublier dans la société de l’information? In S. Lacour (Ed.), La sécurité de l’individu numérisé: Réflexions prospectives et internationales (pp. 249–278). L’Harmattan.


Sindhwani, D. (2025). Beyond permanent memory: Digital forgetting in the age of intelligent systems, reconciling human cognition, machine unlearning, and the right to be forgotten. International Journal for Multidisciplinary Research, 7(5).


Slow Media Collective. (2010). The slow media manifesto. https://www.slow-media.net/manifesto


Steyerl, H. (2009). In defense of the poor image. e-flux journal, (10). https://www.e-flux.com/journal/10/61362/in-defense-of-the-poor-image/


Stiegler, B. (1998). Technics and time, 1: The fault of Epimetheus (R. Beardsworth & G. Collins, Trans.). Stanford University Press.


Tamò, A., & George, D. (2014). Oblivion, erasure and forgetting in the digital age. Journal of Intellectual Property, Information Technology and E-Commerce Law, 5(2), 71–87.


Thylstrup, N. B. (2019). The politics of mass digitization. MIT Press.


Virgil. (1916). Eclogues. Georgics. Aeneid: Books 1–6 (H. R. Fairclough, Trans.). Harvard University Press. (Loeb Classical Library, 63)


Virilio, P. (1991). The aesthetics of disappearance (P. Beitchman, Trans.). Zone Books.


Xu, H., Zhu, T., Zhang, L., Zhou, W., & Yu, P. S. (2023). Machine unlearning: A survey. arXiv. https://doi.org/10.48550/arXiv.2306.03558


Wang, Q., & Hoskins, A. (Eds.). (2025). The remaking of memory in the age of the Internet and social media. Oxford University Press. https://doi.org/10.1093/oso/9780197661260.001.0001


Wang W. et al., “A TimeBound NFT Rights Protocol From Time Interval Signatures” in IEEE Transactions on Dependable and Secure Computing, vol. 23, no. 01, pp. 209-220, Jan.-Feb. 2026, doi: 10.1109/TDSC.2025.3604511.