The Extraction of Essence

Key Points:

  • In the United States, existing legislation determines that an individual’s privacy is of financial value, and is ultimately negotiable. So while privacy is for sale, it also comes bundled with large ethical issues of ownership. Who owns an individual’s data, likeness, engagement history or network?

  • Legal protections don't do a lot to prevent abuse of populations that we agree should be protected, especially when those populations can't represent themselves. Even if those who have given up their privacy in advance, and that consent is present, the person who has died is not the end user.

  • It’s often the case that we willingly say yes to digital identity harvesting because we value the exchange, at least in the terms that we understand it, and even if we don't have the language to express our consent.

  • Proposed legislation from the IEEE recommends that an individual has the right to access services allowing them to create a trusted identity to control the safe, specific, and finite exchange of their data. But they omit considerations of posthumous use, and the consent to do so, either by surviving loved ones, or by third-party providers.

Privacy As A Human Right

As we think through ethical questions of digital privacy, we need to focus on the three ideas of consent, extraction and consequence. Of privacy being an individual state free from public attention, interference and intrusion. How does that intersect with grieftech’s motivation to digitize and synthesize human essence, where we’re willingly uploading as much of ourselves as possible in order to improve the training model of a bot? What are the acknowledged, but also unintended consequences of bequeathing a digital presence to those who survive us and the platform who host us in the afterlife?

Ethical issues surrounding our ability to navigate the world without being harassed or surveilled, and without having associations drawn about what we’re doing based on our movements represent the bedrock of free societies, argues Dr. Chris Gilliard (Gilliard, 2022). Yet in western societies the price of entry for even being a contributing member of society overwrites much of our ability to consent to such activity. Especially in public spaces, but also online, the absence of permissions and the removal of agency is often positioned as convenience, prevention, or simply cost-effective. Gilliard argues that the right to consent is a human right, with issues of information, communication and individual privacy key components for ensuring human dignity, safety and self-determination, especially within already marginalized communities (Gilliard, 2023).

Privacy is culturally defined, shaping not only what an individual thinks or feels, but also what someone can do out in the world (Krieger, 2023). In the United States for example, existing legislation determines that an individual’s privacy is of financial value, and is ultimately negotiable. So while privacy is for sale, it also comes bundled with large ethical issues of ownership. Who owns an individual’s data, likeness, engagement history or network? The user? Those they’ve interacted with? The platform? The communication provider? The obfuscated answer to most of these questions is unfortunately, a resounding… maybe. A person’s own sense of who they are is often determined by their group affiliation, but groups themselves are shaped by the physical, social and psychological characteristics of their members (Krieger, 2023).

The Right To Be Forgotten & Remembered

So while many of our conversations around privacy in digital spaces coalesce around the right to be forgotten or ignored, especially when consent hasn’t been given, there also exists the right to be remembered. It’s not just information we’ve already shared, as Dr. Gilliard argues (Gillard, 2023), it’s information that’s extracted without permission, and information that’s freely offered as part of an often superficially understood value exchange. In different parts of parts of our lives, we take on this accountability for how we make these decisions, and how we make these choices as citizens. It’s often the case that we willingly say yes to such harvesting because we value the exchange, at least in the terms that we understand it, and even if we don't have the language to express our consent.

But the right to be remembered and leave a digital presence for others is something we're starting to see in the emerging field of grieftech, where the value exchange is one of easing the emotional strain of human loss for those left behind. We hold ourselves accountable for the legacy we choose to leave behind, both in the real world, but increasingly in the digital world too. So in many ways the value exchange of essentially downloading a life into a product can be viewed as worth it. Our digital identities are strongly connected with our social identities, and as Arntson describes, when we think of our roles as citizens, we can choose to educate ourselves to make responsible choices, and be heard in such a way as to make a difference (Arntson, 1989).

The Institute of Electrical and Electronics Engineers’ Standards Association (IEEE SA, 2020) has proposed a number of legislative recommendations aimed at increasing the level of agency in citizens. They propose that every individual pair with a personal data or algorithmic agent which they curate to represent their terms and conditions in any real, digital, or virtual environment, as well as providing every individual with the means to create and project their own terms and conditions regarding their personal data that can be read and agreed to at a machine-readable level. Ultimately they conclude that an individual has the right to access services allowing them to create a trusted identity to control the safe, specific, and finite exchange of their data (IEEE, 2020). These initiatives require mass consensus and cross national consensus as well, and are obviously incredibly challenging. But they also omit considerations of posthumous use of data, and the consent to do so, either by surviving loved ones, or by third-party providers. In many ways, we're talking about containment versus avoidance here, we we’re more likely to gravitate towards damage mitigation, versus actually accepting the consequences of our choice and doing something about that upstream in the lifecycle of the product.

And legal protections don't do a lot to prevent abuse of populations that we should agree to be protected, especially when those populations can't represent themselves. Even if those who have given up their privacy in advance, and that consent is present, the person who has died is not the end user. They're the content creator in that exchange and future interactions with grieftech are still being surveilled and billed monthly.

Remembrance, Reinvented And Questions Of Unintended Consequence

Let’s look at a real example. When James Vlahos’ father was nearing the end of his life, Vlahos began to record as many of his stories as he could. In hours of audio recordings, Vlahos learned about how his parents first met, his father’s childhood, his career, and his thoughts on the world, especially sports. In extracting his father’s essence, his way of being in the world, he was able to digitally reassemble his father’s voice and anecdotes into an artificial intelligence powered chatbot. Using software originally pioneered by Pixar, Vlahos was about to recreate the emotional feeling of ‘speaking’ to his father after he had passed away in a product he lovingly christened the ‘Dadbot’. It’s clear that Vlahos and his father were close, and the interactions with the bot are poignant and laced with sadness. But Vlahos is very clear that his efforts are not designed to create a simulated avatar of his father. As he is clear to explain, ‘it’s remembrance, not emulation’. In this sense he sees his work as closer to the old family photographs we hang on the walls of our homes, rather than the stuff of dystopian nightmare.

Vlahos has become an advocate for the digitizing of human essence for posthumous use with artificial intelligence, and is currently scaling his efforts into his company, Hereafter.ai, which provides all the necessary services and tools for creating your own version of the Dadbot. Hereafter offers the ‘gift of being remembered’ for as low as $3.99 per month, or for an unlimited single-payment lifetime cost of $199. Its promotional literature promises to preserve an individual’s voice and stories forever, offers privacy controls, and is one of the first conversational AI tools to focus on bereavement as an opportunity. Hereafter’s work has been closely followed by competitors such as Storyfile, which is more video-avatar based and strongly endorsed by spokesperson William Shatner, and Eternime, which more bombastically offers a ‘journey to digital immortality’.

As reporter Joanna Stern explains, there are two distinct parts to a digital legacy. The experiences a person wants to leave behind. And what the people who survive want (Stern, 2020). Lucy Watts, a woman diagnosed with a degenerative condition, knew her time was ending, and connected with Vlahos’ work through Stern’s reporting. Watts passed away in May 2023, but leaves a digital legacy as large as her personality, including a Hereafter voice chatbot. Lucy’s mother, prior to her passing, felt that interactive text and audio remembrance of Lucy would be welcome, but drew the line at video (Stern, 2020). In our non-reductive grief we still appeal to feelings of the departed being somewhere, despite the reductionist code of preserved digital essence (Shoemaker & Tobia, 2022). Even though the somewhere experienced is a remote server, we still feel the presence of an identity. A person we loved.

When it comes to aspects of digital privacy, the right to be remembered, and the value exchange of consensual surveillance, we’re often left with more questions than answers. Here are three worth thinking through:

  • This week saw the launch of mass awareness of spatial computing with the release of Apple’s Vision Pro, perhaps the ultimate personal surveillance device. It feels like we may be inviting an unprecedented level of data harvesting into our lives in the spirit of entertainment. I’m particularly concerned about the user interface and navigation being driven by an intensely close level of eye tracking coupled with gestural activations and voice recognition. It goes far beyond tracking and targeting clicks. Is privacy really the price of digital innovation?

  • I'm thinking a lot about passive data extraction and the ethics of fairness. How often we're simply unaware of the data being harvested from us every day.

    • But we can't live like hermits either.

    • Is there an equitable / 'fair trade' end point here which goes beyond awareness or current legislation to prohibit unintended data collection?

    • That we make the price somehow 'fairer' for everyone beyond initiatives like GDPR and CCPA?

  • When wouldn't privacy and the right to consent be a human right?

    • Are those under investigation permitted to have their data collected without their consent?

    • When is the lack of consent made so attractive as to willingly give it up?

Previous
Previous

Twenty Things In Twenty Years

Next
Next

Liminal, Loathing & Leveling-Up: The Biological Inevitability of Monstrosity