The ascent of artificial intelligence is subtly—and radically—transforming human identity, introspection and agency. As our experience with AI systems increases, we enter a new frontier. This is where human consciousness and machine feedback converge, in a space the artists have dubbed the "Algorithmic Self." This interaction holds promise in providing benefits, but this relationship poses restrictions and challenges to self-concept.
AI systems based on surveillance largely constrain personal self, self-awareness, and agency. And today, predictive text continues to distort the purpose of our communication. AI-powered journaling assistants vigorously work to guide us, shape us, and develop private notions of our identity. This impact leads to concerns about autonomy and making people feel that they may be alienated from their own emotional experience.
One of the most recognizable examples of AI’s role in shaping the way we see ourselves is Spotify Wrapped. This yearly “algorithmic rite of passage” remixes each user’s listening habits into personalized data narratives, creating and reproducing their algorithmic persona. These representations can be viewed as objective truths about users’ identities, impacting how they view themselves.
AI-enhanced journaling assistants or emotion-tracking platforms funded by Big Tech companies have emerged to provide individuals with powerful, personalized tools for understanding and regulating their emotional states. The newest example of therapeutic AIs, such as Woebot and Wysa, offer users emotional support by validating their emotions and helping users navigate through challenging feelings. This turning to AI for “emotional reflection” risks cutting people off from their own emotional processes.
When algorithmic nudges begin to impact human agency, they raise questions about the highly valued self-directed, autonomous individual. This normative impact results in the potential for emotional normativity. Users are constantly motivated to show emotions that are more favorable for machines rather than showing how they feel. Moreover, algorithms have the ability to play into what users already prefer or want, which can bring about cognitive entrenchment and restrict self-perception.
Predictive personalization, one of the most touted features of AI, has insidious costs to the human psyche. On one hand, it creates convenience and personalized experiences. On the other, it leads to filter bubbles and reduces exposure to different viewpoints. In turn, this can result in a broader identity and richer intellectual engagement.
The creation of digital identity is about something much deeper. It’s about the overall construction of power, representation and equity. These hyperconnected digital infrastructures play a role in how users self-identify. They do much more than simply impact users’ behaviors or how other users label them. This further contributes to concerns about algorithmic bias and the risk of achieving unfair, discriminatory, or other unlawful outcomes.
"Algorithmic Self" - [No specific author mentioned, but it seems to be the title of a concept discussed in the text]
First, let’s recognize that algorithmic representations like those produced by Spotify Wrapped are not neutral. They are built on proprietary algorithms and data sets that can shape the way users think about themselves. Keeping an eye on these influences is important to achieve intended positive outcomes. We need to push back against the epistemic violence of the algorithmic myths that invent who we are.