Algorithmic Self (how AI harms individuals)
Posted: Tue Jan 20, 2026 11:33 am
AI, we are all well aware of its existence but I wanted to voice some stuff specifically for fun and in hopes of other folks to share their experience with any observations etc.
It is so much intertwined with everything online from Spotify playlists, personalized news feeds, apps for monitoring you health, I'll stop there but you get the idea. The "algorithmic Self" refers to digitally mediated identity for personal awareness, preferences, emotional patters are being shaped with feedback from AI systems. No surprise. Issue is where algorithms do not passively reflect the self but actively participate in formation. Seems far out? Hey settle in we going on a trip.
Point of this thread is how AI co-produces identity, emotion, and agency.
Personal identity used to be primarily formed by family dynamics, cultural narratives, self-reflection. Now though, AI algorithms with predictive language models, and behavior monitoring capabilities control so many aspects of what we see, how we feel, or think, as well as self-categorizing I find particularly worrisome.
I am trying to keep this from being super long so I am going to skim over what feel is obvious and doesn't need so much yapping. If anyone wants more info comment Ill provide what's needed.
So the algorithms are getting way prominent with shaping our identity, influence cognitive tools for how we understand ourselves. This system is defining our preferences and behaviors as well as interpretations of our mood, thoughts, intentions shifting personal understanding entirely. I'm trying say that we are outsourcing our understanding of us. I see super often that people posting on social media haven't a clue for emotional intelligence or accountability even, just complaining or crying and expecting improvement.
Delegation in the typical sense is not a problem. When the AI system is providing insight or helping emotional regulation problems can arrive. I hope don't sound like I'm hating or anything. I use ai to point, but really just for this post I am writing in the direction and by no means intend to be completely drop all AI stuff. I tend suck at writing, any who information there. Excessive AI assisted learning can cause poor memory retention (personally phones have directly impacted me with this). Also decreased self-awareness is way too popular behavior. When AI is used for emotional reflection people can become estranged from the richness of emotions in their lives. So it changes how individuals relate to their emotions. Producing a diminished sense of personal emotional awareness making it really tough without their phone or you know.
If AI used correctly it is super helpful. Even with mental or understanding what an individual is feels or help navigating etc . But if we grow accustom to receiving emotional and cognitive feedback it'll impact decision making process so where suggestions are not taken as options, but as default.Over time hard not to believe it will shape our choices, preferences and more. Predictive algorithms not only offer suggestions but its defining what we like. Many have experienced this when it is constantly reinforcing the same stuff over and over. I don't know what I want exactly but would nice to see things more randomly? I mean like if i don't know it exists then I cant know if i would benefit from it. So it doesn't sense if it shows me a flashlight maybe I want a mag light but didn't they such thing was available. Ugh my examples..
On with it. Digital identity feedback loops. Causing self concepts that can hinder self evolution. Say user like me introverted. It can create an echo chamber of self perception that reinforces limited image. Seeing this everywhere online is super disturbing but also I am not entirely subjective I get that I could be seeing what I looking for. So that's point also of this forum keeping minds thinking not assuming or super bias all the time.
Psychological risks are significant. Labels like depression or anxious can become self fulfilling. If a diagnosis is not from a doctor but is a self concept that is being reinforced all over the internet. Often use as an excuse where they may feel its an explanation. It just gets reinforces over and over with no room for contradiction. If someone says anything about them taking action its wicked offensive. (then I get shadow ban).
Quantified self. Say using a sleep app for example. Perhaps that number is showing them how well they slept instead of them actually thinking how they actually feel. That make sense? I mean it could matter if I read that I slept like shit then I would perhaps not really take the time to notice how I felt in the body
sense. I think people can see I'm going here.
Emotional conditioning. Instead of building emotional intelligence AI could be promoting emotional conformity because people are exploring what actually are feeling but more just reading how it exists. Observation that its happening I am unsure how common any of this is yet thinking its way more common than it should be. No one cares it seems.
I know I am yelling into the void but we can take some action that why I create this forum surprisingly I want to do something to help on the fucking planet. We need to offset passive exposure with these algorithms. Individuals need to engage in active self construction. AI developers could implement features that promote personal interpretation of data. By giving time to reflect yeah I know hard to think about but folks need to take individual action bedsides me try to spread awareness I don't know how else to any of this. Anyways schools add systems by integrating digital selfhood curriculums. Educate teachers how to identify and cope with these psychological effects. Teacher I'm 100% sure they care. From my reading have found they really are noticing a shift especially with kids understanding social dynamics or self-awareness. Its pretty bad article said that teachers notice over 80% of students are lacking in emotion intelligence compared to preCOVID.
- Source - Some stuff is off the top of my head. Hey just saying stuff not always intending it being everyone or everything.
It is so much intertwined with everything online from Spotify playlists, personalized news feeds, apps for monitoring you health, I'll stop there but you get the idea. The "algorithmic Self" refers to digitally mediated identity for personal awareness, preferences, emotional patters are being shaped with feedback from AI systems. No surprise. Issue is where algorithms do not passively reflect the self but actively participate in formation. Seems far out? Hey settle in we going on a trip.
Point of this thread is how AI co-produces identity, emotion, and agency.
Personal identity used to be primarily formed by family dynamics, cultural narratives, self-reflection. Now though, AI algorithms with predictive language models, and behavior monitoring capabilities control so many aspects of what we see, how we feel, or think, as well as self-categorizing I find particularly worrisome.
I am trying to keep this from being super long so I am going to skim over what feel is obvious and doesn't need so much yapping. If anyone wants more info comment Ill provide what's needed.
So the algorithms are getting way prominent with shaping our identity, influence cognitive tools for how we understand ourselves. This system is defining our preferences and behaviors as well as interpretations of our mood, thoughts, intentions shifting personal understanding entirely. I'm trying say that we are outsourcing our understanding of us. I see super often that people posting on social media haven't a clue for emotional intelligence or accountability even, just complaining or crying and expecting improvement.
Delegation in the typical sense is not a problem. When the AI system is providing insight or helping emotional regulation problems can arrive. I hope don't sound like I'm hating or anything. I use ai to point, but really just for this post I am writing in the direction and by no means intend to be completely drop all AI stuff. I tend suck at writing, any who information there. Excessive AI assisted learning can cause poor memory retention (personally phones have directly impacted me with this). Also decreased self-awareness is way too popular behavior. When AI is used for emotional reflection people can become estranged from the richness of emotions in their lives. So it changes how individuals relate to their emotions. Producing a diminished sense of personal emotional awareness making it really tough without their phone or you know.
If AI used correctly it is super helpful. Even with mental or understanding what an individual is feels or help navigating etc . But if we grow accustom to receiving emotional and cognitive feedback it'll impact decision making process so where suggestions are not taken as options, but as default.Over time hard not to believe it will shape our choices, preferences and more. Predictive algorithms not only offer suggestions but its defining what we like. Many have experienced this when it is constantly reinforcing the same stuff over and over. I don't know what I want exactly but would nice to see things more randomly? I mean like if i don't know it exists then I cant know if i would benefit from it. So it doesn't sense if it shows me a flashlight maybe I want a mag light but didn't they such thing was available. Ugh my examples..
On with it. Digital identity feedback loops. Causing self concepts that can hinder self evolution. Say user like me introverted. It can create an echo chamber of self perception that reinforces limited image. Seeing this everywhere online is super disturbing but also I am not entirely subjective I get that I could be seeing what I looking for. So that's point also of this forum keeping minds thinking not assuming or super bias all the time.
Psychological risks are significant. Labels like depression or anxious can become self fulfilling. If a diagnosis is not from a doctor but is a self concept that is being reinforced all over the internet. Often use as an excuse where they may feel its an explanation. It just gets reinforces over and over with no room for contradiction. If someone says anything about them taking action its wicked offensive. (then I get shadow ban).
Quantified self. Say using a sleep app for example. Perhaps that number is showing them how well they slept instead of them actually thinking how they actually feel. That make sense? I mean it could matter if I read that I slept like shit then I would perhaps not really take the time to notice how I felt in the body
sense. I think people can see I'm going here.
Emotional conditioning. Instead of building emotional intelligence AI could be promoting emotional conformity because people are exploring what actually are feeling but more just reading how it exists. Observation that its happening I am unsure how common any of this is yet thinking its way more common than it should be. No one cares it seems.
I know I am yelling into the void but we can take some action that why I create this forum surprisingly I want to do something to help on the fucking planet. We need to offset passive exposure with these algorithms. Individuals need to engage in active self construction. AI developers could implement features that promote personal interpretation of data. By giving time to reflect yeah I know hard to think about but folks need to take individual action bedsides me try to spread awareness I don't know how else to any of this. Anyways schools add systems by integrating digital selfhood curriculums. Educate teachers how to identify and cope with these psychological effects. Teacher I'm 100% sure they care. From my reading have found they really are noticing a shift especially with kids understanding social dynamics or self-awareness. Its pretty bad article said that teachers notice over 80% of students are lacking in emotion intelligence compared to preCOVID.
- Source - Some stuff is off the top of my head. Hey just saying stuff not always intending it being everyone or everything.