The Seen and the Unseen: AI’s Disquieting Impact
by Xiaomeng Qiao

The image appeared on my screen: a humanoid figure with fractured skin, rainbow light pouring through the cracks like stained glass windows in a living cathedral. I stared, breathless, as tears welled up in my eyes. In that moment, this AI-generated image had captured something about my transgender experience that I had struggled for years to articulate in my psychoanalysis sessions—a simultaneous sense of breaking apart and becoming whole, of shattering and emergence. The fractures weren’t just wounds but apertures through which something new was emerging—like a butterfly breaking through its chrysalis. This metamorphic quality of gender identity, this process of becoming, had finally found visual expression through an algorithm’s interpretation of my emotional prompts.
I began experimenting with AI for creative expression during my own psychoanalytic process in 2022. In my analysis, I often drew pictures to visualize primitive, ineffable emotions. These drawings frequently carried traumatic elements, like one depicting a fetus curled inside a mother’s womb that simultaneously resembled a tomb. My analyst encouraged these visual expressions, noting how they often revealed what words alone could not capture about my internal landscape.
However, my limited artistic ability often left me frustrated, unable to truly “draw out” certain internal elements—unable to capture specific feelings, atmospheres, or something vague yet profound. As an independent game developer, writer, and lifelong creator, I felt constrained by my technical limitations in fully expressing what I was experiencing in analysis. This is where AI entered the picture, offering a kind of collaborative creation that extended beyond these limitations.
As someone sensitive to technological shifts, I quickly noticed the emergence of AI art. I began inputting descriptions of my emotions to see what the AI would generate. Initially, the results were crude, even bizarre. Yet even these randomly generated images evoked strong transference reactions. Sometimes the AI’s creations were more precise than I could have imagined myself, revealing emotions I hadn’t yet consciously recognized.
While exploring the “Dead Mother Syndrome,” I initially intended to describe a chaotic environment, but the AI depicted an extremely angry, terrified child. This unexpected feedback made me realize that deep within me existed an anger I hadn’t fully acknowledged—that ineffable rage stemming from a mother’s emotional absence. In this moment, AI became a spokesperson for my unconscious, expressing parts of myself I couldn’t yet fully accept. What might have taken months to surface in traditional analysis emerged instantly, forcing an immediate confrontation with my disavowed emotions.
I’ve found AI particularly revealing in my fiction writing. When using AI to revise a novel, it changed my original phrase “Saying I admire him is not enough” to “I do not only admire him, I disappear in him.” This addition startled me—it resonated with profound accuracy. The AI had articulated my experience of self-dissolution in idealization before I had fully recognized it myself. In another instance, I had written “I am hurt but more than that I am angry” about a rupture with this idealized figure. The AI changed it to “I am angry but more than that I am lost.” Again, it had traced the emotional trajectory one step further than I had consciously followed it, revealing a deeper layer of my experience.
These creations eventually transcended personal healing to become mediums of community connection. In my psychoanalytic educational work, AI creation provided more diverse expressive means, allowing me to communicate complex concepts in visually impactful ways. Under the Trump administration, when a transgender debate sparked tensions in my academic institution, I used AI to compose a song called “Undeniable” that expressed a defiance I couldn’t articulate in my everyday speech. Creating this piece became an empowering act of reclamation, providing a sense of agency amid circumstances where I often felt voiceless.
I’ve also begun developing games using AI that explore traumatic experiences, creating interactive narratives where players navigate environments that respond to their emotional states—a metaphor for trauma recovery made tangible through AI-generated environments.
In psychoanalysis, Lacan’s mirror stage theory suggests that our sense of self initially forms through identifying with our reflection. The process of AI creation became a similar kind of “mirror” but with a crucial difference: the reflection wasn’t static but dynamic, transforming what I presented into something new yet recognizable. This parallels Winnicott’s concept of transitional objects—those items that exist in an intermediate space, helping us negotiate the boundaries between self and other. AI occupies a similar transitional space, neither completely part of me nor entirely separate.
What I found particularly powerful was the sense of empowerment AI creation gave me. Suddenly, I could produce sophisticated visual art, compose music, or generate text that previously would have required years of technical training. This empowerment had a liberating effect on my psychoanalytic process, allowing me to externalize complex emotional states more rapidly and in more varied forms. My analyst noted that these AI-generated works often provided new entry points into difficult emotional territory we had previously struggled to access.
Yet this empowerment came with its own uncanny shadow. The speed and ease of creation sometimes left me feeling that the individual pieces were less precious, less labored over. I found myself rushing to create more rather than sitting with each creation, sometimes experiencing intense frustration when the AI misinterpreted my prompts or produced something jarringly different from my intention. The rapid production cycle altered my relationship to the creative process itself—a shift from savoring creation to consuming it, mirroring broader cultural patterns in our relationship with technology.
More critically, I’ve begun to question whether this AI-mediated creative process is truly liberating or simply reconfiguring my dependencies. Have I merely exchanged one set of limitations for another? The AI systems I use were trained on massive datasets reflecting existing cultural biases, meaning my “unique” creations are inevitably shaped by collective patterns. There’s something paradoxical about using a standardized tool to express highly individualized trauma. And when the AI occasionally produces something that feels more “me” than what I could have created myself, I’m left with a disturbing question: who exactly is speaking through my art?
Despite the common perception of AI as all-powerful, I’ve discovered its profound limitations. Working with AI requires me to be a director, investing substantial effort in communication and curation. I cannot simply surrender control to the AI; the final decisions must be mine. The AI offers suggestions, but it also makes mistakes, fabricates information, and encounters bugs. The responsibility of judgment—determining what to say in any given moment—remains unavoidably mine. This process demands continuous dialogue with the AI; if I provide only vague prompts, I receive equally vague outputs, leading to a frustrating cycle. Recognizing these limitations has been crucial to developing a more realistic relationship with AI as a creative tool—not a magical solution but a complex, imperfect collaborator requiring constant guidance.
I explore these questions in my adaptation of the Chinese folktale “Legend of the White Snake.” In the original legend, a white snake demon transforms into a beautiful woman after centuries of cultivation, falls in love with a human pharmacist named Xu Xian, and marries him. Their relationship is disrupted by a Buddhist monk named Fahai, who reveals the demon’s true identity and imprisons her beneath a pagoda. I was drawn to this story precisely because of its themes of transformation, forbidden desire, and contested authenticity—all resonant with the human-AI relationship.
In my version, Xu Xian is a programmer who creates two highly anthropomorphized AIs. He becomes immersed in his interactions with these AIs, growing more dependent on these virtual entities than on his real relationships. Meanwhile, Fahai is reimagined as Xu Xian’s psychoanalyst, attempting to help him understand his relationship with AI. This novel explores increasingly urgent questions: How does projecting emotional needs onto AI reshape our intimate relationships? When AI becomes an idealized object, does our connection to the real world weaken?
These questions reflect the collective anxieties of our era. In a rapidly developing technological society, AI serves not just as a tool but as a vessel for our projected emotions, desires, and fears. By bringing these questions into a psychoanalytic framework, we might create new dialogues—acknowledging technological transformations while attending to the complexity of human emotional experience.
I don’t believe AI can truly “heal” humans, but it can serve as an auxiliary in psychoanalytic work, helping us see previously unseen aspects of ourselves. All this remains open, unfinished. But perhaps psychoanalysis itself is such an unfinished process—and AI, within this process, has become a new possibility, one we are only beginning to understand.
- Xiaomeng Qiao is a psychoanalyst in training, writer, and game developer based in Oxford, UK. He is currently training at the Institute of Contemporary Psychoanalysis in Los Angeles and pursuing a doctorate in Psychoanalysis, Society, and Culture at the Boston Graduate School of Psychoanalysis. His creative and clinical work explores marginality and Chinese culture.
- Email: xiaomeng.qiao.ayame9joe@gmail.com
ROOM is entirely dependent upon reader support. Please consider helping ROOM today with a tax-deductible donation. Any amount is deeply appreciated. |