Is AI changing how blind people understand beauty and identity?

Artificial intelligence is changing how blind people receive visual feedback about their bodies, sometimes for the first time in their lives. While the technology offers new access and independence, its emotional and psychological effects are only beginning to be understood.

costabel
Photo: CNN

Milagros Costabel has been completely blind since birth. Each morning, she follows a detailed skincare routine and then takes a photograph of herself, which she shares with an artificial intelligence tool through the app Be My Eyes. The app functions as a kind of mirror, offering descriptions and feedback about her appearance and suggesting whether anything should be changed.

For many blind people, the idea of seeing themselves has long felt impossible. Lucy Edwards, a blind content creator known for her work on beauty and styling, says blind people were taught to believe that appearance mattered less to them, and that they would never have access to visual self judgement. AI has changed that assumption by providing detailed descriptions of faces, bodies and expressions, fundamentally altering how blind people relate to themselves and the world around them.

AI powered image recognition tools now offer far more than simple scene descriptions. They can provide comparisons, evaluations and advice. These features have become popular among blind users, many of whom use the technology to apply makeup, choose outfits or assess how they look before social interactions. According to Karthik Mahadevan, chief executive of Envision, one of the first companies to develop such tools, many users ask the same question first: how do I look?

The technology has evolved rapidly. In its early stages, AI could only generate short, basic descriptions. Today, advanced models are integrated into mobile apps, smart glasses and web based assistants, allowing blind users to interact with visual information in real time. Some applications can even rate a person’s appearance according to conventional beauty standards and suggest changes.

For some, this access feels empowering. Edwards, who lost her sight in adulthood, says AI allows her to form an opinion about her face for the first time in over a decade. Although it does not replicate sight, she describes it as the closest alternative currently available.

However, experts in psychology and body image warn that the effects may not always be positive. Research shows that people who frequently seek feedback about their bodies often report lower body image satisfaction. Helena Lewis Smith, an applied health psychology researcher at the University of Bristol, says AI now enables blind people to engage in the same kinds of comparisons that already affect sighted individuals.

Much of the concern lies in the way AI systems are trained. Image models often reflect narrow, Western beauty standards, favouring certain facial features and body types. These biases can be unsettling for anyone, but may be particularly difficult for blind users, who must rely entirely on textual descriptions without visual context.

According to reporting by the BBC, blind users may find it harder to challenge or contextualise AI feedback, especially when it compares them to idealised or so called perfect versions of themselves. This can increase pressure, dissatisfaction and, in some cases, anxiety or depression.

Costabel describes uploading multiple photos of herself to an AI tool and asking questions rooted in long standing insecurities. The responses reflected dominant media driven ideas of beauty, exposing her to standards she had previously been shielded from. While this information helped her understand visual concepts, it also introduced new forms of self doubt.

Researchers such as Meryl Alper from Northeastern University argue that body image is shaped by many factors beyond appearance, including context, relationships and physical capability. AI systems, which focus almost entirely on visual traits, are unable to account for these complexities.

Another challenge is accuracy. AI tools can sometimes misinterpret images, a problem known as hallucination, where systems generate incorrect or invented details. Blind users often place a high level of trust in these descriptions. Joaquín Valentinuzzi, a blind university student, says inaccurate feedback about his expressions or hair colour made him feel insecure, especially when using AI to select photos for dating profiles.

Some services attempt to address this by offering human verification on request. Others rely entirely on automated systems. Developers argue that giving users control over how descriptions are generated can help, allowing people to request neutral, brief or even poetic feedback. Yet this flexibility can also reinforce insecurities, depending on how prompts are framed.

Despite these risks, many blind users remain optimistic. The technology allows them to access information once thought permanently unavailable, from understanding how they looked on their wedding day to navigating daily life more independently. For them, AI mirrors represent both opportunity and challenge.

As the technology continues to develop, researchers stress the need for further study into its long term emotional impact. For now, blind users are learning to live with this new kind of reflection, balancing empowerment with caution as AI reshapes how they see themselves.