Virtual Makeup on Youtube?

Google, the parent of YouTube, recently announced a new feature that will allow viewers to try on new makeup from the comfort of their phones. This tool is known as “AR beauty try on” and is expected to be released at the end of the summer. The hope is that users will be able to see if a color works for them while watching their favorite YouTubers demo the product. But first, we have some questions on behalf of the Brown Beauty community.

Will this tool be able to accurately show what a shade will look like on different skin tones?

Will this tool be able to recognize and work for a diverse range of facial features?

Or will it perpetuate Eurocentric beauty standards?


We are by no means bashing the feature before it’s even released, but these are important things to think about. YouTube is not the first to come out with this idea, so will it learn from the mistakes of its predecessors?

AR (Augmented Reality) is a live view of the real world with special effects added through computer software. It’s something that we are all familiar with (just think of Snapchat’s filters). To work on your face, it uses software similar to facial recognition to identify facial features. Unfortunately, it has been proven that facial recognition does not work as well for women or people of color as it does for white men. Luckily, makeup software does not need to know who you are, just where your facial features are. This however does not prevent it from assuming that your lips and nose are smaller than they actually are.

Facial feature recognition is only one issue with AR. The more important one is the actual augmentation. If you have ever used snap chat you will notice that it’s “beauty filters” usually have a couple of things in common: they lighten your skin, shrink your nose, and enlarge your eyes. All of these features play into a Eurocentric standard of beauty which can be damaging to the self esteem of brown girls (and guys). However, it’s not only the “beauty filters” that change your appearance. Snapchat’s “unfiltered” camera automatically “enhances” your picture (i.e. it automatically makes you look lighter by bringing down the saturation or overexposing the image).

In order from left to right, with no modifications or makeup: iPhone front facing camera, Snapchat camera, Ultra GLAMLab (with a virtual nude lipstick), Sephora Virtual Assistant

Snapchat is not the only one guilty of this. Sephora’s Virtual Artist feature overexposes the live image from your camera, which, again, makes you look lighter (not to mention the fact that their lipstick feature does not fit larger lips as well as it does thinner lips). Unlike Snapchat, this can not be corrected by tapping the image. How are you supposed to know if you like a shade if your true skin tone is not being shown?

From left to right: Sephora Virtual Assistant, Ultra GLAMLab

YouTube’s AR beauty try on could be an amazing addition to the beauty world… if it learns from the mistakes of the software that came before it.

The information listed about the attributes of the image quality from different apps is from the author’s firsthand experience with those apps.

25 views0 comments

Recent Posts

See All

An Open Letter to Sephora

An Open Letter to Sephora: As two Black female entrepreneurs in the beauty industry who have worked to create a space for women and all people of color, we have been shocked, saddened and angered by t