This AR Startup Has 600 million Users. Can You Name It? Neither Could I.
I wrote about Modiface in my most recent book, but thought I was writing about Sephora. I wrote about it again in a previous ARBW issue, but that time I thought I was writing about L’Oréal.
It’s an easy oversight. Founded in 2008, Toronto-based Modiface is privately held and is not a consumer-facing company at all.
It provides a software developer’s kit (SDK) and realtime video for leading brands in the $445 billion global beauty industry. They have more than 100 partners including Sephora, Walmart, Clinique, Smashbox, Mac, Estee Lauder, Shiseido, Dior, Tarte, YSL, Armani, Urban Decay, Bobbi Brown, Tom Ford, etc.
According, to Parham Aarabi, the University of Toronto engineering professor who started Modiface, initially it was an attempt to use machine learning and AR to read lips. The Modiface SDK is the overwhelming choice of beauty brands because of its unique color simulation that provides a superior and more realistic product rendering in “Magic Mirrors” that are appearing on upscale beauty product counters all over the world.
Aarabi, estimates that so far, over 600 million shoppers have used products containing the AR software to virtually try on lipstick, eyeshadow, liner, mascara, hair color and other cosmetics. Simply by touching the screen (or mirror), people can see what they look like in any particular item, without actually applying it, thus providing a more enjoyable and efficient experience. The technology cuts down shopping time, mess and sample product waste.
The result is a measurable sales boost. According to Aarabi, partnering brands report an 84 percent increase in conversions online and a 31 percent boost in stores where 92 percent of all beauty products are sold, and where engagement and upselling are actually more important than just sales. Growth has increased significantly over the past four years and is edging toward the exponential.
The brands have different goals, I learned. Mobile apps are intended to boost sales online, while in-store, it’s about engagement and upselling.
Beauty products are a very big business, and the lessons being learned by the Modiface experience are most relevant to the remainder of the consumer shopping industry. Already, similar technologies are being used at retail stores ranging from Neiman Marcus where “Memory Mirrors” allows shoppers to see what they would look like in clothing that is still on the hanger.
If a shopper takes an outfit on a hanger and holds it in front of a mirror, then the mirror shows precisely what she would look like if she tried it on. The shopper can then take a second or third outfit and hold it up. The mirrors show her what she looks like in each of the three dresses simultaneously, once again using AR to improve the shopper experience.
But customer experience is only half the story, a story that applies to most brands. The other is the extreme value of collecting user facial recognition data.
These AR mirrors are actually computer screens equipped with 3D sensors and cameras. While shoppers try on items, the screens gather non-intrusive data. While Modiface cannot recognize people, it can recognize facial shapes, skin tone, hair texture and color, wrinkles and other facial characteristics.
Machine Learning enables it to identify which facial characteristics are likely to buy various products. As this data gets collected, Machine Learning enables it to identify which facial characteristics are likely to buy various products at certain times of the day or year and in particular regions.
It will know what eyeliners are preferred by people who have particular facial shapes and skin tones, and how that may change between winter and summer. This allows stores to predict inventory with increased accuracy.
This is how all of retail will be transformed over the next five years, or so I believe. It starts with a simple idea that seems to make a small difference, but users reveal it makes a bigger difference than people originally thought. Then the tech is adopted by others into new applications. Eventually, the tech becomes a universal tool for any party trying to enhance customer experiences in stores, on phones or soon in headsets.
Thus, a startup aspiring to aid the hearing impaired becomes a beauty industry standard. The concept spills into apparel and from there, who knows where it will go.
Special thanks to my Facebook friend Jim Courtney, who pointed me to this great story.
Elsewhere in AR World
From Voice to Brain Interfaces
It is clear that the most rapidly adapted digital interface has become voice. Amazon and Google are already driving a near-exponential assent of voice interaction as a faster, better, safer interface. But wait, there’s more. There’s the potential for Eye Interaction, and beyond that, there is technology coming out that is operated with your brainwaves. In The Fourth Transformation, we dedicated an entire chapter to to Eyefluence, a startup since acquired by actually manipulate objects with your eyes.
Unlike eye-tracking software, eye interaction allows you to actually manipulate objects with your eyes. For example, I could type this newsletter use an AR headset and Eyefluence software about nine times faster than I can type on a good day.
It is faster because the eyes are the fastest part of the external body. It is the shortest route from external into the human brain. But they are not the brain itself, which is the fastest part of the entire body.
There is a fair amount of activity unfolding in something called BCI—or Brain-Controlled Interface. All of the action I have found involves healthcare, particularly with patients who may not have full use of limbs, eyes or voice. In the book, I reported on Mindmaze, a Swiss med-tech company, that has developed a headset connected to external sensors that pick up electrical brain impulses to treat myriad chronic issues including schizophrenia, Parkinson’s, amputation trauma and stroke trauma.
While it has produced promising results in clinical trials, the headset remains a cumbersome device weighing nearly a pound, the last time I checked. For more than ten years, researchers have been experimenting with brainwaves and prosthetics. The refinements have become impressive. Amputees are demonstrating the ability to wiggle prosthetic fingers and to sense heat by brainwave.
Now, I’ve learned from Rob Mowery, another Facebook friend about BCI technology from another early-phase med-tech company.
The Emotiv EPOC+ is a $799 device that allows EEG testing more easily and supposedly more accurately than the current system of attaching sensor plates to the skull with gel.
It is a relatively lightweight system and eliminates the look of Mindmaze devices, whose impressive functions require something that resembles a prop in an old Frankenstein movie. EPOC is more limited and much less expensive.
Mowry also pointed me to the Neurosky Mindwave, which is available for as low as $79.95 from Amazon.com, of all places. The device can measure EEG responses to commercials and ads for marketers.
Mowry has been trying out both the devices and his early response is that they are not quite ready for prime time. He has been playing with the EPOC and Google. He is getting about a 50-60 percent success rate when he really tries.
Over on Amazon, the Neurosky low-end device as a two-star rating, and the higher end version is unrated altogether. Despite all these flaws and limitations, I believe that this is a remarkable start for a technology that changes our relationship with personal digital devices. I do believe that sometime in the next decade typing, tapping and swiping keyboards or screens will be about as commonplace as the rotary phone dial is today.
If this is a topic that holds your interest, you may want to attend AR & the Future of Healthcare, a live, online class presented by Kristi Hansen Onkka, founder, and CEO of HealthiAR, a med-tech consulting group and myself. We will be discussing these and at least 20 other AR/VR medical case studies. The class will be from 10-to-noon Pacific time on March 6 and the cost is regularly $147. Use the Code ARBW and save $20.
Magic Leap & the NBA
If you have been reading ARBW for a while you probably know that I am skeptical about Magic Leap becoming anything close to what it promises to be.
Despite all that, I hope to be proven wrong. I am on the side of users in terms of AR and VR. I favor more companies competing harder for our business and thus driving innovation up and prices down.
Last week, Magic Leap announced a deal with the NBA, and showed Shaq O’Neal wearing a prototype of the headsets that have been promised for release this year. When he was a player, he demonstrated just how magically he could leap. I hope the headsets actually do the same for AR when they come out.
We shall see.
According to the announcement delivered earlier this month from the stage of Recode, a top-tier tech conference, fans in the headsets will be able to view multiple screens overlaid on a wall or watch the game in an optical corner while walking around. They can also watch the game on one screen while keeping Facebook open or … well, you get the 3D picture.
Ultimately, the way I want it to work is that I can convert my living room dining room table, yard, or outstretched hand into a live game. I would be able to watch from the perspective of any seats in the stadium, or on the court itself. I could see the game the way a drone would see it, or I could sit on the bench or join a team huddle.
Magic Leap does not promise anything like that in its 2018 version. Nor does anyone else. But someday, that is likely to be how fans enjoy basketball, soccer, rugby, football or live professional wrestling.
Will Magic Leap be the one who will take us there?
Maybe… Maybe not. Perhaps it will be Apple, which seems also likely to get into the headset game sometime soon. But Apple may also be looking at applications that are a lot less playful, as the following story suggests.
Could Apple be looking at Enterprise Headsets?
Last week Techcrunch wrote that it could confirm rumors that Apple had quietly bought VRvana.com, a Canadian maker of the Totem, an attractive VR headset that features mixed reality, but has not yet shipped. The Totem uses technology considered superior to Hololens for integrating AR and VR into a single headset.
The interesting part may be that VRvana has been entirely focused on enterprise applications. A target that Apple CEO Tim Cook has declared part of the company’s AR future, despite most observers regarding Apple as a consumer-facing company that makes products so desirable that they wend their ways into the enterprise.
This makes some sense because most number crunchers see today’s non-gaming headset market situated overwhelmingly in the enterprise where price and appearance are far less important than in the consumer market.
In either case, this to me is a bit like my above Magic Leap story. If it is true, then users win. Innovation increases and prices fall. If Apple is involved, you can bet the appearance of the device is likely to improve.
VR in public education
If you have read Ready Player One, Ernest Cline’s excellent sci-fi novel (soon to be a movie), you would see a vision where the best teachers in the best classrooms existed not on Earth, but on a virtual world called Ludus, where student avatars learned from virtual teachers who took them to the places they were studying, ranging from ancient cultures to space travel.
I believe that education is an extremely promising application for VR on Earth. I have previously reported on China experimenting with virtual teachers whose look, teaching methods and pace are selected by each student. I’ve also written about US and UK high schoolers who take virtual tours of historic points of interest such as the White House and Buckingham Palace. Now, comes this ABC report on teens in a Brooklyn high school learning about life on an upstate New York farm.
VR education is not just for school kids. Walmart is training 140,000 in-store employees with it. Energy companies are training oil rig workers before they leave land-based classrooms. Med students are learning anatomy with virtual humans, rather than frozen cadavers.
While, we don’t yet know the long-term effects on the developing human brain, and the implication of changing school socialization structures, it is already more than clear that immersive experiences are far more effective for teaching than lectures, textbooks, and 2D training films.