Newsletter: Future thoughts: Why Smart glasses are the next compute platform and why Metas Rayban smart glasses are AWESOME!
This newsletter aims to give an insight into why smart glasses will replace the mobile phones as the dominant new compute platform and how brands can prepare for this eventuality.
Why will smart glasses replace the phone?
The simple reason is utility. Smart phones are fantastic, but there are many things about them which inhibit human productivity, socialisation and work. As smart glasses solve these issues will deliver more utility, improve human connection and reduce human distraction, the phone won’t disappear, but smart glasses will become the key compute platform.
Your talking rubbish
Currently the industry almost collectively thinks that Metas huge investment in reality labs is a mistake, this is because they think the investment is al about virtual reality. Virtual reality will always be a platform but it will be a minority platform, the t other investment reality labs is making is in smart glasses, and these will be the new compute platform.
The implications for this for brands are as significant as the implications of the growth in mobile computing, and the growth trajectory will be as quick – once the hardware is up to standard. I see this as being two years before we get a product the mass will want to use and then another 2 – 3 years for major adoption.
Should brand care?
As a brand, if you could have set up your mobile web / content strategy / shop strategy before your competitors - you would have had a distinct advantage. It’s the same here, but it is hard to understand exactly how the commerce and content will work but there are things we do know.
Issues with mobiles
We know there is device addiction, a lot of this is related to social media and the constant dopamine hit it gives, this issue will continue to grow with more children and adults addicted and increasing issues with mental health.
What makes this worse is that we use our phone also for business and admin, so we are always getting it out, flipping around and looking for apps..and then we get sucked back into social.
Having your head down to obtain a piece of information from a mobile phone also separates you from everyone else, it breaks the connection and the interaction between yourself and the other human.
If however, you want to see some information and you're wearing smart glasses, you can simply pause, ask the smart glasses , keep looking at your counterpart, keep nodding, and they actually become part of the question and response when the answer comes up on the screen in front of you, or is spoken in your ear.
They add to socialisation, they don’t reduce it.
I have done this many times with my meta Rayban smart glasses (see image and vid) and it is a much more fluid way of socialising than having to stop and look down at your phone and type.
For entertainment purposes, having an AR screen where you can watch video - again without having to bend your head down - is truly revelatory. It enables you to sit comfortably and be entertained absolutely anywhere and you don’t have to find somewhere to rest your phone, or even worse having to hold your phone..
AR lens tech isn't there yet, there is eyewear like Xreal that have decent AR lenses but the glassware is a bit bulky and the field of vision isn't quite large enough.. but it is coming,
Work and AR smartglasses
And then there's work. For me, this has been truly fantastic. Every time I'm on Teams, instead of me having a fiddle around with AirPods or speakers or headphones. I just have my glasses on - it's a smooth flawless process, when I get a Whats App it whispers the message in my ear and I don’t have to stop looking at the screen or look down at my phone.
I can aslo live stream form them which I do at conferences - which is pretty cool - but when I am able to have my desktop or laptop on my lenses then I will be over the moon!
And then finally, there is a video on Image Capture. If you have a look at the videos, I've captured on my meta ray bans you can see how easy it makes capturing video. It's also incredibly unobtrusive and by not having to pick up a phone and hold it and point at someone. I find it makes people so much more relaxed to act naturally and just do their thing - as you can see in the video of my daughter and I playing with the Aussie rules ball.
AI Assistant:
Meta has announced that it will be upgrading the Raybans with the Meta AI assistant. This means you will be able to ask the glasses anything and have Metas LLAMA LLM answer the questions for you. I have been using Metas LLM (it’s open source!) as a foundational model for ap dev and it is v v good. This could see adoption of AI Assistant usage on a whole new scale and give Meta a huge competitive advantage in the adoption stakes of GPTs – if people buy their eyewear – which I honestly believe they will..
Limitations
But smart glasses are not quite there yet.
For ease of use and simplicity, the Meta ray bans are great, however, they do not have AI yet, and no display on the screen. AR eyewear is coming out - circa 2025 - and they are called Orion. These will most likely have an AR screen that is controlled by hand gestures, where you can flick through applications, tap on things by moving your hand, or some other unobtrusive device.
Xreal glasses show what this AR functionality could be like but they are ‘tethered’ to a power pack kind of thing, but that will change.
What does this mean for marketers?
At the moment, not too much. But what it will change as adoption grows is locational based data and marketing. the data these glasses have about the user is intense. They know where you are, what you are doing, questions you are asking. There are serious issues with data privacy here but Meta for example seems to be respecting data privacy on eyewear - for now.
As new product comes out and as adoption grows I will come back to this subject - we expect to see releases of new Mixed reality / AR eyewear from Apple / Meta / Google and others in the next couple of years.