A few weeks ago, I had the amazing opportunity to speak at the Future of Work event at Meta (formerly known as Facebook)! Next to dozen or so showcase partners demoing their solutions on the newest hardware available, a full room of people came to listen to Nathan P. King from Accenture, and me, moderated by Stephanie Seeman from Meta Reality Labs, to talk about a lot of fun topics and questions, like:
What is Your A/V/X/MR journey?
It started back in the 1980s, developing using 16 color graphics, and using 4 colors – black, red, blue and purple only 🙂 With these colors and some rudimentary 3D calculations, and of course a cheap paper/celluloid glass from the back of a comic book, you could do amazing 3D visualizations already 🙂 Although I tried cross eye development too, but I liked the idea that I did not had to do anything besides wearing the glasses to achieve the experience. Which continued by me creating the colored celluloid glasses for more people to try it out. Of course, interaction patterns weren’t available as such besides using a joystick and later (when moving to PCs) a mouse. I tried using some other experiences as well, like using light guns (they worked only with particular CRT monitors) or using sound waves to create the effect of a haptic feedback – looking back, I was probably looking like a crazy scientist with all these techs around me all the time. I am not old enough to have played with the original Sword of Damocles tech – but surely would have done that if I am alive than 🙂
The technology kept me interested in a long time, even after these early tries not bringing me the full roaring success like having a successful exit on a startup, so when Microsoft, the firm at the time I worked at, started to work on the Perceptive Pixel devices, I did jump on the opportunity to work on and with those devices – less 3D, more spatial experience, for sure, but I learned a lot about hardware, projection, understanding what ‘context’ means, how physical and digital blends, and more.
Somewhat later, when working on marketing projects for brands like Coca Cola, Merck, Procter & Gamble and more, I kept pushing limits around this again, and helped creating many hugely entertaining and amusingly successful solutions for kicking virtual balls using an overhead projected area or being able to use your webcam to augment your hand to have a bottle in it and more 🙂
And this interest still stayed with me, so when at Morgan Stanley the innovation office asked me about ideas to solve specific problems, I kept saying Spatial Computing so many times that they happily agreed to procure devices and support projects using them. The rest – is history. What started with 1-2 devices now grown to a device park, what started with a small POC now grown to a portfolio of projects. Of course, we had to learn new ropes – how to get data in and out of these devices, how to solve problems and questions around device management and security, how graphical design services are / were at the time fundamentally about 2D designs and how they had to change up, and more.
What were the early goals of the proof of concepts (POCs)?
AR, VR, XR, however you call it, was (I think no longer is) an emerging technology (a side channel introduction of https://zenith.finos.org, the Emerging Technologies SIG of FINOS/The Linux Foundation, that I do co-chair). As known of other similar technologies, if you do start embracing them early, like we tried, that usually turns out to be the key to learn (and if needed, fail) fast, and to understand how these would generate long-term ROIs.
We also made sure we are starting small and gradual, and not trying to replace processes, rather trying to figure out alternate methods for already existing ones. For this very same reason, while we have been working on specialized, sometimes one-off solutions for our employees and our clients, we haven’t embarked on a journey to create ‘mass’ experience yet – no digital lounge or similar, rather focus on bespoke tailored experiences for high level clients and employees.
These do cover a wide range of solutions – from holoportation for financial advisors, e-banking, IPO pitch book augmentation, path finding, datacenter discovery, various physical and social trainings, digital art gallery, and dozens of more. The reactions for these POCs are overwhelmingly positive, but most of them stayed POCs, waiting for the mass availability of devices, helped by the proper MDM (Mobile Device Management) solution. I have the hope, that some vendor’s familiarity in the crowd that are now entering the market will help increasing the size of the addressable market to the level. Similar turn of events we saw first with “we do have a computer at work, I need a personal one at home” and continued with “I have a (non-blackberry) smartphone at home, I would like to use smartphones at work”, so I hope a similar “I have a semi-entertainment semi-professional XR device at home, can I use it at work” is going to be the next step 🙂
From the many POCs, hallway testing, working with vendors, etc, looks like our view can be crystalized that instead of using a particular vendor’s solution (let it be a hardware or a software), we have been looking at solutions that are more generic and applicable cross multiple vendor’s platforms – this is where development platform knowledge like Unity or Unreal comes handy.
What did we learn?
Next to some of the items above I mentioned, we learned a lot of things that we did not expect to – among these were to be able to draw up a matrix of incompatible versions of Unity, Unreal, software plugins, hardware connections, flaky over the air updates, the different physical and virtual machine requirements, the harder than expected initial device management, and more.
Was it worth it? Completely and deeply Yes. Many cases we were the first enterprise company using a solution or two or being allowed to check out an in-progress hardware device and help finding the flaws for a finance company or an enterprise in it. Especially when we started on this journey, back in 2016, everything was new, graphical designers lacked the skills, software lacked the non admin install options, I can continue endlessly.
Luckily, if we were to start now, in 2023, this would be very different. We have our trusted partners for designing the XR interfaces, who understand our limitations and requirements of our industry and the technology sometimes better than we do. We have elaborate integrations for our data feeds, with our device management, with minimal hassle to engage a new device and most importantly, to enable people to ‘bring their own’ devices if needed to participate in the experiences.
Also, we saw, how the words of John Riccitiello, Unity CEO are coming true from a previous AWE presentation of his – his definition of the Metaverse was much less of the headset and the spinning 3D objects and more: The metaverse is the next generation of the internet that is always real time, and mostly 3D, mostly interactive, mostly social and mostly persistent. When we built our cyber security tool, The Wall, giving a 100+ feet long, ~4 feet high touchscreen where you could conjure various real time data feeds and interact together standing in front of the wall – it was a good reason to soften up our approach and to tailor the definition of ‘metaverse’ a bit better. Similarly, many experiences can be done via a phone, tablet, or even on your computer screen – and then you are not affected by data security, device management, etc., and when the market and technology arrive to the right point, if your solution used something like of Unity or Unreal, you would be able to easily transfer it to an actual XR device.
What is the advice I would give to someone trying to start on this journey in their organization now?
Although you are not necessarily a pioneer anymore in the field, you would be one in your company. You have to be brave and bold 🙂 Will all solutions work out of the box? Likely not, but we know the world has been moved ahead by people thinking outside the box.
Make sure to watch / read a lot of sci-fi 😀 Many of the ideas explained in Star Trek became reality in the decades since – tablets, communicators, and more. It surely will give you a base for having a good inspiration.
When it comes to your actual projects – first do think about augmenting an existing process instead of outright replacing something, this will make it an easier sell for sure. The most important point although is to find tech savvy sponsors from day one, it will help you propel your projects forward tremendously. What do I mean by this? Looking at the actual event, when asked who hasn’t tried such experience yet, around a quarter of the people raised their hand. This means, they knew that using the device won’t make them fall into the ‘ridiculous’ factor – e.g. most of the room already wore these strange contraptions on their head and seemingly 1.) survived it 2.) kept their job after sawn wearing one (not necessarily on the street, we are probably not there yet still 😀 ). Given a similar situation in a C-suite board room, most likely everyone would skip wearing the devices as it would run the risk, that they would look ridiculous.
In conclusion, the Future of Work event at Meta not only showcased the exciting developments in XR but also provided Nat and me a wonderful opportunity to share our valuable lessons learned on the journey. Do not hesitate, please do join us – by embracing the immersive technologies, organizations can unlock new possibilities, enhance existing processes, and create transformative experiences that shape the Future of Work.