Metaverse: What Now?!
The “Metaverse” is a term that has really taken off in recent months, and is buzzing throughout investment publications and software developer collateral. I pause before I even use the word, because, frankly, Facebook needed to reboot their fledgling multi-billion dollar investment and VR agenda by using a re-branding strategy… hence the company name-change to “Meta”.
Despite all the great advancements in VR systems and real-time 3D engines such as Unreal Engine and Unity, my foreseeable reality for the Metaverse is not as cinematic quality as most artists, game companies and Virtual Production studios would lead you to believe. Don’t get me wrong, the hardware and engines are capable of delivering highly realistic, fully raytraced, dense geometry models and environments. Modern shaders and material workflows add realism with multi-layered PBR shaders that often include 4-6 separate images that work together to create one single material. Adobe has even jumped on that bandwagon by acquiring Substance.
5MB downloads are obviously lighter in filesize and friendlier to data plans than 5GB download experiences.
The Cost of Hyper-Realism
Unfortunately this realism comes at a very high cost - multi-gigabyte downloads take a long time to download and require costly video cards that can handle all the texture memory and instanced geometry.
These dense 3D payloads can be accommodated by gamers and lovable nerds like me who have the available hardware, time, dedication and effort to make it happen. However, the casual user will have their mobile devices, tablets, or laptops and will only be capable of interacting with more traditional 3D experiences. The casual user will often use a mobile data plan to download the 3D assets. Most modern web browsers ship with WebGL-compliant code so they are a great universal framework for interactive 3D experiences.
To refresh your memory, starting 2012, AR, then VR and then the catch-all “XR” started taking off, the market became saturated with snake-oil salesmen promising the hottest new immersive marketing technology and production. BMW had a famous 360 video ad of Gigi Hadid driving in a shell game of identical cars around the 360 camera. While it may have been cool technically, it wasn’t very cool when mobile users were smacked with a huge spike of data downloading 100MB+ from their Facebook app!
Real-time 3D For Mass Consumption
During this period, I was Director of 3D Production at an online advertising company which, at the time, had invested in a proprietary real-time 3D engine and interactive platform that could serve 3D experiential ads through standard banner ad delivery. This was a great advertising channel for most product verticals, since user interactions could be tracked for metrics to see what users looked at most, what color options were most desired, time-in-ad, etc.
The magic of this platform was that the typical 3D ad could fit into a very reasonable 5MB download. Due to a company-wide change of mission, unfortunately this product was binned, however during that period, we were THE go-to for the highest quality 3D ads available. Out of a total of seventy ads, we won a dozen coveted advertising awards for assorted ads for clients such as Fiat-Chrysler, Dell, Boeing, Lexus, and Disney. The metrics for these ads were off the charts, and our clients came back for more. We even adapted our ads for optional use with Google Cardboard-compatible headsets, so we were the first to ever serve a VR ad at scale through banner ad inventory. It was for a Honda Odyssey ad.
It was amazing what we could pull off within our file size limitations. One executive at a luxury car company thought that one of our automotive ads was a pre-rendered image until we started interacting with it! We could bundle a low poly car (exterior *and* interior), along with one high-resolution texture map that contained all the baked textures and surfaces from the interior CAD files, and it looked as good as a 360 spherical photo, except there was nice parallax and we could animate the camera through the interior.
Because most of our texture space was allotted to the interior textures, we had to be very efficient with the surrounding environment, which often was a strategically repeated texture map.
Optimizing Assets for Mobile Delivery
We almost NEVER used bump maps, and used a normal map only once or twice out of our entire portfolio. Adding those additional maps would have increased texture download time as well as put a hit on mobile screen performance.
What I’m getting at here, is that mobile 3D needs to be swift, efficient, and use as little geometry and texture draw calls as possible. At Facebook's OC5 conference, the first Quest VR headset system was introduced to the world and to the developers who had attended. I was lucky to attend the design lectures that would help developers optimize their apps and games to perform properly within their new VR headset.
The first Quest headset was basically a mobile phone packed into the headset form factor, so design of content required major simplification - removal of all extra material shader maps, such as bump maps, normal maps, specular maps, ambient occlusion maps, etc. All that remained in the 3D scene was geometry, a diffuse map for most materials, and simple lighting setups.
(image diagram of 6 maps arrow transition to one map)
The Over-Promise of Marketing and Trendy Buzzwords
Most 3D game engines today are touting how much geometry, real-time lighting and post-fx they can handle, and most texture map companies are going higher and higher resolution, adding full ranges of support maps for PBR shaders etc. This is all well and good, but there is still a great need for texture maps to be able to hold their own when used as a tileable texture within a 3D or VR scene, so that download times are minimized and bandwidth used efficiently.
Many texture designers build materials that use a base color, or “albedo” map, and shaders that use the albedo map depend on lighting and shadows to add realism. At Texturama, we design and remaster our diffuse color map to also contain a subtle amount of shading within. Enough shading so that it looks realistic on its own, but subtle enough so that if you add more shader map types, the materials will look natural and adapt to your custom lighting designs when your efficiency thresholds are less stringent.
Snake Oil vs Reality
The Metaverse is in a “Wild West” phase, and will remain fragmented until there is a clear winner of the Metaverse battle for dominance. Facebook has claimed the name Meta, but that doesn’t mean they own it (yet). Apple is still quietly doing its thing and has been hiring and employing 3D talent for many years now, and they will release it when it is market-tested and functional enough to be successful. One could say that Second Life was one of the early successful Metaverses, but once again, they are fragmented from the rest of the web, and a walled garden with their own economy, their own culture, and their own ways of policing and litigation, etc.
Until there is a unified Metaverse world, I believe it will remain fragmented and likely to exist mostly on the World Wide Web. For the best user experience and faster adoption, it will need to be lightweight, quick loading, smooth frame rate, glasses-free, and universally consumed through web browsers on devices large and small.
At the very least, a lightweight metaverse can always live alongside a dense one.