Risk Warning: Beware of illegal fundraising in the name of 'virtual currency' and 'blockchain'. — Five departments including the Banking and Insurance Regulatory Commission
Information
Discover
Search
Login
简中
繁中
English
日本語
한국어
ภาษาไทย
Tiếng Việt
BTC
ETH
HTX
SOL
BNB
View Market
a16z: Metaverse Unlocks New Opportunities in Game Infrastructure
DeFi之道
特邀专栏作者
2022-06-27 03:20
This article is about 11161 words, reading the full article takes about 16 minutes
How can we shed the legacy of the walled garden to unlock the potential of the Metaverse?

Original author:James Gwertzman, a16z game business partner

Compilation of the original text: The Way of DeFi

Compilation of the original text: The Way of DeFi

You install that new parkour game everyone's talking about, and your avatar instantly gains a new set of skills. After a few minutes of the tutorial level, climbing up walls and jumping over obstacles, you're ready for a bigger challenge. You teleport yourself into your favorite game, Grand Theft Auto: Metaverse, follow a tutorial set by another player, and soon you're leaping over car hoods and jumping from rooftop to rooftop. Wait a minute... what's that light down there? Oh, the super-evolved Charizard! You pull a Poke Ball from your pocket, capture it, and hit the road...

This gaming scenario cannot happen today, but it seems to me that it will happen in our future. I believe the concepts of composability (recycling, reusing, and recombining basic building blocks) and interoperability (getting components of one game to work in another) are emerging in games and will revolutionize how games are built and experienced Way.

Game developers will build faster because they don't have to start from scratch every time. Able to try new things and take new risks, they build more creatively. And there will be more, because the barrier to entry will be lower. The essence of games will expand to include these new "meta-experiences" that, like the aforementioned examples, will function in and out of other games.

Of course, any discussion of "meta-experiences" also sparks discussions around another much-talked-about idea: the metaverse. It's true that many see Metaverse as a well-crafted game, but its potential is much higher than that. Ultimately, the Metaverse represents the entirety of how we humans will interact and communicate online in the future. In my opinion, game creators who build on game technology and follow the game production process will be the key to unlocking the potential of the Metaverse.

Why a Game Creator? No other industry has such extensive experience building large-scale online worlds where hundreds of thousands (sometimes tens of millions!) of online participants often interact simultaneously. Modern games are more than just "playing" - they are just as important as "trading", "skilling", "streaming" or "buying". The Metaverse has added more verbs to that list -- think "work" or "love." Just as microservices and cloud computing started a wave of innovation in the tech industry, I believe the next generation of gaming technology will usher in a new generation of gaming innovation and creativity.

This already happens in limited ways. Many games now support user-generated content (UGC), which allows players to build their own extensions to existing games. Some games, like Roblox and Fortnite, are so scalable that they've called themselves metaverses. But the current generation of gaming technology, still primarily built for single-player play, can only get us so far.

This post outlines my vision for a transformative phase of gaming, then dissects the new areas of innovation needed to usher in this new era.

first level title

Upcoming Game Change

Games have long been primarily a single, fixed experience. The developers will build them, release them, and start building the sequel. Players buy them, play them, and then move on to other games after they've beaten all of the game's content -- usually in just 10 to 20 hours of gameplay.

We are now in the game-as-a-service era where developers are constantly updating their games after they are released. Many of these games also have UGC adjacent to the virtual world, such as virtual concerts and educational content. Roblox and Minecraft even have marketplaces where player creators can get paid for their work.

Crucially, however, these games remain (purposefully) isolated from each other. While their respective worlds may be large, they're closed ecosystems, and nothing can be transferred between them -- not resources, skills, content, or friends.

  • So how do we shed the legacy of the walled garden to unlock the potential of the Metaverse? As composability and interoperability become important concepts for metaverse-minded game developers, we will need to rethink how we approach the following:

  • identity. In the Metaverse, players will need a single identity that they can carry between games and between gaming platforms. Today's platforms insist that players have their own player profiles, and players must tediously rebuild their profiles and reputations from scratch with each new game they play.

  • friendship. Likewise, today's games maintain separate friend lists -- or at best, use Facebook as the de facto source of truth. Ideally, your network of friends will also follow you from game to game, making it easier to find friends to play with and share competitive leaderboard information.

  • property. Currently, items you acquire in one game cannot be transferred to or used in another game--and for good reason. A game that allows players to bring modern assault rifles into the medieval era might be satisfying for a while, but it can quickly ruin it. But with the right constraints and constraints, exchanging (some) items across games can open up new levels of creativity and emergent play.

Gameplay. Games today are closely related to gameplay. For example, all the fun of the "platformer" genre, with titles like Super Mario Odyssey, is enabling mastery of virtual worlds. But by opening up the game and allowing for "remix" elements, players can more easily "remix" new experiences and explore their own "what-if" narratives.

image

I see these changes happening at three clear levels of game development: technical (game engine), creative (content production), and experiential (live operations). At each layer, there are obvious opportunities for innovation, which I describe below.

The expandable section below outlines the game-making process for those who may not be familiar with its unique intricacies.

first level title

Tech Layer: Reimagining the Game Engine

At the heart of most modern game development is the game engine, which powers the player experience and makes it easier for teams to build new games. Popular engines like Unity or Unreal provide common functionality that can be reused in games, freeing up game creators to build something unique to their game. Not only does this save time and money, but it also levels the playing field, allowing smaller teams to compete against larger teams.

That said, the fundamental role of a game engine relative to the rest of a game hasn't really changed in the past 20 years. While the engines have increased the number of services they provide -- expanding from just graphics rendering and audio playback to multiplayer and social services as well as post-launch analytics and in-game advertising -- these engines are still provided primarily as code libraries, fully developed by each Game package.

image

However, when considering the Metaverse, the engine plays a much more important role. To break down the walls that separate one game or experience from another, games may be encapsulated and hosted in an engine, rather than the other way around. In this expanded view, engines become platforms, and the communication between these engines will largely define what I think of as a shared metaverse.

Take Roblox for example. The Roblox platform provides the same key services as Unity or Unreal, including graphics rendering, audio playback, physics, and multiplayer. However, it also offers other unique services, such as player avatars and identities that can be shared across its game catalog; expanded social services, including shared friend lists; robust security features to help keep communities safe; and helping players create new A library of tools and assets for games.

first level title

Interoperability and Composability

Interoperability and Composability

For example, in order to unlock the Metaverse and allow hunting of Pokemon in the Grand Theft Auto universe, these virtual worlds will require an unprecedented level of cooperation and interoperability. While it is possible for a single corporation to control the common platform that powers the global Metaverse, that is neither desirable nor likely. Instead, a decentralized game engine platform is more likely to emerge.

Of course, I can't talk about decentralized technologies without mentioning web3. Web3 refers to a set of blockchain-based technologies that use smart contracts to decentralize ownership by transferring control of key networks and services to users/developers. In particular, concepts such as composability and interoperability in web3 help to solve some of the core issues faced when moving towards the metaverse, notably identity and property, and a lot of research and development is going into the core web3 infrastructure.

That said, while I believe web3 will be a key component in reimagining game engines, it's not a panacea.

Perhaps the most obvious application of web3 technologies in the Metaverse is allowing users to buy and own items in the Metaverse, such as virtual real estate plots or clothing for digital avatars. Since transactions written to the blockchain are a matter of public record, purchasing an item as a non-fungible token (NFT) could theoretically own an item and use it across multiple Metaverse platforms, as well as several other applications .

However, I don't think this will happen in practice until the following issues are addressed:

Single user identity, players can move between virtual worlds or games with a single consistent identity. This is necessary for everything from matching to content ownership to stopping trolls. One service that tries to solve this problem is Hellō. They are a multi-stakeholder collaboration aiming to transform personal identity through a user-centric vision of identity, primarily based on web2 centralized identity. There are others using the web3 decentralized identity model, such as Spruce, which enables users to control their digital identities through wallet keys. Meanwhile, Sismo is a modular protocol that uses zero-knowledge proofs to enable decentralized identity management and more.

  • Common content format so content can be shared between engines. Today, each engine has its own proprietary format, which is necessary for performance. However, in order to exchange content between engines, a standard open format is required. One such standard format is Pixar's Universal Scene Description for movies; another is NVIDIA's Omniverse. However, all content types require standards.

  • Cloud-based content storage, so that content needed by one game can be located and accessed by others. Today, the content required by a game is typically either packaged with the game as part of the distribution package, or made available for download over the web (and accelerated by a content delivery network or CDN). For content to be shared between worlds, there needs to be a standard way to find and retrieve this content.

  • A shared payment mechanism, so metaverse owners have a financial incentive to allow assets to be transferred between metaverses. Digital asset sales are one of the main ways platform owners are compensated, especially under the “free to play” business model. Thus, to incentivize platform owners to loosen control, asset owners can pay a “corkage fee” for the privilege of using their assets within the platform. Alternatively, if the asset in question is famous, the metaverse might also be willing to pay the asset owner to bring their asset into their world.

  • Negotiated look and feel, so content assets can change their appearance to match the world they are entering. For example, if I have a high tech sports car that I want to drive into your very strictly steampunk themed world, my car may have to be turned into a steam powered buggy to be allowed in. It could be that my assets know what to do, or that the metaverse world I'm entering is responsible for providing an alternate look and feel.

first level title

Improved multiplayer system

A big area of ​​focus is the importance of multiplayer and social features. More and more games today are multiplayer games, because games with social features are vastly superior to single-player games. Because the Metaverse will be entirely social by definition, it will be subject to various issues unique to the online experience. Social games must worry about harassment and toxicity; they are also more vulnerable to DDoS attacks that result in lost players, and often must run servers in data centers around the world to minimize player latency and provide the best possible player experience.

Given the importance of multiplayer functionality to modern games, there is still a lack of fully competitive off-the-shelf solutions. Engines like Unreal or Roblox and solutions like Photon or PlayFab provide these basics, but developers have to fill in holes like advanced pairing themselves.

  • Innovations in multiplayer game systems may include:

  • With serverless multiplayer, developers can implement authoritative game logic and have it automatically hosted and scaled in the cloud without having to worry about spinning up actual game servers.

  • Advanced matchmaking to help players quickly find other players of a similar level to play against, including AI tools to help determine player skill and ranking. In the metaverse, this becomes even more important because "matchmaking" becomes so broad (e.g., "find me another person to practice my Spanish with," rather than just "find me a group of players to raid a dungeon with." )

  • Guilds or Clans, to help players unite with other players, compete with other groups, or just for a more social sharing experience. Virtual worlds are also full of opportunities for players to join forces with other players in pursuit of common goals, creating opportunities for services such as creating or managing guilds, and syncing with external community tools such as Discord.

first level title

Automated Testing Services

Testing is a costly bottleneck when releasing any online experience, as a small group of game testers must iterate through the experience to ensure that everything works as expected, without glitches or bugs.

Games that skip this step are risky. Consider the recent launch of the highly anticipated game Cyberpunk 2077, which has been heavily condemned by players due to the number of bugs that occurred after its launch. However, because the Metaverse is essentially an "open world" game with no set path, testing can be prohibitively expensive. One way to alleviate this bottleneck is to develop automated testing tools, such as AI agents that can play games like players, looking for glitches, crashes, or bugs. A side benefit of this technology is trusted AI players that can either be swapped out for real players who accidentally quit a multiplayer match, or provide early multiplayer "match fluidity" to reduce the amount of time players have to wait to start a match.

  • Innovations in automated testing services may include:

  • Automatically train new agents by observing real players interacting with the world. One benefit of this is that the agent will continue to get smarter and more believable the longer the virtual world runs.

  • Swap the AI ​​agents for real players so that if one player suddenly quits the multiplayer experience, it doesn't end for the other players. This feature also raises some interesting questions, such as whether players can "tag" AI opponents at any time, or even "train" their own substitutes to compete on their behalf. Will "AI assistance" become a new category of events?

first level title

The Creative Layer: Reimagining Content Production

As 3D rendering technology becomes more powerful, the amount of digital content required to create games continues to increase. Consider the latest Forza Horizon 5 racing game - the largest Forza download ever, requiring over 100 Gb of disk space, compared to 60 Gb for Horizon 4. This is just the tip of the iceberg. The original "source art file", the file created by the artist and used to create the final game, can be many times larger. Assets grow because these virtual worlds continue to grow in size and quality, with higher levels of detail and higher fidelity.

Now consider the Metaverse. As more and more experiences move from the physical to the digital world, the demand for high-quality digital content will continue to increase.

This has already happened in film and television. The recent Disney+ show The Mandalorian broke new ground by shooting on "virtual sets" running in the Unreal game engine. This was revolutionary as it reduced production time and costs while increasing the range and quality of the finished product. In the future, I hope more and more works are shot in this way.

Also, unlike physical film sets, which are usually destroyed after shooting due to the high storage costs of keeping them intact, digital sets can be easily stored for future reuse. In fact, it therefore makes sense to invest more rather than less money and build a fully realized world that can be reused later to produce a fully interactive experience. Hopefully, in the future, we'll see these worlds made available to other creators to create new content within these fictional realities, furthering the evolution of the Metaverse.

Given these challenges, I see three major areas of innovation in digital content production: 1) AI-assisted content creation tools, 2) cloud-based asset management, build and publish systems, and 3) collaborative content generation.

first level title

AI-Assisted Content Creation

Today, almost all digital content is still built by hand, adding to the time and cost required to release modern games. Some games have experimented with "procedural content generation," where algorithms can help generate new dungeons or worlds, but building those algorithms themselves can be very difficult.

However, a new wave of AI-assisted tools is coming that will help artists and non-artists alike create content faster and with higher quality, reduce the cost of content production, and democratize the production of game tasks.

This is especially important for the Metaverse, since nearly everyone will be required to be a creator--but not everyone will be able to create world-class art. By art, I mean the entire digital asset class, including virtual worlds, interactive characters, music and sound effects, and more.

Innovations in AI-assisted content creation will include conversion tools that can convert photos, videos, and other real-world artifacts into digital assets such as 3D models, textures, and animations. Examples include Kinetix, which creates animations from video; Luma Labs, which creates 3D models from photos; and COLMAP, which creates navigable 3D spaces from still photos.

An important aspect of using AI-assisted content creation in game creation is reproducibility. Since creators must often go back and make changes, simply storing the output of an AI tool is not enough. Game creators must store the entire set of instructions that created that asset so that artists can go back later and make changes, or duplicate the asset and modify it for a new purpose.

first level title

Cloud-based asset management, build and release system

One of the biggest challenges game studios have to face when building modern video games is managing all the content needed to create an engaging experience. Today, it's a relatively unsolved problem with no standardized solution; each studio has to cobble together its own.

To understand why this is such a difficult problem, consider the sheer amount of data involved. Large games can require millions of files of different types, including textures, models, characters, animations, levels, visual effects, sound effects, recorded dialogue, and music.

Each of these files is changed repeatedly during production, and it is necessary to keep a copy of each variation in case the creator needs to go back to an earlier version. Today, artists often meet this need by simply renaming files (for example, "forest-ogre-2.2.1"), which leads to file proliferation. And due to the nature of these files, this takes up a lot of storage space, as they are often large and difficult to compress, and each revision must be stored separately. This differs from source code, where you can store only the changes for each revision itself, which is very efficient. This is because for many content files, such as artwork, changing even a small portion of an image can change almost the entire file.

Furthermore, these files do not exist in isolation. They are part of an overall process commonly called the content pipeline, which describes how all of these individual content files come together to create a playable game. In this process, the "source art" files created by artists are converted and assembled into "game assets" through a series of intermediate files for use by the game engine.

Today's pipelines are not very intelligent and are often unaware of the dependencies that exist between assets. For example, the pipeline usually doesn't know the specific texture of the 3D basket held by the specific farmer character that lives within the level. Therefore, whenever any asset is changed, the entire pipeline must be rebuilt to ensure all changes are cleaned up and merged. This is a time-consuming process that can take hours or more, slowing down the pace of creative iteration.

The demands of the Metaverse will exacerbate these existing problems and create some new ones. For example, the Metaverse is going to be huge - bigger than the biggest games today - so all existing content storage issues apply to the Metaverse. Additionally, the "always-on" nature of the Metaverse means that new content needs to be streamed directly into the game engine; it is not possible to "stop" the Metaverse to create new builds. The Metaverse needs to be able to update itself on the fly. To achieve composability goals, remote and distributed creators will need ways to access source assets, create their own derivatives, and then share them with others.

Second, there is a lot of work to be done to automate the content pipeline, which can modernize and standardize the art pipeline. This includes exporting source assets to intermediate formats and building those intermediate formats into game-ready assets. The smart pipeline will understand the dependency graph and be able to do incremental builds so that when an asset changes, only those files with downstream dependencies are rebuilt - drastically reducing the time it takes to see new content in-game.

first level title

Improve collaboration tools

Despite the distributed, collaborative nature of modern game studios, many of the professional tools used in game production are still centralized, single-creator tools. For example, by default, both the Unity and Unreal level editors only support one designer editing one level at a time. This slows down the creative process because teams cannot work in parallel on one world.

On the other hand, both Minecraft and Roblox support collaborative editing; this is one reason why these consumer platforms are so popular despite their lack of other professional features. Once you've watched a group of kids build a city together in Minecraft, it's impossible to imagine building it any other way. I believe collaboration will become a fundamental feature of the Metaverse, allowing creators to come together online to build and test their creations.

  • Overall, collaboration in game development will become real-time in almost all aspects of the game creation process. Some of the ways in which collaboration may develop in order to unlock the metaverse include:

  • Real-time collaborative world building, so multiple level designers or "world builders" can simultaneously edit the same physical environment and see each other's changes in real time, with full version control and change tracking. Ideally, level designers should be able to switch seamlessly between playback and editing for the fastest iteration possible. Some studios are experimenting with this with proprietary tools such as Ubisoft's AnvilNext game engine, and Unreal has experimented with real-time collaboration as a beta feature originally built to support TV and film production.

  • Real-time content review and approval, so teams can experience and discuss their work together. Group discussions have always been a key part of the creative process. Movies have long had "dailies" in which production teams can collectively review each day's work. Most game studios have a large room with a large screen for group discussions. But the tools for remote development are much weaker. The fidelity of screen sharing in tools like Zoom is not high enough to provide an accurate view of the digital world. One solution for the game could be a "spectator mode," where an entire team can log in and see through the eyes of a single player. Another is to improve the quality of screen sharing, trading less compression for higher fidelity, including faster frame rates, stereo sound, more accurate color matching, and the ability to pause and annotate. Such tools should also be integrated with task tracking and assignment. Companies trying to solve this problem for movies include frame.io and sohonet.

  • A virtual game studio entirely in the cloud, where game creator team members (artists, programmers, designers, etc.) can log in from anywhere, on any type of device (including low-end PCs or tablets), and can instantly Access a high-end game development platform and a complete library of game assets. Remote desktop tools like Parsec might play a role here, but it's not just about remote desktop capabilities, but how creative tools are licensed and assets are managed.

first level title

Experience Layer: Reimagining Operational Real-Time Services

The final layer of metaverse reorganization involves creating the necessary tools and services to actually operate the metaverse itself, and this is arguably the hardest part. Building an immersive world is one thing, but having millions of players around the world and running it 24/7 is another.

  • Developers must deal with:

  • There are social challenges with running any large municipality, which can be filled with citizens who don't always get along and who need to adjudicate disputes.

  • The economic challenge of effectively running a central bank with the ability to mint new money and monitor money sources and remittances to control inflation and deflation.

  • The monetization challenges of running a modern e-commerce site with thousands or even millions of items for sale and the associated need for offers, promotions and global marketing tools.

  • The analytical challenge of knowing what is happening in their sprawling world in real time, so they can be quickly alerted to problems before they escalate out of control.

  • Since the Metaverse is (in principle) global, communicating with their digital constituents in any language presents challenges, both individually and collectively.

Challenges with frequent content updates to keep their metaverse growing and evolving.

To meet all of these challenges, companies need well-equipped teams with access to extensive levels of back-end infrastructure as well as the necessary dashboards and tools to allow them to operate these services at scale. Two areas that are particularly ripe for innovation are LiveOps services and in-game commerce.

LiveOps as a field is still in its infancy. Commercial tools such as PlayFab, Dive, Beamable, and Lootlocker only implement part of a complete LiveOps solution. As a result, most games still feel compelled to implement their own LiveOps stack. Ideal solutions include: real-time event calendar, with the ability to schedule events, predict events, and create event templates or replicate previous events; personalization, including player segmentation, targeted promotions and offers; messaging, including push notifications, electronic mail and in-game inboxes, and translation tools to communicate with users in their own language; notification authoring tools so non-programmers can author in-game pop-ups and notifications; and testing to simulate upcoming events or new Content updates for , including a mechanism to roll back changes if something goes wrong.

The solutions that exist today only solve part of the problem. An ideal solution would need to include an item catalog, including arbitrary metadata for each item; an app store interface for real money sales; offers and promotions, including limited-time and targeted offers; reporting and analytics, with targeted reports and graphs ; user-generated content, such as games that can sell their own player-created content and return a percentage of the revenue to those players; advanced economic systems, such as item crafting (combining two items into a third), auction house (so players can sell items to each other), trading and gifting; and fully integrated with the web3 world and the blockchain.

first level title

Next Steps: Game Development Team Transformation

In this article, I share my vision for how games will transform as new technologies enable composability and interoperability between games. I hope that others in the gaming community will share my excitement for the potential to come, and that they will be inspired to join me in building the new companies needed to unleash this revolution.

The coming wave of change will offer more than just opportunities for new software tools and protocols. It will change the nature of game studios as the industry moves away from monolithic single studios and becomes more specialized at new levels.

  • In fact, I think in the future we'll see greater specialization in the game making process. I also think we'll see:

  • World builders specialize in creating playable worlds that are both believable and fantastic, filled with creatures and characters that fit that world. Think of an incredibly detailed version of the Wild West built for Red Dead Redemption. Instead of investing so much in that world for one game, why not repurpose that world to create many games? Why not keep investing in that world, letting it grow and evolve, shaping itself over time to meet the needs of these games?

  • Narrative designers craft compelling interactive narratives in these worlds, full of storylines, puzzles, and quests for players to discover and enjoy.

  • Experience creators who focus on gameplay, reward mechanics, and control schemes to create playable experiences that span worlds. Creators who can bridge the gap between the real and virtual worlds will be especially valuable in the coming years as more companies try to bring parts of their existing businesses into the virtual world.

Platform builders that provide the underlying technology that these experts use to get their work done.

The team at a16z Games and I are excited to be investing in this future, and I can't wait to see the incredible levels of creativity and innovation that these changes transforming our industry will unleash. Gaming is already the largest single sector in the entertainment industry, but as more and more sectors of the economy move online and into virtual worlds, gaming is poised to grow even bigger.

And we haven't even talked about some of the other exciting new developments on the horizon, like Apple's new augmented reality headset or Meta's recently announced new VR prototype, or bringing 3D technology to web browsers with WebGPU,

a16z
Metaverse
Web3.0
Welcome to Join Odaily Official Community