Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

avatar
Gryphsis Academy
7 months ago
This article is approximately 8615 words,and reading the entire article takes about 11 minutes
The commercial application of generative AI will be popular all over the world in 2022, but as the novelty fades, some current problems of generative AI have gradually emerged. The increasingly mature Web3 field, relying on the fully transparent, verifiable and decentralized characteristics of block

author:@chenyangjamie, @GryphsisAcademy

TL;DR:

  • The commercial application of generative AI will be popular all over the world in 2022, but as the novelty wears off, some current problems of generative AI have gradually emerged. The increasingly mature Web3 field, relying on the fully transparent, verifiable and decentralized characteristics of blockchain, provides new ideas for solving generative AI problems.

  • Generative AI is an emerging technology in recent years. It is developed based on the neural network framework of deep learning. The diffusion model used for image generation and the large language model used for ChatGPT have shown great commercial potential.

  • The implementation architecture of generative AI in Web3 includes infrastructure, models, applications and data. The data part is particularly important when combined with Web3 and has huge room for development, especiallyOn-chain data models, AI agent projects and vertical field applicationsIt has the potential to become a key development direction in the future.

  • Currently, the popular projects in the AI ​​​​track in Web3 on the market all show insufficient fundamentals and weak token value capture capabilities. In the future, we mainly look forward to new popularity or updates to the token economy.

  • Generative AI has huge potential in the Web3 field, and there are many new narratives that are combined with other software and hardware technologies in the future that are worth looking forward to.

1. Why do generative AI and Web3 need each other?

2022 can be called the year when generative AI (Artificial Intelligence) takes the world by storm. Before this, generative AI was only an auxiliary tool for professional workers. However, in Dalle-2, Stable Diffusion, Imagen, and Midjourney successively After birth,Artificial Intelligence Generated Content(AI-Generated Content, referred to as AIGC), as the latest technology application, has produced a large wave of trendy content on social media. The ChatGPT released immediately after was a blockbuster, pushing this trend to its peak. As the first AI tool that can answer almost any question with only the input of a simple text command (prompt), ChatGPT has already become a daily work assistant for many people. It can perform various daily tasks such as document writing, homework tutoring, email assistant, thesis revision, and even emotional tutoring. There is even more enthusiasm on the Internet to study various mysterious prompts used to optimize the results generated by ChatGPT. For the first time, people You can feel the intelligence of artificial intelligence. According to a report by Goldman Sachs’ macro team, generative AI can become a booster for U.S. labor productivity growth. Within 10 years of the development of generative AI, it can drive global GDP growth by 7% (or nearly $7 trillion) within 10 years. and increase productivity growth by 1.5 percentage points.

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

The Web3 field has also felt the spring breeze of AIGC, and the AI ​​sector in January 2023 has risen across the board.

Source: https://www.coingecko.com/

However, after the initial novelty gradually faded, ChatGPT’s global traffic declined for the first time in June 2023 since its release (data source: SimilarWeb). It is also time to rethink the meaning of generative AI and its limitation. Judging from the current situation,Dilemmas encountered by generative AIIncluding (but not limited to): First, social media is flooded with unlicensed and non-traceable AIGC content; secondly, the high maintenance costs of ChatGPT force OpenAI to also choose to reduce the generation quality to reduce costs and increase efficiency; finally, Even large models of the world still have biases in some aspects of the generated results.

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

ChatGPT global desktop and mobile traffic

Source: Similarweb

At the same time, Web3, which is gradually becoming mature, provides new solutions to the current dilemma of generative AI with its decentralized, fully transparent and verifiable characteristics:

  • The full transparency and traceability of Web3 can solve the challenges of data copyright and privacy brought by generative AI.These two features of Web3 can enable the source and authenticity of content to be effectively verified, thereby significantly increasing the cost of AI-generated false or infringing content, such asShort remix videos with confusing copyrights or DeepFake face-changing videos that infringe on other people’s privacy. In addition, the application of smart contracts in content management is expected to solve copyright issues and ensure that content creators can receive fairer compensation for their creative content.

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

DeepFake Video: This is not Morgan Freeman

Source: Youtube

  • The decentralized nature of Web3 can reduce the risk of centralization of AI computing power. Developing generative AI requires a huge amount of computing resources. It is estimated that the cost of training a GPT-3-based ChatGPT is at least US$2 million. At the same time, the daily electricity bill is about US$47,000, and this number will increase exponentially with the development of technology and scale. Level rises. at presentComputing resources are still heavily concentratedIn the hands of large companies, this results in huge RD, maintenance and operating costs, as well as the risk of centralization, making it difficult for smaller companies to compete. Although the training of large models may still need to be carried out in a centralized environment in the short term because the training of large models requires a lot of computing resources, in Web3, blockchain technology enablesDistributed model inference, community voting governance, and model tokenizationWait becomes possible. Taking the existing decentralized exchange as a mature case, we can design a community-driven decentralized AI large model inference system, in which the ownership of the large model belongs to the community and is governed by the community.

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

Even with the latest H 100 training GPT-3, the cost per FLOPs is still high

Source: substake.com  

  • Utilizing the features of Web3 can optimize the diversity of AI data sets and the interpretability of AI models.Traditional methods of data collection are basically based on public data sets or self-collection by model makers, and the data collected are often restricted by geography and culture. This may cause the content generated by the AIGC program and the answers generated by ChatGPT to have subjective biases from certain ethnic groups, such as changing the skin color of the target task. Through Web3s token incentive model, we can optimize data collection methods, collect data from all corners of the world and assign weights. At the same time, the full transparency and traceability of Web3 can further increase the interpretability of the model and encourage the output of diverse backgrounds to enrich the model.

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

AI designed to improve resolution will turn Obama into a white man

Source: Twitter

  • Web3s massive on-chain data can be used to train unique AI models.The current design and training methods of AI models are often constructed based on the target data structure (text, voice, image or video). A unique future development direction of the combination of Web3 and AI is to refer to the construction and training methods of natural language large models and use the unique data structure of the data on the Web3 chain to establishLarge data model on the chain. This can provide users with a unique perspective that other data analysis cannot reach (smart money tracking, project funding direction, etc.). At the same time, compared to manual on-chain analysis, AI has the advantage of being able to process huge amounts of data concurrently.

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

Automate on-chain analysis and monitor on-chain information to obtain first-hand information
Source: nansen.ai

  • Generative AI is expected to be a powerful help in lowering the threshold for people to participate in the Web3 world. The current mainstream participation model of Web3 projects requires participants to have a considerable understanding of various complex on-chain concepts and wallet operation logic, which greatly increases users’ learning costs and risks of misoperation. In contrast, similar applications in Web2 have already The lazy principle of product design has been implemented for many years, allowing users to get started easily and without risk. Generative AI is expected to help intent-centric projects, by acting as an intelligent assistant between users and protocols in Web3, This greatly improves the user experience of Web3 products.

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

  • Web3 has also created a huge demand for content, and generative AI has become a key means to fill this demand.Generative AI can create a large amount of articles, images, audio and video content for Web3, driving the development of decentralized applications. From NFT markets to smart contract documents, all can benefit from the diverse content generated by AI.

While generative AI and Web3 have their own challenges, their mutual needs and collaborative solutions are expected to shape the future of the digital world. This collaboration will improve the quality and credibility of content creation, drive further development of the digital ecosystem, and provide users with a more valuable digital experience. The co-evolution of generative AI and Web3 will chart an exciting new chapter in the digital age.

2. Technical summary of generative AI

2.1 Technical background of generative AI

Since the concept of AI was proposed in the 1950s, it has experienced several ups and downs. Each innovation in key technologies will bring a new wave, and this time generative AI is no exception. Generative AI is an emerging concept that has only been proposed in the past 10 years. With the dazzling performance of recent technologies and products, it has stood out from many research sub-directions of AI and attracted the attention of the world overnight. Before we go further into the technical architecture of generative AI, we need to first explain the specific meaning of generative AI discussed in this article, and briefly review the core technical components of generative AI that have become popular recently.

Generative artificial intelligence is a type of artificial intelligence that can be used to create new content and ideas (including conversations, stories, images, videos, and music). It is built on the neural network framework of deep learning and trained using large amounts of data. It contains A model with a huge number of parameters. The generative AI products that have recently come into peoples attention can be simply divided into two categories: one is image (video) generation products for text or style input, and the other is ChatGPT products for text input. These two types of products actually use the same core technology, which is the pre-trained large language model (Large Language Model) based on the Transformer architecture.LLM). On this basis, the former type of products adds a diffusion model that combines text input to generate high-quality images or videos, while the latter type of products adds reinforcement learning training based on human feedback (Reinforcement Learning from Human Feedback ,RLHF) to achieve output results close to human logic level.

2.2 Current technical architecture of generative AI:

Many excellent articles in the past have discussed the significance of generative AI to existing technical architectures from different angles, such as this article from A16z Who owns a generative AI platform? 》, which comprehensively summarizes the current technical architecture of generative AI:

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

The main technical architecture of generative AI

Source: Who owns the generative AI platform?

In this research article, the current Web2 generative AI architecture is divided intoThree levels: infrastructure (computing power), models and applications, and also gives views on the current development of these three levels.

For infrastructure, although the logic of infrastructure in Web2 is still dominated at present, there are still very few infrastructure projects that truly combine Web3 with AI. At the same time, infrastructure is also the part that captures the most value at this stage. The technology oligarchs of Web2 have made considerable profits by selling shovels in the current AI exploration stage by virtue of their decades of deep cultivation in the storage and computing fields.

for model, should be the real creator and owner of AI. However, at the current stage, there are very few business models that can support the authors of the models to obtain corresponding commercial value.

For applicationAlthough several vertical fields have accumulated applications with revenue exceeding hundreds of millions of dollars, high maintenance costs and low user retention are not enough to support a long-term business model.

2.3 Application examples of generative AI and Web3

2.3.1 Apply AI to analyze massive data of Web3

Data is the core for establishing technical barriers in the field of future AI development.To understand its importance, lets first look at a study on the sources of performance in large models. This study shows that large AI models exhibit a uniqueEmergent ability: That is, by continuously increasing the model size, when it exceeds a certain threshold, the model accuracy will suddenly increase sharply. As shown in the figure below, each figure represents a training task, and each polyline meets the performance (accuracy rate) of a large model. Experiments on various large models have reached a consistent conclusion: after the model size exceeds a certain threshold, the performance on different tasks shows breakthrough growth.

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

Relationship between model size and model performance
Source: Emergent Analogical Reasoning in Large Language Models

Simply put, quantitative changes in model scale lead to qualitative changes in model performance.The model size is related to the number of model parameters, training time and training data quality. At this stage, when the gap cannot be widened in the number of model parameters (all major companies have top RD teams responsible for the design) and training time (computing hardware is purchased from NVIDIA), one way to build a product that is ahead of competitors is to Finding excellent niche areas requires pain points to create killer applications, but this requires a deep understanding and excellent insight into the target area; the other way is more practical and feasible, which is to collect data that is better than competitors.More and more comprehensive data.

This also provides a good entry point for large generative AI models to enter the Web3 field. Existing AI large models or basic models are trained based on huge amounts of data in different fields, and the uniqueness of on-chain data in Web3 makesLarge data model on the chainBecome a feasible path worth looking forward to. There are currently two product logics in the data hierarchy in Web3:The first is to provide incentives for data providers, while protecting the privacy and ownership of data owners, encouraging users to share data usage rights with each other. Ocean Protocol provides a good data sharing model.The second is for the project party to integrate data and applications to provide users with services specific to a certain task.For example, Trusta Lab collects and analyzes users on-chain data, and through its unique MEDIA score scoring system, it can provide services such as witch account analysis and on-chain asset risk analysis.

2.3.2 Web3 AI agent application

There is also the on-chain AI Agent application mentioned above that is also gaining momentum - with the help of large language models, it provides users with quantifiable on-chain services on the premise of ensuring user privacy.According to a blog post from Lilian Weng, director of artificial intelligence research at OpenAI, AI Agent can be divided into four components, namely Agent = LLM + Planning + Memory + Tool use. As the core of AI Agent, LLM is responsible for interacting with the outside world, learning massive data and expressing it logically in natural language. The Planning + Memory part is similar to the concepts of action, policy and reward in the reinforcement learning technology for training AlphaGo. The task goal is broken down into small goals, and the optimized solution to a certain task goal is learned step by step from the results and feedback of multiple repeated trainings. At the same time, the obtained information is stored in different types of memory for different functions. As for Tool use, it refers to the agent’s use of tools such as calling modular tools, retrieving Internet information, accessing proprietary information sources or APIs, etc. It is worth noting that most of this information will be difficult to change after pre-training. .

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

Global diagram of AI Agent
Source: LLM Powered Autonomous Agents

Combined with the specific implementation logic of AI Agent, we can boldly imagine that the combination of Web3 + AI Agent will bring unlimited imagination, such as:

  • AI Agent model can be added to current trading applications, which can provide customers with a natural language-level interactive interface for trading functions including but not limited to price prediction, transaction strategy, stop-loss strategy, dynamic adjustment of leverage, KOL intelligent follow-up, lending, etc.

  • When executing quantitative strategies, the strategy can be further decomposed intoEach sub-task is implemented by different AI Agents, each AI Agent cooperates with each other, which can not only improve the security of privacy protection, but also provide real-time monitoring to prevent opponents from using vulnerabilities to counterattack robots.

  • A large number of NPCs in chain gamesIt is also an application direction that is naturally compatible with AI Agent. There are already projects that use GPT to dynamically generate dialogue content for game characters. In the future, it is expected that it will not only be limited to preset texts, but also be upgraded to more realistic real-time game NPCs (or even Digital human) interaction, which can interact on its own without involving player intervention. Published by Stanford Universityvirtual town” is an excellent application example.

Although the current Web3 + AI Agent project center is still concentrated on the primary market or AI infrastructure side, and there is still no To C killer application, it is believed that by combining the various characteristics of the blockchain,Such as distributed on-chain governance, zero-knowledge proof reasoning, model distribution, and interpretability improvementWait, the future game-changing Web3 + AI project is worth looking forward to.

2.3.3 Potential vertical field applications of Web3 + AI

  • A. Application in education field

With the combination of Web3 and AI, the education field has ushered in a revolution.Among them, the generative virtual reality classroom is a striking innovation. By embedding AI technology into online learning platforms, students can obtain a personalized learning experience, and the system generates customized educational content based on students learning history and interests. This personalized approach is expected to improve students learning motivation and effectiveness, making education more relevant to individual needs.

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

Students participate in virtual reality classes through immersive VR equipment
Source: V-SENSE Team

also,Token Model Credit IncentiveIt is also an innovative practice in the field of education. Through blockchain technology, students credits and achievements can be encoded into tokens, forming a digital credit system. Such an incentive mechanism encourages students to actively participate in learning activities and creates a more participatory and motivating learning environment.

At the same time, inspired by the recently popular SocialFi project FriendTech, similar key pricing logic bound to IDs can also be used to establish a mutual evaluation system among classmates, which also brings more social elements to education.By leveraging the non-tamperability of the blockchain, the evaluation among classmates is more fair and transparent. This mutual evaluation mechanism not only helps cultivate students teamwork and social skills, but also provides a more comprehensive and multi-angle assessment of student performance, introducing more diverse and comprehensive evaluation methods into the education system.

  • B. Medical field applications

In the medical field, the combination of Web3 and AI drivesfederated learningandDistributed reasoningdevelopment of. By uniting distributed computing and machine learning, medical professionals can share data at an extremely large scale for deeper, more comprehensive group learning. This collective intelligence approach can accelerate the development of disease diagnosis and treatment options and promote progress in the medical field.

privacy protectionThis is a key issue that cannot be ignored in applications in the medical field. Through the decentralization of Web3 and the immutability of blockchain, patients medical data can be stored and transmitted more securely. Smart contracts can achieve precise control and permission management of medical data, ensuring that only authorized personnel can access patients sensitive information, thus maintaining the privacy of medical data.

  • C. Application in insurance field

In the insurance field, the integration of Web3 and AI is expected to bring more efficient and intelligent solutions to traditional businesses. For example, in car and home insurance,The use of computer vision technology allows insurance companies to analyze and estimate prices through images, assess property value and risk levels more efficiently. This provides insurance companies with more refined and personalized pricing strategies and improves the risk management level of the insurance industry.

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

Using AI technology for claims valuation
Source: Tractable Inc

at the same time,Automatic claims settlement on the chainIt is also an innovation in the insurance industry. Based on smart contracts and blockchain technology, the claims process can be more transparent and efficient, reducing cumbersome procedures and the possibility of human intervention. This not only increases the speed of claims settlement, but also reduces operating costs, creating a better experience for insurance companies and customers.

Dynamic premium adjustmentIt is another aspect of innovative practice,Through real-time data analysis and machine learning algorithms, insurance companies can adjust premiums more accurately and timely, and conduct personalized pricing based on the actual risk profile of the insured. This not only makes premiums fairer, but also incentivizes insureds to adopt healthier and safer behaviors, promoting risk management and preventive measures throughout society.

  • D. Application in copyright field

In the copyright field, the combination of Web3 and AI brings a new paradigm to digital content creation, planning proposals, and code development. Through smart contracts and decentralized storage,Copyright information for digital contentBetter protection can be achieved, and creators of works can more easily track and manage their intellectual property. At the same time, through blockchain technology, transparent and non-tamperable creative records can be established, providing a more reliable means for traceability and certification of works.

Innovation in working models is also an important change in the copyright field.Token-incentivized work collaborationBy combining work contributions with token incentives, creators, planners, and developers are encouraged to participate in projects together. This not only promotes collaboration between creative teams, but also provides participants with the opportunity to directly benefit from the success of the project, leading to the emergence of more outstanding works.

on the other hand,Tokens as proof of copyrightThe application has reshaped the model of benefit distribution. Through the dividend mechanism automatically executed by smart contracts, each participant of the work can obtain the corresponding profit share in real time when the work is used, sold or transferred. This decentralized dividend model effectively solves the opacity and lag problems in the traditional copyright model and provides creators with a fairer and more efficient benefit distribution mechanism.

  • E. Application in the metaverse field

In the field of metaverse, the integration of Web3 and AI isCreate low-cost AIGC to fill chain game contentProvides new possibilities. The virtual environment and characters generated by AI algorithms can enrich the content of chain games, provide users with a more vivid and diverse game experience, and at the same time reduce the labor and time costs in the production process.

digital manCrafting is an innovation in Metaverse applications. combineDetailed appearance generation of hair and thinking construction based on large language model, the generated digital people can play various roles in the metaverse, interact with users, and even participate in digital twins of real scenes. This provides a more realistic and profound experience for the development of virtual reality, and promotes the widespread application of digital virtual human technology in entertainment, education and other fields.

Automatically generate advertising content based on user portraits on the chainIt is an intelligent advertising creative application in the metaverse field. By analyzing users behaviors and preferences in the metaverse, AI algorithms can generate more personalized and attractive advertising content, improving advertising click-through rates and user engagement. This method of advertising generation is not only more in line with user interests, but also provides advertisers with a more efficient promotion approach.

Generative interactive NFTIt is a striking technology in the metaverse field. By combining NFT with generative design, users can participate in the creation of their own NFT artwork in the Metaverse, giving it interactivity and uniqueness. This provides new possibilities for the creation and trading of digital assets and promotes the development of digital art and virtual economy in the metaverse.

3. Web3 related targets

Here the author has chosen five projects:Render NetworkandAkashAs a veteran leader in general AI infrastructure and AI track,Bittensor As a hot item in the model category,Alethea.aiAs a generative AI strongly related application project,Fetch.aiAs a landmark project in the field of AI agents, let’s take a look at the current status of generative AI projects in the Web3 field.

3.1 Render Network($RNDR)

Render Network was founded in 2017 by Jules Urbach, the founder of its parent company OTOY. OTOYs core business is cloud graphics rendering. It has participated in the production of Oscar-winning film and television projects, has the co-founders of Google and Mozilla as consultants, and has participated in many cooperation projects with Apple. The Render Network, which extends from OTOY into the Web3 field, was created to take advantage of the distributed nature of blockchain technology to connect smaller-scale rendering and AI needs and resources to a decentralized platform, thereby saving rent for small workshops. The cost of expensive centralized computing resources (such as AWS, MS Azure and Alibaba Cloud) also provides income generation for parties with idle computing resources.

Since Render is an OTOY company that independently developed the high-performance renderer Octane Render, and has certain business logic, it was considered a Web3 project with its own requirements and fundamentals when it was first launched. During the period when generative AI is all the rage, distributed verification and distributed reasoning tasks, which have seen a significant increase in demand, perfectly fit Renders own technical architecture and are considered one of its future development directions worth looking forward to. At the same time, Render has occupied the leading position in the AI ​​​​track in the Web3 field in recent years, and has derived a certain degree of meme nature. Every time there is a narrative craze such as AI, metaverse, distributed computing, etc., it always reaps rising dividends. It can be said that it is More versatile.

Render Network announced in February 2023 that it will update the newpricing grading systemand $RNDR voted by the communityprice stabilization mechanism(However, it has not yet been determined when it will go online), and also announced that the project will be transferred from Polygon to Solana (at the same time, the $RNDR token will be upgraded to the $RENDER token based on the Solana SPL standard. The project has completed the transfer in November 2023).

Published by Render NetworkNew pricing tier systemThe services on the chain are divided into three levels, from high to low, corresponding to rendering services of different prices and quality, which can be chosen by the rendering demander.

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

Three tiers of Render Network’s new pricing tiering system

And $RNDR’s was voted for by the community.price stabilization mechanism, the change from the previous irregular repurchase to the use of the Burn-and-Mint Equilibrium (BME) model makes $RNDR more obvious as a price-stable payment token rather than a long-term holding asset. The specific business process in a BME Epoch is as shown in the figure below:

  • Product Creation Product Creation. The “Product creators” on Render are rendering resource providers. They package idle rendering resources into products (nodes) and wait to be used online on the network;

  • Purchase the product Purchasing Product. If customers with rendering needs have $RNDR token, they can directly burn the token as payment for the service. If not, they can first purchase $RNDR with legal currency on DEX and then burn the token. The price paid for the service is publicly recorded on-chain.

  • Mint token Mint Token. Allocate new tokens according to preset rules.

Note: Render Network will collect 5% of the product buyers fee from each transaction for project operations.

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

Burn-and-Mint Equilibrium Epoch

Credit to Petar Atanasovski
Source: Medium 

According to the rules preset in advance, in each Epoch executed by BME, a preset number of new tokens will be minted (the preset number will gradually decrease over time). Newly minted tokens will be distributed to three parties:

  • Product Creator. Product creators receive two benefits:

    • Rewards for completing tasks. Rewards are based on the number of rendering tasks completed by each product node, which is easy to understand.

    • Online rewards. Each product node is rewarded according to the market that is on standby online, and is encouraged to limit resources to take more work online.

  • Product buyers. Similar to shopping mall product rebates, buyers can receive up to 100% of the $RNDR token rebate to encourage continued use of Render Network in the future.

  • DEX (Decentralized Exchange) liquidity provider. Liquidity providers in cooperative DEXs can receive rewards based on the amount of pledged $RNDR because they ensure that they can purchase a sufficient amount of $RNDR at a reasonable price when they need to burn $RNDR.

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

Source: coingecko.com

It can be seen from the price trend of $RNDR in the past year that as the leading project on the AI ​​​​track in Web3 for many years, $RNDR has reaped the dividends of the wave of AI boom driven by ChatGPT in late 2022 and early 2023. At the same time, with the new generation With the release of the currency mechanism, the price of $RNDR reached a high point in the first half of 2023. After going sideways in the second half of the year, with the recovery of AI brought about by the new OpenAI conference, and the positive expectations that Render Network will migrate to Solana and the new token mechanism will be implemented soon, the price of $RNDR has reached a higher level and has come to a high point in recent years. Since the fundamentals of $RNDR have changed very little, for investors, future investments in $RNDR require more careful position management and risk control.

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

Render Network monthly number of nodes

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

Render Network Number of rendered scenes per month
Source: Dune.com

At the same time, we can see from the Dune data dashboard that since the beginning of 2023, the total number of rendering tasks has increased, but the rendering nodes have not increased. This shows that the added users are all users with rendering needs (rather than those with rendering resources). Combined with the generative AI craze at the end of 2022, it is reasonable to infer that the added rendering tasks are all generative AI-related tasks. At present, it is difficult to say whether this part of the demand is long-term demand, and further observation is required.

3.2 Akash Network ($AKT)

Akash Network is a decentralized cloud computing platform designed to provide developers and enterprises with a more flexible, efficient and economical cloud computing solution. The super cloud platform established by the project is built on distributed blockchain technology and utilizes the decentralized characteristics of blockchain to provide users with a decentralized cloud that can deploy and run applications globally. Infrastructure, which includes diverse computing resources including CPU, GPU and storage.

The founders of Akash Network, Greg Osuri and Adam Bozanich, are serial entrepreneurs who have worked together for many years. Each has many years of project experience. Together, they founded the Overclock Labs project, which is still a core participant in Akash Network. The rich experience has enabled the founding team to clearly set the main mission of Akash Network, which is to reduce cloud computing costs, improve availability, and increase users control over computing resources. Through open bidding, resource providers are encouraged to open up idle computing resources in their networks. Akash Network achieves more efficient utilization of resources, thus providing more competitive prices for resource demanders.

Akash Network started the Akash Network Economics 2.0 update plan in January 2023, with the goal of solving many shortcomings of the current token economy, including:

  • Market price fluctuations for the $AKT token have caused the price of long-term contracts to misalign with their value

  • Incentives for resource providers are not enough to release the large amount of computing power in their hands

  • Insufficient community incentives are not conducive to the long-term development of the Akash project

  • Insufficient value capture of $AKT tokens risks affecting project stability

According to the information provided on the official website, the solutions proposed by the Akash Network Economics 2.0 plan include the introduction of stable currency payments, adding a fee for placing orders to eat eggs to increase protocol income, increasing incentives for resource providers and increasing the amount of community incentives, etc. Among themStable currency payment functionandHandling fee function for placing orders and taking ordersAlready installed online.

As the native token of Akash Network, $AKT has multiple uses in the protocol including pledge verification (security), incentives, network governance, and payment of transaction fees. According to data provided on the official website, the total supply of $AKT is 388 M. As of November 2023, 229 M has been unlocked, accounting for approximately 59%. The founding tokens distributed when the project was launched have been fully unlocked in March 2023 and entered the secondary market circulation. The distribution ratio of genesis tokens is as follows:

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

It is worth noting that in terms of value capture, one feature that $AKT intends to implement that has not yet taken effect but is mentioned in the white paper is that Akash plans to charge a collection fee for each successful lease. It then sends these fees to the Take Income Pool for distribution to holders. The program provides for a 10% fee on $AKT transactions and a 20% fee on transactions using other cryptocurrencies. Additionally, Akash plans to reward holders who lock up their $AKT holdings for a longer period of time. Therefore, investors who hold longer will be eligible for more generous rewards. If this plan is successfully launched in the future, it will definitely become a major driving force for the increase in currency prices and will also help to better estimate the value of the project.

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

Source: coingecko.com

It can be seen from the price trend displayed on coincko.com that the price of $AKT also experienced an increase in mid-August and late November 2023, but it was still not as good as the increase in the same period of other projects in the AI ​​​​track. This may be the same as the current It is related to the financial sentiment tendency. Overall, Akashs project is one of several high-quality projects in the AI ​​​​track, and its fundamentals are better than most of the competitors in the AI ​​​​track. Its potential business income may bring opportunities for the protocols future profits. With the development of the AI ​​industry and the increasing demand for cloud computing resources, I believe that in the future, we are expected to see Akash Network soar in the next AI wave.

3.3 Bittensor ($TAO)

If readers are familiar with the technical architecture of $BTC, it will be very easy to understand the design of Bittensor. In fact, when designing Bittensor, its author drew on many characteristics of $BTC, the veteran of cryptocurrency, including: the total number of tokens is 21 million, the output is halved approximately every four years, and it involves the PoW consensus mechanism. etc. Specifically, let us imagine an initial Bitcoin output process, and then replace the mining process of calculating random numbers that cannot create real value with training and verification of AI models, and based on the performance and reliability of the AI ​​models This is a simple summary of the project structure of Bittensor ($TAO).

The Bittensor project was first established in 2019 by two AI researchers Jacob Steeves and Ala Shaabana. Its main framework is based on the content of a white paper written by a mysterious author Yuma Rao. To summarize briefly, it designed a license-free open source protocol and built a network architecture connected by many subnets, with different subnets responsible for different tasks (machine translation, image recognition and generation, large language models, etc. ), excellent task completion will be rewarded, while allowing subnets to interact and learn from each other.

Looking back at the large AI models currently on the market, without exception, they are all derived from the huge amounts of computing resources and data invested by technology giants. It is true that AI products trained in this way have amazing performance, but this form also brings extremely high risks of centralized evil. The Bittensor infrastructure design allows a network of communicating experts to communicate and learn from each other, which providesDecentralized training of large modelsThe foundation was laid. Bittensors long-term vision is to compete with the closed-source models of giants such as OpenAI, Meta, and Google, while maintaining the decentralized characteristics of the model, in order to achieve matching inference performance.

The technical core of the Bittensor network comes from the consensus mechanism uniquely designed by Yuma Rao, also known as the Yuma consensus, which is a consensus mechanism that mixes PoW and PoS. The main participants on the supply side are divided into Server (i.e. miners) and Validator (i.e. verifier), while the participants on the demand side are Clients (i.e. customers) who use the model in the network. Miners are responsible for providing pre-trained models for current subnet tasks, and the incentives they receive depend on the quality of the models provided; while verifiers are responsible for verifying model performance and acting as middlemen between miners and customers. The specific process is:

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

  • The client sends the requirements for using the model in a certain subnet and the data to be calculated to the verifier.

  • The validator distributes data to various miners under the subnet

  • The miner uses its own model and the received data to conduct model inference and return the results.

  • The verifier sorts the received inference results according to quality, and the sorting results are stored on the chain.

  • The optimal reasoning results are returned to the users, miners are sorted according to the order, and the verifiers are rewarded according to the workload.

It should be noted that in the vast majority of subnets,Bittensor itself does not train any models, its role is more like linking model providers and model demanders, and on this basis, it further uses the interaction between small models to improve performance in different tasks. Currently, there are 30 subnets that are online (or have been online), corresponding to different task models.

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

As Bittensors native token, $TAOs functions include creating subnets, registering in subnets, payment services, staking to validators, etc., playing a pivotal role in the ecosystem. At the same time, due to the Bittensor project team’s approach of paying tribute to the spirit of BTC, $TAO chosefair start, that is, all tokens will be generated by contributing to the network. Currently, the daily output of $TAO is approximately 7,200, distributed equally among miners and validators. The total amount produced since the launch of the project is approximately 26.3% of 21 million, of which 87.21% of the tokens have been used for pledge and verification. At the same time, the project has designed a production halving about once every four years (the same as BTC). The latest one will occur on September 20, 2025, which will also be a major driving force for price increases.

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

Credit: taostats.io 

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

It can be seen from the price trend that the price of $TAO has experienced a sharp rise since the end of October 2023. It is speculated that the main driving force is the new round of AI craze brought about by the launch of OpenAI, which has caused the capital sector to rotate to the AI ​​sector. . At the same time, as an emerging project in the Web3 + AI track, $TAOs excellent project quality and long-term project vision are also a major reason for attracting funds. However, we have to admit that, like other AI track projects, although the combination of Web3 + AI has great potential, its application in actual business is not enough to support a long-term profitable project.

3.4 Alethea.ai($ALI)

Founded in 2020, Alethea.ai is a project dedicated to using blockchain technology to bring decentralized ownership and decentralized governance to generative content. The founder of Alethea.ai believes that generative AI will bring us into an era where generative content leads to information redundancy. A large amount of electronic content only needs to be simply copied and pasted or generated with one click, but the people who originally created value cannot. income. By connecting on-chain primitives (such as NFT) with generative AI, the ownership of generative AI and its content can be ensured, and community governance can be carried out on this basis.

Driven by this concept, early Alethea.ai launched a new NFT standard, namely iNFT, which can use Intelligence Pod to create embedded AI animation, speech synthesis and even generative AI into images. In addition, Alethea.ai also reached a cooperation with artists to turn their artworks into iNFTs, which were auctioned at Sothebys for a high price of US$478,000.

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

Inject soul into NFT
Source: Alethea.ai

Later, Alethea.ai launched the AI ​​Protocol, which allows any generative AI developers and creators to create using the iNFT standard without permission. At the same time, in order to set an example for other projects to use its own AI Protocol, Alethea.ai also drew on the theory of the GPT large model to launch CharacterGPT, a tool for making interactive NFTs. Furthermore, Alethea.ai recently released Open Fusion, which allows any ERC-721 NFT on the market to be combined with an Intelligence and released to the AI ​​Protocol.

Alethea.ai’s native token is $ALI, which has four main uses:

  • Lock a certain amount of $ALI to create iNFTs

  • The more locks you have, the higher the Intelligence Pod’s level.

  • $ALI holders participate in community governance

  • $ALI can be used as a credential to participate in interactions between iNFTs (no actual use case yet)

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

Source: coingecko.com

It can be seen from the use cases of $ALI that the current value capture of this token still remains at the narrative level. This inference can also be confirmed from the currency price changes within a year: $ALI has taken advantage of the trend led by ChatGPT starting in December 2022. Generative AI boom bonus. At the same time, in June this year, when Alethea.ai announced the launch of the latest Open Fusion function, it also brought a wave of gains. In addition, the price of $ALI has been on a downward trend, and even the AI ​​boom in late 2023 has not been able to drive the price up to the average level of increases in projects on the same track.

In addition to native tokens, lets take a look at the performance of NFT projects and the performance of Alethea.ais iNFT (including the officially released collection) in the NFT market.

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

Daily sales of Intelligence Pod on Opensea

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

Revenants Collection daily sales on Opensea

Source: Dune.com

From the Dune dashboard statistics, we can see that both the Intelligence Pod sold to third parties and the Revenants collection issued by the first party of Alethea.ai gradually disappeared after a period of initial release. The main reason for this, I think, is that after the initial novelty wears off, there is no actual value or community popularity to retain users.

3.5 Fetch.ai($FET)

Fetch.ai is a project dedicated to promoting the integration of artificial intelligence and blockchain technology. The company’s goal is to build a decentralized smart economy that combines machine learning, blockchain and distributed ledger technology to support economic activities among smart agents.

Fetch.ai was founded in 2019 by Humayun Sheikh, Toby Simpson and Thomas Hain, scientists from the UK. The three founders have very rich backgrounds. Humayun Sheikh is an early investor in Deepmind, Toby Simpson has served as an executive in several companies, and Thomas Hain is a professor of artificial intelligence at the University of Sheffield. The deep background of Fetch.ai’s founding team brings a wealth of industry resources to the company, covering traditional IT companies, blockchain star projects, medical and supercomputing projects and other fields.

Fetch.ais mission is to build a decentralized network platform composed of autonomous economic agents and AI applications, allowing developers to complete preset target tasks by creating autonomous agents. The core technology of the platform is its unique three-tier architecture:

  • Bottom layer: The underlying smart contract network based on PoS-uD (a permissionless proof-of-stake consensus mechanism) supports collaboration among miners and basic machine learning training and reasoning.

  • Middle layer: OEF (Open Economic Framework, Open Economic Framework), provides a shared space for AEA to interact with each other, allows AEA to interact with the underlying protocol, and also supports mutual search, discovery and transactions between AEA

  • Upper layer: AEA (Autonomous Economic Agent), which is the core component of Fetch.ai. Each AEA is an intelligent agent software that can implement different functions through various skill modules and complete pre-set tasks on behalf of the user. The agent software does not run directly on the blockchain, but interacts with the blockchain and smart contracts through the middle layer OEF. This kind of intelligent agent software can be pure software, or it can be bound to actual hardware, such as mobile phones, computers, cars, etc. The official provides a Python-based development kit, the AEA framework, which is composable, allowing developers to use it to build their own intelligent agent software.

Based on this architecture, Fetch.ai has also launched a number of follow-up products and services, such as Co-Learn (shared machine learning models between agents) and Metaverse (intelligent agent cloud hosting service) to support users on its platform Develop your own intelligent agent.

In terms of tokens, $FET, as the native token of Fetch.ai, covers the regular functions of paying Gas, staking verification and purchasing services within the network. $FET has currently unlocked more than 90% of the tokens, and the specific distribution is as follows:

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

Since the launch of the project, Fetch.ai has received multiple rounds of financing in the form of dilutive token holdings. The most recent one was on March 29, 2023, when Fetch.ai received US$30 million in financing from DWF Lab. Since $FET tokens do not capture value in terms of project income, the driving force for price increases mainly comes from project updates and market sentiment for the AI ​​track. It can be seen that, taking advantage of the two popularity of the AI ​​sector, the price of Fetch.ai surged by more than 100% at the beginning of 2023 and at the end of 2023.

Gryphsis Academy: Talking about the bottlenecks of generating AI and the opportunities of Web3 after the anniversary of ChatGPT

Source: coingecko.com

Compared with the way other blockchain projects develop and gain attention, Fetch.ai’s development path is more like a Web2.0 AI startup project, focusing on polishing the technical level, building a reputation through continuous financing and extensive cooperation, and looking for Profit points. This approach leaves a broad space for development of applications developed based on Fetch.ai in the future, but the development model also makes it less attractive to other blockchain projects, thus failing to activate ecological vitality (Fetch.ai’s One of the founders personally built the DEX project Mettalex DEX based on Fetch.ai, but it ended up being nothing.) As an infrastructure-oriented project, the intrinsic value of the Fetch.ai project is difficult to increase due to the decline of the ecosystem.

4. Generative AI has a promising future

Nvidia CEO Jensen Huang calls the release of generative large models the iPhone moment of AI, and the scarce resource for producing AI at this stage is infrastructure centered on high-performance computing chips. As the AI ​​sub-track with the most locked-up funds in Web3, AI infrastructure projects have always been the focus of investors long-term research. It is foreseeable that as chip giants gradually upgrade computing power equipment, AI computing power gradually increases, and more AI capabilities are unlocked, it is foreseeable that more AI infrastructure projects in the subdivision fields of Web3 will be spawned in the future. come out,We can even look forward to the advent of chips specifically designed and produced for AI training in Web3 in the future.

Although the current development of To C generative AI products is still in the experimental stage, some of its ToB industrial-grade products have shown great potential. One of them is the migration of real-world scenes into the digital realm.digital twin technology, combined with the digital twin scientific computing platform released by NVIDIA for the metaverse vision, considering that the industry still has massive data value that has not yet been released, generative AI will become an important help for digital twins in industrial scenarios. Going further into the Web3 field, including The Metaverse, digital content creation, Real World Assets, and other aspects will be affected by digital twin technology powered by AI.

New interactive hardwaredevelopment is also an aspect that cannot be ignored. Looking back at history, every hardware innovation in the computer field will bring about earth-shaking changes and new development opportunities, such as the computer mouse that has become commonplace today, or the iPhone 4 with a multi-touch capacitive screen. Already announced to be launched in the first quarter of 2024Apple Vision Pro, with its stunning demo, it has attracted a lot of attention from around the world. When it is actually launched for sale, it should bring unexpected changes and opportunities to various industries. With the advantages of rapid content production and wide dissemination, the large entertainment field is often the first to benefit from every hardware technology update. Of course, this also includes various visual entertainment tracks such as the Metaverse, chain games, and NFT in Web3, which are worthy of readers long-term attention and research in the future.

In the long run, the development of generative AI is aQuantitative changes lead to qualitative changesthe process of. The essence of ChatGPT is a solution to the problem of reasoning QA, and reasoning QA is a problem that has long attracted widespread attention and research in academia. After long-term iteration of data and models, it finally reached the amazing GPT-4 level. The same is true for AI applications in Web3. We are still in the stage of introducing models from Web2 into Web3, and models developed entirely based on Web3 data have not yet appeared. In the future, far-sighted project parties and a large amount of resources will be required to invest in the research of practical problems in Web3, so that Web3s own ChatGPT-level killer app can gradually come closer.

At this stage, the underlying technology of generative AI also has many directions worth exploring. One of them is the logic implementation method that relies onThought chain(Chain-of-Thought) technology. Simply put, through thinking chain technology, large language models can achieve a qualitative leap in multi-step reasoning. However, the use of thinking chains has not solved, or to some extent, caused the problem of insufficient reasoning capabilities of large models in complex logic. Readers who are interested in this aspect should read the Thought ChainOriginal authors paper

The success of ChatGPT has led to the emergence of various popular GPT chains in Web3. However, the simple and crude combination of GPT and smart contracts cannot really solve the needs of users. It has been about a year since the release of ChatGPT. In the long run, it is just a snap. Future products should also start from the real needs of Web3 users themselves. With the increasingly mature Web3 technology, I believe that generative AI will play an important role in Web3. The application possibilities are endless and worth looking forward to.

references

Google Cloud Tech - Introduction to Generative AI

AWS - What is Generative AI

The Economics of Large Language Models 

Will GAN ​​become obsolete once Diffusion Model takes off? ? ?

Illustrating Reinforcement Learning from Human Feedback (RLHF)

Generative AI and Web3 

Who Owns the Generative AI Platform?

Apple Vision Pro Released Full Moon Rethink: XR, RNDR, and the Future of Spatial Computing

How is AI minted as NFT?

Emergent Analogical Reasoning in Large Language Models

Akash Network Token (AKT) Genesis Unlock Schedule and Supply Estimates 

statement

This report is produced by@GryphsisAcademystudents@chenyangjamie,exist@CryptoScott_ETH@Zou_BlockOriginal works completed under the guidance of. The authors are solely responsible for all content, which does not necessarily reflect the views of Gryphsis Academy, nor the views of the organization that commissioned the report. Editorial content and decisions are not influenced by readers. Please be aware that the author may own the cryptocurrencies mentioned in this report. This document is for informational purposes only and should not be relied upon for investment decisions. It is strongly recommended that you conduct your own research and consult with an unbiased financial, tax or legal advisor before making any investment decisions. Remember, the past performance of any asset does not guarantee future returns.

Original article, author:Gryphsis Academy。Reprint/Content Collaboration/For Reporting, Please Contact report@odaily.email;Illegal reprinting must be punished by law.

ODAILY reminds readers to establish correct monetary and investment concepts, rationally view blockchain, and effectively improve risk awareness; We can actively report and report any illegal or criminal clues discovered to relevant departments.

Recommended Reading
Editor’s Picks