WEB3

DEGA Insight: The DEGA AI Infrastructure Framework #2

by BSCN

September 6, 2024

chain

DEGA's second AI framework article takes us deeper into the heart of the project's architecture...

AI Node Layered Architecture

 

his is just the second article of many to come explaining the underlying component of the DEGA AI Framework which will form the core of DEGA Realms Game Builder, the DEGA DeFi Chat, and Da Degens Social Simulation gameplay.

 

Last week we shared the high-level solutions architecture for the DEGA AI Infrastructure Framework. It showed how client software connects with the AI Nodes and how they interact with tools and modules to execute tasks. Now we’ll take a look under the hood to understand what exactly an AI Node is composed of.

 

DEGA AI NODE

 

0_D3UaCtGI4fFZHBg-.webp

 

In Figure 1 we can see the elements that compose a DEGA AI Node in hierarchical order based on the flow of data input and output. The client requests flow inward from the Rest API and the Websocket interface. Then, requests proceed to be processed by the modules in the “AI Framework” layer and finally, Tools are used to tap into external data sources or system APIs to execute their own requests. Let’s dig a little deeper into each module set for a bit more detail.

 

Modules

 

Gateway (Rest API +Websocket): The entry point for client requests. A client can be a chat application interface, a video game NPC character, or another backend system making an API call. This layer handles everything related to authorization, payments, I/O logging, network throttling and other subsystems that protect the AI Node.

 

AI Framework: The layer that has all the components related to artificial intelligence, machine learning, and data processing. Some of its components are:

 

  • LLMs: The state of the art architecture for artificial intelligence known as Large Language Models that are powering a new generation of automations that were previously impossible.
  • AI Agents: Autonomous entities programmed to carry out specific tasks, make decisions, and collaborate towards a common goal. The agents use one or more LLMs and connect them with the necessary tools to perform their tasks. You can think of the agent as a wrapper that holds the context for the LLMs. Things like short-term and long-term memory are specified in the agent configuration to allow the LLMs to retain state across time.
  • Workflows: These are based on a graph data structure to provide a “path” for the AI nodes to operate on. The workflows can be pre-built or be assembled dynamically by the AI agents themselves. This allows for explicit control in critical operations but also for flexibility to expand the capabilities and therefore “intelligence” of the AI Node.
  • Metadata: Most of the current AI agents in development account only for explicit coding rules and parameters. At DEGA we believe that for a truly autonomous AI Infrastructure Framework we need to allow for a bottom-up feedback loop in the system. This means that the design of the AI Node, its agents, its workflows and other modules shouldn’t be fixed in stone. The whole system needs to be fluid and adaptive. The metadata module has descriptive data on how the workflows should be structured, which LLMs to call and the credentials to access third-party APIs, among other things.

 

Tools: The layer with the connectors and software packaged needed to operate third-party components and systems. Some of these components might be downloadable modules, while others might be lightweight clients to access an external API. As you can see in Figure 1, we list a few examples such as:

 

  • Search to query the web
  • Hummingbot to perform trading on both CEXs and DEXs
  • The Graph for accessing on-chain data
  • And other blockchain modules.

 

It’s important to note that these are just examples. As the framework grows, more connectors and packages will be integrated to increase the capabilities of the AI Nodes.

 

Application Example: Liquid Reality

 

In the previous article, we highlighted potential use cases for the DEGA AI Infrastructure Framework. We’d like to add a new one this week. We call it “Liquid Reality”.

 

The internet required its own infrastructure, such as CDNs, and client standards like HTML. We predict the next era of content distribution will be powered by powerful generative AI systems and blockchain automated payments. This framework will need new standards for porting data from one medium to another.

 

NVidia is already creating a set of tools and services for their Universal Scene Description (USD) framework which allows for seamless 3D computer graphics data exchange. However, everything from music to 2D assets will need similar standards and intermediary “converters” of content from one medium to another. The DEGA AI Nodes will be part of the pipeline that automates a frictionless exchange of assets and therefore we chose the name “Liquid Reality” for it.

 

Special Thanks

 

We’d like to give special thanks to Venus Protocol, the largest lending platform on the BNB Smart Chain, for their grant which has supported us in the creation of the AI Infrastructure Framework. Additionally, Da Degens game will be integrating the Venus on-chain smart contracts as the first DeFi protocol in our automation roadmap.

 

[Disclaimer: This is a paid press release. BSCN does not endorse and is not responsible for or liable for any content, accuracy, quality, advertising, products, or other materials on this page. The project team has purchased this advertisement article as part of a package for 9,375,000 Dega tokens. Readers should do their own research before taking any actions related to the company. BSCN is not responsible, directly or indirectly, for any damage or loss caused or alleged to be caused by or in connection with the use of or reliance on any content, goods, or services mentioned in the press release.] 

Related News

;