A Tokenomics Classification Proposal: How to Take Advantage From It?

masteryweb3_admin Avatar

·

·

Introduction

Welcome to a journey into the realm of tokenomics and blockchain tokens, where innovation meets disruption, and the future of finance is being reshaped.

In this comprehensive blog post, we will delve into the key elements of the groundbreaking report “Tokenomics and blockchain tokens: A design-oriented morphological framework” that sheds light on the transformative potential of tokenization and decentralized ledger technology. Let’s explore how these concepts are revolutionizing traditional business models and paving the way for a new era of digital economy.

The tokenomics landscape

The excerpt discusses the significant impact of blockchain technology, particularly in decentralizing the source of truth and eliminating the need for centralized authorities in various economic activities. Here are the key points explained:

  1. Decentralization of Source of Truth: By utilizing a peer-to-peer disintermediated network, blockchain allows for a shared consensus on a unique source of authenticity without the need for a central authority.
  2. Shift from Centralized Guarantors: With blockchain, transactions can be securely conducted without the need for intermediaries, as the system is algorithmically governed.
  3. Decentralization Beyond Currency: The decentralization introduced by blockchain extends beyond monetary transactions. It empowers individuals with prerogatives previously exclusive to institutions, enabling actions to be transformed into transparent transactions and fostering virtuous behaviors.
  4. Role of Blockchain Tokens: blockchain tokens represent value in digital form and are essential for incentivizing user interactions within blockchain networks. Understanding the nature of tokens and their classification is fundamental for designing and implementing effective token-based systems.
  5. Token Classification Framework: To address the need for a comprehensive understanding of blockchain tokens, a token classification framework has been developed. This framework aims to provide a structured approach to designing and classifying tokens, offering a tool for describing tokens in detail for various purposes.

Evolution from Economics to Tokenomics

The tokenization process involves encapsulating value into tradeable units known as tokens or coins, expanding beyond purely economic terms to include reputation, work, copyright, utility, and voting rights. This digitalization of value, akin to the Internet’s circulation of digitized information, enables a borderless flow of digitized value, solving the double-spending problem and introducing digital scarcity.

Digital scarcity facilitates a new digital economy with assets that are liquid, divisible, borderless, and potentially appreciating over time, offering implications for transitioning from a debt-based economy to enhancing people’s lives and democratic processes.

Tokenization allows any form of value to be managed as a digital asset using virtual tokens, with individuals or organizations defining the rules governing them, including token features, monetary policy, and user incentives.

This process creates a self-governed tokenomic system, where rules are programmed by the token designer. This shift from economics to tokenomics signifies a move from passive observation to active design of ecosystem laws to align stakeholders’ behavior with desired goals, emphasizing the pivotal role of tokens in unleashing the disruptive potential of blockchain technology.

Rationale to Token Classification

The underlying rationale for token classification emphasizes the crucial yet challenging task of comprehensively describing the nature of tokens.

Tokens can be viewed as units of value created by organizations to self-govern their business models, empower users, and facilitate rewards distribution. Their roles extend beyond mere currency applications to include granting access to services, community contributions, and governance regulation.

Tokens do not have an intrinsic definition but derive their nature from what they represent, encapsulating value within a regulated ecosystem. The diverse value represented by tokens, such as discounts, ownership proofs, or rewards, poses a challenge in defining token-represented value universally.

Trust emerges as a fundamental aspect of token value, where token holders trust issuers to honor the rights associated with tokens. Tokens can represent various forms of trust, serving as quantifiable representations of decentralized and disintermediated trust.

The rapid development and deployment of tokens have been experimental and iterative, leading to a diverse range of features and applications. Defining the nature of tokens requires a dynamic and iterative approach due to their evolving nature and broad scope of innovation.

A comprehensive taxonomy is essential to classify the various branches of value representations conveyed by tokens. The proposed framework aims to assist non-technical decision-makers in navigating the tokenomics space, making informed strategic decisions in token design processes, and facilitating comparisons among different design intentions and outcomes.

Several token classifications and taxonomies have been proposed:

  • Oliveira et al. conducting a comprehensive study characterizing tokens using 13 parameters and identifying 8 token archetypes.
  • Thomas Euler’s Token Classification Framework categorizes tokens based on purpose, utility, legal status, underlying value, and technical layer, aiming to capture the multifaceted nature of tokens.
  • William Mougayar’s framework focuses on three classification dimensions: role, features, and purpose, particularly emphasizing business-related aspects and incentives for token holders.
  • Taxonomy Report on Cryptoassets” by CryptoCompare provides valuable approaches to token classification, examining over 200 tokens based on 30 unique attributes related to economic, legal, and technological features.
  • The Token Taxonomy Framework, developed by the Token Taxonomy Initiative (now InterWork Alliance), aims to establish a common standard for describing and designing tokens. This framework classifies tokens based on five characteristics: Token Type, Token Unit, Value Type, Representation Type, and Template Type, with a focus on interoperability and creating a shared language among technology experts and business practitioners.

Various studies, such as those by Cong and He, Sockin and Xiong, Chiu and Koeppl, and Catalini and Gans, explore topics like consensus generation, utility tokens, cryptocurrency design, ICO funding mechanisms, and the market for tokens. These studies analyze the implications of blockchain technology on economic properties, consensus quality, and the security of digital assets. The literature on game theoretical aspects of blockchain is closely linked to the token classification framework, enhancing the understanding of token design and classification within the blockchain ecosystem.

GMA Methodology for Tokenomics

The methodology employed in the study utilizes General Morphological Analysis (GMA) as a structured approach to address the diverse nature of tokens and their classifications. GMA, developed by Fritz Zwicky, is designed to describe and evaluate complex problems characterized by non-quantifiable and multi-dimensional properties by mapping all possible relationships or configurations within the problem complex. This morphological approach aims to identify recurring patterns and construct non-quantified inference models.

GMA is considered a comprehensive methodology suitable for describing tokens, encompassing their non-quantifiable and multi-dimensional properties effectively. In the context of token classification frameworks, GMA facilitates the creation of a morphological field through the following steps:

  1. defining the dimensions of the problem (parameters for token mapping)
  2. specifying a spectrum of values for each dimension
  3. generating a morphological field that includes all possible combinations of values for each dimension. The total number of configurations in the problem complex is determined by the product of the values under each parameter.

The GMA methodology is initially applied to conduct a comparative analysis of token classification frameworks. This process involves normalizing the frameworks to a morphological field representation to enable meaningful comparisons. By normalizing the frameworks, the study aims to identify recurring morphological dimensions and values, address gaps, reveal token properties, and eliminate redundant or misleading definitions within the classification frameworks.

The Proposal

The analysis revealed varying approaches among authors, with some prioritizing multi-dimensionality while others focused on detailed alternatives for selected dimensions.

The number of dimensions in a framework quantifies its multi-dimensionality, while the average number of values per dimension reflects the level of detail available for token representation. The total configurations, determined by the product of values under each dimension, indicate the overall size of the morphological field. Multi-dimensionality was emphasized to achieve a comprehensive framework.

Based on the comparative assessment, a new morphological token classification framework was proposed. Initially, 42 dimensions from the eight frameworks were critically analyzed, leading to the selection of 11 dimensions across three domains: Technology, Behavior, and Coordination. The framework, with 14 dimensions and almost 5 million configurations, aims to provide guidance for describing tokens comprehensively and ensuring comparability.

Maps of classification frameworks. The size of the bubble represents the number of all possible configurations (Oliveira is out of scale). Source: Blockchain: Research and Applications 3 (2022) 100069

Morphological token classification framework

The text discusses various dimensions related to token characteristics within the Coordination domain of the morphological token classification framework:

Proposed morphological token classification framework. Source: Blockchain: Research and Applications 3 (2022) 100069

The Technology Domain

The Technology Domain in the token classification framework consists of four dimensions:

  1. Chain: This dimension describes the blockchain characteristics where the token is issued. Tokens can be chain-native, created with the blockchain, and essential to its functioning. Tokens can also be issued on a new chain with forked code, a forked chain with forked code, or on top of an existing protocol like ERC20 tokens.
  2. Permission: This dimension relates to the underlying blockchain’s characteristic regarding participation in the network. A blockchain is “permissionless” when open to everyone, while it is “permissioned” when nodes need accreditation to join.
  3. Number of Blockchains: Indicates if the token can be used on a single chain or across different chains. Cross-chain tokens, though still emerging, are expected to play a significant role in Decentralized Finance (DeFi). Examples include BNB, Polygon, and MIMO.
  4. Representation Type: This dimension distinguishes between “Common” and “Unique” tokens based on their representation. Common tokens share the same properties and have a balance assigned to owners’ accounts, affecting all tokens simultaneously. Unique tokens have individual identification and tracking, not benefiting from bulk property changes. This dimension should not be confused with fungibility, which refers to the interchangeability of tokens based on value.

The Technology Domain

The Behavior domain in the token classification framework consists of six dimensions that define the token’s inherent functional characteristics:

  1. Burnability: This dimension determines if the token can be burned by the issuer to create artificial scarcity, terminate a right, or revoke access. Burning tokens can adjust the overall token supply and is often related to the ecosystem’s monetary policy. Examples of tokens that use burning for programmable scarcity include BNB, CHSB, and XLM.
  2. Expirability: Indicates if the token can expire programmatically after a certain time, suitable for tokens providing discounts or temporary access to services.
  3. Spendability: Determines if the token can be spent within the ecosystem, such as for accessing services or paying fees. Non-spendable tokens, used for reputations or voting power, prevent misbehaviors in governance.
  4. Fungibility: Refers to the token’s interchangeability with others of the same value. Tokens can be fungible, non-fungible, or hybrid, with limited fungibility within a subset of the supply. Hybrid fungibility is common in tokens representing partitioned real assets.
  5. Divisibility: Defines if the token can be split into sub-units (Fractional), represents an indivisible Whole, or is a Singleton token with a quantity of 1 that cannot be divided. Examples include indivisible tokens like CryptoPunks or Bored Ape.
  6. Tradability: Illustrates if the token can change ownership within a platform or on secondary markets. Most tokens are tradable, offering swift exchangeability in DeFi applications. Non-tradable tokens are common for those representing reputations or voting rights. Additionally, tokens can be Delegable, allowing the transfer of rights without changing ownership, crucial in blockchain ecosystems using the Delegated Proof of Stake consensus protocol like Tezos and EOS.

The Technology Domain

The Coordination domain in tokenomics focuses on how tokens are utilized to encourage specific behaviors among ecosystem participants. It consists of four dimensions:

  1. Underlying Value: This dimension explains the source of the token’s inherent value and the factors influencing its value fluctuations. It can be categorized into Asset-based, Network Value, and Share-like, depending on the underlying value’s nature.
  2. Supply Strategy: This dimension outlines the token supply structure and monetary policy implemented by the issuer. It influences the token’s dynamics, especially in Network Value tokens, and defines the Token Velocity, which measures how often token ownership changes.
  3. Incentive Enablers: This dimension is essential for token-based incentive schemes as it describes the characteristics necessary for incentivized behaviors to occur. It includes values like the Right to work, Right to use, Right to vote, Unit of account, Medium of exchange, and Store of value.
  4. Incentive Drivers: Complementary to Incentive Enablers, Incentive Drivers are token characteristics that actively promote incentivized behaviors among stakeholders. Values include Get access, Get discount, Get revenue, Get reward, Dividend/Earning potential, Appreciation potential, Participate in governance, and Gain reputation.

Conclusion

In conclusion, the report on tokenomics and blockchain tokens offers a glimpse into the transformative power of decentralized ledger technology and tokenization.

By embracing these concepts, businesses can unlock new opportunities for value creation, peer-to-peer transactions, and decentralized governance. As we continue to explore the potential of blockchain technology, it is essential to consider the regulatory implications, token design principles, and the broader impact on traditional business models.

The future of finance is evolving, and tokenomics is at the forefront of this digital revolution.

Leave a Reply

Your email address will not be published. Required fields are marked *