๐ŸงฎPricing Algorithm

Pricing Methodology consists of the following steps: Data Ingestion, Wash Trading Detection, Feature Engineering and Modeling.

Data Ingestion

Currently, data is gathered only for Ethereum-hosted collections, but roadmap for onboarding other L1s and L2s is defined. Data is leveraged from a number of sources:

  • NFT marketplaces such as OpenSea, LooksRare and X2Y2

  • NFT aggregators such as Gem, Genie and Blur

  • Decentralized NFT Applications such as SudoSwap and BendDAO

  • Pricing Data for Bitcoin and Ethereum

  • Financial Data such as DJIA and S&P500

Wash Trading

Several method are employed for Wash Trading Detection:

  • Directed Graph Encoding of transaction data for wash trading pattern recognition

  • Flagging system for Anomalous Price Change

Feature Engineering

While most existing solutions only include traits of an asset, MetaQuants takes into account:

  • Macroeconomic Conditions - Proxy Variables:

    • S&P500

    • DJIA

    • Feature Engineering on S&P500 and DJIA in order to extract insights from the raw data

  • Industry Conditions - Proxy Variables:

    • Bitcoin

    • Ethereum

    • Feature Engineering on Bitcoin and Ethereum in order to extract insights from the raw data

  • NFT Collection Conditions:

    • Feature Engineering which is based on indicators of traditional financial markets, for example, price volatility

    • Feature Engineering which is based on domain-specific indicators, for example, floor price of the collection

  • NFT-Specific Conditions:

    • Rarity Score

    • Computation of new features which account for past performance of individual assets, for example, average price of previous sales

After all metrics are generated, Forward Variable Selection method quantifies the predictive power of each indicator, excluding all redundant variables from the optimal model.

NFT Pricing Algorithm

  • Model

A serious problem of NFT pricing solutions is that most fail to deal with the case of overpricing tokens. This is especially costly for lending services - issuance of a mispriced loan disincentivizes its repayment, leaving a protocol with unrealized losses. MetaQuants employs a model with a specifically defined loss function for penalizing overpredictions more than underpredictions

  • Interpretability

Evaluation metrics are indicators of a modelโ€™s ability to predict unseen data. A single figure, however, brings clarity neither to the algorithmโ€™s feature selection nor its computational approach. As already stated, existing evaluation services function as a black box that undermines them, regardless of their performance. MetaQuants enumerates the factors for determining a tokenโ€™s value and specifies the impact of each variable in terms of ETH.

  • Confidence Intervals

There is a wide variety of buyer profiles within the NFT ecosystem - from firm believers in the technology to the so-called โ€degensโ€. Apart from a point estimate, MetaQuants provides a value range based on the modelโ€™s residuals, so each user can accommodate their risk appetite. Risk-averse market participants and lending protocols can choose the lower bound of the interval to protect themselves against overpricing. Oppositely, bullish traders may select the upper bound for a possible opportunity. The range is dynamically adjusted and accounts for rarity, so different assets have different interval widths. Further, ฮฑ, can be chosen for a (1 โˆ’ ฮฑ) confidence interval. Lower ฮฑ corresponds to a wider price scope, but also an additional level of insurance.

Last updated