the decentralised data ecosystem - where does theAgora fit?

who's who in the (dataFi) zoo....🐵

Alongside the rapid growth in the defi space, there’s an increasing awareness of the value of the emergent decentralised data economy - dataFi - and its subset of projects.

We’re going to take a helicopter pass of this ecosystem, look at some of the players, and get a gauge on where theAgora fits and the opportunities this decentralised data economy represent.

Firstly, some background

Data is tricky business, with a multitude of factors to be considered: From the consumer perspective, can the data source be trusted- is it accurate, timely and truthful? How can they access it in an acceptable format? How can they discover the data they actually need? How do producers maintain privacy and control? How do they maximise their opportunities to monetise?…its a long list of challenges!

Participants in a data ecosystem can come in many shapes and sizes; individuals, centralised applications, machine learning algorithms, decentralised exchanges and smart contracts, to name a few. All require or produce data, but their specific needs and requirements can differ significantly.

The size and scope of the data ecosystem (centralised and decentralised) is massive and growing, and these problems above are being grappled with regularly.

So tell me about dataFi?

The commonality across the dataFi projects is that they seek to link data producers and data consumers, addressing specific technical requirements of the blockchain space. More importantly they all seek to utilise a range of crypto-economic incentive mechanisms to solve many of the data problems listed above.

It’s not much help writing, or using inaccurate data on unstoppable smart contracts and an immutable blockchain ledger.


The hottest sector currently in dataFi is ‘Oracles’; with the most well known of those being Chainlink (LINK) - leading in both adoption and a circulating market capitalisation in excess of $6.2Bn

Other participants in this space include Band Protocol ($248m) and Dia ($28m).

Each has a slightly different value proposition, but at their core, all aim to create ‘trusted’ data bridges between the “off” and “on-chain” worlds - bringing reliable, accurate and timely data from the ‘real’ world to be consumed (for computation) by blockchain smart contracts.

Oracles provide a trusted bridge for streaming data between the ‘off-chain’ and ‘on-chain’ worlds.

In the real world, developers use Application Programming Interfaces (APIs) to allow software applications to share data with one another, and these API’s require varying levels of trust between counterparties.

Oracles like Chainlink can be thought of as a trustless APIs, connecting “off-chain” applications to “on-chain” applications; smart contracts and dapps. By using combinations of staking, slashing, consensus mechanisms they look to ensure redundancy, reliability and critically that bad actors and bad data is removed and only truful data is consumed.

The Graph – Medium

Soon to be released new data service is The Graph - it likely fits under the oracle catagory too.

For those familiar with centralised APIs, they may have heard of GraphQL - the cool new(ish) kid on API block, offering some benefits over the more widely used and understood REST APIs. GraphQL APIs provide the ability for more complex queries to be made more easily, often reducing a software application’s need to make multiple data calls or parse large data payloads - basically they are trying to make the process of data transfer between applications faster and easier.

GraphQL APIs provide an efficient way to access complicated data via indexing and advanced querying

The Graph is aiming to bring the features and benefits of GraphQL style APIs to the blockchain space, using cryptoeconomic mechanisms to incentivise participants to create accurate data “graphs” or indexes that can be consumed (for a fee) by decentralised applications - a cool project!

Chainlink style oracle and Graph oracles are a valuable data infrastructure - playing a vital role in the decentralised data economy.

TheAgora is a data marketplace and, like the oracle solutions above, it also consists of data producers and consumers exchanging data. It also includes the crypto-economic based incentive mechanisms to drive data exchange, quality and truthfulness.

Data marketplaces form an important part of any healthy data ecosystem - centralised or not - What’s even more exciting for theAgora is the properties of web3.0 that make decentralised data exchanges infinitely more powerful than their centralised counterparts.

Where it differs from the oracles, is the type and form of data and the ways in which data is discovered and exchanged. Data markets like theAgora can be used for person to person dataset or information exchange, person to application, or application to application exchange. All models are possible under this paradigm.

Imagine Amazon and Netflix had a data baby called it theAgora! 👶👶

TheAgora can enable a gig economy for data workers to provide data services; as basic as data cleaning, to advanced data analysis. Defi (or any) traders can request and purchase from the long tail of datasets, generating unique insight to create portfolio alpha. Participants can create and sell datasets and information and earn. Applications can interact with the data and information contained within the market to drive their ML/AI models. The opportunities are endless.

Data protocols

hut34’s hutX data exchange protocol powers theAgora, allowing marketplace users to trustlessly secure, share, and exchange their data - allowing privacy, control, and audibility where required. The protocol has real world traction across academia, research and enterprise.

Other high profile data protocols include Ocean Protocol ($198m) and Erasure Protocol ($244m) High quality projects, building innovative solutions towards a similar goal - opening up the data economy and allowing secure, fair and free trade of data assets.

Centralised data services

Organisations like Messari, Nansen.ai and Dune Analytics are examples of the rapidly expanding blockchain data and information business. They consume, aggregate and analyse raw data to produce valuable information and insights.

As the space grows and becomes even more fragmented their role will only become more important - where people seek trusted sources of information to navigate an increasingly complex environment.

We believe players like these will act as both consumers and producers of data for exchanges like theAgora - buying unique datasets to drive their analytics and using marketplaces as a means of distributing their value added data products.

Building data lego- composable data protocols

Chainlink, with live data feeds, The Graph with indexed data and complex queries, theAgora by hut34, and Ocean with secure data exchange protocols, we envisage a world where these different projects will operate synergistically - creating a data ecosystem greater than the sum of its parts - interoperable data lego

Just as the goal of defi is “composability”, to create ‘money lego’, there’s no reason dataFi protocols can’t, or shouldn’t strive for the same.

Imagine a Chainlink node operator produces Chainlink trusted datasets, packaging them as hutX smart data objects to sell on theAgora, a data service like Nansen.ai may purchase that dataset to then produce value added analytics to either use themselves - or sell back to the market at a higher price; perhaps also via a data exchange like theAgora.

Or, imagine a defi trader makes a Request for Data (RFD) on theAgora for a specific dataset they believe will give them unique market insight, the request is fulfilled by a single data analyst working on the other side of the world - that analyst could fulfill the request by querying trusted base data from The Graph, merge it with a Messari purchased dataset and then return the finished analytics to the trader to act upon (and perhaps profit from the insight).

Each component of the data ecosystem, dataFi lego works together to create value and insight and each participant is fairly paid for their contribution.

The opportunities ahead…

As it stands the cumulative market cap of some of the better know players in the dataFi space is less than 6 Bn - a drop in the ocean for the potential size of a decentralised data economy.

The space is growing rapidly, but it remains very early days.

Similar to the cambrian explosion of data solutions in the centralised world it’s unlikely that it will be a winner take all, with a huge variance of use cases, participants, and successful protocol and services. We’re excited to be a part of this new data economy.

Share

If you want to learn more or join the conversation don’t forget to follow us on twitter  or join the community on Discord

join us on our data odyssey

theAgora Team

*this was a very brief traverse, and we’ve missed many (many) other great players, from D5- a data science DAO, to singularitynet.io working on decentralised AGI to Streamr, a blockchain based IoT messaging solution. There too many to cover in such a short piece.