the decentralised data ecosystem - where does theAgora fit?
who's who in the (dataFi) zoo....đ”
Alongside the rapid growth in the defi space, thereâs an increasing awareness of the value of the emergent decentralised data economy - dataFi - and its subset of projects.
Weâre going to take a helicopter pass of this ecosystem, look at some of the players, and get a gauge on where theAgora fits and the opportunities this decentralised data economy represent.
Firstly, some background
Data is tricky business, with a multitude of factors to be considered: From the consumer perspective, can the data source be trusted- is it accurate, timely and truthful? How can they access it in an acceptable format? How can they discover the data they actually need? How do producers maintain privacy and control? How do they maximise their opportunities to monetise?âŠits a long list of challenges!
Participants in a data ecosystem can come in many shapes and sizes; individuals, centralised applications, machine learning algorithms, decentralised exchanges and smart contracts, to name a few. All require or produce data, but their specific needs and requirements can differ significantly.
The size and scope of the data ecosystem (centralised and decentralised) is massive and growing, and these problems above are being grappled with regularly.
So tell me about dataFi?
The commonality across the dataFi projects is that they seek to link data producers and data consumers, addressing specific technical requirements of the blockchain space. More importantly they all seek to utilise a range of crypto-economic incentive mechanisms to solve many of the data problems listed above.
Itâs not much help writing, or using inaccurate data on unstoppable smart contracts and an immutable blockchain ledger.
The hottest sector currently in dataFi is âOraclesâ; with the most well known of those being Chainlink (LINK) - leading in both adoption and a circulating market capitalisation in excess of $6.2Bn
Other participants in this space include Band Protocol ($248m) and Dia ($28m).
Each has a slightly different value proposition, but at their core, all aim to create âtrustedâ data bridges between the âoffâ and âon-chainâ worlds - bringing reliable, accurate and timely data from the ârealâ world to be consumed (for computation) by blockchain smart contracts.
Oracles provide a trusted bridge for streaming data between the âoff-chainâ and âon-chainâ worlds.
In the real world, developers use Application Programming Interfaces (APIs) to allow software applications to share data with one another, and these APIâs require varying levels of trust between counterparties.
Oracles like Chainlink can be thought of as a trustless APIs, connecting âoff-chainâ applications to âon-chainâ applications; smart contracts and dapps. By using combinations of staking, slashing, consensus mechanisms they look to ensure redundancy, reliability and critically that bad actors and bad data is removed and only truful data is consumed.
Soon to be released new data service is The Graph - it likely fits under the oracle catagory too.
For those familiar with centralised APIs, they may have heard of GraphQL - the cool new(ish) kid on API block, offering some benefits over the more widely used and understood REST APIs. GraphQL APIs provide the ability for more complex queries to be made more easily, often reducing a software applicationâs need to make multiple data calls or parse large data payloads - basically they are trying to make the process of data transfer between applications faster and easier.
GraphQL APIs provide an efficient way to access complicated data via indexing and advanced querying
The Graph is aiming to bring the features and benefits of GraphQL style APIs to the blockchain space, using cryptoeconomic mechanisms to incentivise participants to create accurate data âgraphsâ or indexes that can be consumed (for a fee) by decentralised applications - a cool project!
Chainlink style oracle and Graph oracles are a valuable data infrastructure - playing a vital role in the decentralised data economy.
TheAgora is a data marketplace and, like the oracle solutions above, it also consists of data producers and consumers exchanging data. It also includes the crypto-economic based incentive mechanisms to drive data exchange, quality and truthfulness.
Data marketplaces form an important part of any healthy data ecosystem - centralised or not - Whatâs even more exciting for theAgora is the properties of web3.0 that make decentralised data exchanges infinitely more powerful than their centralised counterparts.
Where it differs from the oracles, is the type and form of data and the ways in which data is discovered and exchanged. Data markets like theAgora can be used for person to person dataset or information exchange, person to application, or application to application exchange. All models are possible under this paradigm.
Imagine Amazon and Netflix had a data baby called it theAgora! đ¶đ¶
TheAgora can enable a gig economy for data workers to provide data services; as basic as data cleaning, to advanced data analysis. Defi (or any) traders can request and purchase from the long tail of datasets, generating unique insight to create portfolio alpha. Participants can create and sell datasets and information and earn. Applications can interact with the data and information contained within the market to drive their ML/AI models. The opportunities are endless.
Data protocols
hut34âs hutX data exchange protocol powers theAgora, allowing marketplace users to trustlessly secure, share, and exchange their data - allowing privacy, control, and audibility where required. The protocol has real world traction across academia, research and enterprise.
Other high profile data protocols include Ocean Protocol ($198m) and Erasure Protocol ($244m) High quality projects, building innovative solutions towards a similar goal - opening up the data economy and allowing secure, fair and free trade of data assets.
Centralised data services
Organisations like Messari, Nansen.ai and Dune Analytics are examples of the rapidly expanding blockchain data and information business. They consume, aggregate and analyse raw data to produce valuable information and insights.
As the space grows and becomes even more fragmented their role will only become more important - where people seek trusted sources of information to navigate an increasingly complex environment.
We believe players like these will act as both consumers and producers of data for exchanges like theAgora - buying unique datasets to drive their analytics and using marketplaces as a means of distributing their value added data products.
Building data lego- composable data protocols
Chainlink, with live data feeds, The Graph with indexed data and complex queries, theAgora by hut34, and Ocean with secure data exchange protocols, we envisage a world where these different projects will operate synergistically - creating a data ecosystem greater than the sum of its parts - interoperable data lego
Just as the goal of defi is âcomposabilityâ, to create âmoney legoâ, thereâs no reason dataFi protocols canât, or shouldnât strive for the same.
Imagine a Chainlink node operator produces Chainlink trusted datasets, packaging them as hutX smart data objects to sell on theAgora, a data service like Nansen.ai may purchase that dataset to then produce value added analytics to either use themselves - or sell back to the market at a higher price; perhaps also via a data exchange like theAgora.
Or, imagine a defi trader makes a Request for Data (RFD) on theAgora for a specific dataset they believe will give them unique market insight, the request is fulfilled by a single data analyst working on the other side of the world - that analyst could fulfill the request by querying trusted base data from The Graph, merge it with a Messari purchased dataset and then return the finished analytics to the trader to act upon (and perhaps profit from the insight).
Each component of the data ecosystem, dataFi lego works together to create value and insight and each participant is fairly paid for their contribution.
The opportunities aheadâŠ
As it stands the cumulative market cap of some of the better know players in the dataFi space is less than 6 Bn - a drop in the ocean for the potential size of a decentralised data economy.
The space is growing rapidly, but it remains very early days.
Similar to the cambrian explosion of data solutions in the centralised world itâs unlikely that it will be a winner take all, with a huge variance of use cases, participants, and successful protocol and services. Weâre excited to be a part of this new data economy.
If you want to learn more or join the conversation donât forget to follow us on twitter or join the community on Discord
join us on our data odyssey
theAgora Team
*this was a very brief traverse, and weâve missed many (many) other great players, from D5- a data science DAO, to singularitynet.io working on decentralised AGI to Streamr, a blockchain based IoT messaging solution. There too many to cover in such a short piece.