The AGIX ERC20 converter testnet is now live
A public testnet is available for you to try out the migration of AGIX tokens to the Cardano ecosystem
7 December 2021 6 mins read
In our previous blog post earlier this summer, we shared how Cardano would support the migration of ERC20 tokens from Ethereum, working initially with SingularityNET and their AGIX token. Today we can announce that the AGIX ERC20 converter testnet is live and ready for community evaluation.
SingularityNET is our first partner in this initiative. And the converter is a significant step in our shared journey towards a much deeper collaboration with the SingularityNET community.
Dr. Ben Goertzel, CEO and Chief Scientist at SingularityNET says:
I'm extremely excited by the emergence of the AGIX-ADA/AGIX-ETH converter onto Cardano testnet, and soon after that onto mainnet. Every revolution is carried out one step at a time, and this is the first in a series of steps whose result will be the porting of the full SingularityNET decentralized AI platform onto Cardano. The importance of this port for SingularityNET and the whole blockchain and AI ecosystems cannot be overestimated – it will yield not only a far faster and more economical AI network, but also a massively superior foundation for adding advanced new functions to SingularityNET and moving toward realizing our vision of decentralized AGI.
In this initial testnet version, users can move SingularityNET’s AGIX tokens to Cardano and back to Ethereum via the permissioned bridge. This marks a significant step forward in driving interoperability between blockchains to establish a functional environment for decentralized finance (DeFi). Users can assess the capabilities of the testnet and pilot the transfer of AGIX tokens to benefit from Cardano’s higher transaction capacity, lower fees, and proven security benefits.
Blockchain bridges power interoperability
Blockchain interoperability is key to boosting adoption and growth for the entire space. Alongside our open-source approach, this has always been one of our priorities – to make blockchain solutions accessible for everyone, regardless of the chosen protocol. However, speed of transaction processing, security properties, and scalability are critical to satisfying the needs of the crypto community.
We are currently building out and collaborating on multiple bridges to connect Cardano to other blockchains, and this first converter is a vital artery in this system. The more these connections grow, the higher the network effect to boost the flow of liquidity within the Cardano ecosystem.
So, let’s take a closer look at how exactly the AGIX ERC20 converter tool works.
Working with the converter
The converter enables the migration of AGIX ERC20-based tokens from the source network to Cardano. Users can access the converter via a URL and move their tokens in just a few clicks. The converter ‘translates’ an ERC20 token into a native token on Cardano with the same value and functionality, which can be moved into Daedalus or Yoroi wallets to make payments or other transactions. The built-in conversion system allows the tokens to be converted back into ERC20 format, if desired.
Users do not need technical expertise or coding experience to use the converter. They simply access the tool through a URL and then proceed by creating a new account or configuring an existing Metamask account.
It is essential to configure the associated Cardano address, which corresponds to either a testnet Daedalus or Yoroi Nightly wallet to store the migrated tokens. After initial setup, users are welcome to use some testnet AGIX and Ethereum Kovan test network (KETH) tokens to start testing the tool.
The converter reflects the token balance and its equivalent value in US dollars on the token card on a dashboard:
Figure 1. ERC20 converter dashboard
To migrate testnet tokens to Cardano, users need to select the token card, choose the amount, and click the Convert button:
Figure 2. The process of token migration from Ethereum to Cardano
The user will be notified once the transaction is processed both on the Cardano and Ethereum Kovan testnets, and the balance will update accordingly.
For the reverse process, the user needs to click the conversion arrow to point to the target blockchain. The system will notify the user about smart contract execution, and the steps to follow.
The converter provides a user-friendly interface that features tips, notifications, and additional information to guide users throughout their token migration journey. For example, the testnet version of the converter utilizes the Kovan test network. If a user is in a different environment, the system will notify the user to change networks. The same applies to the Cardano address setup, sending values that exceed the actual balance, and so on.
Finally, all the activity can be tracked on both blockchain explorers:
- Kovan Etherscan and
- Cardano testnet explorer
It is also possible to check recent transactions in the converter’s Transaction history section:
Figure 3. ERC20 converter transaction history
Our commercial team is now running the process to allow for secure and seamless token migration from other blockchains and sidechains to Cardano. Projects who want to initiate a dialog can get in touch here. We will continue pursuing Cardano’s interoperability mission across a range of permissioned and permissionless, producing a mesh of interconnected sidechains with decentralized applications (DApps) written in Solidity, Glow, and more. This will expand the base ecosystem of DApps written in Plutus on Cardano.
Following our philosophy where security comes first, we are treating the converter deployment with the highest scrutiny to always secure the funds of individuals. That is why we are inviting the community to put it through its paces on the testnet while the code is constantly monitored and audited to ensure that everything is working properly. While the user flow and UI for the testnet converter will likely be very similar on mainnet, the current build is not yet optimized for performance. The testnet phase is an essential part of this process, gathering user data – particularly at times of high network saturation – will help us address this and improve throughput as we get closer to the mainnet launch.
Ready to try out the AGIX converter? First, make sure to visit the dedicated testnet page with step-by-step instructions. And if you’re ready to get started, then go to the ERC20 converter – the testnet is now live and waiting for you to try it out!
Network traffic and tiered pricing
Decentralized finance will continue to build demand on Cardano. Our research project is looking at ways to maintain fair access and throughput for every user
26 November 2021 7 mins read
A recent blog post outlined some of the ways in which the Cardano network would flex and evolve to meet the global demands of smart contracts and DeFi. Similarly, it will become necessary to upgrade the transaction fee system used for Cardano.
The current system is simple and fair: every transaction is treated the same and it is not possible for users to alter their priority by paying higher fees. As long as the throughput capacity is comparable to the demand, this approach works well.
There are, however, drawbacks. As the use of Cardano increases, there will, eventually, come a point when not all transactions will be able to be included in the blockchain, even with adjustments to the parameterization. Although increasing the capacity of the main chain and/or diverting transactions to Hydra or other layer 2 solutions can alleviate this concern, the core system must still work in an agile way in all possible cases and at all times.
This is especially relevant in the case of a denial of service (DoS) attack. With the system as is, an attacker could take advantage of the fair treatment and pass off their malicious spam as legitimate transactions, increasing waiting times for everyone else. There are measures in place (eg, relating to transaction propagation through the peer-to-peer network) that make such an attack technically challenging. However, for extra protection, we would like to be able to increase the costs of such attacks without jeopardizing the fairness and price efficiency of the whole system.
This is a topic members of IO Group’s research team have been looking at this year. The resulting approach proposed in this post maintains the pillars of Cardano transaction processing (predictability, fairness and inexpensive access) while mitigating the issues that could arise from greater demand. Our approach puts forth a novel transaction fee mechanism for blockchains. The key to the design is partitioning each block into three ‘tiers’ based on use case. Each tier makes up a set percentage of the maximum block size and is designed for different types of transactions (Figure 1). The tiers, along with the suggested split we are analyzing at present, would be:
- fair (50%)
- balanced (30%)
- immediate (20%)
Figure 1. Each block would be split into three tiers.
We will discuss the fair segment last, because it works differently from the other two. Balanced and immediate work by having a ‘fee threshold’, which is different for each. To be included in a block, transaction issuers would specify the tier of service they need. This can be done by setting a maximum fee for the transaction. Then, each block would be filled starting with the immediate, then balanced, and finally fair tiers. Similar transactions within the same tier would pay the same fee. To make this choice simple, each transaction would only be charged the lowest fee that would guarantee its entry in the block. After every block, fees for immediate and balanced tiers would be updated dynamically and deterministically (reflecting the level of demand in previous blocks) to ensure that each segment uses its target percentage of the block.
The difference between immediate and balanced tiers is in the way fees would be updated, specifically the ‘speed’ at which they adjust given the current load. The threshold for immediate service would always be higher than balanced and would react more sharply to demand, ensuring that the transaction asking for it would be serviced as soon as possible. The balanced threshold would be slower to adapt and more stable: this would make it unsuitable for time-sensitive transactions, but would provide a lower, more reliable price at the expense of more varied waiting time.
While the balanced and immediate tiers aim to handle transactions with different levels of urgency, the fair tier handles ordinary transactions. The fair segment is intended to serve as a refinement of the current system in Cardano, keeping the fees low (or in the future even stable, by pegging to a basket of commodities/fiat, as explained in the post on stablefees) and removing any unpredictability from the user’s perspective. As long as demand is low (and the transactions fit into half of the block) this segment would function as Cardano does now.
However, once demand rises, a special mechanism would kick in for fair tier transactions. The mechanism would filter transactions in a manner independent of fees and be based on a prioritization function. One example of this would be to give priority to transactions depending on the age and amount of their UTXOs. In particular, the priority of a given transaction would be equal to the sum of the amount of each input multiplied by its age and then divided by the total size of the transactions in bytes. This priority could be used in conjunction with a threshold (updated dynamically after every block) that would filter transactions whose priority is too low. Such an approach guarantees liveness for each transaction at a low and predictable price and limits the effect a malicious attacker (or a surge of demand) could have on prices, by always providing an inexpensive way into each block.
The tiered pricing idea presented here also extends and clarifies the concept of the multiplier that we introduced in the stablefees post. Viewed in this way, each of the three tiers is associated with a deterministically calculated multiplier (with the fair tier always at a multiplier of 1) whose value depends on the congestion of the respective tier in previous blocks.
This mechanism is different from current pricing approaches, as used by Bitcoin or Ethereum (even with Ethereum Improvement Proposal 1559), where there is a variable fee that each transaction must exceed to make it into a block. The downside of this approach is that the fee everyone needs to pay is dictated by the ‘richest’ consumers. Even worse, this is the fee paid by the richest consumers to make it into a block ‘immediately’. In addition, even though the fees are mostly a function of supply and demand, these particular types of transaction fee mechanism can inadvertently ‘shape’ demand, or inadvertently increase prices because the optimal bidding strategy is not clear to users. Imagine if the transaction fees of Bitcoin were halved suddenly and everyone forgot what they used to be, would they still rise to their current levels? Answering ‘no’ to this question illustrates the downsides of such mechanisms and is a predicament that tiered pricing precludes by design.
The tiered approach is more refined. It understands that not every transaction has the same needs, ensuring that different use cases can happen simultaneously and making it easy for users to choose the desired type of service. In this way, tiered pricing makes it possible to have predictable and fair fees while managing periods of congestion on the main chain. Combined with design improvements to be revealed in later posts that focus on increasing the raw throughput capacity and processing power of the main chain, tiered pricing shows how Cardano will be able to accommodate any circumstance of transaction processing demand.
I would like to acknowledge the contributions of Giorgos Panagiotakos, Aggelos Kiayias and Elias Koutsoupias to this post. Together we form the research group working on the design of this mechanism. A technical paper will be available soon.
Welcome to the age of RealFi
By integrating digital identity with Cardano, we can create real value and opportunity for people across the globe
25 November 2021 8 mins read
“The point of RealFi is to serve real people real finance, and thus be able to innovate and do interesting things.” - Charles Hoskinson
When it comes to the story of blockchain’s recent evolution, it is decentralized finance (DeFi) that has taken center stage. DeFi uses ‘smart contracts’ on a blockchain to give anyone access to banking using a public, decentralized, ledger. By removing the middlemen, the bankers and the brokers, DeFi has brought in a new generation of users and the sector has enjoyed meteoric growth, now estimated to be worth some $100 billion.
DeFi is an open financial network. People do not need to go through any one private enterprise, such as PayPal, Western Union, or a bank, to send, borrow, or lend money. Instead, this is done between individuals, in a peer-to-peer fashion, using blockchain as the underlying ledger to support and enable the transaction.
The basic concept of DeFi is sound. Loans are typically fully-, or over-collateralized. This is because little (or in some cases, nothing) is known about the borrower, and there is little recourse if the loan is not paid. Also, borrowing is normally re-invested in further crypto opportunities. Users interact in a peer-to-peer fashion with no central governing body, relying instead on smart contracts and the inherent transparency and immutability of blockchain-based systems. In this way, most of the usual regulatory constraints around lending and borrowing do not apply, while fees tend to be significantly lower.
Speaking about the adoption of RealFi, Charles Hoskinson said: ‘It is our belief that over the next 12 to 24 months the majority of DeFi providers will be systematically upgrading to RealFi, to actually encumber identity, and metadata, to create proper standards and make sure they're secure and functional and to ensure they resolve the issues of regulation and governance, and also they're in the market for new customers. ‘
Among nations with developed economic systems, DeFi has highlighted the potential for blockchain to disrupt financial ‘legacy’ systems and open up access to new users hunting for better yields and moving liquidity around. It has established an entirely new financial paradigm and the $100bn DeFi market is expected to grow significantly over the next few years as models continue to evolve.
However, as much as the age of DeFi is creating fresh markets and driving compelling new use cases, it has also further highlighted the economic divide between people who can easily access financial products, and those who cannot.
Bridging the DeFi divide
Pricing credit is about trying to assess and mitigate the risk of defaulting. Traditional consumer finance and credit reduces risk by understanding how borrowers behave – how much they spend, their income, and so on. DeFi's approach to risk mitigation is different.
A mature credit scoring system is key to delivering credit in developed economies. But it is even more critical in emerging markets. The reason why banks refuse credit or loans in emerging markets is often that they don’t have enough data about the person or organization intending to borrow. The systems are either less sophisticated or simply not there. It is impossible to create an accurate financial picture through a credit score.
However, it is possible to build up a credit score by querying proxies, linked to an identity. You could contact utility companies to check if the customer has always paid their bills, or check with a phone provider to see how often the prospective borrower topped up their mobile. An identity is out there, the problem is how to tie data to the identity. Once that is achieved, the data can be presented to a local bank, microfinance initiative – or a decentralized pool of capital provided by people across the world in the Cardano community.
We have now reached a point when we can enable this through innovations in crypto. All the necessary financial information can be stored and relayed in a verifiable manner through an Atala PRISM ID. The monetary building bricks of DeFi can be used to structure these loans and hedge the currency risk, while scalable payment rails provided by Cardano and various layer 2 solutions will make it possible to transfer capital across the world without friction.
The world has DeFi. Now it needs RealFi
This year we have announced two very significant blockchain agreements. Our partnership with the Ethiopian Ministry for Education is creating a national attainment recording system to verify grades, monitor school performance, and boost nationwide education. Meanwhile, our collaboration with World Mobile in Tanzania will connect the unconnected and enable access to essential online services through blockchain.
Powerful deals like these are the jumping-off point for Cardano's mission to build RealFi: real finance targeted at the people who really need new ways to access finance, creating that real value often missing from DeFi. RealFi is an ecosystem of products that remove the frictions between crypto liquidity and real world economic activities to offer attractive yields to crypto holders, and cheaper credit/ financial products for real people.
Cardano adds the final piece of the financial puzzle by unlocking real economic value at the end of the transaction chain: personal identity.
Identity is central to everything. Once someone has an economic identity, a world of opportunity and inclusivity opens up. Real opportunity comes with access to essential services that were hitherto out of reach. And real finance, such as loans to open a business or maintain an existing one. RealFi.
Identity can become an asset in so far as it can be a substitute for collateral. A lender's overriding concern is to ensure that loans (plus any interest accrued) are paid back. One way of enforcing this is by collateralizing the loan, but if the lender has enough and clear information about a borrower (if they know the borrower is a high-earner, or a long-standing customer), the lender might be more inclined to forgo the collateral.
Reversing financial exclusion
Microlending platforms such as Kiva offer a successful business model, perfectly suited to the African continent's emerging economy. In this environment, small loans can be life-changing for farmers, entrepreneurs, and people with the drive to create and succeed.
But access to finance is only part of a larger picture. Without access to insurance, education, and health services, people would still be exposed to huge risks. RealFi, through the power of blockchain and a digital identity platform like Atala PRISM, offers a comprehensive solution to this quandary. Digital identity enables access to not only financial products, but also to services that enable people to thrive on a level playing field with their counterparts in more developed parts of the world.
Charles Hoskinson said: ‘Cardano has always been about entering the developing world. We really want to focus on the 3 billion people who don't have reliable access to financial services. If we look at a lot of the markets that we play in, we're super excited to bring those markets in through identity and through wallets into the cryptocurrency space, and then giving them access to RealFi, so for the first time ever, they can participate in a global market fairly.’
RealFi will herald a new age of on-chain credit activity. Cardano owners currently hold coins worth $80bn, and many of them will soon be looking for more yield options besides staking. The tangibility of this real-world application of crypto, the provision of real finance to real people, is the switch that can light up Cardano's power and uniqueness. This is the real use case that many people fail to see in cryptocurrencies. RealFi marks a new era of on-chain credit activity. With validated identity, someone in Rome might have the confidence to make an uncollateralized loan to a business in Kenya, for example.
To start down this road, IO has partnered with Pezesha to facilitate loans to small and medium-sized businesses looking for short term loans for working capital. These loans have a default rate of just 2%, but struggle to be funded in the Kenyan market due to a lack of local liquidity. The goal is to build simple friction free tools that enable crypto holders to seamlessly lend into real world opportunities, receiving repayments in crypto directly into their wallets. We see a world where people can lend into real world opportunities as seamlessly as they do crypto.
RealFi: democratizing opportunity
RealFi represents the very antithesis of financial exclusion and heralds the arrival of a new era of financial and societal inclusivity in Africa and elsewhere. Through RealFi, Cardano becomes a beacon of identity provision to help people help themselves. RealFi's mission is the creation of real financial value for real people, and the key differentiator between Cardano and other blockchain platforms.
The Ethiopia, Tanzania, and Kenya projects are the first steps of a much longer journey towards global fairness and inclusiveness. The start of a drive to end the marginalization of the disadvantaged whilst offering attractive yields to cryptocurrency holders. Much like DeFi, RealFi will be an ecosystem of products and technologies that reduce the friction between Crypto liquidity and real world economic opportunities. Through RealFi, we want to make the world smaller, connecting everyone into a global community of capital and opportunity that's now open and welcoming to them.
Slow and steady wins the race: network evolution for network growth
After a successful start to Cardano’s smart contract era, we’ll soon make the first in a program of network adjustments to support future growth
22 November 2021 8 mins read
From its conception, Cardano has been architected as a platform to best balance the perennial trade-offs of security, scalability and decentralization. Therefore we have architected and built a solid and secure network layer, yet with the flexibility to grow and scale to support a global base of millions of users.
With a secure, highly decentralized proof of stake network now firmly established, and core smart contract capability deployed, we’re now heading into the Basho phase, focused on optimization, scaling and network growth.
As a decentralized permissionless blockchain, Cardano is open to anyone who wants to use it or build on it. Recent hard forks (adding native tokens and smart contract capability) have brought many new users into the Cardano ecosystem, and we have seen rapid growth (and spikes) in transaction volumes and network traffic.
As core components – including wallet connectors and the Plutus Application Backend (PAB) are finalized and integrated into mainnet, we anticipate significant growth in network activity. A constellation of projects building on Cardano will begin to launch, first on testnet then mainnet. These will only increase, with potentially hundreds of thousands of new users coming into Cardano over the coming months, from all sides of the blockchain spectrum.
Inevitably, we can expect significant traffic around the launch of new decentralized applications (DApps), especially in the early days and weeks. To accommodate this ongoing growth, and ensure that Cardano maintains its resilience and robustness, we’re now starting to make a series of adjustments to network parameters. These parameter changes will provide ongoing improvements and enhancements to Cardano's usability and experience across its entire range of users.
Architected for growth
Ouroboros is designed to handle a large volume of data as well as transactions and scripts of different complexity and size. At present, and with current parameters, the Cardano network is utilizing on average only around 25% of its capacity. This is sub-optimal because in fact, the most efficient scenario is that Cardano runs at or close to 100% of its capacity (i.e., the network is ‘saturated’).
While many networking solutions would suffer under such conditions, both Ouroboros and the Cardano network stack have been designed to be fair and highly resilient, even under heavy saturation.
Efficient systems are designed to minimize congestion while enabling effective management when it does happen. You can read more in this recent blog, but in short, the network uses backpressure to manage the overall system load. So while some individual users during a large NFT drop may experience longer wait times for their transactions, this does not mean that the network is ‘struggling’. It actually means the network is performing as intended. We call it ‘graceful degradation’ and you can study this in greater depth in the network design paper.
Aside from the original architectural design, and significant benchmarking across a range of simulated situations, it is only in the real world that we can truly gauge demand and the effectiveness of any changes.
Following extensive benchmarking, and developer feedback, we’re now starting to make gradual adjustments and have today submitted two initial changes.These changes are planned to take effect on the testnet on Thursday 25th November. Once tested, we anticipate subsequently applying these to the mainnet, taking effect on epoch 306, on Wednesday December 1, 2021 at 21:45:00 UTC.
So what are we adjusting?
We’re increasing block size by 8KB to 72KB (12.5% increase)
There are now well over 2 million Cardano wallets in use and traffic has grown by over 20 times in a year (from less than 10,000 transactions per day in November 2020 to over 200,000 transactions per day. Because of the anticipated rise in traffic as developers roll out new DApps, the block size is quickly becoming a key consideration. Larger block sizes mean that more transactions can fit into a block, thus providing greater capacity for users. Being able to fit 12.5% more transactions into a block is significant, as it means that we’re processing more transactions per second or we argue – a more useful metric – greater data throughput.
We are taking a steady, methodical approach to changes in Cardano's parameterization. A 12.5% increase is sizable, but not too big. It leaves room for further expansion, and allows stake pool operators (SPOs) to adjust to the increased demands. We will take a 'slow and steady' approach to further block size changes so that we make the underlying network capacity available to end users, while ensuring that we can continue to operate successfully as a globally decentralized blockchain. The current generation of Ouroboros (named Praos) has specific requirements which must be satisfied in order to ensure its security goals are met, one of the most important parameters is block propagation time. Block propagation time is a measure of how long it takes for a freshly minted block to be propagated across nodes on the network representing 95% of the staked ada. For Praos to stay secure the network must propagate new blocks within 5 seconds.
We can consider this 5s limit a ‘budget’ we can ‘spend’ on things like increasing the block size. Changes such as increased block size will naturally increase the time needed to propagate blocks, so we must monitor carefully to ensure changes we make to increase performance don’t affect the security of the network. In future iterations of Ouroboros this budget will be increased. Meanwhile our focus will be on maintaining security while flexing the network to growing demand.
We’re also increasing of Plutus script memory units per transaction to 11.25 million (again, 12.5% increase)
This is a powerful change and one that we know DApp developers will very much appreciate. An increase in Plutus memory limits means that they can develop more sophisticated Plutus scripts, or that existing scripts will be able to process more data items, increase concurrency, or otherwise expand their capabilities. This will be the first of a series of changes to the memory unit settings that will greatly enhance the real-world capabilities of Plutus scripts. As with block sizes, we will roll out the changes gradually, but steadily, so that the network and SPOs adjust to the increased demand.
The changes described below (block size increase and Increase of Plutus script memory units per transaction) were requested by many app developers, for example. Both these changes go hand-in-hand. It’s not just about creating more complex scripts. It’s also about putting more data through.
Steady and sure
As the Cardano platform evolves, every change will be carefully considered and once actioned, subsequently monitored to gauge its impact on performance. All changes will be based on empirical data drawn from the network and based on real, sustained user demand. Critically, it is important not to make decisions with a long-tail impact around short-term surges in network usage. We won't make changes prematurely or make them at a pace that could potentially compromise Cardano's longer-term security, for example.
Cardano development is grounded in both fundamental and ongoing research. Further network enhancements in the mid-longer term will collectively deliver substantial capacity improvements, as well as tuning the network to deliver the best overall experience.
I’ll be joining the November Cardano360 to share further thoughts on this. But in short, this is about building new and capable blockchain infrastructure, built on advanced and fundamentally decentralized technologies. Initially, we will focus on a number of performance improvements that will enable us to exploit the limits of the Ouroboros Praos protocol. We will then focus on optimizing the size of Plutus scripts and the underlying performance of the Plutus interpreter and Cardano node implementations. This will allow us to process more useful work within the same protocol parameters. Related to this will be the use of compression techniques, to reduce the size of scripts and transactions, meaning that more transactions can be carried within the same sized block. All of this (and more) will improve layer 1 performance and capacity. Looking ahead, Hydra will then introduce a layer 2 solution, providing hugely increased scalability by allowing users to provision multiple chains that reuse the same ledger representation.
Cardano is, in a manner of speaking, a living entity that grows and adapts with every evolutionary step. It may sound like a contradiction in terms, yet while its foundations are formed from rock-solid fundamental research, flexibility (to change even entire protocol changes via the hard fork combinator (HFC) has been designed in from the beginning.
Parameterization changes are part of this transformative process. While inevitably there will be folks who want to move faster, our focus will remain on steady, secure evolution as Cardano grows in reach and adoption.
Thanks to Duncan Coutts, Kevin Hammond, and Fernando Sanchez for their contributions to this article.
Architecting DApps on the EUTXO ledger
Taking a closer look at ways of DApp architecture on Cardano
16 November 2021 5 mins read
Following up on our recent blog post about Cardano’s performance and ledger optimization roadmap, we prepared a deeper technical dive into the architecture of the EUTXO ledger.
Here, we offer an example architecture and also discuss possible improvements that will boost transaction throughput and minimize delays in transaction processing.
Cardano’s EUTXO model is a solid basis for decentralized finance (DeFi) and decentralized applications (DApp) development as it facilitates parallel transaction processing, which enables greater scalability than compared to account-based models, as well as providing enhanced security settings.
However, using a design or mechanisms applicable to account-based systems rather than the EUTXO model (in particular, when building decentralized exchanges) may result in contention issues. This results in a situation when a new transaction is dependent on the output of a previous transaction thus causing delays, especially if there is a large number of transactions. To eliminate this issue, developers should avoid using a single-threaded state machine style and design applications specifically taking into account EUTXO properties.
What does a well-formed architecture look like?
An order book pattern is one of the approaches applicable to DEX development if compatible with smart contract logic. And most of the protocol architectures evaluated and presented in SundaeSwap’s blog post, rely on a general approach whereby:
- a user locks funds in an intermediate script (which we will call the request script) together with a description of the submitted orders (e.g., token or datum)
a third party (referred to as a batcher) aggregates the orders sitting at the request script into one single transaction such that:
- the locked orders are spent together with the UTXO holding the global state of the main script (e.g., liquidity pool) to be updated
- results of executed orders are sent back to the original users
- a new UTXO holding the updated global state resides at the main script address
When adopting such a batching pattern, one should bear in mind that, whenever N orders sitting at the request script are consumed within a single transaction, the request script will be executed N times on transaction submission. Moreover, the memory limit check (triggered when the transaction is submitted) is realized by aggregating the memory consumption for each single request script execution, for the main script execution, and for any MintingPolicy scripts that may also be executed (i.e., according to protocol design). Additionally, the same transaction context, which is proportional to the number of orders spent, will be passed as an argument for each script execution.
Although this is a good approach, there are possible improvements to make it even better.
One potential solution to avoid triggering the execution of the request script N times (i.e., within the aggregated transaction) is for the user to directly submit orders to their own public key address instead. The request script is solely used to notify the presence of pending orders and to lock transaction fees that can afterward be claimed by the batcher once orders have been processed. Using this solution, users are also required to sign the aggregated transaction to authorize the spending of orders. It is also important to note, that in such a case, all users in the batch should be online to participate. A simplified architecture for such a solution can be summarized as follows:
- A specific MintingPolicy script can be used to mint an ‘order’ token submitted to the user's public key address.
- The hash of the user's public key address, together with the order description and any necessary transaction fees, can be sent to the request script for order notification.
- The batcher inspects the UTXOs sitting at the request script address to collect ‘order’ tokens and build the aggregated transaction, such that the ‘order’ tokens are used by the main script to validate the aggregated transaction. Note that if an ‘order’ token is not present at the corresponding public key address, the order is considered void.
- The UTXOs sitting at the request script are not spent by the aggregated transaction. They are only used to collect the UTXOs holding the ‘order’ tokens.
- The batcher notifies the relevant users to sign the aggregated transaction for submission.
- A MintingPolicy, bound to the main script, is used to mint a ‘receipt’ token for each processed order. This ‘receipt’ token will be used by the batcher to claim the transaction fees locked at the request script.
Transaction fee collection:
- The batcher can consume each UTXO sitting at the request script by providing the corresponding ‘receipt’ token.
Benchmarks conducted on the public testnet show that with this simple architecture, around 25 to 30 orders can easily be handled within one single transaction, without exceeding the memory limits of 10 million units. We believe that some additional optimizations can still be performed to increase this figure.
Developers can also extend this architecture to consider more sophisticated mechanisms guaranteeing deterministic ordering, order cancellation by users within a specific time frame, and additional protection against malicious batchers.
This is just one example of how one can take a EUTXO specific approach to DApp design. We are in the process of extending our documentation and will share other examples in due course. Currently, you can find some code samples for avoiding concurrency using multi signatures here.
We also anticipate that the development community will identify many further models and we’ll be happy to include these in our repos to build a body of resources for the Plutus development community over the months ahead.
Thanks to John Woods and the team for their input and support in preparing this blog post.