# Proof-of-Stake Protocol - IOHK

21 September 2016 Bernardo David <1 min read

889

This is the regular seminar of the Input Output and Tokyo Tech/Tanaka Laboratory members. The topic this time is the Proof-of-Stake Protocol designed by Aggelos Kiayias, Ioannis Konstantinou, Alexander Russel, Bernardo David and Roman Oliynykov.

Bernardo, the presenter, divided the talk in two parts: the first reviews main topics in Cryptography which would help the viewer to understand the presentation and the protocol itself. Whereas the second is about the protocol itself.

First Part - Cryptography background

• Commitments
• Coin Tossing/Guaranteed Output Delivery
• Verifiable Secret Sharing

Second Part - Proof-of-Stake Protocol

• Comparison Proof-of-Work (PoW) and Proof-of-Stake (PoS)
• Follow the Satoshi technique
• The protocol

In this article I'm going to provide a brief review of protection methods against replay attacks, arising from signature malleability of elliptic curve cryptography.

### Problem

Most cryptocurrencies are based on public-key cryptography. Each owner transfers coins to the next one digitally signing the transaction `Tx` containing the public key of the next owner.

Thus everyone can verify that the sender wants to send her coins to the recipient, but a problem arises - how to prevent the inclusion of transactin `Tx` in the blockchain twice? Without such a protection an unscrupulous recipient may repeat `Tx` as long as the sender has enough coins at his balance, making it impossible to reuse the same address for more then 1 transaction. In particular the adversary can withdraw some coins from an exchange and repeat this transaction until there are no coins left on exchange (such attacks have already been executed in practice, e.g. for MtGox attack).

The simplest way to keep all the included transactions and compare the new one to them doesn't work because of the elliplic curve signature malleability - it is possible to change a signature but keep it valid at the same time (see here).

Scala code that changes signature like that is very simple:

``````    def forgeSignature25519(signature: Array[Byte]): Array[Byte] = {
val modificator: BigInt = BigInt("7237005577332262213973186563042994240857116359379907606001950938285454250989")
signature.take(32) ++ (BigInt(signature.takeRight(32).reverse) + modificator).toByteArray.reverse
}
``````

Thus, now we have a sequence of transactions `Tx1`, ..., `TxN` with the same fields, but different signatures, and there's a challenge to determine whether they are all generated by the sender or some of them are generated by the adversary.

## Solutions

In this section I'll provide examples of how this problem is solved in different cryptocurrencies and will try to describe the merits and drawbacks of each approach.

### Canonical signature (Factom, Ripple, Nxt)

As long as there is sequence of valid signature, there may be a rule to select only one of them. The usual way is to select canonical signature, which is lower then group generator ("7237005577332262213973186563042994240857116359379907606001950938285454250989" for curve25519). Unfortunately for some elliplic curves for any given canonical signature an alternative form of that signature can be formed that is also canonical. In such a case it's required to define fully canonical signature, being the minimum of all equivalent signatures. The main drawback of this approach is that fully canonical signature is not specified in the protocol and default elliplic curve implementations don't usually check if a signature iscanonical and don't generate canonical signature, which makes cross-platform implementation much harder.

### Signature independent id

It's also possible to modify transaction uniqueness check, leaving all possible signatures valid as specified in elliptic curve protocol. For example it's possible to use the rule that the transaction data excluding yje signature should be valid. That has a drawback of being unable to create 2 transactions with the same fields, while non-deterministic signatures may indicate that sender really wanted to send 2 transaction with the same fields. To fix this it's possible to include transaction id into transaction explicitly, e.g. some cryptocurrencies sign transaction, use this signature as id and then sign this internaltransaction with signature one more time.

### Nonce (Ethereum, Waves)

Another way is to add an additional field to the transactions, increasing for transactions from the same address. Current nonce for every account should be stored in the blokchain state and with each new transaction `Tx` nodes verify that `Tx.nonce = account.nonce + 1`. On top of replay attack protection nonce allows you to broadcast sequence of transactions and be confident that only one of them will be included to blockchain.

For example, if you need your transaction to be included in block as soon as possible, you may rebroadcast your transaction with the same nonce, but increased fee and be sure, that only one of your transactions will pass all checks. Nonce provides additional benefits for transactional layer, but not for free - every transaction should contain nonce, so transaction size become bigger. On the other size nonce should be big enough, because it's not clear how to handle a situation when nonce limit has been reached. To mitigiate this it's possible to reuse a transaction field as nonce, for example use timestamp as a nonce.

In such an approach it's not possible to require to increase nonce by 1, so rule `Tx.timestamp > account.timestamp` should be used. This leads to another attack: if someone broadcasted sequence of transactions `Tx1`, ...,`TxN` with increasing timestamp, "evil" miner may only include `TxN` making transactions`Tx1`, ..., `TxN-1` invalid.

## Conclusion

It's not yet clear to me what the best way is to protect against replay attacks, arising from signature malleability of elliptic curves - each approach has it's benefits and drawbacks. This approaches may be combined with each other, e.g. it's possible to require canonicalsignature together with nonce. Feel free to provide more approaches and examples in comments, it would be cool to choose an optimal solution!

# Thoughts on 9/11

11 September 2016 Charles Hoskinson 5 mins read

Thoughts on 9-11 - Input Output HongKong

Fallen Memories Each generation has defining events. I remember the morning of 9/11 as a young teenager seeing the iconic footage of smoking towers with the eventual collapses. Much later in life, I had a chance to parse quite a comprehensive set of data points from that day.

Bush's memoir Decision Points contained a fairly indepth blow by blow account that captured the paranoia and helplessness. Many others have published some account of their experiences. Ted Olson- the solicitor general at the time famous for arguing for Bush in Bush v Gore- lost his wife on flight 77. She fatally delayed her travel by a day to wake up next to Ted on his birthday. Seth MacFarlane- the creator of family guy- missed flight 11 by just ten minutes due to a hangover.

There doesn't seem to be an end to the stories of how pervasive 9/11 has been on our collective psychology. There also doesn't seem to be an end to the retrospective analysis of the causes and motives.

The cryptocurrency and liberty movements in particular are extraordinarily skeptical of official government positions (often justifiably so- just look at operation northwoods). The formative lesson I learned from 9/11 is that the United States Government seems perfectly capable of purposely ignoring reality.

Inconvenient Truths

There's a wonderful book by the co-chairs of the 9/11 commission Kean and Hamilton entitled without precedent that addresses the frustrations of the commission's members ranging from chronic underfunding, an artificial deadline and Kafkaesque requirements for testimony from top level officials such as Bush and Cheney:

• They would be allowed to testify jointly
• They would not be required to take an oath before testifying
• The testimony would not be recorded electronically or transcribed, and that the only record would be notes taken by one of the commission staffers
• These notes would not be made public
The investigation was so badly hampered there was a desire within the committee to launch a second investigation for obstruction of justice. Obviously, the government seems to have moved on since those days.

We've now committed ourselves to a 15 year war on terrorism without a clear end in sight. We've now committed ourselves to an intelligence goliath (best described in Bruce Schneier's recent book data and goliath) that is systematically robbing us of our constitutional rights. Perhaps darkest, we've also committed ourselves to giving the US government the legal right to kill its own citizens without due process (Jake Tapper's scathing questioning of Jay Carney is probably the most elegant).

The harsh reality that every American faces is that we were let down by our government and that 9/11 seems to be a symptom of an inconvenient truth to the consequences of empire. A proper investigation would have forced a brutal journey through US foreign policy, the conduct of friend/enemies such as Saudi Arabia (see Bob Grahm's crusade), the dense shadowy web of interconnections amongst politicans and private industry and how the intelligence complex works. Succinctly, it just wasn't going to happen.

Lying Truths

I can greatly empathize with those seeking more answers and also harboring deep resentment. It's clear there are abundant lies that we have paid for in much blood and treasure. Some of the most frustrated are veterans who committed to fighting in the ensuing wars after 9/11 to seek retribution against the designated enemies only to find a more nuanced situation alongside TBIs, lost limbs and a general apathy upon returning home.

We didn't even get to see the body (much less an indepedently verified DNA sample) of the architect of the entire attack after spending trillions of dollars finding and killing him. I guess that's too offensive except for the times it's not.

To wear my Viktor Frankl hat, I suppose we can honor the losses and derive meaning from that dark day carrying its terrible 15 year fallout on the world by making a commitment to changing our government. We need to decouple money and politics. We need better channels to communicate ideas because the media is failing us. We need to decentralize the US government with much more power returning to the states. We need to change the way we hold elections and bundle our voting system to something like a blockchain for fidelity. We need to change how the United States commits its military to adventures abroad. Finally, we need to end the two party system.

The American Way

These are enormously challenging tasks and some would say beyond our abilities in the current political system. Yet, the global reinvention of money and ending the Soviet Empire were just as difficult if not more so. What gives me hope is that the American people are pretty special. We seem to have a knack for doing the impossible and then acting as if there was a certainty of success.

For example, our space program was trapped in a hyper-bureaucratic loop of half started missions and low hanging fruit. Now we have SpaceX, Blue Origin and others effectively building a roadmap to Mars and beyond. Tesla has proven battery powered cars are an inevitability. Some of our scientists have even built a star from lasers.

It just takes a bit of courage and also a willingness to experience failure in the process. It also takes an utter rejection of cynicism. Another reality- to paraphrase the late Steve Jobs- all the rules around us are made by people no smarter than you. To treat them as immutable gods is to discount one's own abilities.

Veritatem Cognoscere

On this anniversary of 9/11, I'd like to thank those who fought and those who continue to fight for freedom and truth. And, I'd also like to believe we will evolve as a world beyond the root causes of these events. We just have to be honest, disciplined and resilient.

# Ethereum Classic: An Update

9 September 2016 Charles Hoskinson 8 mins read

Ethereum Classic An Update - Input Output HongKong

I wanted to draft a brief update on IOHK's efforts on Ethereum Classic (ETC). We've had the opportunity to schedule more than three dozen meetings with developers, community managers and academic institutions. We've also managed to have several long discussions with several of the community groups supporting ETC to get a better sense of commitments, goals and philosophy. Overall, it's been a really fun experience getting to know a completely decentralized philosophical movement. It's also been illuminating to parse the challenges ahead for the fragile movement as it charts its own path forward. I'll break the report down into what we learned and what we are going to do.

What We Learned

Carlo Vicari and I have been trying to map out the total ETC community and also get some metadata about who they are (vocation, age, geography, interests...) so we can better understand the core constituencies. We will publish some preliminary stats sometime next week, but as a rough summary there are currently several meetup groups, a telegram group, a reddit, several Chinese specific hubs, a slack with over 1,000 members and a few other lingering groups.

Daily activity is growing and there is interest in more formal structure. With respect to developers, there are about a dozen people with development skills and knowledge of the EVM and solidity in the developer channel. They have been holding pretty deep discussions about potential directions and roadmaps. The biggest topics are currently pivoting consensus to PoW without the difficult bomb, new monetary policy and also safer smart contracts.

There is also interest in forming a pool of capital to pay for development efforts ranging from core infrastructure to DApps on top of the system. I haven't taken a position on this effort because we still need to address some governance and legal questions. Regardless of whether this pool materializes, IOHK's commitment of three full time developers will be funded solely by IOHK.

It seems that the price and trading volume of ETC has held relatively stable despite the virtual sword of damocles that is the DAO hacker and RHG funds. It seems that there is enough community interest in ETC to keep liquidity. I do think there will be tremendous volatility ahead and it's going to be impossible to predict when black swans are going to land in our laps, but I suppose that's what makes it fun?

IOHK's Commitment

After the initial conversations and analysis, we have determined the following serious deficits with ETC:

1. There isn't an official or reliable channel of information about the events of the ecosystem or commitments of various actors. This reality has lead to FUD, impersonations and attempts at market manipulation in the absence of clarity.
2. The roadmap of ETC needs to include at a minimum an emphasis on safety, sustainability and stability. There is a strong desire amongst the ETC community members we had discussions with to focus on reliable, high assurance applications that run on a network with proven fundamentals. Effectively, this needs to be the antithesis of move fast and break things.
3. There is a desire amongst several well capitalized actors to donate capital to a pool to fund the growth of ETC. This desire has been complicated by the lack of a clear governance structure that will avoid fraud or misuse of funds. Furthermore an open pool would allow funds to potentially become tainted by RHG or Dao hacker donating funds to it. While code is law covers the protocol level use of funds, it does not shield actors from the legal realities of their actions. It is unclear how these funds should be treated or if accepting them would constitute a crime.
4. The media is uncertain how to report on ETC outside of a referential curiosity to ethereum itself . There needs to be a re-branding and media strategy to ensure new users enter the ecosystem with a clear understanding of what ETC is about and how it differs from ETH.
5. Concepts like the replay attack and also new potential technology that could be adopted are not fully understood by ETC community members or general developers. There needs to be actors dedicated to education and explanation.
6. The Ethereum Foundation owns the Ethereum trademark. Further use of this branding could provoke a trademark infringement lawsuit to companies using the Ethereum brand and name. This complicates the formation of a centralized governance entity or steering committee if it chooses to use ethereum classic as its name. It also complicates business commitments to building on the ETC chain.
There are likely more problems, but these seem to be the most pressing for the time being. They are also compounded by the decentralized nature of the movement, which seems to be a boon for resilience, but a curse for agility. Given this fact, IOHK obviously cannot move unilaterally to address all of these problems; however, we can chart a course and invite the community to follow where they deem reasonable.

Thus IOHK is in the process of doing the following:

1. We have interviewed several community manager candidates and will make our final selection sometime next week. He or she will be responsible for assisting meetup group founders, managing social media channels, broadcasting accurate information, combating FUD, collecting feedback from the ETC community and dealing with media entities. My hope is this position will be defined by its interactions with the ETC community and give us a starting point for timely and credible information at the very least.
2. IOHK is going to subsidize an educator to produce content on ETC ranging from the replay attack to new proposals suggested in various roadmaps. We have one candidate in mind and are finalizing the contract and duration of the relationship. All content will be released under a creative commons license and our hope is to again let this role be community driven.
3. IOHK has had numerous discussions with academic partners about the consensus algorithm of ETC and also the smart contract model. We would like to see if the EVM can be improved to be more secure and that Typescript and Purescript could be used as ETC's smart contract languages representing both a functional and imperative approach to development that maps nicely onto the skillsets of existing developers. We are seeing what types of partnerships are possible in the next few months and will provide an update.
4. We've also spent quite a bit of time looking at Smart contract languages on the horizon. There are some excellent ideas coming from Synereo and Juno's Hopper. IOHK has entered into a partnership with Kent University to begin an analysis of Transaction Languages used in cryptocurrencies. We will have a survey report available sometime in Q4 of 2016. This report will form the basis of our organization's understanding of the interplay of smart contracts in cryptocurrencies. Once available we will release it to the general public as a whitepaper.
5. We have decided that Scorex 2 will make a good base to build our ETC Main Client (Read Alex's First Blog on It). The core is going through a massive refactoring that will be finished sometime this month. From this point, we will retain a scala specific team (our three developer commitment) to fork Scorex 2 and build a full ETC Node including a wallet with GUI. The architecture of Scorex should allow for much faster iterative improvements and also a great opportunity to test our new blockchain specific database IODB .
6. With respect to the developer hires in particular, we have taken quite a few resumes already, but also want to make the process open to the general public. Our new community manager will post the job ad on the ETC reddit once he's been hired. I expect the first developer to be announced sometime in September. Quality scala developers with the requisite skills to make meaningful contributions to Ethereum are rare and require careful vetting.
7. With respect to a technological steering committee to guide the roadmap process, we are proposing the formation of a federated group tentatively called the smart contract engineering taskforce (after the IETF). Ideally we could develop an RFC process to propose improvement proposals from the community without the need for a formal, centralized entity. We'd love to see this form as a DAO. There could be two tracks covering changes requiring forks and changes that are iterative in nature. We will start the discussion about this group sometime in early October.
8. IOHK cannot resolve the trademark issue, but will make a commitment to not use the Ethereum brand or name in its repos or company assets. This said, we would like to see some form of bilateral resolution to this situation. It seems pyrrhic to seek trademark enforcement on a decentralized movement. We also understand the confusion this issue is causing the general public and developers.
Overall, it's been a great two months and I look forward to the next few to see ETC continue to grow and become a strong, stable cryptocurrency. I'd like to thank the awesome community and all their help. I'd also like to thank the people who had enough patience to talk with Carlo and me despite the long meetings.

====== Edit: Special Shout out to the Ethereum Classic Russian Community: https://ethclassic.ru/

When you starting a Bitcoin node it is downloading all the transactions for more than 7 years in order to check them all. People are often asking in the community resources whether it is possible to avoid that. In a more interesting formulation the question would be “can we get fullnode security without going from genesis block”? The question becomes even more important if we consider the following scenario. Full blocks with transactions are needed only in order to update a minimal state, that is, some common state representation enough to check whether is arbitrary transaction is valid(against the state) or not. In case of Bitcoin, minimal state is unspent outputs set (we call this stateminimal as a node could also store some additional information also, e.g. historical transactions for selected addresses, but this information is not needed to check validity of an arbitrary transaction). Having this state (with some additional data to perform possible rollbacks) full blocks are not needed anymore and so could be removed.

In Bitcoin fullnodes are storing all the full blocks since genesis without a clear selfish need. This is the altruistic behavior and we can not expect nodes to follow it in the long term. But if all the nodes are rational how a new node can download and replay the history?

The proposal recently put on Arxiv trying to solve the problems mentioned with a new Proof-of-Work consensus protocol. I very lightly list here modifications from Bitcoin, for details please read the paper.

1. State is to be represented as an authenticated data structure(Merkle tree is the simple example of such a data structure) and a root value of it is to be included into a blockheader. It is the pretty old idea already implemented in Ethereum(and some other coins).
2. We then modify a Proof-of-Work function. A miner is choosing uniformly k state snapshot versions out of last n a (sufficiently large) network stores collectively. In order to generate a block miner needs to provide proofs of possession for all the state snapshots. On a new block arrival a miner updates k+1 states, not one, so full blocks (since minimal value in k) are also needed.
Thus miners store a distributed database of last n full blocks AND state snapshots getting rewards for that activity. A new node downloads blockheaders since genesis first (header in Bitcoin is just 80 bytes, in Rollerchain 144 bytes if a hash function with 32 bytes output is chosen). Then it could download last snapshot or from n blocks ago, or from somewhere in between. It is proven that this scheme achieves fullnode-going-from-genesis security with probability of failure going down exponentially with “n” (see Theorem 2 in the paper). Full blocks not needed for mining could be removed by all the nodes. They can store them as in Bitcoin but we do not expect it anymore.

The RollerChain fullnode is storing only sliding window of full blocks thus storing disk space, also less bandwidth is needed in order to feed new nodes in the network and so bandwidth saved could be repurposed for other tasks.