A few thoughts - Thursday, May 7, 2015

Detailed posts regarding cryptocurrencies
Forum rules
Purpose
This forum is a place to discuss bitcoins, altcoins, cryptocurrencies, and random thoughts about life.

Topics
Topics may include bitcoin prices, outlooks on altcoins, cryptocurrency development, economics, and more. Feel free to share outlooks on other things as long as the major focus is on cryptocurrencies.

In-depth topics only
Replies to posts may be of any size, but new topics must contain original research and be at least two paragraphs in length. While references to external articles are allowed within topics, simply posting a link without discussing its importance or debating its truth is disallowed. Consider this place a huge blog where anyone can post.

For the full list of PROHASHING forums rules, please visit https://prohashing.com/help/prohashing- ... rms-forums.
Post Reply
User avatar
Steve Sokolowski
Posts: 4585
Joined: Wed Aug 27, 2014 3:27 pm
Location: State College, PA

A few thoughts - Thursday, May 7, 2015

Post by Steve Sokolowski » Thu May 07, 2015 12:59 pm

Today, I'm going to focus my thoughts entirely on the increase of the bitcoin block size proposed by Gavin Andresen.

More people means less agreement

It's great to see that the 1MB transaction limit is finally getting attention. Since 2013, I've criticized the development team for not paying more attention to this issue. The team has focused largely on features like blockchain pruning that, while certainly laudable, do not address the #1 problem that has faced bitcoin for the past several years.

Unfortunately, what's done is done, and the time wasted cannot be recovered. I bring up this criticism because it is important to understand that the reason we are in trouble is that the problem has been allowed to get worse until the last minute, and now it is a crisis. The reason it is a crisis is because the development team is not a single entity; it consists of many different people who do not easily agree on major issues like this. The reason the people do not agree is because each development team member is responsible to different people. A mining pool operator is concerned for his miners, while a payment processor is concerned for his merchants, and so on.

In last year's posts, I advocated increasing the block size immediately because as more people get involved in bitcoin, it becomes more difficult to get all those people to agree on the solution. If the block size fix had rolled out a year ago, some of the people who are now in disagreement would have had no say, because they weren't involved at the time. While that may sound harsh, the truth is that the more people who are involved, the more difficult it is to come to an agreement.

But now, a solution is being offered by Gavin Andresen in a series of carefully constructed blog posts, starting exactly on Monday morning, the best time for advertising. Instead of writing one summary of points, numerous posts are being made so that every one can be featured separately on reddit. The urgency of this problem has been clear for some time, but it is not any more urgent now since this marketing campaign appeared on Monday. The wrong solution is being proposed in the name of urgency, and this post explains why, despite there being a real issue, Andresen's solution will make things worse.


We only have one chance

It's clear that it will be difficult enough now to get agreement on this increase, if agreement can be obtained at all. It will be absolutely impossible to obtain agreement on any changes to the protocol whatsoever once bitcoin becomes used as the world currency. To demonstrate this in real life, IPv6 has been proposed for 15 years and has made little progress in deployment. Instead, people have developed increasingly complex network address translation (NAT) solutions to get around the limitations of the current IPv4. History is rife with all sorts of decisions that were made arbitrarily but are now unchangeable.

Once big banks get involved, there will be no way for people with millions of dollars at stake to come to agreement. Even if they did agree to make a change, a huge amount of equipment that was designed for a previous protocol version will exist and could not possibly be upgraded in any reasonable timeframe.

Furthermore, a hard fork is extremely dangerous. No matter how much preparation is made and how much lead time is given, people are going to lose money. Our pool loses money all the time when coin developers make forks, for a variety of reasons. Many times, we are forced to discontinue a coin when a hard fork occurs. A bitcoin hard fork, even now, will cost tens of millions of dollars in developer labor, equipment upgrades, and money lost due to downtime or bugs. A hard fork in the future would cost an unthinkable amount of money, and Andresen's power to sway people is going to decline as people with more money on the line move into the industry.

Therefore, everyone involved with bitcoin has exactly one chance to get it right, which is now. There is no opportunity for a "patchwork solution." We can't just put out a temporary fix and then deal with the problem when it inevitably recurs.


Build it and they will come

In light of the above, Gavin Andresen's proposal to increase the block size to 20MB as a one-time temporary fix would be disastrous. It is worth noting that most of Andresen's blog posts are correct, and he has made a compelling case as to why an increase is necessary. However, one of the key points he fails to address is one that has been known to highway engineers for decades.

When new lanes are added to a highway, they often fill up immediately, and traffic is back to the same state shortly after the improvement is made. The extra capacity draws more people to commute on the road, because it is faster. Businesses and homes spring up along the road, until the traffic becomes slightly intolerable. Then everything is back to the way it was before, in an equilibrium dictated by how much inconvenience with which people are willing to deal.

The bitcoin network as it stands now is hobbled by a lack of capacity. There are undoubtedly people who would love to do innovative things with the network, but who do not because the cost of doing so is high. Some people write uses like proving the existence of items or coloring coins off as spam, but they provide value to groups of people by doing things that weren't possible before. Even spam transactions are an example of market forces dictating what gets included in blocks. Spam is not going to go away whether block size is increased or not, but the opportunity for all uses of the blockchain will increase if block size increases.

However, if block size is increased to 20MB and the size is not allowed to increase thereafter, all sorts of people will start using the fixed capacity immediately, as would new cars on a fixed-width highway. There will be more payments, more "blockchain 2.0" solutions, more useless poetry and images, and more spam transactions. Days or weeks after the size increase, the blocks will be back to almost full again. Even if there is no increased demand for payments, the other stuff will be so cheap that new entrants will take advantage of it. Furthermore, unlike the expense and time required to buy a car and drive it on a highway, a single person can easily modify code in a day to increase by 100x the number of transactions he sends.

If Andresen's proposal is adopted, all that will happen is that an entire year of progress will be delayed. Then, millions of dollars will be spent or lost in a high-risk fork. The bitcoin volatility will reach all-time highs as the fork approaches. A few weeks after the fork, the same problem will recur as people take advantage of a cheaper resource. Finally, there will then be too many participants at the table to come to an agreement about a real solution to the problem, and the network will be permanently limited by 20MB blocks.


Gavin Andresen is wrong, and right

I was surprised to hear in the push of blog posts this week about Andresen's new proposal, because he made the correct proposal earlier in the year. He spent months testing larger block sizes and proposed a system where the block size would be increased to 20MB, and then blocks would automatically become larger over time based on a formula. His formula was too conservative in its growth, but the numbers weren't that important. An agreement could be reached over the exact rate of increase.

One worry that should not be considered is that, at some point in the future, these huge blocks will be empty or filled with spam. If the network does not grow exponentially, then it will fail soon. It does not matter if the network dies in 2025 because there are only 400KB of real transactions in the blocks. By that time, bitcoin will either be widely used or a different technology will have supplanted it.

The important part about Andresen's earlier proposal is that this change would be the last hard fork that bitcoin would ever intentionally make. Even if bitcoin outgrew Andresen's conservative growth numbers, in the worst case time could just be allowed to pass and the problem would resolve itself.

There would be no IPv6 rehash, where everyone needs to replace servers all at once. There would not be opposition by people who went bankrupt during the first fork and who do not want to see the same thing happen again. There wouldn't be thousands of people to consult, no long waiting period, no wasted productivity trying to come to an agreement on a solution instead of working on new technology, and no extreme volatility before the next fork.

The best decision is to increase the block size to 20MB, and then to automatically increase block sizes at a continuous rate based upon a formula. This solution is exactly what Gavin Andresen proposed earlier, and his justification that there isn't enough time to consider the consequences is incorrect. If more research is necessary, it's time to stop attending conferences, stop writing blog posts, pull down the shades, write a "do not disturb" sign on the door, and spend a month on testing to confirm there are no unintended consequences.

While I feel that more than enough research has been performed already, an additional month or even three months of research is far better than releasing a temporary solution that would require a year before implementation anyway.


Everyone can veto the change

A common misconception about this block size increase is that miners will make a decision about whether the change is accepted or not. Miners only have the power to prevent a hard fork, but they cannot force its acceptance even if they mine new fork blocks.

To see why, consider the story of Elacoins. Elacoins were a proof-of-work altcoin created several years ago. After some initial interest, activity declined until there were no more development updates, and difficulty plummeted. However, after about six months, a new developer came onto the scene and decided to fork Elacoin to be proof-of-stake. He released a new version with great fanfare and many future stakers arrived to use the new coin. He set a date in late 2014, at which the chains would diverge. In anticipation of the upcoming change, volatility increased and trading volume rose dramatically.

Exchange Cryptsy, seeing the trading volume, had no reason to make any change to its market, and CEO Paul Vernon ignored customer support tickets from the new Elacoin developers. As price rose in anticipation of the fork, difficulty also rose. The date of the fork came and went, and price continued to rise - but Cryptsy did not upgrade its daemon. As a result, mining continued on the old fork. Huge buy orders of 5+ bitcoins appeared. A large group of people continued to stake the new fork, but the only people who actually mattered - Crypsty, ignored their tickets. Eventually, the hard fork was declared a failure and Elacoins continued to be traded and mined using the old fork, despite almost every user, developer, and staker wanting to use the new fork.

This story demonstrates that exchanges, not miners, will also decide whether to accept any proposed changes. In fact, almost anyone can veto a hard fork. Just 20% of miners can decide not to mine the new blocks, and the fork fails. Just one massive exchange (Coinbase, or the Bitcoin Investment Trust) can continue trading the old fork, and the fork fails. A few of the richest coin owners, like Tim Draper, can decide that they will continue to make their bitcoin investments on the old fork, and the fork fails. Big merchants like Overstock and Microsoft announce that they will only accept the old fork, and the fork fails.

Miners do not have the final word, or even a significant choice, in whether the fork succeeds. There are several lessons to be taken from the Elacoin story: first, miners like Slush and Luke Jr. do not have the power to cause this fork to happen, only to block it. Second, exchanges are more important than miners in whether the fork succeeds or not. Third, a bare majority of people accepting the fork is not sufficient for it to happen, since the default decision is to fall back if it looks like there is even the slightest chance of failure.


Conclusion

To summarize, the reason we are in this situation is because it has been nearly impossible to achieve a consensus on what to do about the problem. Everyone involved in bitcoin has known that the 1MB transaction limit would be a problem for years, and it has taken many years just to get the conversation started. There is nothing to suggest that, when there are more interests and more money involved, it will take even fewer years to solve the newly created 20MB transaction limit problem. The 20MB transaction limit problem is not even a "better problem to have" than the 1MB transaction limit problem, since the exact same solutions (off chain transactions) are needed to solve both issues. A static 20MB limit represents no real progress.

I agree with Andresen that this problem is critically urgent. It was urgent last year, too. The time wasted can't be regained, so it is a sunk cost. Instead of rushing to a static limit, Andresen should propose his earlier solution of automatic block size increases. If the core developers believe more research is necessary, they should stop the publicity, spend time in front of the code, devote 100% of their time to the research, and report the results. If Andresen's previous proposal is flawed in some way, then a new solution should be proposed that is dynamic and where the default decision is for block sizes to increase unless a reason is found to stop the increase.

I am not confident that agreement will be reached even at this stage, and precisely because of that unlikelihood, there is but one chance to get this right. The only correct solution is one that is intended to be bitcoin's last hard fork, and it should permanently address the block size problem for the indefinite future.
Post Reply