Source: Lightning HSL
In the latest BCH protocol modification proposal bch-vm-limits, I read a brand new concept "density of computation". The concept of computational density does not exist on other UTXO chains. I was very curious about what it was and did some research.
What is computational density?
Bitcoin has a block size limit of 1M transaction block body + 3M signature block, and there are size and opcode number restrictions for each transaction. Ethereum's transaction single block also has a gas limit upper limit. Friends who have studied EOS deeply must also know that EOS has three resources: cpu, RAM and Net. Sending eos transactions requires the consumption of these three resources.
These are all to maintain the security of the blockchain network and prevent malicious transactions from attacking the network.
The most classic case of a legitimate transaction maliciously attacking the Ethereum network is the classic "attack on the Ethereum network" event in the collection of the final Fom3D award in 2018. If you need to understand the full picture of Fom3D, you need to search it yourself. This article will not waste words to explain it.
At that time, the attacker constructed a special contract transaction between block heights 6191897 and 6191902, consuming all gas limits in the block, so that other users' transactions could not be packaged, and only the hacker's own transactions could be packaged, and finally took away the 10469ETH prize.
Among the three classic designs to prevent the blockchain network from being maliciously attacked, Btc's block size and transaction size restrictions, ETH's block gaslimits restrictions, and EOS's CPU&RAM&Net resource restrictions.
BTC's design is the most original and safest design, and it has been tested for a long time. Since the birth of BCH, it has continued the same design, but with some adjustments in specific parameters.
ETH's gas limits design should be the most successful design, and it has now become an industry standard.
The design of EOS has been a failure so far. RAM and other resources have not fulfilled the design concept, but have become a hype token.
Off topic, ETH once had a project called gastoken, which allowed users to speculate on gas as a coin, but was banned by V God and others.
Please note that the user above said that BTC is the safest and gas limits are the most successful.
In terms of measuring security and programmability, the design of gas limits has reached the extreme in programmability and has achieved Turing completeness on EVM, which is the decisive factor for the birth of a prosperous economic ecology in Ethereum.
The ecology of UTXO such as BTC is obviously trapped by programmability, especially the limit on the number of opcodes of transactions. The number of addition, subtraction, multiplication and division is limited to you, so how can you program? But it is obvious that the limit on block size and transaction size of BTC provides the ultimate security for the decentralization and security of Bitcoin. The Bitcoin network has never been DDOSed and caused major problems. When "malicious" massive transactions or complex contract transactions (P2SH) flood into the memepool, the only thing to do is to wait for the blocks to be slowly packaged, and no additional problems will occur.
UTXO technology, improving programmability, is likely to lead to additional security issues.
The new concept of BCH's computational density is to balance the programmability and security of UTXO, and to greatly improve the programmability of UTXO while ensuring the security of the BCH network.
The definition of computational density is to limit the computational operations that can be performed on the input based on the byte length of each input data. That is, each input in the transaction will be allocated a certain computational budget based on its size (number of bytes), and this budget determines the maximum amount of computation that the node can perform when verifying the transaction.
The Bch-vm-limits protocol provides a calculation formula, which I don't understand. I only know that this computational amount mainly refers to hash calculations. This is too detailed, involving the construction and verification of BCH transactions, and I can't care so much.
Unlike gas limit, which directly links calculation and gas fee, computational density does not change the design of BCH transaction mining fee, and mining fee is still calculated based on sats/byte.
The design of gas limit is equivalent to that as long as you pay money (gas fee), you can design any complex contract, provided that you do not touch the gas limit of a single block. Now the gas limit of a single block is 30 million gas. If the gas price is 10gwei, it takes 0.3ETH to consume 30 million gas limit. 30 million gas is a very large amount, and very, very complex contracts can be designed.
BCH computational density is the amount of computation that constrains the unit transaction size. I estimate that the complexity of the contract that can be designed is definitely much smaller than the design of gas limit, but much higher than the original transaction size and opcode limit of BTC and BCH. The document describes that it is enhanced by 100 times.
Jason Dreyzehner, the designer of the protocol, wrote a lot of benefits of computational density in the document, and compared it with gas limit, he praised it in various ways. I hope it can be realized in the real production process.
In the actual application scenarios, the developers wrote even more exaggerated, involving quantum cryptography, zero-knowledge proof, homomorphic encryption, and other crown jewel-level scenarios in the field of encrypted digital currency technology. But now it is definitely impossible to distinguish the authenticity.
Finally, I feel that the developers of BCH are still quite innovative, including cashtoken activated in 2023, and the computing density to be developed this year, which are the first technologies I have seen in the entire currency circle.