Vitalik Buterin: The Core Difficulty of Blockchain Scaling in Order is Computation, Data, and State
Odaily News Vitalik Buterin published an article elaborating on his layered understanding of blockchain scalability, pointing out that the difficulty of scaling blockchains from low to high is computation, data, and state.
Vitalik stated that computation is the easiest to scale, achievable through parallelization, introducing "hints" provided by block builders, or replacing large amounts of computation with proofs via methods like zero-knowledge proofs. Data scaling is of moderate difficulty; if a system requires data availability guarantees, this requirement cannot be avoided. However, it can be optimized through methods like splitting data, erasure coding (e.g., PeerDAS), and supports "graceful degradation," meaning nodes with lower data capacity can still generate blocks of a corresponding scale.
In contrast, state is the most difficult part to scale. Vitalik pointed out that to verify even a single transaction, a node requires the complete state; even if the state is abstracted into a tree and only the root node is saved, updating that root still depends on the complete state. Although methods exist to split the state, they typically require significant architectural-level adjustments and are not universal solutions.
Based on this, Vitalik concluded: If data can replace state without introducing new centralization, it should be prioritized; if computation can replace data without introducing new centralization, it should also be seriously considered.
