Ethereum layer-2 scaling network Starknet has outlined plans to improve the decentralization of three core components of its zero-knowledge (ZK) proof rollup solution.
Speaking exclusively to Cointelegraph, Starknet product manager and blockchain researcher Ilia Volokh outlined the firm’s intent to address certain centralized elements of its protocol aimed at defending against censorship and making its system more robust.
Starknet operates as a validity rollup using ZK-proof technology to bundle transactions, with cryptographic proofs submitted to Ethereum to achieve security and finality for layer-2 transactions.
According to Volokh, Starknet’s protocol remains dependent on StarkWare for creating L2 blocks, computing proofs and initiating layer-1 state updates to the Ethereum blockchain.
“In this sense, the operation of the network is centralized. This temporary situation, until full decentralisation, is not necessarily a bad thing. Although Starkware operates the network, it cannot steal money and can’t do any invalid state transitions because they require executing the verifier on Ethereum,” Volokh explained.
While Starkware remains a “centralized gateway” to enter Starknet, Volokh added that the protocol is “100% honest” and cannot falsify transactions or information, as Ethereum’s layer-1 blockchain acts as a filter.
The only tangible way in which Starknet can “misbehave” is either by being idle in not relaying proofs to Ethereum or by specifically censoring certain parties from including transactions or proofs.
“For example, if the sequencer decides to exclude a transaction from a particular entity, they’re free to do so. As long as the other things that they are trying to promote are valid.”
For Starknet, the latter consideration is part of the main reason to decentralize parts of its protocol in an effort to combat two main causes of censorship in consensus-based systems.
Intentional censorship is one consideration, while “non-robust” systems that have a single point of failure present another threat to decentralization, given that all network participants would be “censored” if this central point caused a network or system outage.
“We want to solve both of these problems, and we think the obvious solution to both of them at the same time is to have as many people operating Starknet as possible.”
Decentralizing these different components of Starknet’s system entails varying degrees of difficulty. This includes decentralizing block production through its consensus protocol, decentralizing the proving layer, which is in charge of computing proofs for blocks and decentralizing the process of L1 state updates.
“I want to emphasize that it’s crucial to decentralize each of them because as long as even one of them is centralized, you haven’t achieved much,” Volokh added before unpacking the relevant challenges of each component.
Decentralizing block production has been fairly straightforward given that all blockchains rely on a consensus protocol and sybil-resistance mechanism. Meanwhile, decentralizing Starknet’s prover has required a more novel approach.
“As far as I know, we’re the first rollup that has come out with a fairly complete and concrete solution,” Volokh said. He also went on to unpack how competing ZK-rollups all essentially aggregate transactions into proofs and post them on Ethereum, which by extension transfers its own decentralization to rollup solutions.
However, these systems all rely on respective central entities to create and prove blocks, which means these layer 2s are “equally centralized.” Whether end users are concerned about the philosophical implications of the centralized components of L2s is another conversation altogether for Volokh:
“The people who appreciate decentralization do so because they understand that it gives more security, and we share those values more than we think people will like them for commercial reasons.”
Volokh added that Starknet is still in the process of outlining the process of testing and implementing these decentralized mechanics in its network. This is likely to be carried out through a series of interconnected testnets to test the simultaneous functionality of the different components.