Scaling Ethereum to Billions of Users
When you look up for scaling issue associated with Ethereum blockchain you will find number of holes in Ethereum’s jacket. A number of technical challenges related to the blockchain, whether a specific one or the model in general, have been identified. The issues are in clear sight of developers, with different answers to the challenges posited, and avid discussion and coding of potential solutions. Insiders have different degrees of confidence as to whether and how these issues can be overcome to evolve into the next phases of blockchain industry development.
Some think that the de facto standard will be the Ethereum blockchain, as it is the incumbent, with the most widely deployed infrastructure and such network effects that it cannot help but be the standardized base. Others are building different new and separate blockchains or technology that does not use a blockchain (like ripple). One central challenge with the underlying Ethereum technology is scaling up from the current maximum limit of 13 transactions per second (the VISA credit card processing network routinely handles 2,000 transactions per second and can accommodate peak volumes of 10,000 transactions per second), especially if there were to be mainstream adoption of Ethereum. Some of the other issues include increasing the block size, addressing blockchain bloat, countering vulnerability to 51 percent mining attacks, and implementing hard forks (changes that are not backward compatible) to the code, as summarized here:
Throughput
- The Ethereum network has a potential issue with throughput in that it is processing only one transaction per second (tps), with a theoretical current maximum of 7 tps. Core developers maintain that this limit can be raised when it becomes necessary. One way that Ethereum could handle higher throughput is if each block were bigger, though right now that leads to other issues with regard to size and blockchain bloat. Comparison metrics in other transaction processing networks are VISA (2,000 tps typical; 10,000 tps peak), Twitter (5,000 tps typical; 15,000 tps peak), and advertising networks (>100,000 tps typical).
Facebook handles about 175k requests per second (900k users on the site in any given minute, assume an action is taken every 5 seconds). And this probably doesn’t include API requests, which are a better analogue and probably 3–4x higher.
Latency
- Right now, each Ethereum transaction block takes 10 minutes to process, meaning that it can take at least 10 minutes for your transaction to be confirmed. For sufficient security, you should wait more time—about an hour—and for larger transfer amounts it needs to be even longer, because it must outweigh the cost of a double-spend attack (in which Ethereums are double-spent in a separate transaction before the merchant can confirm their reception in what appears to be the intended transaction). Again, as the comparison metric, VISA takes seconds at most.
Size and bandwidth
- The blockchain is 25 GB, and grew by 14 GB in the last year. So it already takes a long time to download (e.g., 1 day). If throughput were to increase by a factor of 2,000 to VISA standards, for example, that would be 1.42 PB/year or 3.9 GB/day. At 150,000 tps, the blockchain would grow by 214 PB/year. The Ethereum community calls the size problem “bloat,” but that assumes that we want a small blockchain; however, to really scale to mainstream use, the blockchain would need to be big, just more efficiently accessed. This motivates centralization, because it takes resources to run the full node, and only about 7,000 servers worldwide do in fact run full Ethereumd nodes, meaning the Ethereum daemon (the full Ethereum node running in the background). Although 25 GB of data is trivial in many areas of the modern “big data” era and data-intensive science with terabytes of data being the standard, this data can be compressed, whereas the blockchain cannot for security and accessibility reasons. However, perhaps this is an opportunity to innovate new kinds of compression algorithms that would make the blockchain (at much larger future scales) still usable, and storable, while retaining its integrity and accessibility.
How can Ethereum systematically scale better?
The only way is hiring more developers at rates competitive to companies like Google, allowing forward thinking developers to take the plunge. Working in the industry is developing its own risk spectrum. Working on Ethereum could be similar to working at a Google: lower risk with broad impact right away. Developing more projects based on smart contracts, Dappa and DAO.
The Ethereum capital is good enough to do this and scale it for billion users for various application. It won’t happen in days or months or may not be in year but it is surely achievable.