Lee Center for Advanced Networking

Outside of biology and economics, the Internet is the best example of a complex system, says Lee Center member and Caltech professor Steve Low. Low, an associate professor of computer science and electrical engineering, argues that the apparent convenience and simplicity of communication services is an illusion. The reality, he says, is that the Internet is a complex array of mechanisms working in concert over patchy, unreliable, and changing networks. These mechanisms determine which data gets sent in what order and along which path; how the system detects and recovers from errors, failures, and attacks; and how applications interface with various hardware and software platforms.

Robust and flexible as the Internet is, its rapid growth may pose threats to its original design. For example, a particular component of the network could limit its future growth and flexibility. While researchers understand the individual computing, protocols, and technologies that control the Internet, the behavior of these components within a complex network is less certain. How do we understand the emergent properties of a network based on our understanding of the individual components?

Low has been looking at the particular problem of congestion, which can slow e-mail and Web page loading. On the Internet, all packets of data travel through routers that direct the packets to their next destination. If too much data arrives at a particular router, congestion results. As protocols operate now, when there is no congestion, the rate of incoming data increases, but when congestion occurs, the rate is drastically reduced. As the Internet grows, this volatile ebb and flow of data could lead to instability, says Low.

Ideally, as the Internet increases in size, a protocol should be able to rescale itself so that packets of data always flow in a smooth manner. Low and his colleagues have shown that such scalability will work, and are now using the new algorithms in software to test their ideas.