“Prove it!” How a deep dive into the internet’s vital protocols earned a “best paper” honor

Author: Milton Posner
Date: 07.10.23

If you’ve ever had your WiFi cut out for a few hours, especially in the era of remote work, you know how frustrating it can be to go without internet access for even a short period. But what we seldom consider is just how damaging it can be if the bedrock of the internet were to crumble altogether.

It is in that bedrock that Max von Hippel spends much of his time. The rising fifth-year doctoral student works in formal verification — essentially, taking his trusty mathematical methods and computer systems, sidling up to an algorithm that’s too long and tricky to work out by hand, and asking, “So you think you’re functioning properly? Let’s prove it.”

In doing so, he’s proved himself worthy of a “best paper” award. “A Formal Analysis of Karn’s Algorithm” — co-authored with his advisor Cristina Nita-Rotaru, Lenore Zuck at the University of Illinois Chicago, and Ken McMillan at the University of Texas at Austin — won the honor at NETYS, an international conference for networked systems. Its findings provide insight into components of the internet that we rely on every day, but rarely recognize.

Max von Hippel (left) and Cristina Nita-RotaruMax von Hippel (left) and Cristina Nita-Rotaru (right).

To understand those findings, let’s backtrack a bit. The internet runs on protocols, which, as von Hippel explains, are “the elaborate handshakes that allow computers to communicate over the internet.” Like most things on the internet, protocols are vulnerable to attack, whether from state-level hackers seeking intelligence or solitary actors fishing for personal gain.

“An attack on a protocol consists of finding some edge case that the developers never thought of, then guiding the system into that edge case,” von Hippel says, describing a scenario where the protocols are severely overtaxed. “There are edge cases that can happen due to some natural event, but we can also imagine an adversary controlling the connection between two peers and changing the pace at which data makes it through. If they do that in a strategic way, they may cause a failure.”

This focus on pace underpins the team’s dive into two essential ingredients of congestion control algorithms, which account for changes in traffic and stabilize the internet. The first is Karn’s algorithm, which estimates the time it takes for one computer to acknowledge the receipt of packets (chunks of data) from another computer, allowing the machines to deduce the transit conditions of the network. The second is the retransmission timeout (RTO) calculation, which notifies the data sender if the data is not acknowledged in time, meaning the data didn’t arrive properly and must be retransmitted more slowly. Without these components, the network would be like a car in the dark without headlights: liable to come to a sudden and messy halt.

So even when the internet is functioning ideally on the surface, the components underneath may not be. And without hard proof that they are working properly, the possibility may exist for protocol vulnerabilities or undesirable repetition in transmission.

“There are attributes of Karn’s algorithm that people have taken for granted for 30 years, attributes we’ve never formally verified or even defined. We’ve never written down what it meant for these things to be true at a mathematical level,” von Hippel cautions. “When you study something that’s built on top of Karn’s algorithm, you want to know what assumptions you can make. And in our view, prior works had made too many assumptions.”

Through their research, the team formally verified Karn’s algorithm and the RTO calculation, proving important properties of both. They found a number of positive, properly functioning elements; for instance, they showed that Karn’s algorithm measures a pessimistic round-trip time for data transfers, which prevents the over-sending of data into the network. But they also discovered that it is theoretically possible for an infinite number of retransmission timeouts to occur even amid steady rates of communication, which would degrade internet performance.

“By formally verifying the algorithm, we made it clearer which assumptions you could make and which you couldn’t,” von Hippel explains. “Our work gives you a set of guarantees and requirements to understand Karn’s algorithm, which means that future work on higher-level systems will know what they can and can’t take for granted instead of guessing — which is what people have been doing.”

Some of that future work may be von Hippel’s. Next year, he will defend his dissertation — which will include his Karn’s algorithm work — after which he plans to seek formal verification roles. This could mean a job with Galois, the consultancy he’s interning with this fall. It could mean a security position with the military. He could even work verifying supercomputers, which are essentially just groups of computers talking to each other through protocols. For now, though, he and his collaborators can bask in the award from the NETYS conference.

Subscribe to Khoury News

Newsletter Subscription

Enter your information to subscribe now.

This field is for validation purposes and should be left unchanged.