Big data_1536x1536_Original

Big data versus big congestion: Using information to improve transport

By Carl-Stefan Neumann

Digitization in infrastructure networks could improve forecasting, promote reliability, and increase efficiency.

Congestion—on the road, in the air, on the rails—wastes time, increases pollution, and is costly to society. Commuters in Brussels and London waste more than 50 hours a year in traffic jams; that’s the equivalent of more than a full week of work. Across Europe as a whole, infrastructure congestion costs 1 percent of GDP. In the United States, airport delays alone cost some $6 billion to the economy.

It doesn’t have to be this way. In 2013, the McKinsey Global Institute concluded that, globally, $400 billion a year could be saved by “making more of existing infrastructure” through improved demand management and maintenance.1 That is where digitization, in the form of big data, can help. The collection and strategic use of information can improve forecasting and help to nudge behavior in ways that improve the reliability of transport infrastructure and increase its efficiency and utilization. In fact, some of this is already happening.

  • Israel has introduced a 13-mile fast lane on Highway 1 between Tel Aviv and Ben Gurion Airport. The lane uses a toll system that calculates fees based on traffic at the time of travel. To make it work, the system counts the cars on the road; it can also evaluate the space between cars to measure congestion. This is real-time pattern recognition of a very high order. The information is then put to use in a way that increases “throughput,” or the amount of traffic the road can bear. If traffic density is high, tolls are high; if there are few cars on the road, charges are cheap. This not only keeps toll revenues flowing but also reduces congestion by “steering” demand.
  • In Brazil, aviation traffic has been growing fast for the past decade, and annual passenger traffic is expected to more than double by 2030, reaching more than 310 million passengers. Not surprisingly, airspace congestion is a growing concern. To deal with the problem, Brazil is introducing a system that harnesses GPS data to optimize the use of available airspace, enabling less separation between aircraft and shorter routes.

    The usual practice has been to line up planes preparing to land in an airborne queue. Under the new system, each plane is assigned its own flight path. It may sound simple, but making the system work requires enormous amounts of data, as well as fast and sophisticated evaluation of the data. The distance, speed, and capabilities of each aircraft are processed in a way that results in the shortest flight path. Instead of queuing up on approach, planes can “curve in” much closer to the airport.

    The first deployment, at Brasília International Airport, is saving 7.5 minutes and 77 gallons of fuel per landing, as planes fly 22 fewer nautical miles on average. Brazil plans to roll out the system to the country’s ten busiest airports. Initial impact estimates suggest that deployment of this system at North American airports could increase capacity 16 to 59 percent, depending on airport conditions.

  • Railway-infrastructure providers in Europe typically ask operating companies for detailed itineraries of the trains they want to run, and then the providers create a schedule that tries to fulfill every request. The system is well intentioned but rigid—and it doesn’t lead to optimal capacity usage or operational stability. In Germany, the great majority of cargo trains do not depart as scheduled, a fact that inevitably leads to complications down the track.

    Recently, some railway companies have started to follow a more “industrialized” approach that uses big data. They are splitting track capacity across the network into “slots” of different speed profiles based on an analysis of past demand and are allocating trains to available slots as requests for capacity come in. Capturing these opportunities requires advanced planning techniques that can, for example, allow trains to swap slots along their itinerary in order to recoup time lost to operational delays. Such innovations can improve punctuality and reliability while accommodating up to 10 percent more traffic.

In spite of these (and other) encouraging examples of the integration of information and infrastructure, progress in general has been slow. At airport-industry gatherings, there’s lots of enthusiasm about using big data from tracking passengers’ mobile devices for tailored information and management. Ideas include text-message alerts on when to go to the departure gate, taking into account individual walking speeds, and reduced security queues based on better short-term demand predictions or tailored shopping suggestions. At the moment, though, no more than a few dozen airports are actually implementing ideas like these.

Why are infrastructure providers so slow to integrate big data? And what can be done to speed things up? Economic viability cannot be the reason. The payback from investing in such technologies is usually much better than investing in equipment with similar ability to boost capacity.

Based on conversations with industry practitioners, we have identified three significant barriers to leveraging information effectively to improve transport-infrastructure usage.

First, there is a lack of transparency. Transport infrastructure involves complex networks with many participants. An airport, for example, will have dozens of different airlines, ground-handling companies, and retailers, plus air-traffic control, customs, and the airport-operating company itself. Each player collects its own data and does not necessarily want to share it. That can sometimes make sense; no retailer wants to give away the store. But the ability to track passengers could benefit just about everyone. For example, knowing where foot traffic is and how it moves can help to optimize gate and asset allocation. That could not only increase airport capacity but also boost retail revenues. For that to happen, though, the data need to be pooled.

Another issue is how to divvy up the costs and benefits of sharing information; different players do not always have the same goals. Airlines might want faster transit times—for example, in order to minimize travel times for connecting passengers—while retailers might prefer passengers to linger to increase store sales. Airports would prefer a high utilization of assets, but they might value lower utilization to foster flexibility and enable them to recover quickly after irregular events. Collectively finding a solution that makes every stakeholder a winner is not a simple task and requires a certain level of mutual trust that cannot be assumed.

Finally, there are regulatory constraints. Infrastructure in many cases is a natural monopoly. Governments therefore have an important role to play—in ensuring that operations are fair and cost-effective, and in creating a regulatory environment that allows data to be collected and used while protecting confidentiality and privacy. But before that can happen, competition and data-protection authorities need to be convinced of digitization’s benefits. One sizable challenge would be to overcome users’ privacy concerns by clearly stating what data are being collected, how they’re being used, and the ultimate benefit to consumers of cost-effective solutions emerging from data insights.

All three barriers are interdependent and therefore need to be addressed at the same time. Without transparency, there is no way to build trust and achieve equitable sharing. Without equitable sharing (and clear public benefits), regulators will not be sympathetic. Without responsible regulation, players will be reluctant to make their data available.

It’s no easy matter to get all the parties in an infrastructure network to work together. A leader is required. Governments have an obvious interest in making the most of existing infrastructure, so one option is a national or multinational government entity. But it could also be the main concession holder, such as an airport operator or railway company. Or it might be a combination, with the government setting goals and establishing the conditions on data use and sharing, and the concession holders setting up structures to put the data to work.

Using big data in infrastructure is a work in progress; in important ways, it is just getting started. To build momentum, one proven strategy is to launch a pilot program, perhaps at a single airport or railway station, that tests data strategies and documents the benefits. But perhaps the most important thing is simply to recognize the potential that information has to improve infrastructure.

About the author(s)

Carl-Stefan Neumann is a director in McKinsey’s Frankfurt office.
More on Capital Projects & Infrastructure