Opinion: Ashok Sinha, December 2014

This article was originally published in the December 2014 edition of London Cyclist

LCC’s chief executive considers the issues with traffic modelling and why intuition shouldn’t be underestimated

One of my children is getting into maths.  Despite the pedagogical peculiarities of the former Secretary of State for Education, his teachers are exploring the subject rather than simply drilling the rules of long division into the kids by rote. I think he likes this, plus the fact that maths is clear cut.  Dividing 12 oranges equally between three people gives a definite right answer, and that is that.

I too enjoyed the subject. But becoming a physics undergraduate was an eye-opener to me. I had already learned that the physics I had learned at school, with all its absolute certainties, was not quite the full picture.  Nonetheless it gave me brain-ache to find that the universe works in a probabilistic and frankly weird way (a particle really can be in two places at once).

After that, as a graduate student and then salaried researcher, my interest was in developing and applying computer models for complex situations where approximations were a necessity — a situation made more complex by the unavoidable imperfections in the real world data used to calibrate the models. Not only is uncertainty built into the rules of the universe, it’s inherent to trying to model pretty much anything.

This has a great deal of relevance to cycling, because decisions that Transport for London makes rely on the models that it runs to predict the consequences. If the models foretell that a road development will create a level of congestion deemed unacceptable then — no matter the safety benefits that may accrue — that intervention is ruled out.

It’s an approach that has long frustrated cycle campaigners and, more broadly, anyone pressing for a reduction in road danger. Campaigning orthodoxy is to complain that such models don’t reflect the real world.

Predicting human behaviour


It’s easy to dismiss the traffic modellers and accuse the whole science as being no better than alchemy. But I have sympathy for them. Some of the biggest and most complex models ever constructed — climate models — at least have the advantage of being based on physical laws that have been validated by experiment, and can be tuned by physical data.

By contrast, traffic modellers arguably have much in common with economic modellers, in that they are ultimately modelling that most difficult of natural phenomena to capture — human behaviour.

Economies operate according to the sum total of the actions of the humans in them, constrained (or not) by regulation and the availability of resources. Traffic systems are somewhat similar, with the net behaviour being the aggregate decisions taken by road users within the physical constraints of the road network. On the plus side, at least the boundaries are clear: a traffic jam in Manchester will have negligible impact in London, but changes to the frequencies of El Niños in the Pacific can affect the climate in London, and we all know what betting on the US sub-prime mortgage market did to the global economy.

Still, I’d expect experienced and well-resourced traffic modellers to do a good job with the kinds of questions they are used to facing. The problem lies with using these models to determine the impacts of radical change. For example, traffic models are poor at accounting for variations in human behaviour: maybe that filtered permeability scheme won’t cause mass tailbacks as predicted, because people will switch from cars and buses to bikes (or change their journey patterns).

Indeed, the more people move around efficiently by bike, rather than in big metal boxes, the more space will surely be freed up on our roads in the long term? This would be the opposite of the gridlock that the alarmists would have us believe would beset us all from creating space for cycling.

It’s not that the traffic models that TfL uses are wrong or that traffic modelling itself is a fool’s errand, rather we have to keep the limitations of modelling front of mind when attempting a transformation of our city and not be hidebound to their predictions.

Solution to congestion


I can see why in the short term people are worried by the modelled impacts of the proposed North-South and East-West Cycle Superhighways. Anything that makes it harder for, say, a photocopier repair technician to reach their daily roster of clients is bound to cause concern.  As campaigners we should be sensitive to such anxieties, and not simply dismiss them with trite assertions that in the long run we’ll all adapt. The future will be better, but for some the journey will be harder.

But if we assume the prediction of a 40% increase in the number of people working in central London over the coming decades is anywhere near accurate, then politicians should view cycling not as a cause of congestion, but a critical part of the solution to it.

That will require politicians across London to take a political risk by putting aside the dire warnings from some — as Ken Livingstone did with the Congestion Charge and bus priority lanes — and applying some human reasoning that the models can’t offer.

Sometimes the best model we have is our determination and intuition. We should trust it.