From vol. 106, spring 2014
By Paul Cordy

DEPOSIT FROM PATH 19.3 IN KOOTENAY PASS. A SIZE 3 AVALANCHE TRIGGERED BY HELI BOMBING // MOTI
IN THE WORLD OF BIG DATA we have become accustomed to interacting with computer models. The search for good snow inevitably begins by consulting the ensemble weather forecast (the consensus weather prediction of five different detailed models of the atmosphere), just as most searches for knowledge these days begin by consulting Google (a complex and dynamic model of the relevance of digital information). So one might expect that any day now Big Data will begin to spread its tentacles into the world of avalanche safety. But are there particular challenges to using computer models for avalanche prediction? Not the least of these may be thecomplexity of geographic and human factors leading to avalanche formation, and also the scarcity of reliable and continuous information about conditions in the start zones.
So how far have avalanche prediction models come, and how might they benefit organizations and individuals? Will they ever be good enough to rely on in Canada? The British Columbia Ministry of Transportation and Infrastructure (MOTI) has a long history of taking the lead in creating digital tools for avalanche practitioners. These efforts have led to the development of one tool that we know and use already: SnowPro. A lesser-known innovation of the MOTI is the computer-based avalanche forecasting system which began more than 15 years ago in Kootenay Pass. Ted Wieck, former information systems manager for the MOTI avalanche and weather branch, spent over a decade developing the MOTI’s first digital highway, weather, and avalanche database. In the beginning, this meant considerable amounts of tedious data entry for technicians, who would have rather spent more time in the hills and on the road. Ted wanted to make all that data useful to the people who were assiduously collecting it for him, and so he became a fervent supporter of computer-based avalanche prediction.
In the mid-nineties, Dr. David McClung and John Tweedy developed and tested software that used manual weather observations (input by the user, of course) to predict the probability of avalanche activity that day. The prediction was based on a statistical model that was created using historical weather data and avalanche occurrence records from the previous ten seasons at Kootenay Pass. As in all computer models (including Google’s search engine), historic data is used to train the model, or in the case of MOTI, determine the relative importance of various weather variables and how to combine them in a way that computes accurate predictions of avalanches.
This is not too dissimilar to the way that we humans learn. Our experience is combined with training to create mental models of how weather creates avalanches. Often we will compare current weather or snowpack structure with previous seasons’ observations to refine our decisions. The original Kootenay Pass model also retrieved the ten most similar instances of weather and presented the data to the human forecaster to further aid in decision making. In the end, both model approaches were 70-80% accurate. Early in the 2000s, James Floyer proved that similar models could be trained on Bear Pass datasets with similar results.

A CROWN ON PATH 19.8, KOOTENAY PASS // MOTI
As a Masters student with McClung at the University of British Columbia, my contribution to this effort was to dynamically integrate numerical weather forecasts and optimize different versions of the model for each of five different highway corridors with active avalanche control programs. In each place, we used ensemble weather forecasts up to 48 hours ahead into each model, thus extending avalanche predictions into the future (all previous avalanche models predicted present probability of avalanches only). As it happens, predicting avalanches in the future mostly depends on the accuracy of weather forecasting, and most avalanche forecasting models achieve similar accuracy irrespective of the type or complexity of the model.
Of course a 70 to 80% prediction rate is horribly inaccurate given the consequences range from traffic hazard to loss of life, and so there always had to be a human forecaster calling the shots. But before dismissing computer models, one must consider the constraints under which they are working.
Take weather forecasting as an analogy. European weather prediction is far better than that of western North America because of differences in density of meteorological stations. Weather systems en route to Europe are being broadcast by countless sensors in myriad islands and land masses in the Atlantic, not to mention by the North American sensor network. Reliable data makes for more reliable weather models. By contrast, weather on its way to western North America passes over the Pacific Data Void, a vast stretch of ocean almost uninterrupted by islands and permanent weather stations. So the very same computer models are often inaccurate more than 24 hours in advance.
So too with computer models of avalanche prediction. Greater complexity and precision of avalanche models is unlikely to improve forecast accuracy until we provide such models with more and better information. The data that we provide prediction models couldn’t possibly compete with the human experience. Avalanche technicians explore the terrain, doing hand shears and listening to the snow settling under their skis. They feel temperature changes when fronts come through, just like the sensor networks do, but sensors can’t see the sun hit certain start zones, and they can’t see how snow is loading up there. Really, it’s a miracle that numerical prediction algorithms are accurate at all.
Therefore, the next goal was to integrate information about the snowpack into the model. The MOTI avalanche models had a built-in mechanism for updating the avalanche probabilities based on new information. Previously, this "prior" information was added by the forecaster in response to avalanche control results or other knowledge that was not available to the model. Prior probabilities could just as easily come from a model of snowpack structure and stability such as the red flag method of SnowPro, or the SNOWPACK physical model used in Switzerland. Unfortunately, changing funding priorities and personnel at the MOTI meant that snowpack information was never integrated into predictions, although it is still used in Kootenay Pass. It’s up to the next generation take it to another level.

NORTH FORK AVALANCHE AREA ON THE EAST SIDE OF KOOTENAY PASS // MOTI
Generational change itself was also a major driver of interest in creating the model. During the latest bout of modeling studies, MOTI was facing the near-simultaneous retirement of all of their technicians. MOTI saw that new staff might get up to speed more quickly if they could scan the results for the size, type and spatial distribution of natural or controlled avalanches in the historical records. The idea was to try to decouple the memories of seasons from the people who observe them, and help bridge the loss of team experience when seasoned professionals retire. Furthermore, the benefits of such systems would be more apparent to successive generations of technicians who would be ever more native to the digital environment. Whereas the old ironsides of the avalanche patch are more likely to decry that their Rite in the Rain books have never crashed nor printed error messages, younger generations are more likely to wish they could just use their iPhone and store it in the cloud.
Computers can supplement our memories, help us see broad patterns, and evaluate the importance of various causal factors that govern avalanche formation. Snowpack depths and precipitation intensity can be measured by satellite, and soon we’ll have satellites sensing atmospheric structure and conditions over the Pacific Data Void. With more and more wired backcountry users and the Canadian Avalanche Centre’s geo-referenced recreationist observation database, avalanche information is set to explode. Models can help us to synthesize an oversupply of data into relevant knowledge. That knowledge will always be limited by the data and model that generate it, and may always require a human to make life and death decisions. However with changing personnel and changing climate, it helps to maintain historical perspective on present events. Avalanche prediction models can help to bridge present and past, and to help us tease out the most relevant information that can be used to manage risk.
As the analytical techniques of Big Data inexorably penetrate all aspects of life, I expect that one day they will be as much a part of the furniture of our lives as smartphones. However, research and development in avalanche risk modeling advances through the vision, passion and forward thinking of people like John Tweedy and Ted Weick, who championed the initiative within the MOTI. Although my main research focus has shifted from avalanche models to pollution modeling and mitigation, I maintain a deep interest in the topic. As we approach the critical information density with respect to snow and weather, I look forward to collaborating with the next generation of visionaries and institutional champions that will bring the avalanche world back in step with Big Data.