[email protected]

3 Jan 2013

Wildfires, Computer Models and Sophisticated GIGO

Early on, I learned about GIGO, Garbage In Garbage Out. Translated, if you don’t input good data into the computer don’t expect good data in return. Even though a common assumption, as computers become more sophisticated so has their GIGO. They not only receive data from us but also from sensors programmed by us. Consequently, algorithms that are more sophisticated manipulate this data, making it increasingly hard to see inaccuracies and to believe it could be inaccurate.

The article, “Burning Question” (The Atlantic, September 2012 edition), by Michael Behar exemplifies this by exploring the problem long-standing computer models had in predicting the wildfires of 2012. Much of it had to do with the computer models operating on historical experience, thus assuming fires would behave pretty much as they did in the past. As a result, the article delivers this point, which really applies to all of us when we enlist the help of computers:

. . . some experts worry that younger fire analysts lean a bit too heavily on their data-crunching skills, and have little field experience. Dawson is thankful to have spent his early career fighting fires with an ax and a shovel.

This challenges us further because even if we suspect the data, they can still influence us. They anchor in our minds and we unconsciously judge other data against them, even our own observations. Thus, not only does the data taint our observations, but the computer’s sophistication could lead us to doubt what we saw. It’s very similar to people giving false confessions because they believe the police know what they are doing, except here we believe the computer “knows what it’s doing.”

So remember, if you want your computers to work better, remember to apply your experience. After all, they don’t have any.


Leave a Reply

Powered by Paranoid Hosting™. 'Cause you never know...