|A segment of West Antarctic Ice core.|
How complexity science can quickly detect climate record anomalies
While understanding a large amount of information packed in the ice core, scientists have to face a forensic challenge: how easy is to separate useful information from corrupt.
Recently, a new study report has been published in the journal Entropy that shows how a tool from information theory, a branch of complexity science can immediately address this challenge by homing in parts of that data which requires further investigation.
With such data, we have limited opportunities to get it right. Snow removal and processing of data takes hundreds of people and tons of processing and analysis. Due to the constraints of processing, repeating the core is rare. Since good data is available, it can be analyzed precisely. But this makes the analysis very susceptible to the anomalies, said Joshua Garland in a statement; he is a mathematician at the Santa Fe Institute and works with 68,000 years of data from the Ice Sheet Divide Ice Core of West Antarctic.
By the time Garland and his team caught the data, the original initial drilling of ice had passed for more than 10 years to publish a dataset. Two miles of ice caps were removed in five miles from 2007-2012, by teams of several universities funded by the National Science Foundation. From the field camp in West Antarctica, the core was packaged, then the National Science Foundation in the Colorado Ice core facility, and finally to the University of Colorado. In a stable isotope lab at the Arctic and Alpine Research Institute, a state-of-the-art processing facility helped scientists to draw ice-water isotope records.
The result is a highly resolved, complex dataset. Compared to the previous ice core data, which allows for every 5-centimeter analysis, the WAIS divide core allows analysis at millimeter resolution.
One of the exciting things about ice core research in the last decade is that we have developed these laboratory systems to analyze ice in high-end use. A long time ago, we were limited in our ability to analyze the climate because we could not get enough data points, or if we could take it too long. These new technologies have given us millions of data points, which are difficult to manage and interpret without any new progress in our data processing, said Tyler Jones, a paleoclimatologist at the University of Colorado Boulder.
In the last cores, decades, even centuries, were consolidated in one point. On the contrary, WAIS data gives sometimes more than forty data points per year. But since the scientists’ go-ahead to analyze the data on a short time scale, even small discrepancies can also be problematic.
To quickly identify which inconsistencies or anomalies need further investigation, the team uses theoretical techniques to measure how complexity is revealed at each point in the timetable. A sudden increase in complexity can mean that there was either a major, unpredictable climatic event like a super volcano, or there was a problem in the data or data processing pipeline.
This type of discrepancy or anomaly will disappear without a highly detailed, aromatic, fine-grained, point-by-point analysis of data, which takes several months for the human expert to perform. Even though the information theory cannot tell us the underlying cause of an anomaly, we can use these techniques to rapidly flag the dataset segment which should be investigated by paleoclimate experts, said Elizabeth Bradley in his explanation, he is an External Professor at the Santa Fe Institute and a computer scientist at the University of Colorado Boulder.
"It is not that you cannot go through those pages. But imagine if you have a technique that can point to those people who were potentially meaningful?" While analyzing large, real-world datasets, information theory can distinguish data that either indicates a processing error or an important climatic event, said Elizabeth Bradley, she also compared the ice core dataset to Google Search, which returns a million pages.
In their Entropy paper, scientists have explained in detail how they used information theory to identify and repair the problematic pull of data from the original ice-core and basic theory. Ultimately their investigation led to the re-evaluation of archival ice cores - the longest re-evaluation of a high-resolution ice core to date. When that part of the ice was restarted and re-processed, the team was able to solve an incompatible spike in Entropy about 5,000 years ago.
Getting this area right is very important because it includes climate information from the beginning of human civilization. I think that climate change is the most difficult problem ever to face humanity, and the ice core is undoubtedly the best record of the Earth's climate for hundreds of thousands of years, information theory can help us to understand through the data and may help in getting out what we are doing. It is the best and most definite product in the world that we can do, said Tyler Jones, a paleoclimatologist.
Journal Reference: Entropy, 2018; 20 (12): 931 DOI: 10.3390/e20120931