Great Lakes Monitoring makes data easier to access
02.08.13 - Permalink
by Nicole Gaynor
The Great Lakes contain about 21 percent of the Earth’s and 84 percent of North America’s supply of fresh water, according to the Environmental Protection Agency (EPA). Is the water safe to drink? Is it safe to swim? Can we eat the fish?
“These are three fundamental questions the public wants to know about the health of the Great Lakes,” says Brian Miller, director of the Illinois-Indiana Sea Grant, the organization that sponsored the Great Lakes Monitoring (GLM) project.
The GLM GeoDashboard is online to help. It is a new data-aggregating web application created at the National Center for Supercomputing Applications (NCSA) to keep an eye on the 94,000 square miles of water with free, easily accessible data. The website brings together observations from multiple sources, such as the U.S. Geological Survey and the EPA, for scientists, decision makers, and the public to use.
“The GeoDashboard is designed to provide access to the data collected by the EPA since the early 1980s and other state agencies in an easy-to-use web app,” says NCSA’s Terry McLaren, project manager for GLM.
Scientists help GLM meet challenges, and vice versa
Miller says that the biggest challenges in GLM are deciding which data are important and how to clearly convey the data. The straightforward solution: ask scientists what they want.
Paris Collingsworth, an aquatic ecologist with the Illinois-Indiana Sea Grant, is one of these scientists. He wants to know more about the deep chlorophyll layer. The layer lies just below the thermocline, the transition zone between the sun-heated layer near the surface to the cold, dark water below. The deep chlorophyll layer accounts for much of the lakes’ algal growth, but little is known about its relationship to the food web.
Collingsworth uses data from an instrument called Seabird on the EPA research vessel Lake Guardian. Lake Guardian traverses the Great Lakes sampling lake water, mud at the bottom of the lake, and air above the lake. In order to measure the chlorophyll layer, the Lake Guardian crew first has to find it. That’s where GLM steps in to help visualize data quickly. The graphic view is indispensable on the ship, according to the crew.
Lake Guardian also uses water conductivity, which increases with salinity, to indirectly measure salinity. A prehistoric inland sea left salt beds under the Great Lakes that are slowly dissolving into the lake water. Scientists are using the measurements to investigate how deep of a layer the salty plumes influence and how salinity is changing. In an annual time series, conductivity spikes every spring because of runoff that includes road salt. These data will be added to GLM in the next iteration of the website.
The Great Lakes ecosystem is sensitive to pollutants in runoff, like road salt, and precipitation that washes out atmospheric pollutants. Data from GLM can help raise awareness of problems, says Barbara Minsker, a University of Illinois professor of Civil and Environmental Engineering and GLM principal investigator.
As Seabird lowers from Lake Guardian and TriAxis, another instrument cluster on the ship, is towed behind, they measure quantities like temperature, depth, conductivity, and chlorophyll. The crew then uses the data to choose sampling depths for phosphorus as they bring Seabird up. Tristan Weitsma, a graduate student with Minsker, is working on an algorithm to identify the thermocline and soon the chlorophyll maximum to automate processing of the vertical profiles.
Phosphorus is already on the GLM site. High phosphorus content is promoting algal blooms and E. coli outbreaks that close beaches along Lake Erie, says Minsker. The GLM website shows several red triangles over Lake Erie that indicate that phosphorus levels are above the 10 microgram per liter threshold set for the lake—in some cases, more than double the threshold.
On the other hand, low phosphorus content is starving the Pacific Salmon that are stocked in Lake Huron. Experts at Illinois-Indiana Sea Grant, a division of the National Oceanic and Atmospheric Administration (NOAA) that is concerned with ocean, coastal, and Great Lakes stewardship, are concerned that Lake Michigan will follow Lake Huron.
A map on the front page lets users explore data through symbols that represent the location and the agency that took the sample. Clicking on a site displays more about the data types and the most recent values. From the pop-up bubble, users can dive deeper into the data (“View Data”) and explore a time line of the data that shows when values were above or below safe values and statistics like the minimum, maximum, average, and trend over the last 10 years. “Download Data” allows users to download the dataset as comma-separated values for offline use.
GLM can quickly provide reports for a meeting held every fall to review the state of the Lakes. Minsker says reports that used to take months are now quick to prepare. The team is working on additional tools to create graphs for environmental reports to places like Congress and the White House, says Paul Horvatin, a user and data provider with the EPA.
The GLM project started in the summer of 2011. On the first version of the website only wet chemistry data from Lake Guardian is available. Initial testing for version two of the site, slated for late spring 2013, will include more water quality data from Sea Bird, TriAxis, and other state and federal sources and add a user-friendly data search interface.
“The products we’re going to get are going to be right on the money, just what we need,” says Horton.