MAGIC Emergency Preparedness Project: Analyzing the 2017 Meramec River Flood

I was privileged with the opportunity to work on the MAGIC Emergency Preparedness project as a part of the NASA SEES High School Internship program. My team included myself, Janine Fleming, Rishu Mohanka, Valerie Chen, Sam Mosby, and Sara Komaiha. A recount of my experiences can be found here. This post will be detailing my work more extensively and be giving an overview of my team’s findings. Parts of this may also be found in brief within the SEES presentations video as well. Credit and thanks to my team mentor Ms. Teresa Howard for her support and mentorship in this thought-provoking project.

My team specifically dealt with remote sensing and gathering of datasets of the Meramec River in St. Louis, Missouri to gain understanding of the 2017 Meramec River Flood (occurred April 29 – May 9). Objectives include uncovering its aftermath, studying maps created in response to the flood to determine its accuracy and usefulness, learning about the challenges of working with data, and improving the information collected during the flooding event. We compared and contrasted pre-event, event, and post-event imagery obtained with U.S. and European satellites.

To first have a handle on what I would be doing, I learned about the basic principles of remote sensing and the strengths and weaknesses of each satellite system in order to gain greater insight in how to more effectively go about our project. I utilized NASA’s ARSET training webinars including Introduction to Synthetic Aperture Radar. These principles (i.e. how remote sensing works, active/passive sensors, how SAR works, and SAR surface/radar parameters) are explained in a nutshell in our presentation. Understanding the process behind remote sensing for rapid data acquisition is vital for emergency preparedness and ultimately emergency responders during a natural disaster.

I also used online tools to research and familiarize myself with the flood. A primary tool was the USGS’ interactive mapping application Streamer to explore the Meramec River watershed.

Screen Shot 2017-08-07 at 4.48.04 PM.png

The above Meramec River watershed map created by Streamer depicts how there are various other streams and rivers that are connected to the Meramec River which can influence a flood. I also looked at aerial data of the flood to determine the damage it caused and a website of before and during the flood.

Afterwards, I retrieved data of the Meramec River through the Alaska Satellite Facility and EarthExplorer.

Then, each of us analyzed Landsat-7 images (retrieved with optical remote sensing) and Sentinel images (retrieved from SAR) with QGIS and SNAP. File naming conventions were key to keeping our work organized: <Assignment name>_<your last name>_<your preferred name>_<YYYYMMDD>.

First, we worked with Landsat-7 images (a simplistic explanation of how Landsat-7 images are created can be found here) and viewed individual bands (grouping of wavelengths along the electromagnetic spectrum) in black and white. Then, we combined the satellite image bands into a virtual raster (elevation and land usage layers) to see the image in color. We created maps of our images in different band combinations with QGIS’ Map Composer to compare and contrast.

Screen Shot 2017-08-07 at 5.43.26 PM.png

Screen Shot 2017-08-07 at 5.43.00 PM.png

The Landsat-7 321 band combination shows what a human would most likely see because it utilizes the visible bands. It is effective for seeing sediment and noticing evident features such as healthy vegetation being displayed green. With the aerial photos, these images justify the quantifying of sediment which had been spread and deposited due to the flood.

As seen clearly in the images above, the Landsat-7 bands used do not allow penetration through clouds as SAR is able to do with radio waves. This goes along the lines of the concept that the longer the wavelength, the greater the penetration.

Other features were observed by examining different band combinations.

Screen Shot 2017-08-07 at 5.59.53 PM.png

Screen Shot 2017-08-07 at 6.00.13 PM.png

The Landsat-7 432 combination is called the “false color” composite. These images clearly are not what the human eye would see. Vegetation is shown in red rather than green.

Then, we worked with Sentinel-1 SAR data and Sentinel-2 multi-spectral data by using SNAP. First, we practiced with a dataset from Houston, Texas prior to working with St. Louis data. Our method for processing satellite imagery for flood mapping was Calibration -> Speckle Filtering/Multilook -> Terrain Correction -> Binarization (through Band Math). Slides 20-24 of my team’s presentation depict how each of these steps alter the image of Houston, Texas.

Calibration allows each pixel to be directly related to the backscatter (light received back to the sensor). Speckle filtering/multilook filtering gets rid of the speckles in the image (increasing number of looks results in smoother image but lower resolution). Both filtering methods can be used to view varying features. However, speckle filtering catches information from the data differently from multilook filtering. As seen on slide 23 of the presentation, speckle filtering caught more areas of flooded water than the multilook approach. Terrain Correction corrects the terrain and orientation of the once inverted image. Orientation of image is incorrect due to the satellite obtaining the image in a certain direction. Binarization is used with Band Math to distinguish two different things within the image with the values 0 and 1. For instance, anything that’s water in the image represents 0 and land is 1. I also briefly tinkered with classification in QGIS which is similar to Band Math in SNAP. Creating a subset prior to the calibration step was useful in that it took up less storage to process a small section of the image rather than the whole image.

Sentinel-1 data of Meramec River taken on May 16 and processed in SNAP

I also set up automated work flows in SNAP using processing chains with Graph Builder which made for a quick alternative to manually doing each of the aforementioned steps.

One of the key ideas behind remote sensing is to always validate the data that we use with other data to determine its accuracy and effectiveness.

In this Copernicus map of St. Louis, if we zoom in to the Meramec River, we can see the flooding that occurred signified by the light blue areas:

Screen Shot 2017-08-07 at 7.34.26 PM.png

However, by examining the data that I collected with my team, we found areas in the map that did not look flooded but were actually flooded in reality. This is due to the map being created on May 1st for emergency responders to use and thus missing any flooding that occurred after that day. As a result, utilizing other data as we did from other sources gave us a greater understanding of the Meramec River Flood’s effects.

Our project can be used for future floods that may occur in the St. Louis area. It also has universal applications that can be applied to remote sensing and any future natural disasters.

Reflections:

I learned a lot about remote sensing including new concepts, satellites, and tools. Prior to this internship, I never realized how accessible satellite data is. I always thought it was top secret and only high officials and governmental people could look at this sort of data. This project has opened my eyes to the planning, response, and recovery phrases of natural disasters and the processing of data that happens in the background essential for rapid and effective emergency response.

In the future, I will most definitely continue taking advantage of this data and learning more about analyzing data. As a result of my internship, I have included various links to useful sources so that others can share in on the plethora of data that’s available on the World Wide Web and become their own remote sensor.

Applications and Links:

Acquiring data: Alaska Satellite Facility, EarthExplorerHazards Data Distribution System (HDDS) Explorer

Processing data: QGIS, SNAP, ArcGIS (I did not use this, but it is commonly used)

QGIS: Training Manual, tutorials

SNAP: tutorials, Flood Mapping methods

Extra applications: Streamer, Global Flood Monitoring System (GFMS)National Map, NASA Worldview, Copernicus

Project Presentation PowerPoint: https://docs.google.com/presentation/d/1kA2LfzMKGtFTHdSNxmN2Uqti67IVcF0tzU3pgh0H_NI/edit?usp=sharing

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s