Thursday, September 29, 2016

Mountain Top Removal - Analyze Week

Special Topics - Project 2 Week 2

This week we used 2010 Landsat data to create a polygon of current Mountain Top Removal (MTR) areas in the Appalachian Coal Regions of West Virginia and surrounding states.  The first step was to create a single raster dataset from 7 Landsat bands using the Data Management Raster Processing tool Composite Bands, saving it as an .img file.  The Spatial Analyst Extraction tool Extract by Mask was used to clip the composite raster to the study area, again saving to an .img file to use in ERDAS Imagine.

In ERDAS, an Unsupervised Classification was performed on the clipped image with a number of 50 classes.  Areas of MTR were then identified by picking a pixel on this classified image and changing the Class designation for that pixel in Class_Name field in the attribute table to MTR, and the color to Red.  Once all the MTR areas were identified and changed the rest of the pixels were assigned a class designation of NonMTR and the color changed to Dark Green.

This saved image was then added to ArcMap and reclassified with the Spatial Analyst Reclassify tool by populating the New Values field with a 1 for the MTR classes and left blank for the NonMTR classes and the missing values assigned to No Data, creating a new raster of polygons around the MTR areas.

Monday, September 26, 2016

Intro to Electromagnetic Radiation (EMR)

Photo Interpretation & Remote Sensing - Mod 5a

This lesson had us work in ERDAS IMAGINE for the first time which was rather different.  We brought a raster image in and selected an area of Washington State to export to ArcMap using the Inquire Box and created a subset image.  After running the process we moved to ArcMap to manipulate the image properties and legend to display the information appropriately.  Below is the result.


Friday, September 23, 2016

Mountain Top Removal - Prepare Week

Special Topics - Project 2 Week 1

Mountain Top Removal (MTR) is a method of mining for coal that destroys a mountaintop or ridgeline.  This project will explore MTR in the Appalachian Coal Region of West Virginia using data and methods provided by SkyTruth.org to investigate evidence of human-caused changes to landscape cause by MTR using remotely sensed data.  Since this is such a large area to cover, the project was divided into four groups.  I am part of Group 3.

Starting with four Digital Elevation Models (DEMs) that were merged using the Raster to New Mosaic tool and clipped to the Study Area with Extract by Mask tool, a hydrology dataset was created.  This was accomplished using the following Spatial Analyst Hydrology tools:

  • Fill - to modify the raster to prevent water flow from pooling up anywhere in the Study Area.
  • Flow Direction - to assign each pixel a value representing the direction of water flow across that cell.
  • Flow Accumulation - to calculate the total number of other cells that flow into it.
  • Con - defines the threshold of flow accumulation values that qualify something as a stream.  Prior to running this tool 1% of the pixels of the original clipped DEM needed to be calculated in order to quantify which conditions a pixel must meet in order to be classified as a stream.  
  • Stream to Feature - creates polyline vector feature from the Con tool output raster.
  • Basin - delineates the drainage areas in the raster, giving each a unique value
  • Raster to Polygon - creates a polygon shapefile from the raster.
This is the basemap that will be worked from in the upcoming Analyze and Report weeks.  Also a part of this week's assignment was to create a story map of the MTR process and a Story Map Journal to be filled in over the next couple weeks.  These are the links for both:

Story Map:  http://arcg.is/2daFw7K

Story Map Journal:  http://arcg.is/2daNCgz


Tuesday, September 20, 2016

Ground Truthing and Accuracy Assessment

Photo Interpretation and Remote Sensing - Mod 4


Using the map from last weeks lab we investigated random areas of the map to verify the classification schemes we had selected.  Since we weren't able to field check the areas we used Google maps to do so.
 We started by creating a point shapefile, then starting an editing session to add 30 points to the file.  I chose the Stratified Random Sample Protocol to select the locations for my points.

Once I had the points selected I used the Identify tool to determine their coordinates in Degrees, Minutes and Seconds and entered that data in Google.  The image in Google was much clearer than the TIFF we worked from originally.  Even in those areas where I wasn't able to get a street view I was able to see the area clear enough to tell if I had gotten it right or not.  I was not surprised to discover I made the most mistakes on the islands in the river.  They were very difficult to distinguish in the TIFF so I guess at most of them.  In Google I was better able to tell there were no trees so what I had marked as Forested Wetlands was actually Non-Forested Wetlands.  The only real surprise was to find a house with a nice lawn, trees and even a few flower gardens in what had once been just a sandy area.  Overall my accuracy came to 67%

Saturday, September 17, 2016

Network Analyst - Report Week

Special Topics - Project 1 Week 3

This week was putting together the final results of the previous two weeks.  It seemed a simple task, but turned out to be rather daunting.  Eight maps had to be created.  Two of them were added to a pamphlet for the hospital evacuation routes.  Those maps and the next three were pretty simple though a little time consuming.  The next three were supply routes from the National Guard armory to each of the three shelters.

The next map was the most difficult.  This was the map with multiple routes leading out of Downtown Tampa to a shelter.  There was a lot information that had to go on that map including the color coded routes, street names, arrows pointing the correct direction.  This alone wouldn't have been too bad.  But it was supposed to be finished off in Adobe Illustrator.  I spent a considerable amount of time realizing I no longer remember how to use that program.  Finally I had to give up and just finish it in ArcMap.

The last map was also supposed to be done in AI, and  I thought since it was a simpler map that might work, but still had no luck.   This time I finished it off in Power Point, but I'm not real happy with the results.

Tuesday, September 13, 2016

Land Use\Land Cover Classification

Photo Interpretation and Remote Sensing - Mod 3

This week's lesson was to classify a TIFF image to Level II.  A polygon shapefile was created and two fields, Code and Code_Descr added to the attribute table.  Next an editing session was started and the boundaries for the different classifications needed to be digitized.  Digitizing each of those islands was a very long, drawn out process, but the most difficult part was determining how to classify everything.  So many things could be either one thing or another.  Fortunately most of what I was uncertain about fit into the same category so I didn't have to make a final determination.  If I had had to drill down to Level III there were several things I wouldn't have known how to classify.  There were a couple buildings that could have been schools, hospitals or nursing homes.  Fortunately they all belonged to the Level II Commercial and Services category so I didn't have to decide. 

This lesson was pretty intense.  There was a lot of reading material to cover and so much information it was difficult to absorb it all.

Thursday, September 8, 2016

Network Analyst - Analyze Week

Special Topics - Project 1 Week 2

Using the data that was prepared last week, this week the assignment was to create a map with evacuation routes for patients from Tampa General Hospital to other hospitals, evacuation routes from downtown Tampa, supply routes from the National Guard armory to the shelters, and the service areas for each of the shelters.

To accomplish these analyses I created a new network dataset and added the attributes Seconds, to create fast and simple routes, FloodPublic, to calculate routes to avoid flooded streets closed to the public, and FloodEmer, to calculate routes for emergency vehicles which may need to use flooded streets to respond to emergencies but will otherwise avoid them, verifying the evaluators had the correct type and value for each of the attributes.

Because of its location on Davis Islands and the very low elevations there, Tampa General Hospital will need to be evacuated before the hurricane hits.  Memorial Hospital of Tampa and St. Joseph’s Hospital were chosen to accept those patients.  Using Network Analyst’s New Route feature, routes were created from Tampa General Hospital to each hospital, ensuring Impedance was set to Seconds and Restriction only to Oneway to be sure the shortest time route was selected to each.

The US Army National Guard will deliver the emergency supplies from their armory to the local storm shelters.  Because the supplies may not arrive until after the storm hits the routes have been designed to avoid flooded roadways.  Three new routes were created from the armory to each of the shelters.  For these routes the Restrictions were set to FloodEmer to avoid flooded streets as much as possible, but to use them if necessary.

Multiple evacuation routes were also created for downtown Tampa because it is so heavily populated.  The DEM polygon layer, which was used in the Prepare data to specify the predicted flood elevation levels, was also used here with a new Short Integer attribute field added named ScaledCost so groups of elevations could be used with the Impedance attribute to determine the drive times affected by flooding in order to determine the best routes to take to the shelter located at Middleton High School.

To cut down on confusion and uncertainty by the general population the map has been color coded by the shelter service area to make it easier at a glance to tell what shelter they can get to the quickest.  This was done through the New Service Area Network Analyst feature using the shelter points as Facilities, setting the Impedance to Seconds and the Default Breaks value to 10,000 to make sure all areas would be included in the analysis, and the Direction to Toward Facility, with Not Overlapping selected for the polygons, which were then symbolize with different colors.

Monday, September 5, 2016

Visual Interpretation of Aerial Photography

Photo Interpretation & Remote Sensing - Mod 2

This lab covered the methods and techniques used to visually interpret aerial photos through three exercises.  The first exercise focused on tone and texture.
  Five areas of the map were selected to represent different tones in a range from very light to very dark.  Polygons were drawn around these areas then converted to features so the attribute table could be edited to add the tone of each feature and its label added to the map.  Next, the same was done for texture, selecting five areas that ranged from very smooth to very course.




Exercise 2 covered shape and size, shadow, pattern and association.  Three objects were selected to represent each of these elements except association which only required two.  Instead of polygons markers were used to highlight features that were identified in each category.  The markers were also converted to features and each attribute table edited to include the name of the feature so it could be labeled.



The last exercise was a color comparison between the same image in True Color and in False Color IR.  Again markers of the selected features were converted to features to be edited and labeled.  This was done in a data frame with the True Color image, the colors of the selected features were noted then recorded in a table in the process summary.  Those same features were then examined in the False Color image and the difference in colors were also noted in the table.