Thursday, December 8, 2016

Open Sourced - From Analysis to Communication in the Age of Web Mapping - Report Week

Special Topics - Project 4 Week 4

The final final.  This project ended with the creation of a Power Point slide presentation of the analysis we performed in previous weeks.  Creating the presentation was fairly straight-forward until I got to the audio part.  I tried several different ways to get the audio to start up automatically but I never was able to accomplish that.  After recording the whole presentation three times in different ways I finally gave up.  The rest of it was simple enough until I went into QGIS  to create a finished map.  For analysis week I didn't need a finished map from QGIS, just a map that had had the analysis done and the layers created.  It should have been an easy enough task, but every time I made one change in the layout view it would change all kinds of things.  I was very careful to switch over to the data window and turn the appropriate groups on or off before unlocking the groups in the layout view, and yet it still would change things in ways I never could figure out.  Finally I got the map to the point where I wasn't too embarrassed by it so I did a quick save and export before anything else could happen.

All in all this last project was rather interesting, but I thought it was a bit much for a final project to have so much new stuff thrown at us with very little guidance in using it.  Had each of the programs been worked into the previous lessons it wouldn't have been so bad, but at the very end it was rather frustrating.

But it's all over and done now and time to move on.

Here are the links for both my Web map and my Power Point presentation.


Wednesday, December 7, 2016

Final Project

Photo Interpretation and Remote Sensing - Final

The final project for this class was to use what was learned in all the previous classes to answer a question.  I chose to go with the suggested project for which the data was provided.  The question being, Is urbanization to blame for the decrease in clarity of the waters of Lake Tahoe.  

To answer this question a composite image was created of individual bands of a 2010 Landsat 5 TM image in ArcMap, then imported into Erdas Imagine where spectral signatures were created and a supervised classification was run.  This image was then returned to ArcMap to compare to a 1992 National Land Cover Database image.  The comparison showed urbanization is likely the cause of the problem.

It's a relief to finally finish this course as it was the most difficult of them all.  I think part of the problem was I kept looking for things to be black and white, and quite frequently they were more best guess.  It was a difficult concept for me to adjust to.  In the end I think I did okay, and somewhere along the way I also came to enjoy it.  I think that was at the very end, when I knew it was almost over and I'd never likely need to do this again.  But at least now I can look back at this with fond memories.  


Friday, December 2, 2016

GIS Portfolio

GIS Internship - Final Assignment

My final assignment was to create a GIS Portfolio with a little information about me, my goals, my resume and and examples of my work.  It was a little difficult selecting what maps to use for examples since so many showed off some awesome skills, but I think the ones I selected provide a good picture of what has been learned during this past year.

Here's is a link to my portfolio:

Wednesday, November 30, 2016

Open Sourced - From Analysis to Communication in the Age of Web Mapping - Analyze Week 2

Special Topics - Project 4 Week 3

 I downloaded the 2010 census shapefile from FGDL, and used the Florida Counties shapefile from Prepare Week to create my county shape to clip the census shapefile to, and then created my study area and centroid shapefiles from that within QGIS.  I then used Google Earth to identify grocery stores within that study area and created a KLM file from that search and opened that in ArcMap and saved the results to a shapefile, then added my centroid shapefile.  With these two layers I then ran the Analyst tool Near to locate those stores within 1 mile of the census tract centroids and add that data to the centroid attribute table.

With the centroid layer modified in ArcMap I opened the database file in Excel and saved it as a .csv file then went back into QGIS and added the table as a Delimited Text Layer and joined it to the Study Area layer.  This added those new distance fields to the Study area layer.  From this field I was able to select those records in my Study Area that did not have a store within a 1 mile radius of the tract centroid and create a Food Desert shapefile from that.  Reversing the selection I then created a Food Oasis shapefile from the rest.

Moving on to MapBox I uploaded my zipped Food Desert and Grocery Store shapefiles to new Tilesets, then created a basemap from the basic style, changed a few elements, then added the tilesets to the basemap.  After that I added the Study Area to ArcMap and investigated the results of different classes and classification methods and finally settled on Jenks Natural Breaks with 4 classes as the most reasonable distribution, then switched to Colorbrewer for the HEX and RGB color codes for the color scheme I wanted to use.  Back in MapBox I created a Group for the food desert layer and made three duplicates of the layer to represent the four classes, then added filters and changed colors so each layer would represent the correct class.

The last step was to create a webmap in Leaflet.  To do this I copied the text file from Analyze Week 1 and made the adjustments in the text file that were appropriate for the differences between maps.  Basically, all the things that needed to be added or adjusted for Analyze Week 1 also needed to be adjusted for Analyze Week 2. With that done I save the text file as a HTML on my I drive.  This is the link for it:  http://students.uwf.edu/mr80/STGIS/OkDesert.html

The Study Area encompasses the Greater Fort Walton Beach area in South Okaloosa County, Florida, including the surrounding rural area.  Within this area are 10 grocery stores that service most of the urban area, but not all, and there is not one grocery store within the rural area.

Fort Walton Beach is not my home town but I am pretty familiar with it and was actually surprised to see so little of it falls within a food desert.  I didn’t realize there were so many grocery stores within town.  I was also surprised to discover there were no grocery stores at all within the rural area outside of town.  I would have expected there to be at least one. 

Friday, November 25, 2016

Open Sourced - From Analysis to Communication in the Age of Web Mapping - Analyze Week 1

Special Topics - Project 4 Week 2

Analyze Week 1 introduced us to Mapbox, Leaflet and web mapping.  Using the data we created last week in QGIS, this week we uploaded our food desert and grocery store shapefiles to Mapbox, then symbolized it.  This was a little more complicated than in ArcMap.  In Mapbox, first copies needed to be made of the food desert layer for every class we had, then the class and color needed to be assigned to each layer.  There were a few ways this data could be derived.  I chose to use ArcMap and experiment a bit with the classes in the layer symbology window until I felt I had a reasonable distribution of the classes, then I made a note of those ranges.  Then I went back to Mapbox and added that many layers.  Next I went to the Color Brewer and selected the color scheme I liked best and specified the number of classes.  I made note of the HEX and RGB codes for those colors.  Back in Mapbox I started with the lowest copy of the food desert layer and in the Select data window associated that layer with the lowest range of data based on the POP2000 field of the attribute table for all the layers, then went back through all the layers and in the Style window used the RGB codes noted from the Color Brewer to assign the correct symbol color to each of the layers.

Next we moved to Leaflet to create a web map to be hosted on our UWF I Drive.  We started with the source of the Leaflet map as a base template, then adjusted the code to meet our requirements.  This included changing the location of the style sheet and script files, changing the center point so the map would open centered on Pensacola, FL, editing pop-ups, polygons and circles, adding a legend and a geocoder to make the map searchable.  Adding the legend required us to add a long section of code that was provided, then changing the population ranges and the HEX codes we had noted earlier.  I also adjusted the size of the map view and scale so almost my whole study area would show when the map opened.  This file was then saved to the I Drive as an html file.

It was an interesting assignment, but a lot of new stuff to deal with for a final project.  Here is the link to the web map:

Monday, November 21, 2016

GIS Day Around the Kitchen Table

GIS Internship - Week 13

This year on GIS Day I was in New Mexico visiting my daughter so I decided to make my GIS day an opportunity to share GIS with her.  My daughter is a park ranger and has been looking for openings at parks on the east coast, preferably in Virginia because she'd like to go back to school and get her Masters in History, focusing on Constitutional History.  One of the things she has said over and over is that she'd like to be close enough to make trips up to DC so she can go to the Library of Congress to do research. So of course the very first thing I did was share a fellow classmate's post about her GIS Day visit to the Library of Congress.  My daughter was very jealous and very impressed.  She was also surprised to hear they had material from the medieval age since that preceded not only the founding of this country, but also its discovery. So she is very anxious now to discover what that material is and why it was included in the library's collection.
Once I got her to quit drooling over the library I showed her the Residential Location Study I did for my final project for Applications in GIS.  She got a kick out of seeing I had done it for a mother and daughter moving to St. Augustine, a place I've told her many times I'd like to go visit with her, and the mother would be working at the county GIS office and the daughter as a park ranger at Castillo de San Marcos National Monument.  I summarized the process of creating the location study and we talked about the different criteria we could use to help her chose a location to start looking for houses once she finds a job back east.
Last we used google maps to plan her drive from New Mexico to Florida, selecting good stopping points along the way and investigating the hotels in those areas where she can spend the night with her two furry babies when she drives out to spend Christmas with me.
We had a good time and several giggles, and she said she thought she understood much better now what I've been studying, but it really wasn't anything to take pictures of.

Thursday, November 17, 2016

Open Sourced - From Analysis to Communication in the Age of Web Mapping - Prepare Week

Topics - Project 4 Week 1


The first week of our final project was spent learning Quantum GIS (QGIS) and about Food Deserts.  Part A taught how to use QGIS by adding layers, setting a coordinate reference system, clipping layers, and grouping layers in the main window.  Grouping the layers allows for multiple frames in the Print Composer, QGIS' Layout View.  Just as in ArcMap's Layout View, Print Composer is where map elements, such as legends, scale bars, etc., are added.  Unlike ArcMap, the data frames in QGIS must be locked if they are not being manipulated or what is done in one frame will affect what is in another.  This was less of a problem in Part A than it was in Part B.

In Part B the layers were grouped by the results of our analyses.  Using the census tracts of Escambia County a study area was created, then created centroids for those polygons.  Using the Join tool in the
Study Area layer a join with the Near.csv file was performed to add the NEAR_DIST field from that table to the Study area attribute table.  From this field we were able to create selection sets for those census groups that were within and those that were without Food Deserts.  Statistics were done on each of these layers to determine the percent of the population that fell into each category.  With the statistics done the layers were duplicated where necessary and grouped for their different data frames.  In this second map there were more layers to contend with and forgetting to turn layers on and off  in the main window and lock and unlock the data frames in the Print Composer window resulted in a lot of repetitive steps.  But in the end it all came out right.  Except I still haven't figured out how to put a neatline around the entire map.

Monday, November 7, 2016

Supervised Classification

Photo Interpretation and Remote Sensing - Mod 10

This week we created a land use map of Germantown, Maryland using Supervised Classification tools in Erdas.  Training sites were created using a combination of polygons and the Growing Properties tool.  With these tools, features were selected and defined so an algorithm would know what class to assign to each pixel.  We started by defining 14 classes, then merged them down to 8 using the Recode tool.    Once that was done a Class Name column was added and the classes renamed, then an area column was added to the attribute table to calculate the area covered by each class.  To finish the map off it was added to ArcMap and symbolized appropriately.  

Saturday, November 5, 2016

Statistical Analysis with GIS - Analyze Week

Special Topics - Project 3 Weeks 2 & 3

For the analyze portion of this project we ran numerous iterations of the Ordinary Least Squares (OLS) regression starting with 29 independent variables and removing one with each iteration.  To
determine which variable to remove the first three of six checks were performed in combination.  Each variable was tested for Probability, Value Inflation Factor (VIF), Coefficient, and Importance.  If the answer to all these questions were "No" the OLS was repeated with that variable removed.  If just one question returned a "Yes" the variable remained.  This process was to be repeated until either all the variable that failed the test were removed or the Adjusted R-Squared value was as high as it could get.  The Adjusted R-Squared value was expected to come in between 0.0-1.0, but during this process mine seldom came close.

Once all the variables were removed the next step was to check the Jarque-Bera Statistic score to check for bias.  If the p-value was less than 0.05 and has an asterisk next to it the model is biased.  To analyze the data to find the skewed results a Scatterplot Matrix graph was created.  This graph also included a histogram of each variable provided a second way to view the data.  For each variable that was skewed the OLS was run again with that variable removed and the Jarque-Bera score checked for improvement.

Check 5 was the first time we viewed the map for results.  A part of the OLS process is the creation of a layer of Standard Residual values.  The Standard Residual categorized the residual values making them comparable between different models.  The residual is the difference between the density of meth labs the model predicted would be in a census tract and the density that actually exists.

Check 6 was to see how well the model was predicting the dependent variable.  It seemed okay to me.

Saturday, October 29, 2016

Unsupervised Classification

Photo Interpretation & Remote Sensing - Mod 9

This week's less in Unsupervised Classification was pretty straightforward and simple, though somewhat time consuming.  Most of the work was done in Erdas Imagine starting with running the Unsupervised Classification tool on a high resolution aerial photograph of the UWF campus.  This
resulted in a thematic raster that allowed us to simplify the image into fewer classes by selecting the pixels and changing them.  Our task was originally to classify the image into four categories; Trees, Grass, Building/Roads, and Shadows.  This seemed simple enough until some of the clusters affected multiple features.  In order to deal with this problem a Mixed class was added as well and anything that affected more than one feature could be added to that.  Sometimes the affect on a second feature was so limited it made more sense to keep that class as the first feature.  Once we had the image classified we merged the classes so we had only the five specified.

Next we added fields to the attribute table for Class Name and Area.  This allowed us to calculate the area for each of the classes so we could determine the percentage of permeable verses impermeable surfaces.  Permeable surfaces include Grass and Trees, impermeable, Buildings\Roads.  The other two classes included a combination of both, so before we could make that calculation we first had to calculate what percentage of each of those classes were made of each surface.

Tuesday, October 25, 2016

Thermal & Multispectral Analysis

Photo Interpretation & Remote Sensing - Mod 8

This week's assignment was to use image manipulation and interpretation techniques to identify a feature from an image using the thermal infrared band as part of the analysis.  The selection was made in Erdas Imagine using the TM Thermal Infrared Composite band combination of red for the thermal layer Band 6, green for Band 3 and blue for Band 2.  In ArcMap the same image was symbolized with a composite combination of red for Band 1, green for Band 2 and blue for Band 3 for a more real world image, which made the feature clearer and more easily identified as an airport.  The coordinate for this feature are 30° 28' 34.8306' N, 86° 31' 7.3224' W and allowed it to be identified as Destin - Ft. Walton Beach Airport (VPS) in northwest Florida.

Friday, October 21, 2016

Statistical Analysis with ArcGIS - Prepare Week

Special Topics - Project 3 Week 1

The week starts off with a new project:  Statistical Analysis of Methamphetamine Laboratory Busts in West Virginia.  For Prepare Week we had to do a bit of reading up on Methamphetamine, it's
history and its users.  We also created a basemap to use in upcoming weeks to display and report on the analysis of the data we were supplied or downloaded in order to assist the government and law enforcement in anticipating future criminal activity.

The study area of census tracts for Kanawa and Putnum Counties was provided along with a point file of Charleston Meth Labs.  The cities, roads, river, and counties of West Virginia were downloaded from the US Census Bureau's Tiger Shapefile site.

In preparation for the analysis process to occur next week a spatial join was performed on the census tracts and meth lab layers in order to combine the attributes of both tables into one.  Once the attribute tables were combined the unnecessary fields were turned off and this is what we were left with:

Some of the attribute names were a little difficult to interpret.  Best guess is sometimes the best you can do.  It will be interesting to see what we do with them next week.

Tuesday, October 18, 2016

Image Preprocessing 2: Spectral Enhancement and Band Indices

Photo Interpretation and Remote Sensing - Mod 7

This week was very challenging.  We worked with histograms a lot this week learning how to analyze them in order to interpret images.  Our final exercise was to identify certain areas on the map based on histogram information.  That was very challenging, but by the end I had a little better understanding of this week's lesson.  Here are the maps created:




Friday, October 14, 2016

Mountain Top Removal - Report Week

Special Topics - Project 2 Week 3 & 4

This week was the culmination of all the prep work and analysis performed in previous weeks.  We had to publish our group Mountain Top Removal (MTR) Analysis map on ArcGIS Online UWF Org, complete our Story Map Journal and add our published map to that, along with a link to this blog.

Mountain Top Removal (MTR) is a method of mining for coal that destroys a mountaintop or ridgeline.  All too frequently the plans described in work permits are significantly different from the actually mining activities that occur on the ground.    Remotely sensed data has been used to investigate evidence of human caused changes to landscape as a result of MTR and to compare what was permitted to what has actually taken place.  The data and methods were provided by SkyTruth, a non-profit agency monitoring MTR in the Appalachian Coal Mining Regions within the states of Tennessee, Kentucky, Ohio, Virginia and West Virginia.  This dataset was created by Group 3 of the University of West Florida’s (UWF) Online GIS Certification Program 2016 class and will be shared with SkyTruth for a comparison study.  Group 3 students include; Rachel Hamaty, Maggie Roth, Charmaine Hingada and Austin Adkison.

Satellite imaging was chosen as an independent and cost-effective method of identifying, mapping and quantifying landscapes disrupted and altered by MTR.  2010 Landsat data was used to create this dataset of polygons covering those areas of MTR in eastern Kentucky.  It is a compilation of Landsat images LT50190332010243EDC00 and LT50190342010243EDC00.  The combined accuracy is 97%, with a total acreage of 131,144 acres.  This dataset contains only those areas 40 acres or larger.  Areas within 50 meters of roads and rivers or within 400 meters of major rivers and highways have been removed.  With those exceptions, only those areas that intersect with mountain ridges have been included.

The Story Map Journal is basically a compilation of slides with a column on the side for descriptive text.  My Story Map Journal, A Journal of Mountain Top Removal, starts with an introduction of MTR, then background of MTR and the role of GIS in defining and analyzing it, an image of the study area my group was responsible for, an image of our analysis that we published though ArcGIS Online, and finally a discussion through a link to this blog.  Below is the link to my Story Map Journal:




Sunday, October 9, 2016

Image Processing 1: Spatial Enhancement and Radiometric Correction

Photo Interpretation and Remote Sensing - Mod 6

This week we used radiometric and spatial enhancements to enhance an image and reduce striping.  
The first step was to perform a Fourier transformation in order to run some of the Fourier tools in a Fourier Transform Editor.  Prior to running this step the image was just a big white blur with a few black splotches until it was zoomed to 1:150000.  After running the Fourier Transform Editor tools it was a complete image that could be zoomed to its extent, but it still had striping.  I tested numerous tools trying to find that right combination that would lessen the stripes without diminishing the clarity of the image but nothing I tried work.  Finally I settle on an image that still had all the stripes it came out of the Fourier Transform Editor with, but the clearest image I had managed to achieve.  The tools I used to accomplish this after the Fourier Transform Editor were the Convolution tools Sharpen and Haze Reduction in ERDAS Imagine and in ArcMap I used the Spatial Analyst Focal Statistics tool with a width and height of 3 and a Statistics type of Range.

Thursday, September 29, 2016

Mountain Top Removal - Analyze Week

Special Topics - Project 2 Week 2

This week we used 2010 Landsat data to create a polygon of current Mountain Top Removal (MTR) areas in the Appalachian Coal Regions of West Virginia and surrounding states.  The first step was to create a single raster dataset from 7 Landsat bands using the Data Management Raster Processing tool Composite Bands, saving it as an .img file.  The Spatial Analyst Extraction tool Extract by Mask was used to clip the composite raster to the study area, again saving to an .img file to use in ERDAS Imagine.

In ERDAS, an Unsupervised Classification was performed on the clipped image with a number of 50 classes.  Areas of MTR were then identified by picking a pixel on this classified image and changing the Class designation for that pixel in Class_Name field in the attribute table to MTR, and the color to Red.  Once all the MTR areas were identified and changed the rest of the pixels were assigned a class designation of NonMTR and the color changed to Dark Green.

This saved image was then added to ArcMap and reclassified with the Spatial Analyst Reclassify tool by populating the New Values field with a 1 for the MTR classes and left blank for the NonMTR classes and the missing values assigned to No Data, creating a new raster of polygons around the MTR areas.

Monday, September 26, 2016

Intro to Electromagnetic Radiation (EMR)

Photo Interpretation & Remote Sensing - Mod 5a

This lesson had us work in ERDAS IMAGINE for the first time which was rather different.  We brought a raster image in and selected an area of Washington State to export to ArcMap using the Inquire Box and created a subset image.  After running the process we moved to ArcMap to manipulate the image properties and legend to display the information appropriately.  Below is the result.


Friday, September 23, 2016

Mountain Top Removal - Prepare Week

Special Topics - Project 2 Week 1

Mountain Top Removal (MTR) is a method of mining for coal that destroys a mountaintop or ridgeline.  This project will explore MTR in the Appalachian Coal Region of West Virginia using data and methods provided by SkyTruth.org to investigate evidence of human-caused changes to landscape cause by MTR using remotely sensed data.  Since this is such a large area to cover, the project was divided into four groups.  I am part of Group 3.

Starting with four Digital Elevation Models (DEMs) that were merged using the Raster to New Mosaic tool and clipped to the Study Area with Extract by Mask tool, a hydrology dataset was created.  This was accomplished using the following Spatial Analyst Hydrology tools:

  • Fill - to modify the raster to prevent water flow from pooling up anywhere in the Study Area.
  • Flow Direction - to assign each pixel a value representing the direction of water flow across that cell.
  • Flow Accumulation - to calculate the total number of other cells that flow into it.
  • Con - defines the threshold of flow accumulation values that qualify something as a stream.  Prior to running this tool 1% of the pixels of the original clipped DEM needed to be calculated in order to quantify which conditions a pixel must meet in order to be classified as a stream.  
  • Stream to Feature - creates polyline vector feature from the Con tool output raster.
  • Basin - delineates the drainage areas in the raster, giving each a unique value
  • Raster to Polygon - creates a polygon shapefile from the raster.
This is the basemap that will be worked from in the upcoming Analyze and Report weeks.  Also a part of this week's assignment was to create a story map of the MTR process and a Story Map Journal to be filled in over the next couple weeks.  These are the links for both:

Story Map:  http://arcg.is/2daFw7K

Story Map Journal:  http://arcg.is/2daNCgz


Tuesday, September 20, 2016

Ground Truthing and Accuracy Assessment

Photo Interpretation and Remote Sensing - Mod 4


Using the map from last weeks lab we investigated random areas of the map to verify the classification schemes we had selected.  Since we weren't able to field check the areas we used Google maps to do so.
 We started by creating a point shapefile, then starting an editing session to add 30 points to the file.  I chose the Stratified Random Sample Protocol to select the locations for my points.

Once I had the points selected I used the Identify tool to determine their coordinates in Degrees, Minutes and Seconds and entered that data in Google.  The image in Google was much clearer than the TIFF we worked from originally.  Even in those areas where I wasn't able to get a street view I was able to see the area clear enough to tell if I had gotten it right or not.  I was not surprised to discover I made the most mistakes on the islands in the river.  They were very difficult to distinguish in the TIFF so I guess at most of them.  In Google I was better able to tell there were no trees so what I had marked as Forested Wetlands was actually Non-Forested Wetlands.  The only real surprise was to find a house with a nice lawn, trees and even a few flower gardens in what had once been just a sandy area.  Overall my accuracy came to 67%

Saturday, September 17, 2016

Network Analyst - Report Week

Special Topics - Project 1 Week 3

This week was putting together the final results of the previous two weeks.  It seemed a simple task, but turned out to be rather daunting.  Eight maps had to be created.  Two of them were added to a pamphlet for the hospital evacuation routes.  Those maps and the next three were pretty simple though a little time consuming.  The next three were supply routes from the National Guard armory to each of the three shelters.

The next map was the most difficult.  This was the map with multiple routes leading out of Downtown Tampa to a shelter.  There was a lot information that had to go on that map including the color coded routes, street names, arrows pointing the correct direction.  This alone wouldn't have been too bad.  But it was supposed to be finished off in Adobe Illustrator.  I spent a considerable amount of time realizing I no longer remember how to use that program.  Finally I had to give up and just finish it in ArcMap.

The last map was also supposed to be done in AI, and  I thought since it was a simpler map that might work, but still had no luck.   This time I finished it off in Power Point, but I'm not real happy with the results.

Tuesday, September 13, 2016

Land Use\Land Cover Classification

Photo Interpretation and Remote Sensing - Mod 3

This week's lesson was to classify a TIFF image to Level II.  A polygon shapefile was created and two fields, Code and Code_Descr added to the attribute table.  Next an editing session was started and the boundaries for the different classifications needed to be digitized.  Digitizing each of those islands was a very long, drawn out process, but the most difficult part was determining how to classify everything.  So many things could be either one thing or another.  Fortunately most of what I was uncertain about fit into the same category so I didn't have to make a final determination.  If I had had to drill down to Level III there were several things I wouldn't have known how to classify.  There were a couple buildings that could have been schools, hospitals or nursing homes.  Fortunately they all belonged to the Level II Commercial and Services category so I didn't have to decide. 

This lesson was pretty intense.  There was a lot of reading material to cover and so much information it was difficult to absorb it all.

Thursday, September 8, 2016

Network Analyst - Analyze Week

Special Topics - Project 1 Week 2

Using the data that was prepared last week, this week the assignment was to create a map with evacuation routes for patients from Tampa General Hospital to other hospitals, evacuation routes from downtown Tampa, supply routes from the National Guard armory to the shelters, and the service areas for each of the shelters.

To accomplish these analyses I created a new network dataset and added the attributes Seconds, to create fast and simple routes, FloodPublic, to calculate routes to avoid flooded streets closed to the public, and FloodEmer, to calculate routes for emergency vehicles which may need to use flooded streets to respond to emergencies but will otherwise avoid them, verifying the evaluators had the correct type and value for each of the attributes.

Because of its location on Davis Islands and the very low elevations there, Tampa General Hospital will need to be evacuated before the hurricane hits.  Memorial Hospital of Tampa and St. Joseph’s Hospital were chosen to accept those patients.  Using Network Analyst’s New Route feature, routes were created from Tampa General Hospital to each hospital, ensuring Impedance was set to Seconds and Restriction only to Oneway to be sure the shortest time route was selected to each.

The US Army National Guard will deliver the emergency supplies from their armory to the local storm shelters.  Because the supplies may not arrive until after the storm hits the routes have been designed to avoid flooded roadways.  Three new routes were created from the armory to each of the shelters.  For these routes the Restrictions were set to FloodEmer to avoid flooded streets as much as possible, but to use them if necessary.

Multiple evacuation routes were also created for downtown Tampa because it is so heavily populated.  The DEM polygon layer, which was used in the Prepare data to specify the predicted flood elevation levels, was also used here with a new Short Integer attribute field added named ScaledCost so groups of elevations could be used with the Impedance attribute to determine the drive times affected by flooding in order to determine the best routes to take to the shelter located at Middleton High School.

To cut down on confusion and uncertainty by the general population the map has been color coded by the shelter service area to make it easier at a glance to tell what shelter they can get to the quickest.  This was done through the New Service Area Network Analyst feature using the shelter points as Facilities, setting the Impedance to Seconds and the Default Breaks value to 10,000 to make sure all areas would be included in the analysis, and the Direction to Toward Facility, with Not Overlapping selected for the polygons, which were then symbolize with different colors.

Monday, September 5, 2016

Visual Interpretation of Aerial Photography

Photo Interpretation & Remote Sensing - Mod 2

This lab covered the methods and techniques used to visually interpret aerial photos through three exercises.  The first exercise focused on tone and texture.
  Five areas of the map were selected to represent different tones in a range from very light to very dark.  Polygons were drawn around these areas then converted to features so the attribute table could be edited to add the tone of each feature and its label added to the map.  Next, the same was done for texture, selecting five areas that ranged from very smooth to very course.




Exercise 2 covered shape and size, shadow, pattern and association.  Three objects were selected to represent each of these elements except association which only required two.  Instead of polygons markers were used to highlight features that were identified in each category.  The markers were also converted to features and each attribute table edited to include the name of the feature so it could be labeled.



The last exercise was a color comparison between the same image in True Color and in False Color IR.  Again markers of the selected features were converted to features to be edited and labeled.  This was done in a data frame with the True Color image, the colors of the selected features were noted then recorded in a table in the process summary.  Those same features were then examined in the False Color image and the difference in colors were also noted in the table.

Wednesday, August 31, 2016

Network Analyst - Prepare Week

Special Topics - Project 1 Week 1

Our first project was to act as a city employee preparing for a major hurricane projected to make landfall the next week.  The project was divided into three stages to complete over three weeks; Prepare, Analyze, and Report.  

With a major hurricane projected to hit Tampa, Florida the city has to make preparations for the storm.  Those preparation included determining road closures in low-lying areas likely to be impacted by the storm, evacuation routes and plans, distribution plans for FEMA aid to storm shelters, and basic information to the public.

This week, for the Prepare stage, a basemap was created to use in the following weeks.  We started with several feature classes of data required to complete the analysis in the next stage.  These included streets, fire and police departments, hospital, shelters and a National Guard Armory locations, a DEM and a boundary of the study area.  

The first task was to clip all the feature classes to the boundary of the study area and reproject them to the specified projection.  Python tools were provided to perform these tasks in batches, but an update to ArcMap seems to have made them a bit glitchy so it took a lot more time and effort to use the tools than to do each feature class one by one.  Still, it was a good refresher on how to edit tools.

Once the clipping and projecting was done the DEM was reclassified into a discrete data set with several categories of elevations, then converted from a raster to a polygon for those areas less than 6 feet.  A couple fields were added to the streets layer in order to determine which streets were likely to flood and create the best routes throughout the city and around the flooded areas.

Another tricky step was editing and exporting the metadata.  Some of the datasets had complete metadata, some had some or none.  For those that needed data added, the export process worked just fine.  The datasets that didn't require data added transferred to the new folder, but the metadata didn't transfer with it.  In order to get the files to transfer with their metadata I had to make and save something in each of the files.  

The map was saved to a Map Package and zipped with the Metadata folder.

Friday, August 5, 2016

A Residential GIS Location Study

Applications in GIS Final Project

Well it's easy to see why this project usually gets three weeks during a normal semester. The project was to produce a residential location study using GIS to help a woman and her daughter get started on house hunting in St. Augustine, Fl.  The analysis involved two proximity analyses, three calculated analyses, and two weighted analyses, one equal, one favored.  With the completion of the analysis recommendations were made on three tracts that best fit the criteria they had set.

To present the data a Power Point presentation was made.  It is located at this site:

http://students.uwf.edu/mr80/AppsFinalProject_MR.pptx

Wednesday, August 3, 2016

Sharing Tools

GIS Programming - Mod 11

Our final assignment for this course was Sharing Tools.  We started with a script that was already written and a tool that was already made.  The script had a couple paths that were hard coded to non-existent files or locations so our first step was changing those paths to parameters defined by the tool.  To do that we used the sys.argv[] function, which is very similar to GetParameterAsText() as it always returns string objects.
 With the variables set to pull the paths from the tool parameters the tool then ran correctly and added random points and put buffers around them, so we moved on to clean up the tool a little bit.

The dialog box has a help window that explains the purpose of the tool and the requirements for all the parameters, but it doesn't just magically appear with the creation of a tool, so we entered it by editing the Item Description in ArcCatalog.  Since this tool was a combination of two existing tools, Create Random Points and Buffer, the explanations could pretty much be copied from those tools with just a little tweaking.

Once the script and the tool were both cleaned up and grade A professional looking the script was imported to the tool so the tool could more easily be shared, requiring only the passing of the toolbox.  And to make sure nobody could sneak in and sabotage all my hard work on this my last GIS Programming assignment, I password protected it with a diabolically clever password nobody would ever figure out in a million gazillion years.

While I really enjoyed the entire course, I think what I enjoyed most was learning how to create custom tools.  It frustrates me to use a program and always wonder how it knows what to require of me and what to do with the information I give it.  Creating custom tools gave me a behind the scenes, behind the screen, view of what is going on.  Now I have a sense of what goes into the making of dialog boxes and what the limitations and possibilities can be.  I think that is something that can come in very handy in the future, especially for making GIS more usable to others who don’t have time to learn the program.  They only have to learn how to enter the data required, and they’re making their own maps.

Another thing I really enjoyed was debugging.  I just don’t feel I have a deep enough knowledge yet to really be very successful at it.  Hopefully once I get back to work I’ll have the opportunity to use Python and get to really dig deep into it. 

Wednesday, July 27, 2016

Creating Custom Tools

GIS Programming - Mod 10

This week we modified a script to create a custom tool and a toolbox.  We modified a stand-alone script to create a tool to perform the clip function on multiple features based on a single clip boundary file. We did this by adding the new script to the new toolbox we created, added a description and selected the "Store relative path names" option.  Next we set the tool parameters and the options that go along with them.  
Once the tool was prepared we went back into the script to make the adjustments that were required to take it from a stand-alone script to one that would work within ArcMap.  To do this we had to replace the hard-coded values in the script with parameters passed by the tool using arcpy.GetParameter () and adding the appropriate index value for each variable into the argument and replacing the print statements with arcpy.AddMessage () statements in a for loop.

To do this we followed these basic steps:
1.       Create a new toolbox in a location it can be selected along with the script.
2.       Add the stand-alone script to the toolbox.
3.       Add a Description and select the “Store relative path names” option.
4.       Select the script as the Script File.
5.       Run to make sure it works.
6.       Enter the parameters and their options in the Properties box of the tool
7.       Update the script to change hard-coded values with arcpy.GetParameters(#),  replacing # with the appropriate index value for that particular parameter.
8.       Update the script to change print statements to arcpy.AddMessage statements, making sure everything to be printing is inside the argument parenthesis and any commas are changed to the plus sign.
9.       Any variables that need to pass a string and use results from arcpy.GetParameter function must be converted to a string
10.   Verify “Store relative path names” option is checked in the tool’s Properties, then select the script and the toolbox and compress them to a zipped folder.

Wednesday, July 20, 2016

Working With Rasters

GIS Programming - Mod 9

The assignment this week was to write a script that would create a raster output that would identify certain areas with a particular set of parameters.  Those parameters were; slope, aspect, and land cover type.  We accomplished this by piecing together many of the scripts we created in our exercise using multiple modules, functions and the one method available to the raster object, .save.  

As usual we started by importing our modules and classes and setting our environments.  We had the option of creating a geodatabase to save our final raster to or just saving it to a file.  I chose to create a geodatabase because I wanted to refresh myself on the process since I think it's something I will use often.

There are different tools available in ArcMap depending on the licensing agreement you have.  Many of the tools we use are Spatial Analyst tools which requires a higher level of licensing than the basic ArcMap package.  The script we wrote was mostly contained within an If statement that would make sure the Spatial Analyst Extension was available before running the script, and an Else if it wasn't to print a message to the user letting them know the license wasn't available.
 In between those if and else statements we checked out the license to use the Spatial Analyst tools, remapped and reclassified the land cover raster, recalculated the slope and aspect of the elevation raster and combined all five temporary rasters, then used the .save method to save them to a permanent file in the geodatabase.  Once done with all of that the Spatial Analyst extension could be checked back in and the script ended.

This assignment was pretty straightforward.  Most of the steps were defined in the exercise so it was just a matter of going back through that to see what we had done.  The one thing that caused me problems was I had copied a portion of the script exactly as it had appeared in the exercise, even though there were a few things about it I didn't understand.  When my script was complete and I went to ArcMap to see what I had done, I found I had large patches of land that had no color.  I went back through my script several times looking for the culprit and searched the discussion groups and consulted with classmates to try to determine the problem.  Eventually it was discover that one of the parameters "NODATA" in the reclassifying step that had never been explained was what was causing the problem.  Once I took that out everything came out as it was supposed to.  And then hours later I found out how it original was was how it actually was supposed to be so I had to go back through everything and fix it all back to the way I originally had it.

To help keep track of where the script was in the process of running, a print statement was put in prior to each step.  This not only helped to keep track of where the script was in the process, it was also a way to trace back where a problem was if the script failed.  These results show the progress of the script, right to the successful end.  It is a very useful step in the process of running the script, but very frustrating to deal with when creating the flow chart, since it adds so many extra elements to the chart.  I often wonder if a generic note on the bottom of the page stating each step was preceded by a print statement would be good enough, but never think to ask.

Sunday, July 17, 2016

Urban Planning - GIS for Local Government

Applications in GIS - Participation Assignment


The web address for the property appraiser of Walton County is www.waltonpa.com.  On this site I was able to discover the highest property sold for the month of June, 2016 was $10,200,000.  Um, let me spell that out just so we're clear...Ten million, two hundred thousand dollars.  That is what was paid on June 10, 2016 for this single family home of 4 bedrooms, 4 baths.  It previously sold for $7,200,000 in October of 2006.

Legal Informations:  LOT 1 BLK 19 ROSEMARY BEACH PH3 PB 13-2 IN OR 1973-61 OR 2072-84 OR 2623-810 OR 2733-3116 OR 3012-3183 

The legal description shown here may be condensed for assessment purposes. Exact description should be obtained from the recorded deed.

The assessed land value for this property is $4,856,300.  This site didn't offer the ability to check the assessed land value form the last sale, only the total price.

This two story house was built in 2000 in Rosemary Beach, Walton County, Florida.  It is 6,200 square feet, which is a pretty fair sized house, but it is the location that jacks the price up so high.  


The second part of this assignment was to assess the values of parcels in a subdivision named West Ridge Place. One of the parcels, 090310165 looks like it should have it's value lowered $6,175 to match the assessment of the lots around it.  Lots 090310320 and 090310325 should have their raised $2,137 to match those lots of same size around them.  The other three lots in the $24,938 range should stay the same even though they are comparable in size to others around them because they have easements on them.




Urban Planning - GIS for Local Government

Applications in GIS - Mod 9

Local governments worldwide use zoning as a method of land use planning.  The first scenario of  this week's exercise had us viewing property and obtaining data from Marion County Appraiser's website to create a site map of the parcel of a client who wanted to find out what, if any impacts a fly-in community would have on adjacent parcels.  To do this we learned about different types of zoning codes and their meanings and how to download a table of parcel data to a GIS file.  We then joined that table to our parcels feature class to access that data.  Next we did some editing to capture only those parcels we were interested in.
With our selection set we moved on to Data Driven Pages which is an awesome tool that allows you to generate a series of maps using a single layout and applying it to numerous map extents.  To do this we created a grid index that covered our entire area of interest then used this grid as an intersect with our zoning feature class to create a zoning index for the area of interest to symbolize according to zone codes.  Next we followed some seemingly convoluted steps to create a locator map.  We copied our index grid, Index2400, to another feature class of a different name, Locator_Mask, then copied that and renamed that copy Locator-Mask Current Page.  So in the end we had three identical layers with different names all in the locator map data frame.  Which makes no sense at all.  Until you manipulate each of them to do something different.  With Data Driven Pages enabled we were able to set up the Locator_Mask Currant Page layer to highlight within the locator map the current page being displayed in the extents.  Locator_Mask masked the rest of the grid and the original, Index2400, was symbolized to outline each page in white and label the pages.  So the data driven pages allowed the locator map to match the portion of the map being displayed in the extents and using Dynamic Text allowed the page number and the name of the index grid to update as you page through the file.  This set up allowed me to add all the standard map elements just once, and yet produce 16 different maps.  I did this quite simply by exporting my map to a PDF file specifying to export all pages to a single file.  For the last step in this portion we generated a report using the Create Report option in the attribute table and exported that to a PDF file as well.

The second scenario had us doing a lot of editing.  We had to merge a couple parcels and update the attribute information in them, then split a portion out of that to create a new parcel and update that attribute information.  This was actually a lot of fun because it included drawing lines.  Next we had to do some data manipulation and searches to report on suitable sites for a new Extension office to be built in Gulf County.  The requirements were the land had to be county owned, 20 acres or larger and vacant.  Gathering this data required using Query Builder to limit the records to those acres equal to or greater than 20 and joining tables.  Again, we used the Create Report feature of the attribute table to report our results.

Wednesday, July 13, 2016

You Can't Get There From Here

GIS Programming - Participation Assignment #2

You can’t get there from here.  It’s a comment frequently heard.  Often it’s said with a grin or a smirk, sometimes with chagrin.  But when it’s a matter of life or death or the loss of all you own, it’s not so funny.  Narrow streets and curvy road can be a serious impediment for large emergency vehicles like fire engines.  Even wider streets with cars parked on both sides can be difficult to navigate.  These are issue that concern fire departments around the world.  How you get there from here could save a life, or lose it. 

This article I found on sciencedirect.com, published by Procedia Social and Behavior Science from the 11th International Conference of the International Institute for Infrastructure Resilience and Reconstruction (I3R2) : Complex Disasters and Disaster Risk Management, titled “Models and Applications of Firefighting Vulnerability”  talks about the many investigations that have been done by researchers in regards to the use of GIS in firefighting, but weren’t comprehensive enough so another study was called for.

This study focused on the scope of firefighting models in regards to applications in the system and showing how much more GIS can be used with new information technology environments with the use of state-of-the-art devices and more in-depth information from the fire scenes.  The four models were based on fire vulnerability types, considering firefighting vulnerabilities and critical factors through the different stages of the fire, from the notification at the fire station to the fire scene, through the firefight and back to the station.  The study was conducted in four cities in South Korea.

This is the link for the article if you’d like to read it in detail:

Tuesday, July 12, 2016

Working With Geometries

GIS Programming - Mod 8

This week in working with geometries we created a text file and wrote lines of data to it from a rivers polyline shapefile.  This text file is the result.  Each segment of line for each object is listed individually.  The first number is the object ID, the second number is a vertex ID to show which segment of the object it belongs to.  Next are the xy coordinates and last is the river name.  This data was derived using a SearchCursor and a couple for loops, then written to the text file with a .write () statement.

Also as part of the assignment was writing the pseudocode and a flowchart for the script.



Sunday, July 10, 2016

Location Decisions

Applications in GIS - Mod 8

This week's assignment was to make a recommendation on places to live for a couple, a doctor who will be working at the North Florida Regional Medical Center, and her husband who will be teaching at the University of Florida, in Alachua County, Florida.
 The criterion set by the couple were nearness to work for both of them, a neighborhood with a high percentages of people 40 to 49 years old, and a neighborhood with high house values.  This map was created with four data frames, one each for each of the criterion.  For the distance frames we ran the proximity analysis tool Euclidean Distance to generate a raster image showing distances out from the points of interest in 3 mile increments.  To allow for easier interpretations and analysis we then reclassified them. Next we added fields to our census tract attribute table to calculate the percentage of people 40-49 and homeowners, then converted these feature classes to rasters and reclassified them so they could be compared to the distance rasters and recommended a few locations for each criteria.

 The next map is where our comparisons were made.  We added our four reclassified layers to the map then created a model with them as the input and the Spatial Analyst Weighted Overlay tool.  For the first data frame we gave each of the criterion equal weight, but for the second we responded to the clients' request to focus more on proximity to workplaces and restricted some of the Scale Values in the distance layers to place them within the least favorable values so they wouldn't be influenced by the weight of the other criteria.  Next the percent of influence was adjusted to give more weight to the distances and less to the population and home ownership.  With this new information we were then able to recommend a few areas evenly spaced between the two locations with the rest of the criteria accounted for.




Sunday, July 3, 2016

Homeland Security - MEDS Protect

Applications in GIS - Mod 7

Critical infrastructure includes not only those infrastructures vital to our defense and security, but also those people, places and things necessary to allow for everyday living and working.  Certain key resources, such as large gathering place, may be deemed to be critical at certain times when circumstances can cause them to become a target.  One such example of this is the Boston Marathon Bombing of 2013.  

Our lesson this week took us back to that incident as though we were preparing for it.  We started with the MEDS data we had put together for last weeks assignment.  The Boston Marathon was determined to be at risk, so security needed to be stepped up.  We were tasked with identifying critical infrastructures in the vicinity of the marathon finish line location and implement protective measures such as securing the perimeter of a protective buffer zone around targeted sites by locating surveillance posts at ingress and egress points and siting the most optimal observations positions in and around the event site.

We started with our Buffer tool to create a 3 mile buffer around the finish line.  Our next step was to locate those key resources within this buffer zone that could potentially be secondary targets.  For this we used our GNIS (Geographic Names Information System) layer to run a select by location query to isolate those features that fell within our zone of interest.  For this lesson we focused our attention on hospitals, selecting out those that were nearest to the finish line using the Generate Near Table tool, then of those, manually selecting those that met our criteria.  We then created a 500' buffer around those hospitals and the finish line to emphasize those areas that required stepped up security.  Extra security precautions were required at the finish line so we used the Intersect tool to determine appropriate locations for security checkpoints along the ingress and egress routes at the border of our finish line buffer zone.

Safety and security at large events like the Boston Marathon is dependent upon surveillance.  Since the finish line for this event was considered to be a high target area security needed to be heightened with the use of surveillance equipment, providing line of sight coverage of the finish line for the entire block.  15 sights were chosen to provide the greatest amount of coverage without obstruction. The height of the equipment ranges from 10 to 15 meters in elevation.  Determining the placement of these cameras started with converting LiDAR data to a raster then generating a hillshade effect to see how the shadows would fall at the time and date of the event.  Next we created a new feature class of surveillance points strategically placed along the around the finish line to provide the most unobstructed view possible.  To verify the placement of these points we ran the Viewshed tool to show us an obstructed areas between the surveillance point and the finish line.  With this information we were able to determine which points need adjustment either through relocation or increased height.

With that all set we then used the Create Line of Sight tool which allowed us to run a line from our surveillance points to the finish line, including the elevation of the point, which enabled it to determine where there might be sections of obstructed view.  We created a graph of one of the lines that had an obstructed view and the graph shows how that line of sight is obstructed.

It would have been nice to see how all this showed up in a 3D view, but I was never able to get the line of sight lines copied over to ArcScene to see it.

Wednesday, June 29, 2016

Exploring and Manipulating Spatial Data

GIS Programming - Mod 7

This week we created a geodatabase using the CreateFileGDB_ management function, then created a list of shapefiles in our Data folder and copied them to our geodatabase using the CopyFeatures_ management function.

With our geodatabase set we then created a SearchCursor for the city names and populations for those cities within the cities feature class that were county seats in the state of New Mexico.  This was the only real problem I had with the assignment.  I was having a little difficulty figuring out how I was going to populate the dictionary two steps ahead, but I decided not to worry about that step until after I finished this step.  In the exercises we performed a SearchCursor and it was pretty straight forward, but we were only looking to search the attributes of one field in the table, whereas for the assignment we were dealing with two fields.  It took me a while to figure out the SearchCursor was basically going to build a list of the two fields based on the attributes of the third field.  Once I realized that I was able to figure out my syntax and move on.  I created an empty dictionary, then populated it within a for loop by pulling the fields from the list I created with SearchCursor and updating the dictionary with them.

Getting all this down in the flowchart was a bit of a chore.  Not so much the different steps, but adding all the print statements was kind of a pain since we had to add one before and after each step.  I ended up combining the ones that came between steps and just hope that will be good enough.