Wednesday, June 29, 2016

Exploring and Manipulating Spatial Data

GIS Programming - Mod 7

This week we created a geodatabase using the CreateFileGDB_ management function, then created a list of shapefiles in our Data folder and copied them to our geodatabase using the CopyFeatures_ management function.

With our geodatabase set we then created a SearchCursor for the city names and populations for those cities within the cities feature class that were county seats in the state of New Mexico.  This was the only real problem I had with the assignment.  I was having a little difficulty figuring out how I was going to populate the dictionary two steps ahead, but I decided not to worry about that step until after I finished this step.  In the exercises we performed a SearchCursor and it was pretty straight forward, but we were only looking to search the attributes of one field in the table, whereas for the assignment we were dealing with two fields.  It took me a while to figure out the SearchCursor was basically going to build a list of the two fields based on the attributes of the third field.  Once I realized that I was able to figure out my syntax and move on.  I created an empty dictionary, then populated it within a for loop by pulling the fields from the list I created with SearchCursor and updating the dictionary with them.

Getting all this down in the flowchart was a bit of a chore.  Not so much the different steps, but adding all the print statements was kind of a pain since we had to add one before and after each step.  I ended up combining the ones that came between steps and just hope that will be good enough.

Saturday, June 25, 2016

Homeland Security - Prepare Minimum Essential Data Sets

Applications in GIS - Mod 6

The Department of Homeland Security, recognizing the importance of geospatial information and technology in securing our nation, has developed the DHS Geospatial Data Model (DHS GDM) in support of urgent DHS mission requirements. This standards-based logical data model is to be used to collect, discover, store and share homeland security geospatial data.

During a crisis, emergency operations personnel are dependent on a comprehensive geospatial database that is prepared, ready and available for immediate use.  The MEDS layer configuration is a standardized file and directory structure that enables inter-operable and efficient data analysis in times of crisis.  To this end, a series of Minimum Essential Data Sets (MEDS) for urban areas and large area spatial extents was established as vital to successful homeland security operations.  There are eight MEDS. 

Boundaries are used to define the geographic area of interest.  Urban areas are defined as Tier 1 and Tier 2. Tier 1 Urban Areas are the largest, most populated metropolitan centers in the country.  These boundaries are set by mapping the extent of the combined entities in the metropolitan centers and a 10 mile buffer beyond. 

Elevation is represented by digital elevation models (DEMS), or digital  terrain models (DTM).  They are representations of continuous elevation values over a topographic surface typically used to present terrain relief.  They are used to make 3D graphics that display terrain slope, aspect, and profiles between selected points.

Geographic Names from the Geographic Names Information System (GNIS), developed by the US Geological Survey (USGS) contains information about physical and cultural geographic features in the US and associated areas.  This database contain federally recognized names of features and defines the location by state, county, USGS topographic map and geographic coordinates.

Hydrology includes information about surface water features such as lakes, ponds, streams, rivers, springs and wells and other hydrologic areas that have great significance to homeland security such as spillways and dam weirs.  These critical infrastructures that control the flow and direction of flood waters require close surveillance and protection from possible attack.

Land Cover data sets are important to homeland security planning and operations.  They provide detailed information about land uses and ground conditions which assist in preventing and mitigating the impacts of a catastrophic event.

Orthoimagery is aerial photography that has had distortion and relief displacement removed so ground features are displayed in their true position.  This type of imagery allows direct measurements to be made while providing a realistic visualization of the landscape.  This ability for first responders to view geographic features that may not be mapped can reduce response times considerably and save lives.

Transportation includes roads, railroads, airports, ferry crossings and more.  Each plays a critical role in our lives and as such are attractive targets to terrorist.  Transportation systems are used not only to transport people from place to place, and consumer goods to distributions centers, but also to transport highly toxic and volatile substances.  For this reason they pose a significant public safety and security threat that must constantly be guarded against.

Structures consists of significant buildings and facilities that are focal meeting places for large groups of people or house critical infrastructure heavily dependent on for everyday operations.  This could include government offices, police stations, power lines and substations, sports arenas and entertainment centers, houses of worship, schools, hospital, and more.

This weeks lesson was to put together a MEDS package.  While we didn't have the time to download all the data ourselves, we did have to organize the data by creating group layers and manipulate some of the data so it could be analyzed.

We started by creating group layers for each of the MEDS except Structures and adding to them the datasets that had been downloaded for us.  Next we manipulated the transportation data by joining a table of Census Feature Class Codes (CFCC) to our roads layer.  We then used this code to create separate local, secondary and primary roads we could then symbolize individually.  The Land Cover needed to be clipped to the boundary area and symbolized according to the National Land Cover Database standards.  The Geographic Names file had all the data jumbled into one column so we had to adjust that before we could even add that to the map and define the projection.  Once that was done we were able to do a Select by Location to create a dataset of just those points that fell within the boundary of our study area.  None of the other data sets required any manipulation other than putting them into a layer group.  With all that done we were then able to save all the layer groups to layer files that could be shared.

Wednesday, June 22, 2016

Geoprocessing with Python

GIS Programming - Mod 6

Module 6 had us running three geoprocessing tools with a Python script.  AddXY added xy coordinates to a hospitals point shapefile, Buffer generated a 1000 meter buffer around the points, then Dissolve dissolved the buffer file into a single feature.  




After each tool I printed a message with the GetMessages function to show the successful completion of each tool.

With that successfully done I created my flow chart and the assignment was done.




Sunday, June 19, 2016

Homeland Security - DC Crime Mapping

Applications in GIS - Mod 5

This week's lesson was in four parts that resulted in two maps.  We started with organizing and documenting our work by setting up our project folders, defining our Map Document Properties, including assigning our default geodatabase, and setting our environments.  

Part 2 involved prepping our first map for analysis by adding the datasets we would need, creating an Address Locator and geocoding addresses for the police stations in our area of interest.  During this process we had only one address that was unmatched so that shouldn't have taken very long, but I realized after completing the next step, symbolizing the main highways, that locating that unmatched address would have gone much more quickly if I had done that first.  After that I took the time to read through the rest of the lab instructions again to make sure there wasn't anything else I wanted to do in reverse order.

Part 3 had us analyze crime in Washington DC and police stations by crime proximity.  We started by creating a graph of all the crimes in DC and the number of occurrences reported for each one.  Next we used the Multiple Ring Buffer to create three buffers of .5, 1 and 2 miles around each police station.  This, with the point feature of our Crimes layer, allowed us at a quick glance to see how well the police stations were located in comparison to where crimes were reported.  But we needed something better than a quick glance to run an analysis so we did a spatial join that joined the Crimes with the buffer zone in which they occurred.  We now had an attribute table associated with our buffers that allowed us to run queries.  With this we were able to determine the percent of crimes that were reported in each buffer zone and determine most are within one mile of a police station.  The next analysis we ran was to determine how many crimes each police station was nearest to.  This was represented by using graduated symbols for the police stations.

The last step was to propose a location for a new police substation based on the results of our analysis.  Our analysis showed there were a 127 reported crimes that occurred outside the 2 miles radius of a police station, mostly in three different areas.  One of those areas included a cluster of 73 crimes so that seemed to me like the best place to locate a new police station.

Part 4 was creating density maps for three types of crime;
burglaries, homicide and sex abuse.  We did this using the Spatial Analyst tool Kernel Density.  I chose to go with a 1500 square kilometer radius for my search parameters for all three crimes and classified the results with natural breaks and 5 classes to display the results, showing the most crimes per square mile occurred in the darkest areas.  This layer sat above a census block layer symbolized by population per square mile to show the density of the population, and beneath that was a layer of roads.  Setting up the transparency of the layers to show all the information was a bit difficult.  To get a clearer picture of how the density of crimes compared to the density of population was easy enough to do with the Swipe tool from the Effects tool bar, but it was more difficult to make this information clear in a print out or image file that couldn't be manipulated.  Still, with a side by side comparison of the three crimes it is easier to see the population density showing through under the frames for homicides and sex abuse.  This clearly showed most burglaries tend to occur in the more densely populated areas, whereas homicides tend to occur in the outer edges of the most densely populated areas, and sex abuse between the two, more in the mid-range of population density.

Tuesday, June 14, 2016

Geoprocessing in ArcGIS

GIS Programming - Mod 5


This week we learned about geoprocessing tools, models and scripts. We started out creating a new toolbox then, within that, creating a model using ModelBuilder.  Our task was to clip the soils layer to the extent of the basin layer and remove from that selection those soils that were classified "Not prime farmland."  To do this I dragged the soil and basin layers from the table of contents to the model and the Clip tool from the Toolbox menu, added the parameters to the Clip tool and ran the model.  As soon as I clicked run I realized I hadn’t set my workspace and scratch environments or adjusted the path for my Output Feature Class so I had to delete the output from the geodatabase of a previous lesson, set the workspace and scratch place environments to the correct one, verify the Overwrite option was selected, then rerun the model. 

With that portion working correctly I dragged Select to the model and added the output of my clip as the input, defined the output and created an SQL expression to select only the specified attributes to add to the selection set.  Using the results of the Select tool I used Erase to remove those attributes from the Clip output.  Before running this step I right clicked on the output and selected Add to Display so my output would show up in the table of contents and I could check the attribute table to make sure the selection worked correctly.  The above shows I did it right so I was able to export the model as a Python script and moved on to the next part of the assignment.

Exporting a model to Python allows the script to work within ArcMap, but in order for it to work outside of ArcMap as a stand-alone script a few things need to be changed.  Because the input files came directly from the map's table of content they didn't have a path associated with them.  In order for the script to stand alone the path needed to be added to those variables.  The next change was telling the script to overwrite the output since that had been set within the map workspace, not the model.

 Now my script was able to stand alone as though I had written the whole thing from the beginning. From this script I was able to create a script tool within the toolbox.

Sunday, June 12, 2016

Natural Hazards - Hurricanes

Applications in GIS - Mod 4

In October of 2012 Hurricane Sandy hit the northeast coast of the United States with devastating consequences.  This week's lab was to create two different maps of this event.  The first showing Hurricane Sandy's track from it's inception to it's end stage and the second, a damage assessment for just one side of one street impacted by the storm.

This map shows the path Hurricane Sandy took and the ten states and District of Columbia that were affected.  We started by creating a polyline feature class from the points of the x,y values we added to the map from a table.  We next created a new symbol to represent the storm and used the Unique Values option in the Symbology Category display the different levels of the storm in different colors.  Using the Expression Window on the Labels tab we were able to display both the miles per hour of the winds and also the barometric pressure at each of our points.  This provided a little more information than the category of the storm at each location.  The last step was to add Graticules to the map.  This was my first time working with graticules and that took a little getting used to.  I didn't want the lines to overwhelm my drawing, but there was a fine line between not overwhelming and not showing up.  I had to adjust the color of the lines a few times to get that right.  With that done it was just a matter of adding the labels for the states and countries then the map elements and it was done.

Next on the list was creating a map showing the before and after images of a block in Tom's River New Jersey and assess the damage done to the homes on that block.  This we started by creating a new geodatabase, a feature class dataset and a couple raster datasets to store the data we'd use or create for this portion.  We also created an Attribute Domain.  This allowed us to specify what kind of information would be acceptable to add to the attribute table of the feature class we were creating to identify the damage done to the homes in our area of study.

Up to this point thing were pretty simple.  The difficult part came in trying to determine how much damage had been done to the property and what caused that damage.  The "before Sandy" image was pretty bright and clear, with the houses all very well defined.  The "after Sandy" image, however was darker and not nearly as easy to distinguish details.  Plus it was left up to us to use our own best judgment with no criteria for what constitutes major damage, minor damage, or what has just been affected.  Affected?  What does that mean?  That there was just some level of damage?  Or nothing happened to this house, but the property value is now trashed because everything around it is destroyed?  So it was a guessing game I wasn't really comfortable playing about something so serious.

Looking at the houses from overhead it was difficult to tell what if any damage was done to them, other than the ones that were obviously completely destroyed or moved from where they had originally been.  But one thing that was pretty clear was the inundation was more likely to be the cause of damage than the wind.  There were some houses that appeared to have some damage to the roof, but just about all of the houses seemed to have mud and debris pressed up against them.  I had to assume most of them had gotten damaged to some extent from that.  It was rather astonishing to me that the house closest to the coast had no visible damage from the image.  I'm sure it had to have had some damage from the inundation since it was first hit with it, but looking at it from overhead it would almost appear as though it had survived the storm unscathed.

Wednesday, June 8, 2016

The Disappearing West

GIS Programming - Participation Assignment #1


Every 2.5 minutes the American West loses a football field worth of natural area to human development.  With less than 12% of land in the west under protection from development and resource extraction, a team of scientists, some of the top researchers in landscape ecology and conservation biology, at the nonprofit Conservation Science Partners (CSP) analyzed nearly 3 dozen datasets, a dozen types of human activity and more than a decade of satellite imagery for 11 western states, to answer the questions; How fast is the United States losing natural resources in the west, and, most importantly, why?  Established methods were used to map the degree of human modification at high spatial resolution and to estimate the amount of natural loss between 2001-2006 and 2006-2011.  The central dataset used in this project is the National Land Cover Dataset.

Between 2001 and 2011 approximately 4300 square miles of natural areas disappeared because of development.  This loss amounts to an area larger than Yellowstone Nation Park.  Wyoming and Utah were hit the hardest of the 11 states.  Between 2001 and 2011 they had the largest percent of change in area from human development.

Three quarters of natural areas in the west that have disappeared were caused by development on private lands.  Many state lands are managed primarily for energy and timber extraction.  They also lost large areas to development.  Federal lands fared better than state lands in protecting natural areas.  National parks , in particular, did the best among all ownership categories in having the lowest proportion of land converted to development.

GIS was instrumental in tracking and analyzing all this data.  The approach, data, and analytical methods used to estimate natural land loss in the western U.S. can be viewed here:  https://www.disappearingwest.org/methodology.pdf
For the full article on this project follow this link:  https://www.disappearingwest.org/

Tuesday, June 7, 2016

Debugging and Error Handling

GIS Programming - Mod 4

Debugging and error handling are crucial aspects of programming.  Mistakes happen and when then do it's very important to know how to fix them.  Our assignment this week was to work on three scripts that had errors in them.  The first script was to print out the names of all the fields in an attribute table of a shape file.  It had a couple typos that needed to be fixed, then ran fine.

Script 1 Results


The second script was to print out the names of all the layers in the data frame in a map.  This one was a little more complex.  In this script we had to correct 8 errors of different types, from correcting paths, to typos, to adding or removing arguments from statements.

Script 2 Results

The last script was the most difficult in that it was a difficult concept to understand.  The intent of the script was to return an error statement for Part A and certain information from the data frame of a map for Part B.  Part B had no errors in it, but we had to get through Part A before we could get there.  Instead of correcting the errors in Part A we had to use a try-except statement to allow the script to run through the error portion without failing, but returning a relevant error message, then move on to the second portion of the script.  To accomplish this I ran the script to see where my first error was then investigated that.  I saw I was missing an argument in a statement so I placed my try before that line.   At this point I wasn’t really sure how the try-except worked so tried the except after the for statement to see what that would do.  I ran the script again and came up with another error.   I realized at this point that nothing after setting mapdoc could work right because everything was based off mapdoc being correct so I moved my except to the end of Part A and added the second error type to the argument and ran the script.  This time my script ran correctly, but I realized by moving the except where I did I shouldn’t be getting that second error any more so I removed that and tried again and it was good.

Script 3 Results

We also had to do a flow chart for one of the scripts.  I chose to do it for Script 1.

Flow Chart of Script 1



Saturday, June 4, 2016

Tsunami Evacuation

Applications in GIS - Mod 3

In March of 2011 the Great East Japan Earthquake occurred off the Pacific coast of the northeastern part of Japan causing devastating damage and extremely destructive tsunami waves, which caused even more damage.  This damage included the disabling of the reactor cooling systems of the Fukushima Daiichi Nuclear Power Plant, one of the largest nuclear power stations in the world, which lead to nuclear radiation leaks.  This week's assignment was to map the radiation evacuation zones surrounding the Fukushima-Daiichi Nuclear Power Plant, analyze the tsunami runup on the Fukushima coast for three zones, and determine the at-risk population locations within each of those zones.

We started the assignment by creating a geodatabase and datasets to store the data we'd be using and creating for the assignment.  Next we
isolated the Fukushima II Power Plant for analysis to determine radiation evacuation zones using the Multiple Ring Buffer and Clip tools, then adjusted the symbology to create the different zones.  With the evacuation zones in place we ran queries to isolate the impacted cities and determine the population affected.

Our last portion was to determine the evacuation zones for the tsunami runup, estimate at-risk cities within each, and identify roads and nuclear power plants that would be inundated.  Again we created a buffer of 10,000 meters of coastline to work from. The rest was accomplished through a model, automating the Con, Raster to Polygon, Append and Intersect tools to create the tsunami runup evacuation zones and clip out the roads, power plants and cities that fell within those zones.