Wednesday, July 27, 2016

Creating Custom Tools

GIS Programming - Mod 10

This week we modified a script to create a custom tool and a toolbox.  We modified a stand-alone script to create a tool to perform the clip function on multiple features based on a single clip boundary file. We did this by adding the new script to the new toolbox we created, added a description and selected the "Store relative path names" option.  Next we set the tool parameters and the options that go along with them.  
Once the tool was prepared we went back into the script to make the adjustments that were required to take it from a stand-alone script to one that would work within ArcMap.  To do this we had to replace the hard-coded values in the script with parameters passed by the tool using arcpy.GetParameter () and adding the appropriate index value for each variable into the argument and replacing the print statements with arcpy.AddMessage () statements in a for loop.

To do this we followed these basic steps:
1.       Create a new toolbox in a location it can be selected along with the script.
2.       Add the stand-alone script to the toolbox.
3.       Add a Description and select the “Store relative path names” option.
4.       Select the script as the Script File.
5.       Run to make sure it works.
6.       Enter the parameters and their options in the Properties box of the tool
7.       Update the script to change hard-coded values with arcpy.GetParameters(#),  replacing # with the appropriate index value for that particular parameter.
8.       Update the script to change print statements to arcpy.AddMessage statements, making sure everything to be printing is inside the argument parenthesis and any commas are changed to the plus sign.
9.       Any variables that need to pass a string and use results from arcpy.GetParameter function must be converted to a string
10.   Verify “Store relative path names” option is checked in the tool’s Properties, then select the script and the toolbox and compress them to a zipped folder.

Wednesday, July 20, 2016

Working With Rasters

GIS Programming - Mod 9

The assignment this week was to write a script that would create a raster output that would identify certain areas with a particular set of parameters.  Those parameters were; slope, aspect, and land cover type.  We accomplished this by piecing together many of the scripts we created in our exercise using multiple modules, functions and the one method available to the raster object, .save.  

As usual we started by importing our modules and classes and setting our environments.  We had the option of creating a geodatabase to save our final raster to or just saving it to a file.  I chose to create a geodatabase because I wanted to refresh myself on the process since I think it's something I will use often.

There are different tools available in ArcMap depending on the licensing agreement you have.  Many of the tools we use are Spatial Analyst tools which requires a higher level of licensing than the basic ArcMap package.  The script we wrote was mostly contained within an If statement that would make sure the Spatial Analyst Extension was available before running the script, and an Else if it wasn't to print a message to the user letting them know the license wasn't available.
 In between those if and else statements we checked out the license to use the Spatial Analyst tools, remapped and reclassified the land cover raster, recalculated the slope and aspect of the elevation raster and combined all five temporary rasters, then used the .save method to save them to a permanent file in the geodatabase.  Once done with all of that the Spatial Analyst extension could be checked back in and the script ended.

This assignment was pretty straightforward.  Most of the steps were defined in the exercise so it was just a matter of going back through that to see what we had done.  The one thing that caused me problems was I had copied a portion of the script exactly as it had appeared in the exercise, even though there were a few things about it I didn't understand.  When my script was complete and I went to ArcMap to see what I had done, I found I had large patches of land that had no color.  I went back through my script several times looking for the culprit and searched the discussion groups and consulted with classmates to try to determine the problem.  Eventually it was discover that one of the parameters "NODATA" in the reclassifying step that had never been explained was what was causing the problem.  Once I took that out everything came out as it was supposed to.  And then hours later I found out how it original was was how it actually was supposed to be so I had to go back through everything and fix it all back to the way I originally had it.

To help keep track of where the script was in the process of running, a print statement was put in prior to each step.  This not only helped to keep track of where the script was in the process, it was also a way to trace back where a problem was if the script failed.  These results show the progress of the script, right to the successful end.  It is a very useful step in the process of running the script, but very frustrating to deal with when creating the flow chart, since it adds so many extra elements to the chart.  I often wonder if a generic note on the bottom of the page stating each step was preceded by a print statement would be good enough, but never think to ask.

Sunday, July 17, 2016

Urban Planning - GIS for Local Government

Applications in GIS - Participation Assignment


The web address for the property appraiser of Walton County is www.waltonpa.com.  On this site I was able to discover the highest property sold for the month of June, 2016 was $10,200,000.  Um, let me spell that out just so we're clear...Ten million, two hundred thousand dollars.  That is what was paid on June 10, 2016 for this single family home of 4 bedrooms, 4 baths.  It previously sold for $7,200,000 in October of 2006.

Legal Informations:  LOT 1 BLK 19 ROSEMARY BEACH PH3 PB 13-2 IN OR 1973-61 OR 2072-84 OR 2623-810 OR 2733-3116 OR 3012-3183 

The legal description shown here may be condensed for assessment purposes. Exact description should be obtained from the recorded deed.

The assessed land value for this property is $4,856,300.  This site didn't offer the ability to check the assessed land value form the last sale, only the total price.

This two story house was built in 2000 in Rosemary Beach, Walton County, Florida.  It is 6,200 square feet, which is a pretty fair sized house, but it is the location that jacks the price up so high.  


The second part of this assignment was to assess the values of parcels in a subdivision named West Ridge Place. One of the parcels, 090310165 looks like it should have it's value lowered $6,175 to match the assessment of the lots around it.  Lots 090310320 and 090310325 should have their raised $2,137 to match those lots of same size around them.  The other three lots in the $24,938 range should stay the same even though they are comparable in size to others around them because they have easements on them.




Urban Planning - GIS for Local Government

Applications in GIS - Mod 9

Local governments worldwide use zoning as a method of land use planning.  The first scenario of  this week's exercise had us viewing property and obtaining data from Marion County Appraiser's website to create a site map of the parcel of a client who wanted to find out what, if any impacts a fly-in community would have on adjacent parcels.  To do this we learned about different types of zoning codes and their meanings and how to download a table of parcel data to a GIS file.  We then joined that table to our parcels feature class to access that data.  Next we did some editing to capture only those parcels we were interested in.
With our selection set we moved on to Data Driven Pages which is an awesome tool that allows you to generate a series of maps using a single layout and applying it to numerous map extents.  To do this we created a grid index that covered our entire area of interest then used this grid as an intersect with our zoning feature class to create a zoning index for the area of interest to symbolize according to zone codes.  Next we followed some seemingly convoluted steps to create a locator map.  We copied our index grid, Index2400, to another feature class of a different name, Locator_Mask, then copied that and renamed that copy Locator-Mask Current Page.  So in the end we had three identical layers with different names all in the locator map data frame.  Which makes no sense at all.  Until you manipulate each of them to do something different.  With Data Driven Pages enabled we were able to set up the Locator_Mask Currant Page layer to highlight within the locator map the current page being displayed in the extents.  Locator_Mask masked the rest of the grid and the original, Index2400, was symbolized to outline each page in white and label the pages.  So the data driven pages allowed the locator map to match the portion of the map being displayed in the extents and using Dynamic Text allowed the page number and the name of the index grid to update as you page through the file.  This set up allowed me to add all the standard map elements just once, and yet produce 16 different maps.  I did this quite simply by exporting my map to a PDF file specifying to export all pages to a single file.  For the last step in this portion we generated a report using the Create Report option in the attribute table and exported that to a PDF file as well.

The second scenario had us doing a lot of editing.  We had to merge a couple parcels and update the attribute information in them, then split a portion out of that to create a new parcel and update that attribute information.  This was actually a lot of fun because it included drawing lines.  Next we had to do some data manipulation and searches to report on suitable sites for a new Extension office to be built in Gulf County.  The requirements were the land had to be county owned, 20 acres or larger and vacant.  Gathering this data required using Query Builder to limit the records to those acres equal to or greater than 20 and joining tables.  Again, we used the Create Report feature of the attribute table to report our results.

Wednesday, July 13, 2016

You Can't Get There From Here

GIS Programming - Participation Assignment #2

You can’t get there from here.  It’s a comment frequently heard.  Often it’s said with a grin or a smirk, sometimes with chagrin.  But when it’s a matter of life or death or the loss of all you own, it’s not so funny.  Narrow streets and curvy road can be a serious impediment for large emergency vehicles like fire engines.  Even wider streets with cars parked on both sides can be difficult to navigate.  These are issue that concern fire departments around the world.  How you get there from here could save a life, or lose it. 

This article I found on sciencedirect.com, published by Procedia Social and Behavior Science from the 11th International Conference of the International Institute for Infrastructure Resilience and Reconstruction (I3R2) : Complex Disasters and Disaster Risk Management, titled “Models and Applications of Firefighting Vulnerability”  talks about the many investigations that have been done by researchers in regards to the use of GIS in firefighting, but weren’t comprehensive enough so another study was called for.

This study focused on the scope of firefighting models in regards to applications in the system and showing how much more GIS can be used with new information technology environments with the use of state-of-the-art devices and more in-depth information from the fire scenes.  The four models were based on fire vulnerability types, considering firefighting vulnerabilities and critical factors through the different stages of the fire, from the notification at the fire station to the fire scene, through the firefight and back to the station.  The study was conducted in four cities in South Korea.

This is the link for the article if you’d like to read it in detail:

Tuesday, July 12, 2016

Working With Geometries

GIS Programming - Mod 8

This week in working with geometries we created a text file and wrote lines of data to it from a rivers polyline shapefile.  This text file is the result.  Each segment of line for each object is listed individually.  The first number is the object ID, the second number is a vertex ID to show which segment of the object it belongs to.  Next are the xy coordinates and last is the river name.  This data was derived using a SearchCursor and a couple for loops, then written to the text file with a .write () statement.

Also as part of the assignment was writing the pseudocode and a flowchart for the script.



Sunday, July 10, 2016

Location Decisions

Applications in GIS - Mod 8

This week's assignment was to make a recommendation on places to live for a couple, a doctor who will be working at the North Florida Regional Medical Center, and her husband who will be teaching at the University of Florida, in Alachua County, Florida.
 The criterion set by the couple were nearness to work for both of them, a neighborhood with a high percentages of people 40 to 49 years old, and a neighborhood with high house values.  This map was created with four data frames, one each for each of the criterion.  For the distance frames we ran the proximity analysis tool Euclidean Distance to generate a raster image showing distances out from the points of interest in 3 mile increments.  To allow for easier interpretations and analysis we then reclassified them. Next we added fields to our census tract attribute table to calculate the percentage of people 40-49 and homeowners, then converted these feature classes to rasters and reclassified them so they could be compared to the distance rasters and recommended a few locations for each criteria.

 The next map is where our comparisons were made.  We added our four reclassified layers to the map then created a model with them as the input and the Spatial Analyst Weighted Overlay tool.  For the first data frame we gave each of the criterion equal weight, but for the second we responded to the clients' request to focus more on proximity to workplaces and restricted some of the Scale Values in the distance layers to place them within the least favorable values so they wouldn't be influenced by the weight of the other criteria.  Next the percent of influence was adjusted to give more weight to the distances and less to the population and home ownership.  With this new information we were then able to recommend a few areas evenly spaced between the two locations with the rest of the criteria accounted for.




Sunday, July 3, 2016

Homeland Security - MEDS Protect

Applications in GIS - Mod 7

Critical infrastructure includes not only those infrastructures vital to our defense and security, but also those people, places and things necessary to allow for everyday living and working.  Certain key resources, such as large gathering place, may be deemed to be critical at certain times when circumstances can cause them to become a target.  One such example of this is the Boston Marathon Bombing of 2013.  

Our lesson this week took us back to that incident as though we were preparing for it.  We started with the MEDS data we had put together for last weeks assignment.  The Boston Marathon was determined to be at risk, so security needed to be stepped up.  We were tasked with identifying critical infrastructures in the vicinity of the marathon finish line location and implement protective measures such as securing the perimeter of a protective buffer zone around targeted sites by locating surveillance posts at ingress and egress points and siting the most optimal observations positions in and around the event site.

We started with our Buffer tool to create a 3 mile buffer around the finish line.  Our next step was to locate those key resources within this buffer zone that could potentially be secondary targets.  For this we used our GNIS (Geographic Names Information System) layer to run a select by location query to isolate those features that fell within our zone of interest.  For this lesson we focused our attention on hospitals, selecting out those that were nearest to the finish line using the Generate Near Table tool, then of those, manually selecting those that met our criteria.  We then created a 500' buffer around those hospitals and the finish line to emphasize those areas that required stepped up security.  Extra security precautions were required at the finish line so we used the Intersect tool to determine appropriate locations for security checkpoints along the ingress and egress routes at the border of our finish line buffer zone.

Safety and security at large events like the Boston Marathon is dependent upon surveillance.  Since the finish line for this event was considered to be a high target area security needed to be heightened with the use of surveillance equipment, providing line of sight coverage of the finish line for the entire block.  15 sights were chosen to provide the greatest amount of coverage without obstruction. The height of the equipment ranges from 10 to 15 meters in elevation.  Determining the placement of these cameras started with converting LiDAR data to a raster then generating a hillshade effect to see how the shadows would fall at the time and date of the event.  Next we created a new feature class of surveillance points strategically placed along the around the finish line to provide the most unobstructed view possible.  To verify the placement of these points we ran the Viewshed tool to show us an obstructed areas between the surveillance point and the finish line.  With this information we were able to determine which points need adjustment either through relocation or increased height.

With that all set we then used the Create Line of Sight tool which allowed us to run a line from our surveillance points to the finish line, including the elevation of the point, which enabled it to determine where there might be sections of obstructed view.  We created a graph of one of the lines that had an obstructed view and the graph shows how that line of sight is obstructed.

It would have been nice to see how all this showed up in a 3D view, but I was never able to get the line of sight lines copied over to ArcScene to see it.