Sunday, March 27, 2016

Geocoding, Network Analyst & ModelBuilder

Intro to GIS - Week 12

This week was a three part assignment.  In Section 1:  Geocoding, we downloaded a Tiger All Lines shapefile for Lake County, Florida.  This file included streets, water features, railroads and trolleys. We then also imported a table of EMS site addresses from out student assignment files to the same geodatabase.  The information in the two files was then standardized using the Address Locator function.  With this new feature class we were then able to geocode our data, matching addresses from the table to geographic locations on the map.  This process was successful in matching most of our addresses, but our unmatched rate was over the general guideline of 5% or less so we then had to manually rematch some of the unmatched addresses.  We did this in two steps.  For some of the unmatched addresses there were possible candidates listed.  We zoomed  to these candidates and compared them to the location of the unmatched address in Bing Maps.  If a candidate seemed a close enough match we selected it.  Otherwise, we handled it as we did in the next step of rematching unmatched addresses that didn't have any candidates.  Using Bing Maps we found the location, then used a cross street as a reference to zoom to using Select by Attributes and determined the most likely location on our map.  Once we were pretty sure we knew where the address was located we selected the Pick Address from Map option and linked the address to it's geographical location. 

In Section 2:  Network Analyst, we used the geocoded data from Section 1 to create a simple route map of three of the EMS sites using the Network Analyst tool.  This was a pretty simple process.  We started by creating a route analysis layer using the New Route tool, then added stops to the route using the Create Network Location Tool.  Once our stops were selected we set up the parameters for analysis.  There were lots of options here to choose from including Impedance, Using Start Time, Use Time Window, Restrictions and several more.  After setting all the parameters we selected solve and it created a route of our three stops as shown in the enlargement of this map.

For Sections 3:  Model Builder we got most of our instructions from the ESRI Virtual Campus Training.  For this exercise we used a model that had already been created and just made a few adjustments to it.  The model was designed to locate areas around schools where the gas utility company will regularly check for gas leaks that are too close to the school.  We opened this model through the edit feature then explored some of the elements.  The first half of the model was in color and so ready to run, but the second half was still white, indicating it was missing a piece of data.  One of our variables was missing a value so we had to enter that.  Once we did the rest of the model switched to colored and was ready to run.  With the model now ready to run we chose Run Entire Model from the Model menu.

When a model has been successfully run the model tool and output data elements will be shadowed.  Mine weren't.  I went back through the instructions step by step and knew I had run it correctly, but I couldn't figure out what I had done wrong.  Since I didn't know where the problem was I deleted everything and recopied all the files for the assignment and started again from scratch.  And failed again.  After the third failure, before starting again from scratch, I went back to the beginning of the lab instructions before the ESRI instructions and reread them..a few times.  Finally the last line of instruction clicked and I realized while I had "explored" all of the elements in the model I never went in and reset the output location for anything to my own working folder.  So I went back in and checked every element again, to verify if there was anything in each one that required any changes, fixed what needed fixing then ran the model again.  Finally with success.

Basically what this model did was create a buffer around the schools and gas mains, intersect them so only those schools that had gas mains within a certain distance would be selected, then dissolved the borders between the intersected buffer polygons so each polygon could be treated as one individual feature, creating a layer of Gas Leak Areas like this.


Monday, March 21, 2016

Dot Mapping

Cartographic Skills - Mod 10

Mod 10 was on Dot Mapping.  Dot mapping is basically a very simplistic form of thematic map.  It uses a dot to represent a certain amount the of phenomenon being mapped.  A dot representing a certain value is placed in the relative geographic location where the phenomenon is most likely to occur.

Dot maps are a very easy concept to understand, but it can be a little complicated getting the computer to place the dots correctly.  I struggled with this for a while before I remembered this problem had come up in the discussion boards.  When I went back and checked, the answer I was looking for was right there.  Once I implemented that correction everything fell into place.  My problem was that the layer I was basing the placements of the dots on, my control layer, needed to be above the layer I was placing them on.  Once I shifted that control layer to the top everything worked right.

Another difficulty with this map is getting the dots large enough to show up, but not so large that in the more densely populated areas they all end up merged into one large blob.  One of our requirements was to make sure there was at least one dot in every county.  In order to do that I had to set my value to represent the size of the least populated county.  With that being 10,000, I had to keep the size of my symbol rather small so in the more populated areas like Miami I didn't end up with a blob.  That made the dots so small they're a little difficult to see in some areas.

But otherwise it was a pretty simple assignment that was pretty quick to complete.

Saturday, March 19, 2016

Vector Analysis 2

Intro to GIS - Week 10

Week 10's lab was on Vector Analysis using modeling tools in ArcGIS such as buffer and overlay, creating a scrips in ArcPy to run the buffer tool, and spatial queries.  To perform these analysis we started with layers for roads, lakes & rivers, and conservation areas.  Our goal was to create a map showing possible camping sites in De Soto National Forest in Mississippi, that would be within 300 meters of a road, and either within 150 meters of a lake or 500 meters of a river, but not within any conservation areas.  This was the resulting map:

To create this we first needed to create buffer zones of our parameters for roads and waters.  We did this first for roads by using the Proximity Analysis Tool Buffer.  With the Buffer tool we created a buffer of 300 meters on all the roads.  Next, since rivers and lakes were on the same layer and we wanted different distances for rivers than we did for lakes we first created a new field in the attribute table then selected by Attributes to assign values for the variable buffer distance. Once that was done we were able to run the Buffer tool using this new field.

We also created several buffers using ArcPy mostly to see how it was done.  ArcPy makes more sense to use for multiple processes because you can run them all at once so it saves a lot of time.  

Once our buffering was done we needed a way to combine the information for the two datasets.  For this we used the Overlay Analysis Tool called Union.  Again, to prepare the data to be combined, we had to first prepare the attribute tables to provide the information required for the process.  We did this by adding fields to both the buffered water and the buffered roads layers assigning a value of 1 to the fields for inside the buffer zones of water and roads.  After that we ran the Union Tool.  The output of this process was a layer with the combined data.  Once all this information was altogether in one file we then were able to run a query to select by attribute only those records that fell within the buffer zones for both water and roads and export those features to a new feature class.

We now had a layer that contained only those buffer zones that fit our criteria for roads and water.  Next we added the conservation area to the mix by using the Erase overlay tool to remove from our selection set any buffered areas that fell within the boundaries of the conservation areas.  This provided us with our ultimate goal of selecting possible campsites that were within 300 meters of a road and either within 150 meters of a lake or 500 meters of a river, but not within a conservation area.  It seems a little convoluted, but it works.  

Saturday, March 12, 2016

Flow Mapping

Cartographic Skills - Mod 9

This week's lesson was on Flow Mapping.  We created a distributive map depicting the 2007 immigration into the US from regions all around the world.  We had the choice of two different base maps to work with.  One had an inset choropleth map of the US classed to display the states by the percent of immigrants who settle there.  The other base map had the choropleth map overlaid on the world map.  I chose to go with the second option.  I had some concern the choropleth data might be a little difficult to see, but then I realized I would't really be able to make the image any bigger in the inset map if I was going to have room to display everything else at a reasonable proportion.  


In the end I think the choropleth map showed up just fine.  The scale bar, however, seemed obnoxiously large to me so I ended up cutting it down from 10,000km to 5,000.

This whole lab was in AI which was rather intimidating to me until I opened the layers up and saw they were fairly well laid out.  It was easy enough to gather all the components of the scale bar and group them all together under the continents layer.  Everything else was pretty well laid out.  The only layer issue I ran into was when I wanted to expand the map to fill in more of the frame and had to delete some of the little islands that really had no effect on the purpose of the map.  The finding and dragging them, element by element, to the next layer up in order to separate them out of the groups was, again, one of the most time consuming aspects of the whole lab.  

I was not looking forward to drawing the curved lines because I had tried that a few times before and had no luck, but after watching a couple of the videos that had been posted I felt a lot more confident tackling them.  I had a few false starts, but then I got the hang of it and was able to get through them all well enough.  My biggest problem was getting the two lines for North America to line up well enough with the straight line and arrow so they gave a smooth, continuous appearance.  Another problem there was trying to figure out how to divide the line up.  Originally I had subtracted Canada from the rest of the NA total and set the line weight proportionate to that, but then I realized Canada wasn't the only country north of the US included in the count so I just decided to divide the line 50/50 and left it at that.  I also wasn't real thrilled with the line weight for the line running from Oceana, but I didn't want to increase the line weight for Asia, with the largest amount of immigrants, because I didn't want to crowd out the rest of the lines that had to fit into that limited space.

Labeling the continents was simple enough since they were all large enough to select a large font size without running into any overlap issues.  I had already decided I was going to label my flow lines rather than create a separate legend since I didn't want to lose the space.  I tried selecting one of the lines to use the Type on a Path Tool, but the text kept coming in upside down.  I spent a bit of time trying to figure that out, but finally decided to just enter the text straight and rotate it to the line.  I placed all the text close to the continent it represented because I thought that made it clearer to the map reader.  

After this the rest of the map elements went in quick and easy and I got to the effects stage.  I had already decided I would like to raise the US choropleth map up just a bit to give it a little more emphasis, but when I did it offset it so much from the world map underneath that I found it somewhat distracting.  I tried making a few adjustments to see if I could minimize the effect, but AI was moving so slow and locked up a few times so in the end I gave up on that idea.  Though I think it would have looked pretty cool.  So my next step was to try to implement some effects on the flow lines.  I went round and round with this, changing the color from black to matching the color of the regions they represented, then finally settled on matching colors.  I think the black stood out a lot more, but it I thought it was too much, that it kind of washed out the rest of the map.  I decided to go for an inner glow and a shadow.  I wanted a brighter center, fading out to the darker color along the edge, but when I tried it that way it either didn't show up or it took over the whole width of the line.  When I tried the glow on the edge it was a similar experience, but at the lower setting where it was just around the edge it did seem to show up a bit so I left it at that.  With that done I added shadows to the lines and called it done.

Overall it was a rather enjoyable assignment.  I had a bit more time with this one to play with some of the AI tools and try different things and feel like I learned quite a bit with this.

Sunday, March 6, 2016

Isarithmic Mapping

Cartographic Skills - Mod 8

This week's lesson was on Isarithmic Mapping, the most widely used thematic mapping method after the Choropleth map, and also one of the oldest, dating back to the 18th century.  The most common form of an isarithmic map is a contour map.  The appropriate data type for isarithmic mapping is presumed to be continuous and smooth, to exist throughout the entire geographic region of interest and to change gradually or smoothly.  Two types of isarithmic maps are isometric, which are produced using true point data and isopleth maps produced with conceptual point data.  Isopleth maps required data to be standardized.

For our lab we created three isarithmic maps with data downloaded from the USDA Geospatial Gateway.  This data is of the average annual precipitation in Washington State for a 30 year period.  The data was prepared using the PRISM (Parameter-elevation Relationships on Independent Slopes Model) interpolation method.  This method weights the data collected from a dense monitoring network to account for physiographic factors that influence climate patterns.  We created these maps using three different symbolization approaches; Continuous tones, Hypsometric tints, and Hypsometric tints with contour lines.  The sample above is the third approach, a hypsometric tint with contour lines.

To create the continuous tone map all we really needed to do was change the symbology of the layer by changing the color ramp from Graphic View to Precipitation.  We also added the hillshade effect which added a sense of depth to the image, and adjusted the labels to present in rounded numbers instead of the raw data values.  The only difficulty came with the legend.  Because Washington is shaped horizontally it made more sense to have a horizontal legend.  But because ArcMap doesn't follow the convention of left to right reading in its defaults for the horizontal legend distribution we had to do some manual manipulation to make that come out right.  

The second map used the same data, but symbolized in Hypsometric Tints.  For this map we needed to utilize the Spatial Analyst tool Int to convert the raster vales from floating to integers.  This allowed us to create crisp contours since the cell values were truncated to whole numbers.  Once that was done we manually classified the data into 10 classes and again changed the color ramp to Precipitation and selected the Hillshade Effect.

The last map, above, was the Hypsometric Tints map with Contour Lines added.  Again we turned to the Spatial Analyst Toolbox for the tool Contour List.  With this tool we simply specified our contour values and it placed the contours on the map.  Since our contour intervals matched the hypsometric step values we did not need to notate them on this map.

Thursday, March 3, 2016

Data Search

Intro to GIS - Weeks 7&8

This week's lesson was our mid-term project.  Even with two weeks it was tough to get it all done in time.  This was a very time consuming assignment, but in the end, very enjoyable.  For this lesson we had to hunt down and download all the files we needed ourselves.  I did this through Labins for my aerial, FGDL for all five of my vector files and for my two environmental rasters.  The DEMs were both downloaded from a USGS site.

A map of Polk County set in four data frames to display the
digital elevation, wetlands habitats, and habitats and
 land covers of the county.
Since I was working on a county level I decided to go with a State Plane projection.  This process went quite well and I was done with it before I knew it.  It was the next step that took so much of my time.  The water dataset had every drop of water my county had ever seen in it.  When I added that file to my county it obscured everything else that needed to be on there.  I spent quite a bit of time creating and deleting selections trying to find a nice balance.  The same was true for the parks, roads and cities, though cities weren't quite as tough.  Finally I got my selections down to proportions I felt made a good representation of their fields without overwhelming the map.

After that I moved on to my rasters and decided one file just wasn't going to be enough for my DEM so I went back and downloaded and projected a second one.  I used the raster clip tool for all my raster images, but that didn't quite get it so I had to used the data frame clip as well.

Once all of that was done I set about laying my map out and decided on one map with four data frames.  Two of my data frames were either too dark to too busy to display anything else with them, but I felt with the other two data frames having the roads and cities on them for reference so close by it wouldn't hurt to have them just display their data alone.

I'm still going back and forth about having the projection information displayed coming from the maps dynamic text option of Coordinate System.  On the one hand it's kind of cool to have all that information there, but on the other, it's not really aesthetically pleasing.  I don't think I'll do it that way again, but I think it was worth doing once.