Showing posts with label GIS4043. Show all posts
Showing posts with label GIS4043. Show all posts

Thursday, April 28, 2016

Bobwhite - Manatee Transmission Line Project Analysis

Intro to GIS - Final Project

Our final project was very long, involved and quite challenging.  We were tasked with analyzing key factors that were originally considered in gaining right-of-way access for a real transmission line project for Florida Power and Light Company (FPL).  We started with the background phase which was providing a cartographic model, basically a plan, for how we would go about the project, what data we would need and where we would get it.  The next phase was an analysis of the data.  There were four objectives we had to analyze and in order to do that we had to create maps for each objective, including datasets that would allow us to examine the data and run analysis tools on it. Most of the data was provided, some of it we had to download, and some we needed to create ourselves.  This process was a real challenge that required skills from pretty much the entire course.
The last phase was the presentation.  We had to create a slide show and transcipt as though we were presenting the data to a high-school level audience.  

Here are links to my slide presentation and the transcripts for it:

Tuesday, April 5, 2016

Georeferencing, Editing and ArcScene

Intro to GIS - Week 13


This lab started with georeferencing two raster images of the UWF campus.  Georeferencing is the method used to tell a raster dataset with no geographically referenced coordinate data built into it where it belongs geographically.  This is done by linking the target layer, the unreferenced layer such as the raster image, to a referenced layer using control points to match up common points between the two.  In this exercise we matched buildings in the raster to buildings in a referenced building layer.  Several points are usually required to create an accurate reference and they should be evenly spread throughout the image.  I found the Image Viewer Tool to be very helpful in placing the control points.

Once the links are places with the control points they can be viewed in the links table.  This table shows the residual value which tells you how much each link agrees with how the layer is currently displayed.  The lower the value, the more accurately the control point is georeferenced. I had two control points that were considerably higher than the rest and when I zoomed in to look at them more closely I saw they weren't that well aligned and there were a couple other buildings around them that were not either.  I deleted those control points and set a couple others.  This made for a much better visual alignment and when I checked the table these residual values were much more in line with the rest.  Once my residuals were more in line and a closer examination of my image was made I felt I had a good match.  I went back to the table to check the RMS (Root Mean Square) Error and saw it was well below 15.  RMSE is frequently used in GIS as an indicator of the accuracy of the spatial analysis and/or remote sensing.  It is a measure of the differences between the calculated values and the actual, or observed/measured, value.  The difference between the calculated and actual value is called the residual.  The RMSE, derived from squaring the difference between the actual and calculated values adding these residuals together, dividing that by the total number of values and taking the square root of the result, aggregates the residuals into a single value.  My RMSE for the north raster of UWF was 4.743 with a 1st Order Polynomial Transformation.  My second raster for the south portion of UWF was much better at 1.66888 with a 2nd Order Polynomial.

The next section of our lab was editing.  In this section we used editing tools to digitize two new features; a building and a road.  To digitize a new feature you must first start an editing session.  This can be done either by clicking on the editor menu of the editor toolbar and selecting start editing or right-click on any layer in the table of contents and select Edit Features then Start Editing.  Selecting Create Features opens a window with the templates in the map in the top panel and tools available to create that type of feature in the bottom panel.  We used the straight segment and endpoint arc tools for both the building and the road, utilizing the snap options on the road to make sure the new road lines met existing road lines and endpoints were placed properly.  Edits are not automatically saved, not even by saving the map.   Before ending the editing sessions with Stop Editing you must first select Save Edits or everything will be lost.

The third section we had to create a multiple ring buffer around an eagle's nest on campus property to show the location of a new conservation easement to protect the nest and the eagle.  First we started with creating a hyperlinked picture of the eagle nest to the attribute data, then through the layer properties tab HTML Popup, choosing to show content as a URL using the picture field.  Next we used Identify to select the point feature in the drawing and clicked on the lightening bolt to verify the link worked.  Next we used the Multiple Ring Buffer tool to create two rings around the protected nest.

Section 4 had us create a 3D view of UWF in ArcScene.  We started with a DEM file of UWF which had elevation data, then draped the other layers over it in the Base Heights tab of each layer's properties and selecting "floating on a custom surface:" and selecting the UWF_DEM to use its elevation data.  Next we used the height field of the buildings layer as our z-value to extend those features above ground by checking "Extrude features in Layer" option in the Extrusion tab of the buildings layer properties. We then set the vertical exaggeration in the scene properties to accentuate the building heights.  With that done we exported the scent to a 2D image to finish off in ArcMap since ArcScene does not allow the addition of map elements.  I had to go back and forth between ArcMap and ArcScene a couple times to adjust my angle so the digitized road would show in the image but there would still be enough relief to show the height of the buildings.  I ended up having to increase the line weight of the road symbol to make sure it showed up.

I really enjoyed the editing portion of this assignment and am looking forward to doing more such work in later courses.

Sunday, March 27, 2016

Geocoding, Network Analyst & ModelBuilder

Intro to GIS - Week 12

This week was a three part assignment.  In Section 1:  Geocoding, we downloaded a Tiger All Lines shapefile for Lake County, Florida.  This file included streets, water features, railroads and trolleys. We then also imported a table of EMS site addresses from out student assignment files to the same geodatabase.  The information in the two files was then standardized using the Address Locator function.  With this new feature class we were then able to geocode our data, matching addresses from the table to geographic locations on the map.  This process was successful in matching most of our addresses, but our unmatched rate was over the general guideline of 5% or less so we then had to manually rematch some of the unmatched addresses.  We did this in two steps.  For some of the unmatched addresses there were possible candidates listed.  We zoomed  to these candidates and compared them to the location of the unmatched address in Bing Maps.  If a candidate seemed a close enough match we selected it.  Otherwise, we handled it as we did in the next step of rematching unmatched addresses that didn't have any candidates.  Using Bing Maps we found the location, then used a cross street as a reference to zoom to using Select by Attributes and determined the most likely location on our map.  Once we were pretty sure we knew where the address was located we selected the Pick Address from Map option and linked the address to it's geographical location. 

In Section 2:  Network Analyst, we used the geocoded data from Section 1 to create a simple route map of three of the EMS sites using the Network Analyst tool.  This was a pretty simple process.  We started by creating a route analysis layer using the New Route tool, then added stops to the route using the Create Network Location Tool.  Once our stops were selected we set up the parameters for analysis.  There were lots of options here to choose from including Impedance, Using Start Time, Use Time Window, Restrictions and several more.  After setting all the parameters we selected solve and it created a route of our three stops as shown in the enlargement of this map.

For Sections 3:  Model Builder we got most of our instructions from the ESRI Virtual Campus Training.  For this exercise we used a model that had already been created and just made a few adjustments to it.  The model was designed to locate areas around schools where the gas utility company will regularly check for gas leaks that are too close to the school.  We opened this model through the edit feature then explored some of the elements.  The first half of the model was in color and so ready to run, but the second half was still white, indicating it was missing a piece of data.  One of our variables was missing a value so we had to enter that.  Once we did the rest of the model switched to colored and was ready to run.  With the model now ready to run we chose Run Entire Model from the Model menu.

When a model has been successfully run the model tool and output data elements will be shadowed.  Mine weren't.  I went back through the instructions step by step and knew I had run it correctly, but I couldn't figure out what I had done wrong.  Since I didn't know where the problem was I deleted everything and recopied all the files for the assignment and started again from scratch.  And failed again.  After the third failure, before starting again from scratch, I went back to the beginning of the lab instructions before the ESRI instructions and reread them..a few times.  Finally the last line of instruction clicked and I realized while I had "explored" all of the elements in the model I never went in and reset the output location for anything to my own working folder.  So I went back in and checked every element again, to verify if there was anything in each one that required any changes, fixed what needed fixing then ran the model again.  Finally with success.

Basically what this model did was create a buffer around the schools and gas mains, intersect them so only those schools that had gas mains within a certain distance would be selected, then dissolved the borders between the intersected buffer polygons so each polygon could be treated as one individual feature, creating a layer of Gas Leak Areas like this.


Saturday, March 19, 2016

Vector Analysis 2

Intro to GIS - Week 10

Week 10's lab was on Vector Analysis using modeling tools in ArcGIS such as buffer and overlay, creating a scrips in ArcPy to run the buffer tool, and spatial queries.  To perform these analysis we started with layers for roads, lakes & rivers, and conservation areas.  Our goal was to create a map showing possible camping sites in De Soto National Forest in Mississippi, that would be within 300 meters of a road, and either within 150 meters of a lake or 500 meters of a river, but not within any conservation areas.  This was the resulting map:

To create this we first needed to create buffer zones of our parameters for roads and waters.  We did this first for roads by using the Proximity Analysis Tool Buffer.  With the Buffer tool we created a buffer of 300 meters on all the roads.  Next, since rivers and lakes were on the same layer and we wanted different distances for rivers than we did for lakes we first created a new field in the attribute table then selected by Attributes to assign values for the variable buffer distance. Once that was done we were able to run the Buffer tool using this new field.

We also created several buffers using ArcPy mostly to see how it was done.  ArcPy makes more sense to use for multiple processes because you can run them all at once so it saves a lot of time.  

Once our buffering was done we needed a way to combine the information for the two datasets.  For this we used the Overlay Analysis Tool called Union.  Again, to prepare the data to be combined, we had to first prepare the attribute tables to provide the information required for the process.  We did this by adding fields to both the buffered water and the buffered roads layers assigning a value of 1 to the fields for inside the buffer zones of water and roads.  After that we ran the Union Tool.  The output of this process was a layer with the combined data.  Once all this information was altogether in one file we then were able to run a query to select by attribute only those records that fell within the buffer zones for both water and roads and export those features to a new feature class.

We now had a layer that contained only those buffer zones that fit our criteria for roads and water.  Next we added the conservation area to the mix by using the Erase overlay tool to remove from our selection set any buffered areas that fell within the boundaries of the conservation areas.  This provided us with our ultimate goal of selecting possible campsites that were within 300 meters of a road and either within 150 meters of a lake or 500 meters of a river, but not within a conservation area.  It seems a little convoluted, but it works.  

Thursday, March 3, 2016

Data Search

Intro to GIS - Weeks 7&8

This week's lesson was our mid-term project.  Even with two weeks it was tough to get it all done in time.  This was a very time consuming assignment, but in the end, very enjoyable.  For this lesson we had to hunt down and download all the files we needed ourselves.  I did this through Labins for my aerial, FGDL for all five of my vector files and for my two environmental rasters.  The DEMs were both downloaded from a USGS site.

A map of Polk County set in four data frames to display the
digital elevation, wetlands habitats, and habitats and
 land covers of the county.
Since I was working on a county level I decided to go with a State Plane projection.  This process went quite well and I was done with it before I knew it.  It was the next step that took so much of my time.  The water dataset had every drop of water my county had ever seen in it.  When I added that file to my county it obscured everything else that needed to be on there.  I spent quite a bit of time creating and deleting selections trying to find a nice balance.  The same was true for the parks, roads and cities, though cities weren't quite as tough.  Finally I got my selections down to proportions I felt made a good representation of their fields without overwhelming the map.

After that I moved on to my rasters and decided one file just wasn't going to be enough for my DEM so I went back and downloaded and projected a second one.  I used the raster clip tool for all my raster images, but that didn't quite get it so I had to used the data frame clip as well.

Once all of that was done I set about laying my map out and decided on one map with four data frames.  Two of my data frames were either too dark to too busy to display anything else with them, but I felt with the other two data frames having the roads and cities on them for reference so close by it wouldn't hurt to have them just display their data alone.

I'm still going back and forth about having the projection information displayed coming from the maps dynamic text option of Coordinate System.  On the one hand it's kind of cool to have all that information there, but on the other, it's not really aesthetically pleasing.  I don't think I'll do it that way again, but I think it was worth doing once.

Thursday, February 18, 2016

Projections Part 2

Intro to GIS - Week 6

Whew!  It's done.  This one wasn't so much difficult as time consuming.  There were a lot of steps involved and a number of inconsistencies that caused confusion and delay.  Our assignment was to download data from a number of outside sources and bring it all together in a map.  Sounds simple enough until you know all these files are in different projections, or none at all, and have to be defined and/or reprojected to match one another.  

Petroleum Storage Tank Contamination Monitory (STCM) sites in Escambia County with an aerial image of Walnut Hill, Fl.
It's not a pretty picture and thank goodness it's not required to be.  The point of this exercise was to get all the data to match up.  Had there been more time I would have cleaned it up a bit, but this is what was required so here it is.

This assignment was a bit of information overload.  I think it's going to take me several days to put the whole thing in perspective.  At this point my mind is still spinning.

Wednesday, February 10, 2016

Projections - Week 5

This week's assignment was very interesting.  This week we had to make a map with three data frames containing the same image, but displayed in different coordinate systems.

The State of Florida in three projections.  The table shows the square miles of four counties within each data frame.  Check out the difference from one projection to another.
The map also had to contain a table showing the square mileage of four counties in each of the three projections.

For the first data frame we started in an Albers Conical Equal Area projection.  From there we used the Data Management Tools and the Projections and Transformations toolset to access the Project tool.  With this we projected spatial data from one coordinate system to another.  We then created two more data frames and used the same method to assigned coordinate systems to them.  In the end we had data frames for Albers Conical Equal Area, UTM, and State Plane N.

All three of these data frames contained a map of the State of Florida counties boundaries.  Within the county boundary layer we opened the attribute table and added a field called "Area" and used "Calculate Geometry" to populate the Area field with the square mileage of each county.  From this we created a selection of four counties for comparison and saved them to a separate layer.  With this information we were able to create a table to display on the map showing the variations in square mileage for the same counties, in different projections. 

I think I understand projections a bit better after this exercise, but I can see it will be a while before I'm not easily confused by them.

Friday, February 5, 2016

Sharing GIS Maps and Data - Week 4

This week's assignment was to create three different map outputs to share.  I chose to create mine based on my Top 10 Favorite Restaurants and started by listing them in an Excel spreadsheet.  Once I listed all the places in order of preference I looked up and added their addresses and urls showing a dish from each restaurant.  Once the spreadsheet was done I saved it as a Text (Tab Delimited) file.

After that things got confusing.  The next step was to geocode the zip codes and addresses.  This was done by uploading the text file in ArcGIS Online, then downloading the layer to open in ArcMap.  Part of the process in ArcGIS Online allowed for making some design choices, but they were very limited.  The first output from this process was this public web map:

http://arcg.is/1Ra5sw8

This method was quick and easy, but I didn't care much for the results.  I didn't feel I had any control over the symbols and how they were displayed.  It didn't seem to matter if I zoomed in or out, there was only one view where I didn't have a lot of symbol overlap.  I actually changed the restaurants of preference and their rankings just to get enough space in between so each individual symbol would be seen.  Even then one over-lapped another, but not so much you couldn't tell there were two there and select each.  I found this to be quite annoying.

I think this first output was supposed to be the Map Package and the basis for the next map, the ArcGIS Online Map, but since the last one was mostly done on ArcGIS Online I'm rather fuzzy on the difference.  Here, the next step is to open the "map package" and that launches ArcMap, though a couple steps later the task is to "Create a Map Package".  Hence my confusion.  But at this point I did a bit of work, mostly creating a shape file and adding that and a World Streets layer.

After verifying the World Streets and the Geocoded list were both being displayed in the same coordinate system, I started working on the scale to make sure the symbols didn't look like a jumbled pile as I zoomed in and out. The next step was to clean up the attribute table because new fields were added during the Geocoding stage.  Once that was done I shared as a Map Package, even though I used a map package for the basis of this map.  Here I shared the Map Package to my ArcGIS Online account, analyzing it in the process to make sure nothing would prevent the map from being published or cause problems with its performance.  Once this process was through the Map Package was created.  From ArcGIS Online the Map Package was able to be pulled up.

Last was to make a Google Earth Map.  This was quick and easy since most of the work had already been done for the first two output maps.  Here a Layer to KML conversion was done allowing the map to be viewed in Google Earth.  Clicking on the KML file launched Google Earth.
This lab was very intensive.  There was a lot of new terminology to learn and a lot of bouncing around between programs that made things confusing.  I think I'll need a bit more time to absorb this lesson than I have any others.

Thursday, January 28, 2016

GIS Cartography

This map is a representation of the population of Mexico by State.  It is color coded to show at a glance the difference in population from state to state. 

This week's assignment had us exploring, editing and managing the data with a multiple of tools, attributes, and layouts.  To achieve this we had to create three maps.  The first was of the Population of the States of Mexico.  This was the easiest one yet gave me the most trouble.  Somehow I got my neatline messed up and when I tried to print it the top and bottom of the border were cut off.  I spent a fair amount of time trying to understand what I did wrong and fix it, but finally got to the point where I realized it would be faster to just start again from scratch.  One thing this lesson taught me was to not only save early and save often, but to also check my print preview prior to saving. 

The second map was of Central Mexico and its urban areas, railroads, highways and rivers.  This map was fun.  It started out so busy and crowded it was overwhelming.  But little by little we started changing the symbology to cut down the clutter and make the map more readable.  Step by the step the map became more and more legible until finally, a quick glance and a peak at the legend told the whole story of what this map was about.

Last was a map of the typography of Mexico.  This map included a couple firsts.  In this map we used a raster image of Mexico and showed the elevation using a color ramp with stretched symbology.  This portion of the assignment came with fewer directions so we had to work a few more things out on our own.  This one was also a lot of fun.

Thursday, January 21, 2016

GIS4043 - Intro to GIS

Week 2 - Own Your Map

 

In Week 2 of GIS4043 - Own Your Map we determine what factors needed to be considered before designing a map layout.  Some of those things are:
  • The purpose of the map
  • The intended audience
  • If the map will stand alone or be part of a series
  • How many data frames will be required
  • What size paper will the map be printed on
  • Will it contain graphs or reports
  • How will the page be oriented
  • What elements will be displayed and how will they be organized. 

With these considerations in mind I created this map depicting the location of UWF's main campus in Escambia County, Fl.  An inset map of all the counties in Florida was added to give perspective on where in the state of Florida Escambia County is located.  The cities and town displayed were limited to just a couple in the general vicinity of the school.

In adding elements to the map, all essential elements as defined by UWF GIS Online were added.  These elements are:  Title, Scale, Legend, North Arrow, Border, Date, Data Source, Cartographer Name.


Friday, January 15, 2016

GIS4043 - ArcGIS Overview

GIS4043 - ArcGIS Overview Lab Parts I & II



Process Summary:
ArcGIS Overview


PART I: ArcMap Overview

Part I of Week 1 ArcGIS Overview Lab covered basic ArcGIS Tools ad Views and adding data in ArcMap in order to learn how to:
  • ·         Locate and Launch ArcGIS/ArcMap
  • ·         Review the individual file components of a shapefile
  • ·         Locate and use commonly used tools in ArcMap
  • ·         Navigate ArcHelp (online version)
  • ·         Construct a basic Map in ArcMap
  • ·         Identify MXD map file and Export map images to JPG or PNG format


1.      The Part I lab prompted for an examination of the Metadata in ArcCatalog for the following files:This information was gathered from the metadata for each of the layers using methods provided in Part I lab.
      
Layer
Data Type
Publication Information: Who Created The Data?
Time Period Data Is Relevant
Spatial Horizontal Coordinate System
Data Summary / Description
Cities
Point
USGS
2008
GCS_WGS_1984
World Cities represents the locations of major cities of the world.
World_Countries
Polygon
USGS
2008
GCS_WGS_1984
World Countries (Generalized) represents generalized boundaries for the countries of the world as they existed in January 2008.

 


2.       The following file paths define the location of all files used or created by this project:

Working data folder filepath:
S:\My Documents\GIS4043\Week1Orientation\Data\OverviewArcGIS
Working MXD(s):
S:\My Documents\GIS4043\Week1Orientation\Document\mr_overview_map.mxd
Other Working Documents:
S:\My Documents\GIS4043\Week1Orientation\Document\mr_overview_map.jpg
S:\My Documents\GIS4043\Week1Orientation\Document\mr_overview_ps.doc

PART II: ArcGIS Overview
Part II of Week 1 ArcGIS Overview Lab covered using ArcGIS Help using:
  • ·         ArcGIS Resource Center Site through the internet browser
  • ·         ArcGIS Help within ArcMap
  • ·         Search Tool within ArcMap


PART III: ArcGIS Overview Process Summary

Week 1 – ArcGIS Overview started with setting up folders in network drives to extract project files to and store files created during the lab.  I ran into my first problem here by not being sure what would be required and added more folders than necessary, creating a very long path.  By the time I realized my mistake I already had a path set within my map that required I maintain my current set up or restart the whole process to enter a new path.

In Step 1: Launch ArcMap and Prepare Data, when I checked the data files I had extracted to my S drive I realized I was missing my Cities.sbx file.  I viewed the source files on the R drive thinking somehow that one file had gotten dropped during the extract process but it wasn’t there either.  Since this file wasn’t one of the three files listed as mandatory I posted a query regarding the file on the Discussion Board and moved on with my lab.  The TA responded she checked my S drive and saw it there with the other files, but it’s still not visible to me.  I still don’t know why, but it didn’t seem to interfere with my lab at all.  Everything else in Step 1 went fine. 


Step 2: Explore ArcMap ad Add Data brought me my second challenge.  When I clicked on the Add Data button then Folder Connections from the dropdown menu nothing happened.  I tried exiting out of the program and reentering, clicking other options in the dropdown menu then going back to Folder Connections, but couldn’t get it to do anything.  I skimmed through the Discussion Board and luckily found a post by another student who had the same problem, but knew a work around and posted that.   I went back to my Add Data box and saw the little icon of a folder with a plus sign on it.  When I clicked that Connect to Folder icon it allowed me to navigate the drives and select the files I needed.

Everything else went fine until I got to #7 of Step 3: Explore the Data.  When I opened the Cities.xml file I couldn’t make much sense of it.  I was able to find the information required for this assignment, but that was only because I knew specifically what to look for.  I’m looking forward to learning about more metadata in the later labs so I can better understand what I was looking at.