Saturday, February 27, 2016

Choropleth and Proportional Symbol Mapping

Cartographic Skills - Mod 7

This week's lesson was to show population densities as well as wine consumption in Europe by creating a Choropleth map and using either graduated or proportional symbols.  The lesson started in ArcMap by adding the shape file for the data set and classifying it.  We started out excluding some of the outliers so our data wouldn't be so skewed.  After perusing the different classification options I chose to go with Natural Breaks because it seemed to most accurately represent the data.  I do wish now that I had gone with 6 classes instead of 5 but it's too late now to go back. 

The next step after that was to add the shape file again on a second layer for wine consumption.  For this data set I had wanted to go with a proportional symbol, but when I did I was not at all happy with the results.  I tried changing classes from fewer to more and changing the symbol size by adjusting the Min Value, but everything I tried still left the larger symbol too large and the distribution of the data by the symbol just did not seem accurate.  Eventually I decided on a Graduated Symbol with a Natural Breaks classification and 5 classes.  Again I played around with the classes a bit because I didn't care for the size of the smallest symbol, but I felt the data was accurately represented so I stuck with what I had.

I chose not to add any text in ArcMap because I knew it would get clumped together in AI and it would be a pain separating it out to adjust. So I switched over to layout view to add my inset and map elements.  For my inset I created a new data frame then added the data to two layers and repeated the process of classifying them.  Once I finished with that I remembered I could have just copied the layers from the original data frame and they would have been good to go.  After banging my head on the desk for a while I moved on to adding my legend.  I puttered about with this for a while trying to find a layout that would make use of this week's lesson, but in the end it just seemed the best legend style for the layout of the map was just a plain old ordinary one.  I did decide not to put a boarder around it this time though because I was working around the image and I didn't want to block anything out, but I still wanted the legend large enough to read.  With the boring old legend in place I added the rest of my map elements then exported to a jpg file just to make sure everything looked right before exporting to an AI file.

Once in AI I checked my print preview to make sure everything had come over right then exported to another jpg file just to be sure.  And of course it didn't.  I was really surprised to find a couple inches of white space around the top and right side of my map.  I spent a considerable amount of time searching for an explanation for this and tried multiple things to try to fix it but nothing was working.  Finally I posted a question about it on the discussion board then started the long, arduous process of cleaning up the layers.  It's getting easier to recognize what's what, but it's still a pain to separate out all the individual elements from the groups they get clumped into by AI.  That was probably the most time consuming part of the whole lab, but I had to separate out the wine symbols so I could move them around to locate them more aesthetically and out of way of the labels.  Finishing, I turned off  that layer then proceeded to enter all the labels for the countries and get them placed correctly.  With them in place I turned the symbol layer back on and moved them around to fit in with the labels.

At this point I was pretty much done and was going to look into substituting my circle symbols with a symbol that represented wine, but first I wanted to make sure what I had was safe in case I messed up beyond repair and figure out my paper space issue.  I went through and exported it several times after trying different things but nothing seemed to make any difference.  I kept checking the discussion board for suggestions and tried what I could, then finally, on my last effort before giving up for the day, I noticed there was a little box on the export page about using the artboard.  I checked that box and viola!  It finally worked.  But at that point it was already so late in the day I decided to forgo the possible extra .5 points in extra credit and just left my symbols as circles.  I really had wanted to play with that for a bit, but after 10 hours of working on this just today alone I decided I'd had enough and decided to move on to the Process Summary and this Blog.


Sunday, February 21, 2016

Data Classification

Cartographic Skills - Mod 6

This week's lesson was on Data Classification.  We learned about several methods of classification but focused on only four for our assignment.  Our assignment was to create two maps, each containing four data frames depicting the same information, displayed by using four different data classification methods; Natural Breaks, Equal Interval, Quantile, and Standard Deviation.  

Both maps were of the distribution of the senior population in Miami Dade County, Fl.  The first map was by percentage of population over 65 and the second was of the actual population of seniors 65 and up by square mile.  This is my map of the population of seniors 65 and up by square mile:

This map is of the distribution of the population of seniors
 65 and up by square mile in Miami Dade County, FL.

I like the map by actual population by area better because I feel it is more honest.  By percentage doesn't tell you how many seniors are there, just what percentage of the population in a certain location is seniors compared to the rest of the population.  If 75% of the population is seniors you don't know if that means three of the four people in that area are seniors or three hundred thousand of four hundred thousand.  With the population by area you at least have a range of actual numbers to go by.

Of the four classifications I don't understand Standard Deviation at all.  Standard Deviation classes are formed by adding or subtracting the standard deviation from the mean of your dataset. Data needs to be normally distributed to use this method and if it is it will give clear dividing points for the classes.  It groups data that is similar and avoids grouping data that is not.  Identical values cannot fall into two different classes.  By the definition of this method it sounds like the perfect method, until you see the results.  The way the data is presented requires at least a basic understanding of statistics.  The color scheme makes no sense until to you compare it to the legend, but when you look at the legend, without that understanding of statistics it makes no sense either.

Equal Interval classes data in equal ranges by dividing the total data range by the number of classes.  Each class represents an equal amount of data, but can be unequal in its distribution.  This method works best on data that is generally spread across the entire range.  It is used by many institutions and government agencies to map complex spatial patterns.  These maps are easy to understand and interpret, but may contain empty classes. It appears there are two, maybe three empty classes in this map.

Quantile classifies data into a certain number of categories with an equal number of values in each category.  This is done by dividing the total number by the number of classes.  Class assignment is based on rank order.  With the quantile method there are no empty classes, but  single class could contain distinctly unlike data values.   

Natural Break method minimizes the difference between data values in the same class and maximizes the difference between the classes by considering the natural groups that are within the dataset.

This assignment was pretty easy as far as the mapping portion went.  The analysis portion was a different story.  I think never having had a statistics class makes a lot of this difficult to understand.  I do feel I have learned a lot from this assignment though and hope I can solidify what I have learned before it all goes away.

Thursday, February 18, 2016

Projections Part 2

Intro to GIS - Week 6

Whew!  It's done.  This one wasn't so much difficult as time consuming.  There were a lot of steps involved and a number of inconsistencies that caused confusion and delay.  Our assignment was to download data from a number of outside sources and bring it all together in a map.  Sounds simple enough until you know all these files are in different projections, or none at all, and have to be defined and/or reprojected to match one another.  

Petroleum Storage Tank Contamination Monitory (STCM) sites in Escambia County with an aerial image of Walnut Hill, Fl.
It's not a pretty picture and thank goodness it's not required to be.  The point of this exercise was to get all the data to match up.  Had there been more time I would have cleaned it up a bit, but this is what was required so here it is.

This assignment was a bit of information overload.  I think it's going to take me several days to put the whole thing in perspective.  At this point my mind is still spinning.

Sunday, February 14, 2016

Spatial Statistics

Cartographic Skills - Mod 5

This week's assignment on Spatial Statistics was an Esri Web Course.  We learned how to do an in-depth examination of our data's characteristics through the use of spatial statistic tools and ArcGIS Geostatistical Analsys tools.

Map of Weather Monitoring Stations in most of Western and Central Europe


This is a map of weather monitoring stations throughout most of Europe.  We used the ArcGIS tools of Mean Center, to calculate the mean center and Median Center to calculate the median center of our temperature data.  We then employed the use of the Directional Distribution (Standard Deviational Ellipse) tool to create a direcitonal distribution ellipse.  These tools are all located in Spatial Statistics Toolbox under Measuring Geographic Distributions.

This map was a little difficult to work with because of all the background noise.  Calculating the median and mean centers, and the directional distribution was very simple with the ArcGIS tools, but placing the map elements was a little more difficult because there was so little clear space.  To overcome my elements being washed out I added background fill where I could, in the Legend, the Scale and also in the text box with my explanation of the map.  I was a little surprise to see the background fill didn't completely fill the Legend frame, but I couldn't find any setting where I had caused that affect and actually, I liked the way it looked.  The North Arrow, Author and Date I put in as clean a spot as I could find, but it was difficult finding places for them where they would be visible yet balanced. 

I chose to go with a landscape view to provide more room for my elements while zoomed in as much as possible on the area of interest.  I did end up shifting the area of interest off center in order to make room for the descriptive textbox and the elements, but I felt the textbox and legend on the same side helped to keep the map in balance.

The source data text kept shifting around on me when I zoomed in and out trying to place some of my elements in its vicinity.  Eventually I decided to move my Author and Date text over by my Legend instead because even after adding a halo to it that text was not standing up against the background located near the Source Data text.  I didn't feel the Source Data text was very legible where it was either, but there was nothing I could do about that.

I chose to go with kilometers on my scale bar since the map was of European weather stations and I felt the target audience would most likely be European.

Wednesday, February 10, 2016

Projections - Week 5

This week's assignment was very interesting.  This week we had to make a map with three data frames containing the same image, but displayed in different coordinate systems.

The State of Florida in three projections.  The table shows the square miles of four counties within each data frame.  Check out the difference from one projection to another.
The map also had to contain a table showing the square mileage of four counties in each of the three projections.

For the first data frame we started in an Albers Conical Equal Area projection.  From there we used the Data Management Tools and the Projections and Transformations toolset to access the Project tool.  With this we projected spatial data from one coordinate system to another.  We then created two more data frames and used the same method to assigned coordinate systems to them.  In the end we had data frames for Albers Conical Equal Area, UTM, and State Plane N.

All three of these data frames contained a map of the State of Florida counties boundaries.  Within the county boundary layer we opened the attribute table and added a field called "Area" and used "Calculate Geometry" to populate the Area field with the square mileage of each county.  From this we created a selection of four counties for comparison and saved them to a separate layer.  With this information we were able to create a table to display on the map showing the variations in square mileage for the same counties, in different projections. 

I think I understand projections a bit better after this exercise, but I can see it will be a while before I'm not easily confused by them.

Sunday, February 7, 2016

Mod 4 - Cartographic Design

This week's assignment was to create a map featuring the schools of Ward 7 of Washington D.C. in ArcMap using Gestalt's Principles.  This lab was pretty fun.  It started with data obtained from District of Columbia Open Data being imported to a new map with Arc Map.  From there it was a

A map of the schools of Washington D.C.'s Ward 7 using
the Principles of Gestalt
matter of manipulating the layers so everything would show properly and turning off anything that was just plain clutter.  This required employing the Gestalt Principles.

I kept all the background clutter to a minimum by shutting off layers that did not emphasize the theme of the map.  After that I applied contrast to emphasize the area of interest by darkening the background of 7th Ward and lightening the background of the surrounding area.

Then I used visual hierarchy to emphasize the schools by breaking the schools down into three categories and sizing them by rank so a quick glance would indicate a difference.

Next I employed Figure Ground Distinction by lightening the lightening the roads in the background so the symbols would stand out more clearly.

After this was all set up I went into the Neighborhoods attribute table and selected several neighborhood names to display and created a new layer of this selection.  In order to move them around into more appropriate positions I converted the labels to annotations.  I had a bit of a problem with this step.  The conversion process appeared to work, but for some reason I was unable to move the text.  I could select it and change the font and size, but I couldn't move it.  After numerous failed attempts I just deleted the original selection layer and recreated it then ran the conversion again.  The second time worked fine.

Having my main map complete I switched to layout view and added another data frame for an inset map and set that up with very sparse information.  Once that was set I added all my map elements and shifted them around until my map felt balanced.  I exported to a .jpg file then saved my map and did a save as to another map and tried shifting it around to landscape view to see how that would look.  I didn't care for that as well.  I didn't feel it utilized the space as well as it did in portrait so I stuck with portrait.


Friday, February 5, 2016

Sharing GIS Maps and Data - Week 4

This week's assignment was to create three different map outputs to share.  I chose to create mine based on my Top 10 Favorite Restaurants and started by listing them in an Excel spreadsheet.  Once I listed all the places in order of preference I looked up and added their addresses and urls showing a dish from each restaurant.  Once the spreadsheet was done I saved it as a Text (Tab Delimited) file.

After that things got confusing.  The next step was to geocode the zip codes and addresses.  This was done by uploading the text file in ArcGIS Online, then downloading the layer to open in ArcMap.  Part of the process in ArcGIS Online allowed for making some design choices, but they were very limited.  The first output from this process was this public web map:

http://arcg.is/1Ra5sw8

This method was quick and easy, but I didn't care much for the results.  I didn't feel I had any control over the symbols and how they were displayed.  It didn't seem to matter if I zoomed in or out, there was only one view where I didn't have a lot of symbol overlap.  I actually changed the restaurants of preference and their rankings just to get enough space in between so each individual symbol would be seen.  Even then one over-lapped another, but not so much you couldn't tell there were two there and select each.  I found this to be quite annoying.

I think this first output was supposed to be the Map Package and the basis for the next map, the ArcGIS Online Map, but since the last one was mostly done on ArcGIS Online I'm rather fuzzy on the difference.  Here, the next step is to open the "map package" and that launches ArcMap, though a couple steps later the task is to "Create a Map Package".  Hence my confusion.  But at this point I did a bit of work, mostly creating a shape file and adding that and a World Streets layer.

After verifying the World Streets and the Geocoded list were both being displayed in the same coordinate system, I started working on the scale to make sure the symbols didn't look like a jumbled pile as I zoomed in and out. The next step was to clean up the attribute table because new fields were added during the Geocoding stage.  Once that was done I shared as a Map Package, even though I used a map package for the basis of this map.  Here I shared the Map Package to my ArcGIS Online account, analyzing it in the process to make sure nothing would prevent the map from being published or cause problems with its performance.  Once this process was through the Map Package was created.  From ArcGIS Online the Map Package was able to be pulled up.

Last was to make a Google Earth Map.  This was quick and easy since most of the work had already been done for the first two output maps.  Here a Layer to KML conversion was done allowing the map to be viewed in Google Earth.  Clicking on the KML file launched Google Earth.
This lab was very intensive.  There was a lot of new terminology to learn and a lot of bouncing around between programs that made things confusing.  I think I'll need a bit more time to absorb this lesson than I have any others.