Tuesday, March 28, 2017

Conducting a Distance Azimuth Survey

Introduction:


Figure 1: Survey Stations along Putnam Trail
While surveying using a grid-based system can be useful for mapping smaller plots, technological advancements with GPS has allowed field surveying to become an easier, faster, and more accurate tool for obtaining and displaying field data for scales of various size. However, the use of any type of technological equipment comes with risk for technological failure while the surveyor is out in the field. Still, the job must get done! For this week's lab, students used various a variety of tools at varying locations along Putnam Trail to survey a series of select trees using the old-school Distance-Azimuth survey method. This method requires the measurement of distance and compass degree between several surveyed points (10 trees) and one pin-point location tied to a latitude and longitude coordinate pair (data collection stations) to be used later for mapping. The surveyed area and stations are pictured in the reference map (Figure 1).



Methods:


Data Collection:


To best familiarize students with a variety of equipment that could be used to obtain distance and azimuth data in the field, students divided into three groups and worked as a team to collect 10 data points from each of the three stations, using the various tools provided as they rotated through.


Figure 2: Image of TruPulse 360 
At Station 1, a TruPulse 360 range laser was used to determine the distance between the surveying pin-point and its selected surrounding data points, as seen in Figure 2. Other necessary tools included a standard compass for measuring the azimuth of the surveyed trees, and a basic GPS unit to measure the coordinates of the central point, or data collector's position. At the data collection point, the latitude and longitude pair read 44.796 deg. N and -91.5016 deg. W. While collecting the data, students alternated roles operating the TruPulse 360 range viewer and compass, measuring the diameter of selected trees, and recording the data until reaching the data collection total of 10 various points. This method was especially accurate in measuring the distances between collection point and tree. Also noteworthy of this tool's function, the measuring units displayed on the reading scope could be easily changed in the equipment's settings to read in either imperial units, metric units, (as used), or in degrees. Some challenges that arose when using this method, however, was the sensitivity of the tool's reading. In some instances, the tool tended measure the distance of a small branch that intercepted the scope on the way to the intended select tree. For this reason, many of the collected data points at this station had to be double, and triple checked for distance integrity of the actually intended data point.

Figure 3: Measuring Diameter of Trees at Breastlevel
Station 2 was the most time consuming because aside from the need for the GPS to denote a specific coordinate point, this station made due without the use of technology completely. Instead, students used a tape measure and compass to determine distances between data collection point at 44.79585 deg. N and -91.50033 deg. W and its surrounding trees. For this reason, the group did not travel nearly as far for plotted points, and the second station appears to be the more clustered data collection group of the three methods. While this method can be especially handy in case of equipment failure, it was also the least accurate of the three methods. Again, students alternated roles between holding and leading the tape measurement, reading the compass azimuth, measuring tree diameters, and recording the resulting data. Figure 3 shows a fellow group mate taking the circumference of a tree at standard breast level in order to find the diameter. Some complications the group faced in gathering data included the struggle to pick trees that were a far enough distance away to appear significant when plotted on a map, but not so far that another tree would block the tape measure's route, causing a curve in the tape and skewing the measurement's reading.

The last station surveyed used a range reader and receiver combination to record the distance. The data collector held the range reader gun at 44.795383 deg. N and -91.499388 deg. W while the person measuring the tree's diameters held the receiving end of the pair. The reader would measure and display the distance between it and the signals picked up by the receiving device. Some of the challenges that arose from this method was the occasional inability for the reader to pick up on the signal sent out by the receiver. This usually just required minor positioning adjustments so that signals would send properly without signal interruption.


Data Normalization and Mapping:


Once all of the data had been collected in the field, it was then necessary to format the data into an Excel file and normalize into a format compatible with ArcMap. A sample snapshot of the final data sheet is displayed below in Figure 4.

Figure 4: Normalized Data Table in Excel


Figure 5: "Bearing Distance to Line"
and "Feature Vertices to Point" 
Commands Location
After creating the table, the routes and distances measured between the three central data collection points and their corresponding trees were imported and plotted onto the map using the Bearing Distance to Line command in ArcToolbox under Data Management >> Features as shown in Figure 5. Figure 5 also references the location of the tool used to place the points where the trees surveyed were stationed at the end of the measured distance line. This tool could be found in the same section of ArcToolbox labeled Feature Vertices to Point. Finally, a topological image was placed beneath the resulting plotted points for reference.Figure 6 illustrates the results generated after the use of both tools in inlay of a basemap.


Figure 6: Tool's Resulting Map Image and Plotted Data


Results:


From the data points and features plotted on the map in the image above, the following map (Map 1) was constructed to further showcase the resulting distribution of each of the three methods and the corresponding trees which were selected in collecting data from each of the central points.

Map 1: Putnam Drive Survey Stations, Methods and Tree Data Points

The first station using the TruPulse 360 was the most effective tool used of the three stations for collecting distance data. It was quick, user friendly, and accurate in its measurements and could be measured with only one user. It also allowed the surveyor to collect these data points without actually having to approach any of the trees. Considering the steeply angled uphill slope of the terrain south of the trail, some of these trees were more difficult to reach by foot when necessary, as in instances when measurements were being taken at the second two surveying points. Station 3, for instance, had the convenience of using technology for measuring these distances as well, but the surveyor still needed a second person to hold the receiver at the tree's location. For this reason, most of the data points collected at the last two survey points were collected north of the trail to avoid any uphill hikes.




Conclusions:


Learning the Distance-Azimuth surveying method is an important skillset to use as a back up tool in the case of equipment or technology failure. Though the results produced are less accurate than those that may be obtained through the use of technology, the method still does a fairly good job of displaying the overall location of collected data points.

The Distance-Azimuth method can also be applied in the Point-Quarter sampling method used for determining the relative concentration of a species in a given habitat, especially those with a less defined shape as is the case with Putnam Trial. During this type of sampling, the same relative technique is used to get an estimate of the overall number of species that are within a given area. To perform sampling, a number of species in the area (in this case, trees) are sampled at random from a central point. Their correlating data is recorded and the trees are each prescribed an identifying number, just as performed in the lab detailed here. The methods begin to differ from here. In the Point-Quarter surveying method, once points are collected, a compass is used to determine and lay out four individual quadrants. The total sampled number of trees observed is multiplied by four (for four quadrants) to get the relative density of the area. This number is multiplied by the total density (calculated from the tree diameters) in order to obtain the absolute density of a species within an area, in this case, the absolute density of each tree species along Putnam Trail.

Overall, this lab equipped students with the necessary knowledge to overcome potentially critical situations that may occur in the field that will enable them to still get the job done! Despite living in an age with ever-advancing technology, learning the basics of the trade and the "old-school" methods used to collect location based data is a handy tool set to have stowed away for the occasional instances in which they just might be needed in the future.





Tuesday, March 14, 2017

Pix4D Demo

Introduction:


This lab demonstration was designed to introduce students with Pix4DMapper, a program that transforms a series of aerial imagery into a single compiled mosaic capable of rendering both 2D and 3D surface models. 

An online Pix4D Software Manual was provided to familiarize students with the capabilities and limitations of the program's functions. Through exploration of this manual, students were prompted to answer the following questions related to the software:


-What is the overlap needed for Pix4D to process imagery?
        70% Frontal, 60% Side
-What if the user is flying over sand/snow, or uniform fields?
        More overlapped imagery is needed in these instance: 85% Frontal, 70% Side
-What is Rapid Check?
        Rapid Check allows processing to be completed at a much quicker pace, but with a lower accuracy and resulting resolution.
-Can Pix4D process multiple flights? What does the pilot need to maintain if so?
        The software IS able to process multiple flights, provided that there is a thorough amount of overlap between photos and that the photographs were taken under the same conditions (same relative time of day, weather, ground layout, etc.).
-Can Pix4D process oblique images? What type of data do you need if so?
        Oblique images CAN be processed, provided there is enough overlap in and between datasets. 
-Are GCPs necessary for Pix4D? When are they highly recommended?
        GCP's are NOT necessary for Pix4D. They are however highly recommended during tunnel reconstruciton. 
-What is the quality report?
        The quality report works to identify errors within the processing of the given data imagery. 



Methods: 


Following an in class demo, students began by creating a new project in the program, importing the series of 68 provided images titled "Flight 1", and undergoing initial processing before finally completing the processing with Point Cloud and Mesh processing and DSM, Orthomosaic and Index processing.

The results from the initial processing stage generated a quality report (Figure 1) and a single mosaic (Figure 2) that could be used for 3D rendering after completion of the final two processing features (Figure 3). All 68 of the photos in the series were successfully processed.

Figure 1: Quality Report

Figure 2: Flight 1 Mosaic of Litchfeild Mine, Eau Claire, WI



Figure 3: 3D Rendering of Mine Site



Upon the successful completion of 3D rendering, students were able to use the program to produce a fly-by video visual of the 3D terrain surface examined in Figure 3. The resulting video revealing a glimpse of the 3D model from each of the four directional perspectives has been provided in Video 1 below:




Video 1: Flyover of Mine Site




Data Discussion: 


The overall results from the processing of the original 68 aerial photos proved to be successful in stringing together a singular mosaic which was later used for 3D rendering. Contributing largely to the success of this data processing was the high amount of overlapping images used in the data set. The diagram in Figure 4 illustrates where these areas of overlap occurred, as well as the degree to which overlap occurred. The bright green area occupying the majority of the figure is representative of the areas with five or more overlapping images for every pixel. Contrastingly, the red areas at the fringes of the study area depict a scarcity in image overlap per pixel, probably because it lies outside of the mining site.

                   Figure 4: Overlap


Completion of the second two processing stages resulted in a total of two rasters, one as a DEM, the other a Mosaic, as depicted in the map below. The DEM works to symbolize the elevation of the terrain in the area of interest while the mosaic provides the actual visual frame work. Areas with high elevation are pictured in red on the DEM image while areas of low elevation are pictured in blue.




Final Critique:


In all, the results of this lab are meant to demonstrate the functions and capabilities of Pix4DMapper as well as the potential benefits its use. While this demonstration only revealed a small fraction of the software's potential, it is evident that this program could prove powerful in the geospatial world, analyzing everything ranging from land use and agriculture to architectural and structural projects in urban planning. 



Tuesday, March 7, 2017

Using Survey123

Introduction:


There are a number of survey sets which Geographers often consult for the purposes of collecting data, the gold standard of surveys and go-to consult often being the US Census records. Sometimes, however, the surveys that are popularly referenced for consult do not host the information sought for in the data collection process. In these instances, developing personalized surveys are a key way for Geographers to reach out to the community for data collection of a particular topic of interests. Survey123 provides a user-friendly platform that allows students to quickly formulate and customizes online surveys for a community. This lab provided students with a tutorial (through https://learn.arcgis.com/en/gallery/) on how to use Survey123 and manipulate details within its program for further use in future labs.



Methods: 


The tutorial was formatted in step by step exercises for students to replicate directly, and ultimately, produce a final product which, in the case of the example, would serve the Homeowner Association in evaluating the disaster preparedness of community homes in a given study area. The tutorial was divided into a series of four lessons overall, including: "Create a survey," "Complete and submit the survey," "Analyze survey data," and "Share your survey data."

During the first lesson under "Create a survey," students worked at constructing the overall survey survey form, focusing on the questions the survey will ask, and the standardized types of responses a surveyor may use to answer. The platform is set up in an easy to manage format that allows the survey builder to simply "Add" (Figure 1) a question with presets per each response type (numerical, multiple choice (single or multi-answer), text), and then use "Edit" (Figure 2) to formulate the actual question and modify any necessary response options.

Figure 1: Add a Question Type

Figure 2: Edit the Question and Associated Response Options


Once the survey was built and complete, students were able to move on to the "Complete and submit the survey" lesson portion of the tutorial. In this section, students perform a number of runs through their survey's final product, and submit them for use. In the case of this lab, the survey conducted a total number of eight times before evaluation.

The third lesson in the tutorial, "Analyze survey data." allowed students to visualize the statistical results of their collected survey evaluations. The Survey123 platform allows students to visualize their collective results to each question through the use of columns charts (Figure 3), bar charts (Figure 4), pie charts (Figure 5), and proportional symbol mapping (Figure 6). One of each of these methods is provided with the people per household question in the four figures below. Each question also hosts some statistical result tables posted below visual illustrations of the data collected, some include a number of statistics while others include simple percentages for certain responses.

Figure 3: Survey Question Results- Column Bar Graph


Figure 4: Survey Question Results- Bar Graph Chart


Figure 5: Survey Question Results- Pie Chart Graph


Figure 6: Survey Question Results-Proportional Mapping Graph




The last lesson on "Share your survey data" walked students through the process of publicly publishing their survey to be taken by the target community through the web. The final tutorial illustrates how the program allows for the generation of pop-up configuration maps with data collection points linked to text of their associated surveys as shown below in Figure 7. The survey creator can choose which survey elements they choose to remain in the pop-up configuration links within the maps so that personal information is not revealed to the public eye.

Figure 7: Data Collection Points and Pop-Up Survey Configurations



Conclusion:


Overall, experimenting with Survey123 through the tutorial provided was effective in helping students to learn the potential of the program, how to navigate its layout and manipulate its functions. Survey123 will prove useful in conducting personalized research which requires a self-generated survey to obtain new data about a population.




Sources:


Survey123
https://learn.arcgis.com/en/projects/get-started-with-survey123/lessons/create-a-survey.htm
ESRI


Development of a Field Navigation Map

Introduction:


The process of navigating a given territory often relies on grid systems and sight-based location points in order to be successful. There are a number of grid systems that can be utilized for the purpose of referencing location during navigation. In generating a pair of field navigation maps for the Priory of Eau Claire, Wisconsin, students will demonstrate two of these grid systems in projection, one using a UTM Projected Coordinate System and the other using a traditional world Geographic Coordinate System of Decimal Degrees. The resulting maps and their use will be expanded on in later labs. 


Methods/Results:


For the first map illustrated in Figure 1, I decided to set both my data frame properties and my individual feature projections to Transverse Mercator reflecting the UTM to the 15N Zone projected coordinate system (where the city of Eau Claire falls on the UTM system). I then formatted the grid in the layout properties to mark latitude and longitude points as reference points for the mapped area. 

The second map denoted as Figure 2, conversely, uses a world Geographic Coordinate System (GCS_WGS_1984) to map the area of focus and provides its units as decimal degrees when translated onto a grid system.

While the map in Figure 1 is beneficial for preserving the shape of the illustrated area, the map in Figure 2 is better for preserving direction, which could translate as being the optimal choice map between the two for the purpose of navigation. The resulting maps are posted below: 




Figure 1: UTM Coordinate System





Figure 2: World Geographic Coordinate System




Sources:


ESRI and ArcMap
Geodatabase info provided in class from Priory











Tuesday, February 21, 2017

Map Making Fundamentals

Introduction: 


Understanding cartographic fundamentals in mapping is an essential skill set to maintain in the field of geography. To begin, it is important to note that there is a minimum of at least five mapping elements which should be incorporated into all generated maps, including:

  1. North Arrow
  2. Scale Bar
  3. Locator Map
  4. Watermark (map illustrator)
  5. Data Sources
      (and a Title, of course)


Other elements to consider in cartographic design could include:

  • A Legend
  • Data Frame
  • Company or Client Logo
  • Labels and Annotations
  • Disclaimers
  • the Date
  • etc. 

This lab is designed to ensure that each of these key components within maps and all other proper mapping techniques are being included and utilized by students for the remainder of this course, as well as for other projects extending well beyond the duration of the course. 

Methods & Results:


Part One: Creating A Map Series of Sandbox Terrain Survey-


In order to put these cartographic fundamentals to practice, students were asked to utilize their sandbox survey imagery from the previous two labs and generate a series of five maps using an interpolation method of their choosing. The first map produced would be a hillshade map view of the sandbox survey terrain, and the final four maps required would be oblique maps of the terrain viewed from four different angles. 

For the purposes of this lab, the interpolation method of choice utilized for the map series was the Kriging Interpolation Method. The four oblique map's angles were arranged so that when organized together, the series would allow for each sandbox corner to take a turn at the forefront of the map. North arrows are used to help the map reader visualize the sandbox's orientation within each of the maps represented. The final series of maps for the Sandbox Terrain Survey is presented below:

                 Map Series 1: Sandbox Terrain Survey, Kriging Interpolation Method

Metadata: The tools utilized in collecting data points for survey model included a meter stick, tacks, and a string used for girding. Data points collected from far sandbox across the street from Phillips Hall between 3:00 and 4:30 pm on Monday, January 30th .




Part Two: Hadlyville Cemetery, Mapping with Attributes-


The section section of the lab required students to construct another series of maps concerning the features of a number of graves located in the Hadlyville Cemetery in Eau Claire County, Wisconsin. Students were to generate a total of four maps, including three nominal maps, one denoting the year of death on each grave, another the last names associated with the graves, and the last involving a color coated scheme indicating if the graves are still standing or not. The last map required a numeric ranking again based on the years of deaths written upon each grave. The resulting maps are posted below:


Map 2: Nominal Year of Death Map

The first of this map series reveals the dates associated with each of the graves within the Hadleyville Cemetery. The years range from as early as 1859 to as recent as the year 2006. Many graves which appear within the same year occur in a row, but overall there is no progressive number series to indicate any order to the burials lied out.



Map 3: Nominal Last Names Map

The next map in the series reveals the last names that are associated with each of the graves. It is common to find alike names buried near to one another across the cemetery. It is safe to assume that these common last name clusters belonged to individual families or married couples. Some of the more commonly occurring names include "Beardsley" at the eastern-most section of the map, "Folley" to the north, and "Higley" and "Hadley" to the northwest.


Map 4: Nominal Color Coded Conditions Map

This map is another example of a nominal map, though it has been color coded to indicate whether or not each grave is still standing. In this instance, it is appropriate to include a legend within the map in addition to the original five elements of maps. The legend is useful in helping the reader to decipher why each of the graves are color coated as such. The blue dots symbolize the graves still standing while the red dots indicate graves which have fallen. The "Null" yellow dots most likely account for graves which are already laid out parallel to the ground, or are graves which perhaps no longer have a stone to mark them at all. Overall, it seems most of the graves in Hadleyville Cemetery are standing in good condition. 


Map 5: Numeric Proportional Ranking Map based on Years of Deaths

The last map in the series was a proportional interval map relating  the dated ages of the graves found within the cemetery by a ranking system. Each of the three point sizes represent a time span of approximately 50 years. The smallest dots account for the oldest set of graves; their 'year of death' dates range between 1859 throughout the remainder of the century. The middle sized dots pick up the dates again beginning in 1900, and maintaining through the beginning half of the twentieth century into 1950. The largest dots represent the newest of the graves found in the cemetery. Its 'year of death' listings range beginning in 1951 until 2006. 



Conclusion:


Each of the maps provided within the two map series showcased above maintain all five elements of mapping essentials within them, including a north arrow, scale bar, locator map, watermark, and list of data sources. When useful, additional mapping elements can be added to the map, such as the legends featured within the last two maps in the second series. Overall, this lab was a useful exercise in the review of mapping element essentials and cartographic fundamentals. 



Sources:


Hadleyville Cemetery Geodatabase (provided by professor)
ESRI Software

Tuesday, February 14, 2017

Sandbox Survey (part 2): Visualizing and Refining the Terrain Survey

Introduction: 


In the previous lab, students were assigned to groups of three and asked to construct and survey an elevation model featuring a ridge, hill, depression, valley, and plain using the terrain in an approximate square meter sandbox. Each group was supplied with tape, string, thumb tacks and some meter sticks in order to construct a grid system pattern and survey the terrain. Groups used various tactics of sampling in order to collect elevation data points from their terrain. The data points collected were then normalized by entering the coordinate points into an Excel document as individual x-, y-, and z-value attributes.

Data normalization is the process of organizing data into a table, with each column representing one attribute of the data presented. In this case, normalizing the data meant that for the table produced in Excel, each x-value needed to be entered into one continuous column labeled 'X-Values,' all y-values entered into a continuous column marked 'Y-Values,' and so on so forth so that each row was comprised of four columns (the OID, x-, y-, and z-values). When looked at as a whole, each row within the excel file would make up one coordinate point (this group had 400 coordinate points, therefore 400 rows of data points).

When done correctly, the X,Y (and Z) coordinates of an Excel file may be uploaded as data points into ArcMap as a grid system. From here, students would be able utilize tools in ArcMap to manipulate the data points and illustrate various 2D files of 3D projections of the terrain, which could then be transported into ArcScene to ultimately generate the 3D digital models of the terrain created in the sandbox during the previous lab.



Methods:


Once the data has been normalized in Excel, the file containing these points is ready to be uploaded into the geodatabase stationed in ArcMap using the 'add XY data' tool option. After the table has been imported, it can be converted into a point feature class, which will reveal the grid-like pattern students previously constructed in their sandboxes onto the map. These points can be symbolized to represent individual values from each data point collected, but the image does not provide a good visual as to the varying gradations between each data point value.

To obtain the 3D visual, one can use the tools under the 3D Analysis option in ArcToolbox, where a variety of interpolation methods can be selected from to provide the 2D output of the image desired. This image is stored in the geodatabase in raster format, and can be uploaded into ArcScene to be viewed in 3D with an option to rotate it to view various angles. For the purposes of this lab, all 3D models utilized were oriented so that they would be viewed in the same direction as the sandbox landscape was while being constructed in the field.

Once the models had been generated, the symbology and orientation set, these 3D images could then be exported in 3D format as a .png file to be returned to ArcMap as raster features for mapping and scaling. Since the files re-added to ArcMap were raster features, scale could not be automatically inputted into the maps as is traditionally done. Instead, scale was reflected using the 'drawing' toolbar in ArcMap to illustrate the general size of the actual sandbox being mapped (1 x 1 meter).

Students were to explore 3D analysis using the following five interpolation methods:


1. IDW (Inverse Distance Weighted)-


The IDW interpolation method uses the assumption of Tobler's First Law of Geography (things that are nearer are more similar to one another than things at a further distance away) to fill in the gaps between the sampled data points. The points collected in the field are weighted with the greatest amount of influence and diminished directly with distance from there. The resulting 3D model, then, resembles a somewhat "lumpy" image, the collected weighted points at the peak of the lump as seen in figure one below. This method tends to only be useful if the data being modeled has minimal variations in the range between the points collected, but is normally not the "go-to" method in 3D modeling.

2. Natural Neighbors-

Also known as "area-stealing" interpolation, the natural neighbors interpolation method uses the same weighted distribution technique as in the previous method, but generally tends to do a better job of smoothing out its transitions between the weighted data points, generating a less "lumpy" as can be seen in figure two.

3. Kriging-

The kriging interpolation model is constructed using the weighted average of all nearby collected data points to fill in the unsampled points within the grid using a specific formula. This method is ideal in many instances since it prioritizes smooth transitions between sampled data points, while still providing the best unbiased prediction of the values inbetween. An example of the sandbox terrain using the Kriging interpolation model is pictured in figure three below. 

4.  Spline-

Like the previous kriging method, spline interpolation uses a mathematical formula that ultimately aims to reduce the overall curvature of the surface within the area sampled. As illustrated in figure four, this again results in a much smoother surface in the transition between collected data points.

5. TIN (Triangular Irregular Networks)-

The last interpolation method, TIN interpolation, creates a series of edges connecting various collected data points to form a network of triangles and reveal a general outline of the sampled surface. In areas where the surface varies more drastically, TIN is able to provide a higher resolution than it does in areas with little variance in values. A drawback to this technique, however, is its limited popularity as a result of its costs to build and process. 



Results:


The first interpolation method used, the IDW interpolation, illustrated a "lumpy" 3D image shown in Figure One, as promised in its description under the methods section. The 400 data points which were collected during the original surveying can somewhat be seen at the peaks and troughs created by the weighted points. This method, though it may be beneficial in some instances to be able to see that original data beneath the created surface, is not the most visually appealing, nor accurate in predicting the unsampled data between the sampled data points, and is therefore not a commonly selected option in 3D mapping.



Figure 1: IDW Interpolation Model


The natural neighbors interpolation method illustrated in Figure Two generates a much more likely option between the first two methods. The weighted distribution at collected data points is still visible, but the transition between data points illustrates a much smoother surface than the first option had provided.

Figure 2: Natural Neighbors Interpolation


Figure Three reveals the kriging interpolation method in use. Here, again, the surface is generally smooth. This option is supposed to be the least biased in generating values for its unsampled points. Perhaps this is why this option seems to reveal the least amount of color variance from the surface beginning its lowest points, to the depths of the area. 

Figure 3: Kriging Interpolation


The fourth figure utilizes the Spline interpolation method. As stated in its description, this interpolation type provides the smoothest outcome of all five methods. It limits the curvature of the surface, but still somehow manages to create the most uniform surface. Some errors can be seen in the image produced, however, in both hill areas on the surveyed display. The etched blending in the side of the hills may indicate error in measurements performed while in the field. In a follow up survey, these measurements could be taken with more care for accuracy and a smaller rounding value than the quarter inch that this group had decided upon. 

Figure 4: Spline Interpolation


The last interpolation method, TIN, is the most distinct of the five method options. The method utilizes edges and resolution to illustrate the surface model. The triangular network pattern that results from these edges connecting the data points can be clearly seen in Figure Five below, and is especially evident in the caldera-looking feature at the lower left corner of the sampled area. 

Figure 5: TIN Interpolation



Conclusion:


Interpolation can be used for a multitude of purposes. Some of these include mapping of rainfall, watertables, chemical concentrations, noise frequencies, and soil-type distribution. There are also many more options of interpolation methods to choose from in addition to the five options previewed for lab this week. Some other types include PointInterp, Trend, and Density. 

The last few weeks of lab allowed students to familiarize themselves with the practice of field sampling, data normalization, and 3D modeling with Interpolation. 



Sources:


http://www.spatialanalysisonline.com/HTML/index.html?kriging_interpolation.htm

http://pro.arcgis.com/en/pro-app/help/analysis/geostatistical-analyst/how-inverse-distance-weighted-interpolation-works.htm

http://resources.arcgis.com/en/help/main/10.1/index.html#//005v00000027000000



Tuesday, February 7, 2017

Sandbox Survey (part 1): Creating a Digital Elevation Surface Model

Introduction:


Sampling is an effective time and resource saving tool that is often utilized in the process of data collection. Many times, a study interest may realistically be too large to undertake a thorough and detailed collection of all the existing data that is beneficial to the study. In these instances, sampling allows the data collector focus on a small-scale representation of their study interest in order to make generalizations about the larger picture as a whole. For geographers, this is an ever-familiar skill set that has been frequently exercised-- identifying key themes and patterns at local scales, and analyzing them in relation to other themes and patterns found elsewhere, in an overall attempt to make better sense of our world. 

The process of sampling can be executed in a variety of ways. The three main types of sampling include random, systematic, and stratified. Somewhat self-explanatory, the process of random sampling involves the spontaneous selection of data, where all available data has an equal chance at being selected. This method is useful because it is the least bias of all three sampling techniques and can be applied to large sample populations. Systemic sampling is done according to a predetermined strategy or system. Data collected is done in even intervals along the study area. This strategy is useful since it allows for thorough coverage of the area in study. Lastly, the third type of sampling, stratified sampling, is used when the area in study is composed of smaller areas of a standard size. These smaller areas are each individually a smaller representation of the larger area, and should therefore be reflective of that. One example of this might be blocks within a given neighborhood. The neighborhood is the overarching study area, but each block may be miniature representations of what the neighborhood looks like as a whole.

For this lab, students were placed into groups of three and asked to construct an elevation surface of terrain using a sandbox approximately one square meter in size. The terrain needed to include a ridge, hill, depression, valley, and plain. Students were given tape, string, thumb tacks, and a few meter sticks in order to construct grid system and survey the terrain to be digitized in ArcMap during lab in the week to follow.


Methods


In beginning the project, the group determined that the best method to use for sampling the sandbox's terrain was the systemic line sampling method. This method uses the intersections on a standardized grid with uniform intervals as points for data collection across the sandbox terrain. The group felt that this was the best choice in order to have a good coverage of the sandbox terrain overall that would clearly outline the terrain features required to be used in its construction (a ridge, hill, depression, valley and plain).

Figure 1: Group Surveying and Recording Data
Of the two sandboxes located across Roosevelt Street from Phillips Hall, the group chose to begin constructing the terrain in the furthermost sandbox. Once each terrain feature had been constructed, the group focused on constructing the grid, pinning string to the sandbox's wooden frame at equal intervals measured by a meter stick. The intervals used were predetermined as a result of the sandbox frame size. The approximate square meter sandbox could be split into a 20x20 grid system with each square approximately 2x2 inches in size, allowing for a total of 400 data points to be collected, a sizable amount that could be conducted in reasonable time. One person in the group was responsible for recording the data while the other two alternated between rows in measuring the elevation level of the sand at the southwest corner of each grid line intersection.



Figure 2: Taking Measurements from Terrain Grid


In total, the operation took just over an hour and a half to complete. The string line was determined to be the surface (or sea level) of the terrain model, so most of the data collected was then negative in value. Due to the cold weather, the group decided first to transcribe these data points in a notebook to be transferred into an Excel file later on. After transferring it to the Excel file, a color scheme was added to the data plotted in the table so that the group could get a glimpse at what the terrain would look like once digitized in ArcMap. The group then also charted the data in a format that would be useful for transferring these points into ArcMap during the lab next week. Fragments of the resulting Excel data tables can be found below:

 
Chart 1: Excel Data and Normalization

Results/Discussion


Overall, the group managed to collect a total of 400 data points within the 20x20 grid.  The minimum value in our collection was -8 inches while the maximum value was a +4 inches. Since the string line was established as sea level, most of the data points fell below the line at negative values, the most commonly occurring value being -2 inches. Given the cold temperatures and tedious measurement requirements of the lab, this sampling method proved to be really useful and seemingly effective. 

Some issues did arise during the process of completing the lab. To begin, the freezing temperatures had managed to freeze much of the sand within the sandbox, and made digging the terrain for surveying difficult as it limited the areas which were soft enough to be molded. Secondly, as the group went on collecting data points, the string began to slack some in certain areas, It may have been more beneficial to double tack alternating string lines to improve its security. A third improvement could have been made in being more specific in collecting measurements. It seems the group did a lot of rounding to quarter inch markers in looking at the final data set. A better method would have been to conduct the measurements in centimeters to promote more accurate readings across the survey sample.

Conclusion:

Sampling is an effective tool to utilize in spatial settings as are commonly found in the field of Geography because it allows for larger scale analysis to be done on smaller scale levels, conserving both time and resources in the process. This activity relates well to the system used by the Public Land Survey System, which also maps out land plots into squares, but at a much larger scale. Overall, the survey system utilized was a decent system given the constraints of the weather, but of course, more data is always better. Perhaps creating more rows and columns within the grid system would have benefited the group, as it would have also resulted in the collection of more data points. Also, as noted previously, the group would have done better conducting the data point measurements in units of centimeters rather than in inches for better accuracy in numbers. 



Sources


http://www.rgs.org/OurWork/Schools/Fieldwork+and+local+learning/Fieldwork+techniques/Sampling+techniques.htm