Monday, December 17, 2018

Lab 8


Introduction
The purpose of this lab is to analyze corridor LiDAR point clouds and extract features. This lab also introduces reprojecting LAS files, practical skills in utilizing terrestrial lidar scan (TLS) for corridor mapping and assessment mapping, and extracting building footprint for LOMA (Letter of Map Amendment) information.

Methods
Part 1: Projecting an Unprojected Point Cloud
The first section of this lab was to define the projection for a LAS dataset for Algoma, Wisconsin. This was completed by creating a new tool box in ArcMap that contained the Define LAS File Projection, Reproject LAS Files, Scale LAS Files, and Shift LAS Files. Using the Define Projection tool the Algoma LAS tiles xy projection was defined as NAD_1983_HARN_WISCRS_Kewaunee_County_Feet (WISCRS) and the z coordinate system was defined as NAD 88 (US Feet). Once the projection was defined for the LAS files the LAS files were then reprojected using the same projections (figure. 1).
Figure 1. Projected LAS file in LP360
Part 2: Transport and transmission corridor asset Management 
Once the LAS files were projected into the correct coordinate system they were then brought into LP360 for a corridor asset inventory. To aide in the inventory assessment, the boundary of the study area was brought into Google Earth as a kmz file. Where different features could be cross-checked with high-resolution imagery. Within the dataset features such as street poles and street signs were counted, as well check power lines for potential issues due to vegetation from surrounding trees (figure. 2).

Figure 2. 3D viewer used to determine if there was encroachment from vegetation on power lines
Part 3: Building Feature Extraction  
Once the dataset was checked for features, a building footprint was extracted in LP360 from the dataset for Lab 4 in Lake County, Illinois. Using a Point Group Tracing and Squaring task. Using this task two shapefiles were created a building footprint and building footprint square. The building footprint shapefile traced the outline of the classified buildings where as the building footprint square created squared edges for the classified buildings. Using the footprint squared image, the shapefile was conflated using Summarize Z using building points as the source points to determine building heights. Then the building footprint square shapefile was conflated using the Pure Drape conflated methods with ground points as the source points to create a building footprint. Then to determine minimum height for buildings to designated above the floodplain to be candidates to apply for FEMA’s LOMA, the footprint square conflated using Summarize Z using Minimum Z with the source points being ground points. Once the building footprints where conflated, the shapefiles were brought into ArcMap (figure. 3).  Then using ArcMap building that were between 800 and 810 feet above sea level were selected to determine which buildings could be taken off of the floodplain map if FEMA were to hypothetically change their policy.
Figure 3. Building footprint shapefile created in LP360 and displayed ArcMap

Results
Map created in Arcmap displaying the buildings that could be removed from floodplain map.
Sources
Data for this lab was provided by Dr. Cyril Wilson of the University of Wisconsin-Eau Claire. 

Lab 7


Introduction
The purpose of this lab was utilize land cover from optical imagery to pinpoint tree species spatial distribution, use empirical models to extract various vegetation metrics from a tree species in a given study area, and create a recommendation to the carbon sink potential of a forest.

Methods
Part 1: Canopy Height Model
This section of this lab was to generate a canopy surface and ground model from a LAS dataset for Eau Claire County. A canopy surface model was created using vegetation points filtered by first returns only using a 3 foot cell size. Then a DTM was for the ground surface. Once these models were created in LP360, they were brought into ArcMap. Using model-builder a model was created that subtracted the vegetation height model from the ground model to create a new model from the subtracted values named EC_veg_CH. This new model was changed to a 32-bit signed integer raster named EC_veg_CH_c. This was done so that an attribute table could be created for model.

Part 2: Above Ground Biomass Estimation
The next section of the lab the above ground biomass (AGB) for five different tree species was created. To complete this task, the EC_veg_CH_c and the wiscland2_L3 were brought into ArcMap. The wiscland2_L3 shapefile was reclassified into five different classes (hardwood, red maple, pine, oak, and aspen). Then using this newly reclassified image and using the chart and equation below (figure. 1) a new model was created to calculate each of the of the aforementioned species (figure 2). 

Figure 1. Values and equation used to create model in ArcMap
Figure 2. Model used in ArcMap to calculate AGB

Part 3: Calculation of Additional Metrics
For the final step of the lab we asked to calculate stem, branch, and foliage biomass for each of the each of the tree species using the equation below. Inputs for the eqations was found using a set of articles provided by the Dr. Wilson.
a = gain, b is offset, and H is derived tree height
The values for each of the variables were pulled from following
Results
Part 1: Canopy Height Model

Part 2: Above Ground Biomass Estimation

Part 3: Calculation of Additional Metrics

Sources
Data for this lab was provided by Dr. Cyril Wilson of the University of Wisconsin-Eau Claire. 

Friday, December 14, 2018

Lab 6

Introduction
The purpose of this lab is to perform a QA/QC of a topobathy lidar point cloud, create and generate a conflate a breakline for a shoreline as well as configuring the enforcement of that breakline for topo-bathy deliverables.

Methods
The first process in this lab was to correct a topobathy lidar point cloud dataset in Delta County, Michigan. This dataset had many classification errors due to both commission and omission errors. These areas where highlighted in LP360 by digitizing areas around dataset errors and labeling these areas either as omission or commision using the create feature class tool. For example, there were many areas in the Lake Michigan that had been wrongly classified as ground points due to the reflection of LiDAR pulses suspended sediments. There were also many inland water bodies that had been classified as ground points. Omission errors included large areas along the coast line that been left unclassified.

Once these areas were identified the errors on the coastline were corrected using manual classification. This was done to ensure that the breaklines would have enough ground points to conflate the polyline feature accurately.  After this process was completed, a polyline feature was created for shoreline. Then a filter for the was applied to data displaying only ground and 31 reserved points and the LAS dataset was displayed via a TIN. Using the conflate manager a new task was created using the drape method for the breaklines with vertices being created every 5 map units. The polyline was drawn tracing the shoreline using the ground points. Following the creation of the polyline, a polygon feature class was created using the same parameters and instead of tracing the shoreline, the polygon feature traced the rest of the dataset along with the shoreline.

Once the polyline and polygon were created, they were then brought into LP360 where they were conflated over LAS tiles (figure 1).
Figure 1. Breaklines created in ArcMap used to create DTM in LP360
Once the LAS files and newly created breaklines were brought into LP360 the surface was then exported as DTM. The source points for the DTM were ground points, roads, and 31 reserved. The surface method was a TIN and breaklines enforced using the newly created polyline and polygon. Using these parameters a new DTM was created for Delta County dataset.

Results
DTM created in LP360 from the breaklines created in ArcMap 
Sources
The materials for this lab were provided by Dr. Cyril Wilson of the University of Wisconsin-Eau Claire

Wednesday, December 5, 2018

Lab 5


The purpose of this lab is to create, identify and fix topological errors in breaklines, conflate breaklines, inspect breaklines elevation data, configuration of surface features based upon breaklines. 

Methods
The first section of the lab, I was given a shapefile that had outlined all the water bodies in the the study area. From this shapefile topological errors were found and corrected in ArcMap. This was done to ensure that there would not be errors in breakline enforcement between the breaklines and the LAS point cloud. If there were areas where there were topological errors it lead to undesirable results when hydro-flattening the water bodies in the study area.

Once the topology of the lines was corrected, the breaklines could then be conflated. To begin, the LAS file was displayed in a TIN surface. Then a new conflation point cloud task was created using the topologically corrected shapefile from the first section of the lab. The ground and water points were set as the source points for the conflation. The conflation method used was ponds and lakes was Summarize Z. This was used to ensure that the Z values for the conflated breakline would create a homogeneous elevation so that the water bodies could be hydro-flattened. For Islands and rivers the Drape conflation method was used. Then the conflation task was run. Once the breaklines had been conflated, the newly conflated brealines were brought into ArcMap where a new z elevation field was created and the z value geometry was calculated, giving the elevation values for the breaklines. Then the breaklines where used using Breakline Enforcement to hydro-flatten the water bodies. Once this was done smooth contours where generated for the image at an interval of 5 map units. This was done to ensure that the hydro-flatten water bodies were in fact flat and did not have elevation errors. Then a digital terrian model of the LAS tiles was generated using the classified ground points.

Once this was completed, we were given a LAS dataset for the city of Eau Claire. This LAS dataset was imported into ArcMap where we then created a new polygon and polyline feature that were used to create conflated breaklines for the Chippewa River. In each of these features, fields were created for the water type, min Z, mean Z, and maximum Z values. Once the features were created, they were then digitized over a TIN surface created from the LAS file. The polygon outlined the river and the polyline was digitized down the center of the river. The polyline was digitized down the center of the river so that a downstream constraint could be calculated for the river. Once the breaklines were created the River-Flattening  tool was to hydro-flatten the Chippewa River.