Monday, September 15, 2014

Measuring Intercepted Solar Radiation

Overview     

     Many factors influence the true amount of solar radiation that actually reaches the planet's surface. The factors can include the orientation of the Earth's surface, sky obstructions, and surrounding topographic features. Because there are so many factors that can alter how much radiation is being intercepted, P.M. Rich, W.A. Hetrick, S.C. Saving, and R.O. Dubayah developed an algorithm that allows for rapid calculation of the amount of intercepted radiation. Each of the possible factors that affect interception were given formulas to account for there role in the interception. 



Two different projections for intercepted solar radiation. 
Top: hemispherical Coordinate System
Bottom: Equiangular Coordinate System




Process

     In order to measure radiation, the sky is projected onto a plane and then divided into sections. By dividing the sky into sections, it was easier to formulate a table of values that would be easier to compare and analyze. Each section of the sky, depending on the amount of radiation absorbed, will have a calculated "irradiance" value. Irradiance is the amount of electromagnetic radiation per unit area that can be found on a specific surface. Having the value for the full amount of radiation (irradiance) allows for influential factors to be accounted for with the help of cosine algorithms. After accounting for influencing factors, the values are put into a table to analyze. 




Examples of how the sky can be divided. 
Top: A more evenly divided sky provides calculations from all sky directions.
Bottom: The unevenly divided sky produces better calculations when dealing with one specific spot and one particular sky direction.

Evaluation & Conclusion


     The values that were calculated were compared to topographic maps in order to find out what obstructions were causing some areas to collect less radiation that others. The division of the sky played a key role in the results. For example, the more finely and equally divided the sky provided better results when measuring radiation from all directions in the sky while the unequally divided sections produced better results when trying to determine radiation collected in one particular spot from a particular direction.

This process is one that is viable, flexible, and very applicable to a wide array of uses. This algorithm could help decide which areas could make the best use of solar panels for better energy yields. Because the Sun provides such a large amount of Earth's energy and also due to the fact that different influential factors can always be given their own proprietary cosine equations, this algorithmic technique to measure solar radiation exhibits enough flexibility to make it a contender in future environmental energy problems and endeavors.


Rich, P.M., R. Dubayah, W.A. Hetrick, and S.C. Saving. 1994. Using viewshed models to calculate intercepted
solar radiation: applications in ecology. American Society for Photogrammetry and Remote Sensing Technical
Papers. pp 524–529. 

http://professorpaul.com/publications/rich_et_al_1994_asprs.pdf


Could Katrina have made more damages without Lidar ?

A long time has gone by since the devastating Hurricane Katrina. Nevertheless, New Orleans still carries the stigmatisms of that dark day of August 29th 2005.  What if we move back to August 2005 to learn a little bit more about the behind the scene of that event.

At this time, a mapping tool, Lidar, was used to collect information about the topography of inundation in New Orleans. Based on “a high level of spatial detail and vertical accuracy of elevation measurements, light detection and ranging remote sensing is an excellent mapping technology for use in low-relief hurricane-prone coastal areas”, according to Dean Gesch.  Thus, this high-resolution and high accuracy elevation data is more than useful when it comes to determine the flooding risk in cities, especially the coastal ones. Furthermore, it is useful for studies of the responses to impacts of storms. Indeed, this elevation data prove to be essential to determine the hurricane response and recovery activities. They can accurately establish a map of the different land-surface elevations within the city. In 2005 this tool, Lidar, even though was relatively new in the world of remote-sensing technology and because of its advanced technology, the U.S. Geological Survey used it for their National Elevation Dataset. New Orleans elevation data were updated in June 2005; therefore, they were already available for response to Katrina.
Figure 1: We can see the different elevation data of New Orleans. Red being the highest and blue the lowest.

Immediately after the levee breaches, there was a demand for a map showing the extent and magnitude of the flood waters in the city. The National Elevation Data proved to be a great help for mapping the extent and the depth of the inundation because no aerial imagery of the area were available at this period. Basically, knowing that the level of the Lake Pontchartrain and the flood waters equalized, the elevation was calculated from a lake-level gage on the lake and the data provided by Lidar. By its effectiveness and quickness to be realized, they could project the length of time required to remove the water from the city.


Figure 2: Map of the relative water depth for the New Orleans area.

If we compare the two figures, we can see that the most devastated part of the city could have been predicted as the lowest land-surface elevation on the first map represents the deepest flood waters on the second.

What if we change?
Unfortunately, Dean Gesch, the author of the article, does not provide any information if they used this tool to prevent environmental hazards, such as hurricanes. In any case, Lidar probably helped New Orleans’s mayor, Ray Nagin, to evacuate the most risky part of the city. However, we could analyze the use of the supposed-to-be-safe levee. Are they as safe as we think they are? Do they really prevent from flooding? Apparently this is not even an accurate question, when we see what happened in New Orlean. Of course levees provide an important source of safety, and category 5 hurricanes are not that common, but you have to be ready for the worse at anytime. Mother nature is not going to tell you months in advanced what she is up to. We know now that the levees were not strong enough to resist to this hazard and when you look at a transversal cut of New Orleans, you understand how it ended up like a vulgar swimming pool.
 Figure 3: Area map of New Orleans and its levees elevation.


To conclude, we could ask ourselves why do Men have to put themselves in a dangerous position? What if we stopped trying to change nature by adjusting to her? That is probably one of the longest debates that governments do not find enough time to talk about.

Reference: Topography-based Analysis of Hurricane Katrina Inundation of New Orleans By Dean Gesch, 2005. 
For Figure 3: http://en.wikipedia.org/wiki/New_Orleans#mediaviewer/File:New_Orleans_Levee_System.svg

Neogeography: A fusion of art and mapping

                Neography is a modern take on geography, combining the science of geography and GIS with digital art.  “The term ‘neogeography’ is taken to engulf traditional geography as well as all forms of personal, intuitive, absurd or artistic explorations and representations of geographical space, aided by new technologies associated with the Geospatial Web” (Papadimitriou 2013).  As this quote illustrates, neography is a fascinating hybrid of two seemingly incompatible fields that is accomplished through the powerful software of modern GIS. 
                                          Figure 1 Technologies that contribute to Neogeography
                
                Geotagging and georeferencing are two such ways our modern intuitive software enables neogeography.  In the process mappers are able to add personal flair such as “snapshots, texts, music, random sounds and noises and even video clips” (Papadimitriou 2013).  With Neogeography, mapping becomes much more of an individual, grassroots phenomenon rather than an administrative government one. 
                Another phenomenon that accompanies neogeography is an increasing accessibility and usability of GIS software.  No longer is it required to map with extensive knowledge of GIS software.  “Some packages, such as those provided by Yahoo, require no prior GIS skills to produce interesting and aesthetically pleasant output in neogeography” (Papadimitriou 2013). 
                In conclusion, neogeography has helped change mapping into a form of art.  “Geographical education may well open a new chapter in response to these developments, possibly called ‘neogeographical education’, whose aim would be to foster educational activities worldwide in order to build the newly emerging geospatially enabled Web 3.0” (Papadimitriou 2013).

Citation:

Papadimitriou, Fivos. "A "Neographical Education"? The Geospatial Web, GIS and Digital Art in Adult Education." International Research in Geographical and Environmental Education 19.1 (2010): 71-74. Routledge. Web. 19 Feb. 2013.

The Times Are Changing and So's the Land!


We as humans have dramatically changed the earth’s surface in the last 150 years. Changes in global land use has lead to more and larger urban centers, thousands of square miles of subsistence and commercial agriculture, the loss of millions of square miles of forest, and much more. These land use changes can and have had detrimental effects on the environment and its ability to supply and service our constantly growing population. Among these detrimental effects are: changes to the atmospheric composition, disruption of ecosystems, biodiversity loss, and the degradation of soil and water.
Food production is one of the biggest environmental concerns because of the large amount of land it uses-croplands and pastures cover about 40% of the earth’s surface-and the resource intensive nature of modern agriculture. Agriculture has changed and grown substantially in the last 40 years. Cropland area has grown by about 12% while fertilizer use has increased by 700%. This can be at least partially attributed to the “Green Revolution” which promoted the use of fertilizers and machinery to increase crop yield.  While these changes have lead to increased crop yield-global grain production has roughly doubled in the last 10 or 20 years-they also have devastating effects on the environment.
Land use also greatly affects the hydrologic cycle. The over use of fertilizers leads to damaging run off that both degrades water quality locally and downstream as well as causing algal blooms and “dead zones”-when the fertilizer reaches the ocean it causes a boom in algae growth, the algae use all the oxygen in the water and nothing else can survive there, thousands of fish wash up on the shore dead during these blooms. Agriculture accounts for 85% of global water consumption and a lot of that water is being pumped unsustainably from underground sources. Some of these sources have salt in the water which ends up on the soil when the crops are irrigated. This salinizes-think salt-the soil and makes it impossible to grow anything there. Deforestation, increased impervious surfaces like roads and parking lots, and urbanization also degrade water quality and disrupt the hydrologic cycle.

In the last 300 years humans have cleared around 7-11 million square km of forest for agriculture or timber harvesting purposes. While reforestation projects are helping build back the forests the biodiversity and some of the ecological services that were lost in the original clearing will likely never fully recover. These changes to vegetation mass, land use, and the hydrologic cycle have impacts on our atmosphere and climate as well. With increased human development have come increased emissions. The earth has natural systems to regulate the composition of gases in the atmosphere one of which is forests which act as natural carbon sinks. However, it is not capable of handling the added weight of our emissions and we already cut down 7-11 million km2 of our carbon-absorbing forests. The earth is warming and it’s at least mostly if not entirely because of humans how we have used and reshaped the land.
Modern land use practices are sacrificing the long-term health of the environment for short-term rewards. Human actions need to take a sharp turn towards sustainability if we want to continue to enjoy the ecological services that our environment provides. Sustainable land-use would not only preserve ecological services for future generations, but would also seek to increase the resilience of that ecological service. For example, creating a plan for a cropland that would have environmental, social, and economic benefits would seek to increase the yield per unit of fertilizer, land, and water input thus reducing the environmental impact. Increasing green-spaces in urban places can reduce runoff and the “heat-island” effect while providing parks to play in and gardens to harvest. The ultimate goal is to find a way to coexist with and leave natural ecosystems as unchanged as possible. Life was around for billions of years before humans and it had a pretty good thing going before we threw a wrench in the works. Working with the natural systems the environment already had in place—using ladybugs to control aphids instead of insecticide—usually gives the best results.
What about GIS?
            GIS was probably used to gather most of these statistics and learn the full extent of these land-use changes. GIS can also be used predict their progression and map potential threats or areas of concern. GIS is a powerful tool for anyone interested in looking at how the surface of the earth has changed, is changing, and will likely change in the future.

Works Cited
Foley, J. A., DeFries, R., Asner, G. P., Barford, C., Bonan, G., Carpenter, S. R., ... & Snyder, P. K. (2005). Global consequences of land use. science,309(5734), 570-574.

  
The Benefits of Improve National Elevation Data
Elevation data is useful for many reasons, but most current US elevation data is at least 30 years old. The National Enhanced Elevation Assessment (NEEA) was done in 2011 to assess the need for new elevation data to be collected.

Lidar, light detection and ranging, can be used to survey elevations, which can reduce the time it takes to update maps, detect fault ruptures to avoid catastrophes, makes surveying safer, makes aviation safer, improves the precision of farming, reduces the time needed for flood risks analysis, collect forest information to determine environmental concerns, detect variation in farm fields to ensure farmers apply the right amounts of chemicals and there is less wasted, locate efficient wind farming locations, survey land for oil and gas companies, and can help locate the best routes for roads, which can save gas and make driving safer.
 After reviewing the benefits and costs of collecting elevation data with different accuracy levels and data collection cycles, it was determined that each collection situation, except quality level one collected annually, would result in savings. Different areas require different levels of accuracy’s with different data collection cycles for optimal savings.
Availability of elevation data and technology available for collecting this data are changing. Improvements in laser and satellite technologies are increasing accuracy and density of Lidar surveys.
The NEEA found that upgrading the nations elevation data would benefit all levels of business and government operations, collecting data over larger areas leads to greater savings, and there are no drawbacks to implementing a national program. The USGS developed the 3DEP Initiative in response to the findings. Elevation data will be collected for the US on an eight-year cycle, and IFSAR (Interferometric synthetic aperture radar) will be used to collect data for Alaska. In addition to the already states benefits, 3DEP will create new jobs and transform the geospatial community.
 Citation:
Snyder, Gregory I. (2013). The Benefits of Improve National Elevation Data. Photogrammetric Engineering and Remote Sensing, Retrieved from file:///Users/rebeccahuteson/Downloads/Synder-2013-NED.pdf

Sunday, September 14, 2014

GIS Tool, LIDAR, Helps Responders See Flood Levels in New Orleans after Hurricane Katrina

According to Dean Gesch, a government research physical scientist, a new remote-sensing technology called LIDAR, which is an acronym for light detection and ranging, proved itself worthy in 2005 when in New Orleans, hurricane Katrina struck. There was a lack of aerial photo data available to judge the flood levels, so LIDAR, which uses light to show data about elevation, was called upon to display elevation information. This would later come into play when responders had to figure out levels of flooding in different areas of town. This elevation data gathered with LIDAR was paired with measurements taken from a lake-level flood gauge. By the time the water was settled, LIDAR and the information from the flood gauge were combined and able to display an indication of what the flood volume looked like in different areas of New Orleans.
As you can imagine, this is very important information for responders to know so that they can appropriately distribute resources and know which areas are in urgent need of help. This is a real-world scenario about how GIS tools such as LIDAR can be more important in our lives than just being some “complicated technology” that nobody wants to pay attention to or take the time to learn. Also, this gives way to the fact that if more people were working in this industry and keeping data about human environments up to date, we could be better prepared to deal with natural disasters by being able to have more accurate representations of our environment and therefore understanding how to protect ourselves.


Works Cited

Gesch, D. (2005). Topography-based analysis of Hurricane Katrina inundation of New Orleans. Science and the storms: The USGS Response to the Hurricanes of.

Monday, September 8, 2014

Finding Success in a Soft Economy

For retail businesses to be successful, it is important for them to understand location information.  Businesses need to understand their markets needs. They need to understand the different dynamics of different locations in order to be successful.

ArcGIS Business Analyst allows retailers to analyze the market of certain locations. Markets are changing and retailers are finding it harder to succeed. The analyst allows the business to see patterns of successful businesses and copy them.
Markets are changing due to the declining economy. Young adults, however, who aren’t effected by real estate, retirement, and investment markets, continue to spend at the same level. ArcGIS Business can analyze sale records and the customer base, giving retailers information about where to locate businesses and what types of merchandise to sell.
United Properties, which owns shopping centers in the Midwest, decided it needed to use GIS data to give its leasers information about the market so they could succeed. They chose Esri ArcGIS server, ArcGIS Mapping for Sharepoint, and the business analyst online API. These tools create interactive maps that report demographic data, helping them find the right location. Users can create reports comparing the retail value of different locations. 

 Nike licensed GIS software to understand where the market for their shoes was. They also use it to find where shoes from their Reuse a Shoe program should be distributed. GIS maps save time because one can be made, and it can be applied to many different retailers.
When the tourist town of Hershey, Pennsylvania began experiencing a downturn, a GIS consulting firm was called. They found that highway systems were directing tourists away from downtown Hershey. They decided they needed to revitalize the downtown area. They used the Huff gravity model in the Business Analyst to figure out if people would be willing to drive far to get downtown. Based on the Huff model, they created a design to fit the market.
Esri Business Analyst Online helped real estate owners figure out what type of restaurant would be most successful in a recently closed barbeque restaurant. The local market was mapped out on the business analyst. The area fit the demographic profile that most Old Spaghetti Factory restaurants typically served.
The success of the shopping centers owned by Evans and Avant can be attributed to the market research they have done. They use Esri Business Analyst Software so their clients choose the best locations for their businesses. Business analyst characterizes neighborhoods so retailers understand their potential markets.
Citation:
Esri. (2012). Improving retail Performance With Location Analytics [Data File]. Retrieved from file:///Users/rebeccahuteson/Downloads/ESRI-improving-retail-performance.pdf

Mexican Americans at increased risk for obesity and diabetes!

In the study Socioeconomic Status and Prevalence of Obesity and Diabetes in a Mexican American Community, Cameron County, Texas, 2004-2007 and driven by Susan P. Fisher-Hoch and her peers, we learn some attributes consistent with the Mexican American community of Cameron County. Indeed, when it comes to obesity and diabetes, this special community on the Mexican American border appears to have differences with Americans. In fact, the research has discovered that Mexican Americans are more likely to be obese or to develop diabetes than others.
The method to carry out this study was based on a cohort on the US-Mexican border in the city of Brownsville, Texas. Susan P. Fisher-Hoch and her research group wanted to discover if any minor socio-economical advantages would affect the risk of obesity and diabetes for the Mexican American population of Brownsville. Thus, on the basis of 2000 census data, they divided the Mexican American population of Brownsville into four strata differing by their annual income. Then started inviting all the households from the selected census blocks to participate in the study. The selected census blocks refer only to the first strata, the “lower income” ($17,830 or less) and the third strata, the “higher income” ($24,067 to $31,747). They finally randomly pick one person from each household in order to participate in this study.
Following the selection process, the participants were asked to take a battery of tests such as blood analysis, blood pressure, blood glucose level, insulin level, height and weight measurements, body mass index (BMI) and waist circumference.
When they had all the data they needed, they ranked the participants by household income and decided to select the top and bottom quartiles in order to obtain a wider difference in household income than the one provided by the use of the full census data. Therefore, the comparison was made between the top 202 participants and the bottom 202 participants on the household income’s basis.


What about GIS?
It gets interesting when they visualized the spatial distribution of households by income with the geographic information system. They collected via Global Positioning System the longitude and latitude coordinates of households and geocoded them on ArcMap 8.3.
The result is relevant. Effectively, it shows a map with two main cluster well defined. We can observe a tendency for the “lower income” to be close from the border whilst the “higher income” is further from the border and more spread out. We can easily imagine the land cost being the cause of this spatial distribution.



Obesity at its best
The results of the research provide important information concerning Mexican Americans health. Firstly and surprisingly, they found no significant difference in the prevalence of obesity and diabetes between the two different socio-economical stratospheres’.  On the other hand, the numbers do indicate a serious health issue that must be addressed.
More than 50% of the participants are obese and 8% of them are morbidly obese.  That is 1.4 times higher than numbers reported nationally for Mexican Americans.
A fourth have diabetes and nearly one in ten participants of the “lower income” were informed through this study that they have diabetes. In fact, 78% of the participants did not have health insurance and, most likely, because of their incomes being too low for them to procure one.
We could question where is the equitable health care system? Should health care be accessible for everyone? Mainly when you know that “In 2006, more than 20 million Americans were estimated to have type 2 diabetes and by 2050, the number of US patients with diagnosed diabetes is projected to rise to 39 million." We need to understand the dramatic issue of this important health matter touching the Mexican American community, especially in Brownsville, Texas.



Reference: Fisher-Hoch SP, Rentfro AR, Salinas JJ, Pérez A, Brown HS, Reininger BM, et al. Socioeconomic status and prevalence of obesity and diabe- tes in a Mexican American community, Cameron County, Texas, 2004-2007. Prev Chronic Dis 2010;7(3). http://www. cdc.gov/pcd/issues/2010/may/09_0170.htm. Accessed [9/7/2014].