Sunday, October 5, 2014

White Rock Lake and Micro Urban Heat Islands




Cathy Aniello, Ken Morgan, Arthur Busbey, and Leo Newland used LANDSAT TM and GIS to map micro-urban heat islands in Dallas, Tx. Specifically, the researchers looked at the White Rock Lake area which has diverse land cover including impervious cover, bare soil, grass, trees, and apartment buildings. Micro urban heat islands are different than heat islands. Heat islands are areas generalized as having higher temperatures than the surrounding rural areas. Micro urban heat islands (MUHI) are hot-spots within the city urban heat island. These researchers believed that increased tree cover would offset the effects of these MUHIs. They looked at satellite temperature readings from LANDSAT TM and found that areas with trees were not only cooler, but had a radiative cooling effect that extended well beyond the tree canopy. They found that the MUHIs also had a radiative heat effect. Interestingly, older apartments and housing areas were significantly cooler than newer ones due to their increased tree cover. The hottest areas in White Rock Lake were land uses associated with impervious cover such as a warehouse district, asphalt parking lots and roads, and the new apartment complexes on the West side of the lake. Big areas of bare soil and grass around the lake were also hot spots. The coolest areas were those with the most tree cover such as the heavily forested area to the North of the lake and the older apartments and residential areas and White Rock Lake. This data reinforces the idea that increased tree cover leads to cooling of surrounding areas and could be used to combat the heat island effect. The MUHIs are an average of 5 to 11 degrees Celsius warmer than their surroundings. Increasing tree cover in urban areas would not only help reduce temperatures but would also help sequester more carbon emissions and other pollutants (which are abundant in urban settings), help prevent runoff and soil erosion, as well as create visually pleasing green spaces.  




Aniello, C., Morgan, K., Busbey, A., & Newland, L. (1995). Mapping micro-urban heat islands using Landsat TM and a GIS. Computers & Geosciences,21(8), 965-969.

Measuring Insolation and Soil Temperature in the Rocky Mountains

Insolation, incoming solar radiation, is essential for life on Earth and is integral to physical, chemical, and biological processes in our world. Insolation has direct effects on water and energy balances and therefore indirectly affects evapotranspiration, photosynthesis, wind conditions, snow melt, as well as air and soil temperature. In this study the main focus was soil temperature. Pinde Fu and Paul M. Rich used digital elevation models (DEMs) and insolation models that accounted for a variety of variables including elevation, atmospheric conditions, and varied topography to create an insolation model for an area near the Rocky Mountain Biological Laboratory in Colorado. 
Digital Elevation Model for the study area
Most interpolation methods to this point are for use on broad scales such as country or continent, but a finer method for smaller areas is not as common. Variables such as elevation, surface orientation (slope), and vegetable cover end up creating a gradient of insolation that changes with the topography. Most methods of interpolating insolation require tremendous data input and computation which in turn require expensive and sophisticated software. Other methods tend to be inaccurate and don’t account for all the aforementioned variables. The goal of this study was to create high resolution temperature maps for the study area using a few measurements from high resolution insolation models. They used Solar Analyst to derive average solar conditions/insolation for the study area. They combined physical soil temperature data samples with their temperature model to calculate temperature gradients based on elevation, topography, and vegetation cover. The result was an accurate and high resolution temperature map of their study area. The temperature and insolation data have applications in both agriculture and forestry. Looking at and understanding the levels and distribution of inoslation over different topographies could be used to determine the best areas to plant crops or which areas of forest are at risk for fires.

 Finished soil temperature map

Fu, P., & Rich, P. M. (2002). A geometric solar radiation model with applications in agriculture and forestry. Computers and electronics in agriculture37(1), 25-35.

Wednesday, October 1, 2014

L’île d’Yeu, Un Espace Convoité : Développement et Aménagement

Comme pour mon dernier article, nous faisons un bond de près de 20 ans en arrière pour lever le voile sur cette étude. Il est question cette fois d’une charmante petite île sur la côte atlantique française, l’île d’Yeu. Cette île, comme la grande partie de la côte atlantique française, bénéficie d’une économie liée à la pêche depuis des années. Les changements apportés à cette île durant la dernière quarantaine d’années ont été très important et c’est pour cela que Patrick Pottier et Marc Robin ont trouvé intéressant de cartographier ces changements à l’aide du SIG.

Ils leur alors fallu prendre en compte un grand nombre de composantes pour construire un modèle simple d’organisation spatiale, d’organisation du territoire. Les deux composantes principales sont séparées en deux sphères interne et externe, où la sphère interne n’est autre que le paysage urbain, agricole et la végétation urbaine, alors que la sphère externe représente le milieu physique, la topographie, l'altitude et le contrôle anthropique. Ces informations ont été récoltées au travers des années afin de créer une carte représentative de l’année 1951 et une de l’année 1990.
Au final, une simple délimitation par polygone est utilisée pour cartographier les zones occupées par l’urbain et l’agricole.

Evolution de l'espace urbain

Evolution de l'agriculture


Deux cartes qui ne sont pas forcément compliquées à réaliser. Ce qui est plus complexe par contre, c’est toute la problématique que montre ces cartes. En effet, lorsque l’on analyse ces cartes, la perte de l’espace agricole au bénéfice de l’espace urbain. En effet, nous pouvons voir par rapport aux années une consommation de l’espace urbain sur l’espace agricole. Tout cela a commencé en 1951 avec l’explosion urbaine de l’île jusqu’en 1995 où 30% du territoire est occupé. C’est d’ailleurs avec ces statistiques que l’on comprend les raisons des changements sur l’île d’Yeu. Effectivement, c’est île a su tirer profit de sa situation favorable au tourisme alors que 51% de ses habitations sont des habitations secondaires.

En conclusion, le système d’information géographique aide à démontrer que l’île a bénéficié d’une économie touristique à la place de se concentrer sur les ressources naturelles. Cela explique l’expansion urbaine aussi importante en défaveur des espaces agricoles.

Tuesday, September 30, 2014

Fuzzy expert systems and GIS for cholera health risk

Cholera is listed as an internationally quarantinable disease by the International Health Organization, and it is one of the most researched communicable diseases, yet it is still wreaking havoc on countries in Southern and Eastern Africa. Outbreaks in 2000 were traced to the uMhlathuze River in the northern part of the KwaZulu-Natal Province. Risk factors for cholera outbreaks include a hot and humid climate and socio-economic factors. The CSIR, Council for Scientific and Industrial Research, has used GIS tools to assess likely locations for outbreaks. Their models use the assumption that environmental conditions like algal blooms trigger Vibrio, the bacteria that cause cholera, growth. If there is Vibrio in the water, spread of the disease then depends on human access to safe water. This risk potential model was designed to predict cholera outbreaks and hopefully prevent them in the future.

 By researching the environment that cholera outbreaks occur in and assessing the risk of outbreaks, they hope to reduce the spread of cholera through well planned resource allocation. The model below describes how a cholera outbreak can be caused by an algal bloom.

The cholera outbreak potential model takes into account average annual rainfall, mean maximum daily temperature on a monthly basis and ‘month of first rains’ per pixel (salts from the first rain run into the river affecting the salinity). Results from the model show long term cholera outbreak risk. However, results do not show location and time of the outbreaks. Expanding the model will incorporate remote sensing data to supply input information for data like phytoplankton levels and the spread of algal blooms. Field data will need to be taken for data like temperature, daily rainfall, dissolved oxygen levels, salinity, oxidization, reduction potential, presence of bacteria, and pH. The model will take into account the weather data around the time of past cholera outbreaks, and predictions of future outbreaks can be made. Funding has been given to this project to make remote sensing possible.
Fleming, Gavin; Merwe, Marna van der; McFerren, Graeme. (2006). Fuzzy expert systems and GIS for cholera health risk prediction in southern Africa. Science Direct. Retrieved from

Monday, September 29, 2014

Big Data

     With so many new technological innovations, it has become increasingly common to gather data about users and examine it in a geographical context. Big data references the databases that belong to large corporations such as telephone companies and even media application developers (FourSquare, Twitter, etc.)

This map represents the movement of Twitter users. These patterns can be studied to learn more about how information spreads in a geographical context.

     Big data is a powerful tool for business analysts as it allows them to study their consumers and their locations, to an extent. This in turn helps companies gain a more comprehensive view of their targeted audience. Some of the data that is collected by these companies is considered to be VGI, or Volunteered Geographic Information. This information is called volunteered because the user agrees to allow the company to collect information about the consumer's use of the product. An example of this symbiotic relationship between business and consumer can be seen in products like MapShare and Google MapMaker.

The map above depicts tourist density. The information was gathered through a photo sharing website / application called Flickr.

    While gathering information about users and consumers by collecting VGI can be useful, many individuals do not always approve of the data collecting that is done by third parties and as a result,
this gathering of data is also the cause of mistrust and irritation among users. In addition to mistrust, VGI is not a substitute for a random population sample. There is a lack of knowledge about the user outside of the fact that they are participating in this generation of information for a third party database. Due to this lack of knowledge about economic status, context and motives, it is difficult to make generalizations about the population who is contributing the data. 

     It has been determined that in order for this method of data collection to be effective, more emphasis must be placed on where the data is coming from and also the contextual conditions of the data. Also, it has been recommended that this practice be viewed as a communication between two participating parties, not just a sender-recipient partnership. As a result of this type of relationship, more in depth data will likely be more readily put forward because a relationship between company and user that contains more trust and communication will, in theory, yield more insight about the consumer to the company.


Fischer, F. (2012, April 1). VGI as Big Data. A New but Delicate Geographic Data-Source. Retrieved September 29, 2014. 

Volunteered Geographic Information: Pros and cons

                VGI data (Volunteered Geographic Information) is an up and coming form of Big Data.  What is “Big Data”, you ask? “In recent years databases in enterprises have grown bigger and bigger. Mobile phones tracking and logging their users’ behavior, social media applications and an increasing number of interconnected sensors, create more and more data in increasingly shorter periods of time. This valuable data is called big data.”  VGI’s can be incredibly useful in that it lets users create a great deal of sharable, valuable data.
Tourist density and flows calculated from Flickr database
               

                However, there are drawbacks to VGI data.  One such problem is that “VGI datasets rather reflect the characteristics of specific online communities of interest but do not necessarily fulfill the qualities of a random population sample.”  VGI is not distributed well over socioeconomic, physical location, or any sort of variable, really.  This is its biggest problem. The future of VGI data must reconcile this lack of distribution with its enormous potential.



Citation:
Fischer, Florian.  “A New but Delicate Geographic Data-Source: VGI as Big Data”.  GEO Informatics.  2012.

Sunday, September 28, 2014

Research Using GIS Gives Insight to Extent of Local Food Flows in Philadelphia

Peleg Kremer and Tracey L. DeLiberty compiled statistical and geographic research using some geographic information systems (GIS) methods such as remote sensing to look into the extent of how locally produced the food in Philadelphia is. As they discussed, the industrialized and urbanized food system involving long travel distances from producer to consumer and use of pesticides to preserve food quality that is married with long produce travel distances contributes to negative health effects in both humans and the environment. This travel distance is often referred to as “food miles”, and a general rule can be formed that the more food miles produce must travel, the more negative health effects it will have on the human consumer and environment. 
Kremer and DeLiberty used GIS techniques to compile maps that expressed the distance from producer, in this case it was farms, to consumers. The consumers’ location of consumption was represented by farmers’ markets, the end of the travel route for the produce. According to their data, the average amount of “food miles” on these produce travel routes is sixty-one. Here is a visual representation of these routes compiled by Kremer and DeLiberty’s research that can reveal the extent of food miles in Philadelphia’s local food system:

Kremer and DeLiberty also compiled helpful information using GIS techniques that revealed to some extent the capability of land to harvest food in residential environments. This was done with infrared, remote sensing, GIS techniques that separated types of vegetation, giving way to being able to see potential in residential areas for food production. They highlighted the fact that focusing on the use of residential land would introduce difficulties because of factors such as land quality and residents’ willingness to participate, however, they made an estimation that if even five percent of that land was used, then around 9.9 million pounds of food could be produced at the most local level, which would benefit the consumers and the environment.
Kremer and DeLiberty’s research gives insight to the flow of food in Philadelphia, and can be very helpful to the food-localization movement. To see their full research article, go to: http://www.sciencedirect.com/science/article/pii/S0143622811000087
Works Cited
Kremer, P., & DeLiberty, T. L. (2011). Local food practices and growing potential: Mapping the case of Philadelphia. Applied Geography31(4), 1252-1261.

Tuesday, September 23, 2014

Scénario catastrophe pour la ville de Quito, Equateur.

En 1995, il était difficile voire impossible de prévoir un séisme suffisamment à l’avance pour éviter une catastrophe. Cependant, il était possible de déterminer les conséquences socio-économiques qu’un tremblement de terre pourrait avoir sur une région donnée. C’est ce qu’à réaliser Monsieur Jean-Luc Chatelain et ses compères dans la région qui les intéressait tout particulièrement, la capitale de l’Equateur, Quito. Etant soumis à une activité sismique intense, Quito se présentait comme une ville parfaite à l’élaboration d’un scénario sismique. D’autant plus que les autorités gouvernementales de la ville se devaient de se prémunir contre ce genre de danger.

Il leur fallait alors récolter suffisamment de données afin de réaliser ce projet. À l’aide de la banque de données urbaines spécialisées gérée par le Système d’Information Géographique SAVANE, mise à disposition par la Direction de la Planification de la municipalité de Quito et l’ORSTOM en 1991, Jean-Luc Chatelain et ses pairs bénéficiait d’un outil de travail privilégié. En effet, ses données, gérées par le SIG, donnaient accès aux informations nécessaires à l’élaboration d’un scénario sismique, telles que la sismicité historique (les tremblements de terre qui ont frappé dans le passé), la distribution des intensités des ondes aux travers de la ville, la topographie et la géologie de la ville. A cela, ils rajoutent des informations concernant les habitations et les infrastructures de la ville. Ces différentes données ainsi qu’une matrice, un calcul standard évaluant les risques de destructions, sont ensuite importées dans le système d’information géographique afin de finalisé une carte montrant les zones les plus à risque de la ville.


 Figure 1 : Répartition des intensités sismiques produites par le séisme côtier (Quito) + Distribution du type de construction (Quito)

Afin de parvenir à cette fameuse carte finale, il aura tout d’abord fallu, dans un premier temps, simuler la propagation des ondes et leurs intensités au travers de la ville. Pour cela, trois événements sismiques historiques ont été retenus avec à chaque fois un hypocentre différent, et couplé aux données topographiques et géologiques de la ville, ils ont obtenu la carte de distribution des intensités sismiques de la ville pour chacun des trois séismes. Dans un second temps, les données  concernant les habitations et infrastructures de la ville ont également été cartographiés. Ses données étant catégorisé suivant le type de construction.

Figure 2 : Carte finale montrant les pourcentages de dégâts des infrastructures en cas d'épisode sismique.

Pour la touche finale, ces cartes sont réunies dans le SIG, les données sont croisées et des matrices standard d’évaluations des dégâts en rapport avec le type de construction et l’intensité sismique de la zone sont ajoutées.  Ces matrices sont les mêmes que celles utilisées pour les scénarios sismiques de la Californie et du Japon. C’est ainsi qu’ils ont obtenu cette carte évaluant le pourcentage de dégâts par zone.

Pour conclure, on peut voir que la carte finale est bel et bien représentative du degré d'intensité des ondes et des types de bâtiments et d'infrastructures. En effet, on peut voir au centre-ville (rectangle), la partie la plus ancienne de la ville, que les bâtiments "adobe", "maçonnerie non renforcé" et "auto-construit" seraient les plus touchés en cas de séismes. Alors que pour le reste de la ville, la partie la plus nouvelle est essentiellement construite avec du "béton armé". Ce qui rend les bâtiments bien plus résistants aux tremblements de terre.
Pour finir, il est important de souligner l'importance du SIG dans ce genre de région où les hommes politiques ne sont pas toujours attentifs aux risques qui les entourent et, tout particulièrement, aux problèmes socio-économiques que ces risques peuvent engendrer.

Monday, September 22, 2014

U.S. Census Data, GIS, and the Preston Medical Library.

     While GIS can be a valuable asset on its own, when paired with other forms of gathering information, it can truly be a limitless form of data collection, analysis and presentation. In 2011, the Preston Medical library in Tennessee used a combination of U.S. census data and GIS programs to better help their customers through an outreach program called Consumer and Patient Health Information Service, or CAPHIS.

     Because more medical risks have been associated with lower literacy rates, the library's goal was to provide CAPHIS customers with information that they could understand. The program would work as follows, when customers or patients would call the library seeking medical information, the library would then use Microsoft Access to locate the caller using a query by zip code then use the data from the census and GIS programs to examine the literacy rates of the location of the caller. These literacy rates are then used to determine the estimated socioeconomic status of that person. After doing this, the library would be able to provide the consumer with medical advice and information at a literacy rate that is determined by their estimated socioeconomic status.

   The benefits of having this combination of GIS and census information are numerous. One key piece of information that was gathered from this methodology was a new idea of which diseases and medical conditions are prevalent in certain counties. Another bonus to using this system was that all of the information was able to be placed on a map. Maps were created to display how many calls took place per 100,000 people and also the rate of ambulatory disabilities per 100,000 people.



     All of this information could have been gathered by recording the numbers presented above. However, by using GIS, the spread of disease, callers, and their proximity to under-served areas are easier to grasp and understand. GIS helps to paint a picture of a large, complex amount of information and present it in a streamlined and manageable format.

Socha Y.M, Oelschlegel, S, Vaught, C.J, & Earl M. (2012). Improving an outreach service by analyzing the relationship of health information disparities to socioeconomic indicators using geographic information systems. JMed Lib Assoc 100(3).  https://lms.southwestern.edu/file.php/4373/Literature/Socha-2012-HealthOutreach.pdf

The emergence of GIS qualifications and their effects on GIS jobs

Careers in GIS are an interesting phenomenon because the field is still relatively new, yet the demand for GIS work is high.  As such, there are heated deliberations over what constitutes a “GIS professional”.  DiBiase thinks that a “GIS Professional [is] someone who makes a living through learned professional work (see table below) that requires advanced knowledge of geographic information systems and related geospatial technologies, data, and methods” (1).








Bill Huxhold was a GIS professional in the nineties who promoted creating a system of qualifications to be called a GIS professional.  Huxhold protests were heard, as he eventually convinced the Urban and Regional Information Systems Association (URISA) to establish a certification committee to study the problem.  His ideas became popular in the GIS community as well.







 As you can see on the table above, GIS jobs are rapidly growing and is a high-demand job.  This was partially due to the establishing of qualifications for GIS professionals (which is anyone who uses GIS to make a living).  “In 2010, DOLETA issued a Geospatial Technology Competency Model (GTCM) that identifies the specialized knowledge and abilities that successful geospatial professionals possess” (4).  Employers, students, and educators can all use the tool for their own purposes.  Now there are multiple certifications GIS professionals can acquire, such as the GISP, that signify to employers that the GIS professional has the correct skill level.



Citation:
DiBiase, David.  2012. “Strengthening the GIS Profession” ArcNews.
Rebecca Huteson
Mapping could help stop Ebola’s Spread

The spread of the Ebola virus through West Africa isn’t thoroughly understood. Lars Skog has researched the spread of past epidemics like the Black Death, the Russian Flu Pandemic of 1889, the Asiatic Influenza of 1957 and the Swine Flu to better understand how these diseases are spread. The spread of the Black Death in the fourteenth century bears a resemblance to the spread of the Ebola virus because they are both spread by small mammals. Based on our current level of knowledge on the spread of Ebola, it is spread by fruit bats. Some rural West Africans hunt them, and the disease can also be spread by their droppings.

Answering question regarding preferable bat habitats, what kind of factors change these habitats and how the virus affects the bats health will help to understand how the virus is spread. Geoinformation technology is currently available for public health response organizations, but collecting more data about these bats and the spread of Ebola could help stop it.

Callahan, D. (2014). Mapping could help stop Ebola’s Spread. Directions Magazine. Retrieved from


Saturday, September 20, 2014

Looking at the University of Texas’s Solar-Power Sustainability Potential


There has been a recent push to integrate more solar-power technology into the University of Texas campus. As you may know, solar panels are often placed on roofs of buildings, and they generate energy from the sun that is converted to electrical power to generate electricity for the interior of the building. This solar power is supposed to be a more sustainable and environmentally friendly alternative. There are some obstacles for putting up roof panels and increasing Texas’s solar power percentage. First, some buildings at The University of Texas feature red tiled roofs that are held with sentiment within the community. Many people would not be too fond of the idea of solar panels being mounted on top of them and adding large steel frames to the roofs’ historical image. The second challenge is that integrating solar-power technology would not be the most economically sound idea for the university’s budget. The University of Texas currently has a power generation system on campus that is working well, and adding in solar-power technology is not the highest priority.

            However, it is important to note that the University of Texas has a great potential for solar technology. The extent of this potential was mapped out with Graphing Information Systems technology by parties affiliated with the solar-power movement. Here is a diagram of the university’s rooftop solar-power potential constructed from a digital elevation model, which portrays elevation of objects, at an aerial view with light detection and ranging (LIDAR) technology shaping out the buildings:
*The closer to red the color is, the more potential that area of rooftop has for solar-power generation. Areas represented by colors closer to blue would be less ideal for placing solar panels as they are more often in the shade.

            Finally, it is important to note that the price of using solar-power technology is going down. Because of this information gathered by Moulay Anwar Sounny-Slitine, a former Master’s student of the University of Texas’s Department of Geography and the Environment and others involved in the push for more solar-power on the university’s campus, ideal places to put solar panels are now known when the economically appropriate time comes. The university has many rooftops that could hold solar panels and provide environmentally friendly technology as one may infer from the diagram. There are plenty of these rooftops that aren’t comprised of the prized red tiles which carry the aesthetic sentiment held among some in the university’s community. According to Slitine, this is a fair amount and can be utilized in the future. This is an exciting prospect for the University of Texas’s sustainability level.

Works Cited

Sounny-Slitine, M. (2011, October 21). Solar Power Potential on the University of Texas Campus. Retrieved September 20, 2014.

Monday, September 15, 2014

Measuring Intercepted Solar Radiation

Overview     

     Many factors influence the true amount of solar radiation that actually reaches the planet's surface. The factors can include the orientation of the Earth's surface, sky obstructions, and surrounding topographic features. Because there are so many factors that can alter how much radiation is being intercepted, P.M. Rich, W.A. Hetrick, S.C. Saving, and R.O. Dubayah developed an algorithm that allows for rapid calculation of the amount of intercepted radiation. Each of the possible factors that affect interception were given formulas to account for there role in the interception. 



Two different projections for intercepted solar radiation. 
Top: hemispherical Coordinate System
Bottom: Equiangular Coordinate System




Process

     In order to measure radiation, the sky is projected onto a plane and then divided into sections. By dividing the sky into sections, it was easier to formulate a table of values that would be easier to compare and analyze. Each section of the sky, depending on the amount of radiation absorbed, will have a calculated "irradiance" value. Irradiance is the amount of electromagnetic radiation per unit area that can be found on a specific surface. Having the value for the full amount of radiation (irradiance) allows for influential factors to be accounted for with the help of cosine algorithms. After accounting for influencing factors, the values are put into a table to analyze. 




Examples of how the sky can be divided. 
Top: A more evenly divided sky provides calculations from all sky directions.
Bottom: The unevenly divided sky produces better calculations when dealing with one specific spot and one particular sky direction.

Evaluation & Conclusion


     The values that were calculated were compared to topographic maps in order to find out what obstructions were causing some areas to collect less radiation that others. The division of the sky played a key role in the results. For example, the more finely and equally divided the sky provided better results when measuring radiation from all directions in the sky while the unequally divided sections produced better results when trying to determine radiation collected in one particular spot from a particular direction.

This process is one that is viable, flexible, and very applicable to a wide array of uses. This algorithm could help decide which areas could make the best use of solar panels for better energy yields. Because the Sun provides such a large amount of Earth's energy and also due to the fact that different influential factors can always be given their own proprietary cosine equations, this algorithmic technique to measure solar radiation exhibits enough flexibility to make it a contender in future environmental energy problems and endeavors.


Rich, P.M., R. Dubayah, W.A. Hetrick, and S.C. Saving. 1994. Using viewshed models to calculate intercepted
solar radiation: applications in ecology. American Society for Photogrammetry and Remote Sensing Technical
Papers. pp 524–529. 

http://professorpaul.com/publications/rich_et_al_1994_asprs.pdf


Could Katrina have made more damages without Lidar ?

A long time has gone by since the devastating Hurricane Katrina. Nevertheless, New Orleans still carries the stigmatisms of that dark day of August 29th 2005.  What if we move back to August 2005 to learn a little bit more about the behind the scene of that event.

At this time, a mapping tool, Lidar, was used to collect information about the topography of inundation in New Orleans. Based on “a high level of spatial detail and vertical accuracy of elevation measurements, light detection and ranging remote sensing is an excellent mapping technology for use in low-relief hurricane-prone coastal areas”, according to Dean Gesch.  Thus, this high-resolution and high accuracy elevation data is more than useful when it comes to determine the flooding risk in cities, especially the coastal ones. Furthermore, it is useful for studies of the responses to impacts of storms. Indeed, this elevation data prove to be essential to determine the hurricane response and recovery activities. They can accurately establish a map of the different land-surface elevations within the city. In 2005 this tool, Lidar, even though was relatively new in the world of remote-sensing technology and because of its advanced technology, the U.S. Geological Survey used it for their National Elevation Dataset. New Orleans elevation data were updated in June 2005; therefore, they were already available for response to Katrina.
Figure 1: We can see the different elevation data of New Orleans. Red being the highest and blue the lowest.

Immediately after the levee breaches, there was a demand for a map showing the extent and magnitude of the flood waters in the city. The National Elevation Data proved to be a great help for mapping the extent and the depth of the inundation because no aerial imagery of the area were available at this period. Basically, knowing that the level of the Lake Pontchartrain and the flood waters equalized, the elevation was calculated from a lake-level gage on the lake and the data provided by Lidar. By its effectiveness and quickness to be realized, they could project the length of time required to remove the water from the city.


Figure 2: Map of the relative water depth for the New Orleans area.

If we compare the two figures, we can see that the most devastated part of the city could have been predicted as the lowest land-surface elevation on the first map represents the deepest flood waters on the second.

What if we change?
Unfortunately, Dean Gesch, the author of the article, does not provide any information if they used this tool to prevent environmental hazards, such as hurricanes. In any case, Lidar probably helped New Orleans’s mayor, Ray Nagin, to evacuate the most risky part of the city. However, we could analyze the use of the supposed-to-be-safe levee. Are they as safe as we think they are? Do they really prevent from flooding? Apparently this is not even an accurate question, when we see what happened in New Orlean. Of course levees provide an important source of safety, and category 5 hurricanes are not that common, but you have to be ready for the worse at anytime. Mother nature is not going to tell you months in advanced what she is up to. We know now that the levees were not strong enough to resist to this hazard and when you look at a transversal cut of New Orleans, you understand how it ended up like a vulgar swimming pool.
 Figure 3: Area map of New Orleans and its levees elevation.


To conclude, we could ask ourselves why do Men have to put themselves in a dangerous position? What if we stopped trying to change nature by adjusting to her? That is probably one of the longest debates that governments do not find enough time to talk about.

Reference: Topography-based Analysis of Hurricane Katrina Inundation of New Orleans By Dean Gesch, 2005. 
For Figure 3: http://en.wikipedia.org/wiki/New_Orleans#mediaviewer/File:New_Orleans_Levee_System.svg