Friday, December 11, 2015
GTography: Race and City Council Members in Georgetown
This map shows the distribution of Hispanic populations in Georgetown and city council districts. Georgetown has 7 city council districts and a high Hispanic population (21.8%). Cities are typically racially segregated, as seen in GIS lab #18. Here, it is apparent that some council districts (like that of Councilman Gonzalez) have high populations of Hispanics, while other districts (like that of Councilman Hesser) have very small Hispanic populations. This means that issues faced by Hispanic populations might be dealt with more heavily by those council members located with heavy clusters of Hispanic populations. The city council person for Southwestern University is Rachael Jonrowe, District 6. Data for this map was obtained from the city of Georgetown and Williamson County GIS databases.
Tuesday, December 8, 2015
This article talks about the geographic features of Florida.
The natural landscape of Florida, specifically southern Florida, is the everglades.
These areas are wet and unpredictable which makes it especially difficult to build
structures. The slow, constant flow of water from Lake Okeechobee is a natural
phenomenon that disrupts the plan of man.
In order to change this and make the land usable for
construction, the Everglades Drainage District (EDD) and the Internal
Improvement Fund (IIF) spent about $18 million towards the drainage and
maintenance of the land to build structures. People took advantage of this and the water flow and are
now growing crops in these areas. Despite an immense growth in the economy, these
actions are disturbing the natural ecosystems of this region and we are not
sure what long-lasting effects it may have.
Robert Walker & William Solecki (2004): Theorizing Land-Cover and Land-Use Change: The Case of the
Florida Everglades and Its Degradation, Annals of the Association of American Geographers, 94:2, 311-328
Monday, November 16, 2015
Friday, November 13, 2015
Vegetation mapping methods have dramatically evolved over
the years. Instead of field mapping and photo-interpretation, there are now more
accurate and efficient ways to map vegetation across a landscape. One of the
most common ways is by observing and analyzing the special distribution of
certain vegetation and specific environmental variables. This is called
predictive vegetation modeling.
In order to accurately make this vegetation model, one must
need maps of environmental variables and spatial information about the specific
vegetation that is being emphasized. The relationship between the environment
and vegetation can either be observed or further analyzed. The results of the
map is either a static or equilibrium model. The static models are the most
prevalent and constructable. They measure the temperature, precipitation,
elevation, elevation-derived terrain variables, and surface composition of the
area being studied. These models are useful to draw conclusions about where the
different types of vegetation is most likely to be located.
Miller, Jennifer, and Janet Franklin. Modeling the Distribution of Four Vegetation Alliances Using Generalized Linear Models and Classification Trees with Spatial Dependence. San Diego: Ecological Modeling, 2002. Print.
Monday, November 9, 2015
Wednesday, November 4, 2015
Gtography-Bakeries in Georgetown
This map shows the bakeries that can be found in Georgetown. There are 6 depicted on this map and the majority are on the West side of town.
Tuesday, September 29, 2015
Voter Migration and the Geographic Sorting of the American Electorate
In the United States, most citizens can be divided into two categories: Republican or Democrat. We can determine this by looking at majority votes towards politicians and which ones are in office. States can be divided into red (Republican) or blue (Democrat) states. In the map below, we can see how the United States is divided in political party preference, with the colors regarding the party affiliation of the governor. However, the party affiliation of an area can sometimes be contradictory towards an individual's preferences. This can cause a migration to an area that an individual finds more suitable for his or her political views.
Party Control of Governors' Offices (December 2014) Blue: Democratic Governor Red: Republican Governor Yellow: Independent Governor (Areas in grey boxes in bottom left are US territories) |
In the study by Cho, Gimpel, and Hui, the migration patterns due to party affiliation of citizens of the United States were examined. In 2004, 2006, and 2008, seven states were examined to determine how this migration affected the "political landscape" of each state. These seven states were New Jersey, Maryland, Delaware, and Pennsylvania in the East; and California, Oregon, and Nevada in the West. These states were chosen "for their adjacency, because they register voters by political party, and, importantly, because they maintain accessible, high-quality voter registration records" (Cho, Gimpel, and Hui, 2014). Using these records the migration patterns of the voters could be followed. When looking at the areas that the individuals migrated to, they tended to go to areas where it was more politically favorable towards them. Republicans moved to where Republicans would benefit and Democrats moved to where Democrats would benefit. Many factors come into play when referring to favorable areas for an individual and his or her party, such as "racial composition, income, population density, and age" (Cho, Gimpel, and Hui, 2014). According to the study, income and economic status were the most motivational incentives. Many other factors come into play, as well, but harder to gauge because they could be personal reasons to an individual, that is to say, not entirely political. While showing definitive results, the study does not represent all of America, only seven states. Because of this, the data must be taken with a grain of salt.
Resources: Cho, W., Gimpel, J., & Hui, I. (n.d.). Voter Migration and the Geographic Sorting of the
American Electorate. Annals of the Association of American Geographers, 856-870.
Retrieved September 29, 2015.
GIS and Earthquakes
GIS Mapping of Earthquake-Related Deaths and Hospital Admissions from the 1994 Northridge, California, Earthquake
Earthquakes pose a serious risk to
human health and the public’s safety. Earthquakes have the potential to destroy
entire cities and kill thousands of people in just the matter of a few minutes.
The article by Peek-Asa et al. (2000) was a study done on the 1994 Northridge,
California earthquake that devastated the city. The history of that deadly
earthquake is described below for better understanding of the author’s
experiment.
The basic background of earthquakes
has to be taken into account in order to comprehend the methods used in this
study. Earthquakes are tremors and shaking in the earth’s crust caused by seismic
activity, which is the sudden release of energy.
In regards to the Northridge quake,
it was located in California in an earthquake prone area. While the earthquake
had a duration of only 10-20 seconds, it had a moment magnitude of 6.7. This
was the highest ground acceleration ever instrumentally recorded in a North
American urban area. The tremors were felt as far away as Las Vegas, Nevada,
which was about 220 miles away from the epicenter. The epicenter was located in
the San Fernando Valley, about 20 miles northwest of the downtown area of Los Angeles.
There were several thousand
aftershocks after the main quake, some of which were quite large still. The
death toll was 57 people, while there were more than 5,000 injured.
Furthermore, the Northridge quake amounted to approximately $13-$40 billion in
property damage.
First and foremost, earthquakes are
extremely unpredictable and there is little warning when one is about to occur.
The authors of this study desired to study the spatial relations between the
injuries sustained by people and the seismic activity and location of the
earthquake. Considering earthquakes pose such a massive health threat, the
authors found that there was significance in researching the relations of
seismic hazards and building damage to the risk of injury of a person.
To accomplish this, fatal deaths
and those injured and admitted to hospitals were identified and pinpointed.
Then, all injury locations were charted on map of the area using GIS methods
and software. Subsequently, injuries were analyzed in regard to the distance
from the epicenter of the earthquake, as well as other factors such as the
proportion of damaged buildings in the area, and peak ground acceleration.
The results from the Peek-Asa et
al. (2000) study were that injury severity was inversely related to the
distance from the epicenter (i.e. more injuries occurred in areas closer to the epicenter, and less injuries occurred farther away from the epicenter), and in addition, increased with cumulative ground
motion and building damage. However, the study did not show that injury
severity and incidence were completely predicted by the building damage and the
seismic hazard.
They also predicted that outside
factors such as age and the activity of the person during the earthquake could
have affected the severity of injury (such as driving a car). The figure below
shows the injury locations in regards to how intense the quake was in that
specific area, as well as the proportion of damaged residential structures.
Furthermore, Peek-Asa et al. (2000) found that injuries of all severities
occurred over a wide range of distances from the epicenter of the quake. They discovered
that rescue efforts cannot be solely focused on the immediate damage zone.
Reference:
Peek-Asa,
C., Ramirez, M. R., Shoaf, K., Seligson, H., & Kraus, J. F. (2000). GIS
mapping of earthquake-related deaths and hospital admissions from the 1994
Northridge, California, earthquake. Annals of Epidemiology, 10(1),
5-13.
Web Access:
http://www.researchgate.net/publication/12655552_Peek-Asa_C_Ramirez_MR_Shoaf_K_Seligson_H_and_Kraus_JF_GIS_mapping_of_earthquake-related_deaths_and_hospital_admissions_from_the_1994_Northridge_California_earthquake_Ann_Epidemiol10_5-13https://en.wikipedia.org/wiki/1994_Northridge_earthquake
https://en.wikipedia.org/wiki/Earthquake
Monday, September 28, 2015
The Benefits of Improved National Elevation Data
National elevation data is extremely
useful in areas such as flood hazard mitigation, agricultural productivity,
infrastructure and energy development, resource conservation, and national
security. The National Digital Elevation
Program (NDEP) was created to meet the needs of the government and industry for
digital elevation models. The program
includes numerous federal agencies such as the USGS, the Census Bureau, and
numerous agencies within the Department of the Interior. In general, elevation data updates come for
areas every 30 years, while the technology grows at a much faster pace. At the time of this writing, the elevation
data needs of the United States were not being met, so a task force was created
to assess the potential for improving the national elevation data. The National Enhanced Elevation Assessment
(NEEA) was conducted in 2011 to assess the current needs for improved elevation
data, assess the costs and benefits of improving data, and evaluate new models.
The benefits of improved data are
many, and their significance cannot always be captured by a dollar value. For example, improved elevation data can
eliminate the need for survey crews when constructing new roads, which
eliminates deaths to survey crews that occur yearly. A larger-scale example occurred in
Washington, where improved elevation modeling helped discover a fault near the
Tacoma Narrows that led to an over $700 million bridge repair. As recently as 2014, President Obama declared
that the National Digital Elevation Program would be used as part of the
Climate Action Plan to locate which areas will be most affected by climate
change. Improved data can also be used
for siting wind farms, directing agricultural runoff, and constructing efficient
oil and water pipeline paths. The
research of the NEEA also showed that technology is at a stage of growth where
it makes sense from a cost standpoint to engage in updating the digital
elevation models.
The assessment determined that the
benefits of improving the national elevation models outweigh the costs by a
large factor. There are several
different levels of elevation data quality that can be used, however, and each
quality level comes with a corresponding level of benefits that can accrue at
each level of precision. Each quality
level except for the very highest comes with a net benefit to the US, and at
ratios greater than 4:1. Figure 1 shows
the relative image quality of the highest three quality levels, and Figure 2
shows the cost/benefit analysis of quality levels ranging from highest to
lowest levels of improvements. The
assessment ultimately led to the creation of the 3D Elevation Program (3DEP),
which is now in the process of being implemented. Federal and state agencies work together along
with others to improve the elevation using light detection and ranging (LIDAR)
and interferometic synthetic aperture data (IFSAR) which is used specifically
for data in Alaska. Data will be
collected on 8 year cycles, and annual benefits from a fully funded program
would be $690 million. The 3DEP receives
$50 million annual now, and needs an additional $96 million annually to be
fully implemented. This relatively small
investment could lead to huge savings over time, especially in case of
disasters. Improved elevation data leads
to better emergency flood mitigation plans, better preparedness for impacts of
climate change, and increased operating efficiency and capacity. Watch for annual improvements in the coming
years from 3DEP. The program’s website
is: http://nationalmap.gov/3DEP/.
Snyder, G. I. (2013). The benefits of improved national elevation data.Photogrammetric Engineering and Remote Sensing, 79(2).
The deforestation of areas of land affects the state of the
streams and affects the amount of water in the atmosphere. Over half of the
native vegetation has been removed in the watershed of the Araguaia River in
east-central Brazil.
Without as much
vegetation, there is less evapotranspiration which means there is more moisture
in the ground rather than in the air. Most
of the deforestation is due to the high demand for agricultural uses. This land
is more useful to a person trying to make a living when they can grow crops and
raise cattle. Despite the economical advantages to using land for agricultural
purposes, the ecosystem has been designed to have dense vegetation and it is unnatural
to change one of the ecosystems most identifiable and important
characteristics. Water runoff, river discharge, erosion and
sediment fluxes are the most common hydrological,
geomorphological, and biochemical issues coming from the mass deforestation.
Coe, Latrubesse, Ferreira, & Amsler. (2011). The effects of deforestation and climate variability on the streamflow of the Araguaia River, Brazil. Springer Science Business Media.
Friday, September 25, 2015
GIS as a Disaster Management Tool
In 2010, Haiti was struck by a magnitude 7.0 earthquake that killed between 220,000-316,000 and caused tremendous damage to homes and businesses on the island, making it the most deadly natural disaster in the last decade. In the immediate aftermath of the earthquake, Haiti's communication network was destroyed and actionable information was not being communicated effectively.
The USGS, branches of the U.S. Military and FEMA created maps of the earthquake using GIS images to demonstrate where the strongest effects were felt, and later, where the greatest casualties were taken.
The graphics below illustrate how GIS can assist decision makers in appropriating resources during emergencies with the greatest efficiency possible.
Sources:
http://earthquake.usgs.gov/earthquakes/pager/events/us/2010rja6/index.html
http://voices.nationalgeographic.com/2012/07/02/crisis-mapping-haiti/
http://www.esri.com/news/releases/10_1qtr/haiti.html
The USGS, branches of the U.S. Military and FEMA created maps of the earthquake using GIS images to demonstrate where the strongest effects were felt, and later, where the greatest casualties were taken.
The graphics below illustrate how GIS can assist decision makers in appropriating resources during emergencies with the greatest efficiency possible.
Sources:
http://earthquake.usgs.gov/earthquakes/pager/events/us/2010rja6/index.html
http://voices.nationalgeographic.com/2012/07/02/crisis-mapping-haiti/
http://www.esri.com/news/releases/10_1qtr/haiti.html
GIS is making jumps in big data, APIs from popular apps like Flickr provide big data with geographical context. This data is known as Volunteered Geographic Information (VGI) and can be a valuable information base for real time geodemographics for user profiling. This big data comes with obstacles in validity and reliability that require more testing to improve. What is big data? Along with mobile phone tracking their users, Social media applications such as Facebook, Twitter, and Flickr are used to collect large amounts of data about their consumers, this is big data. Mobile media advances have enabled the collection of big locational data about anyone, anywhere and at any time. Paired with GIS databases companies can use geodemographics for analyzing and visualizing their target consumers and create lucrative sales regions for their goods
This is a map of the tourist density and flows calculated from the Flickr Database.
VGI is created outside the professional practices of the GIS sector but uses a GIS base in its technology. Because VGI is relatively new there are critics such as many GIS practitioners who are concerned with certainty, accuracy and inferior map quality. But due to the clear potential of VGI leads to an acceptance by many practitioners. Hopefully VGI continues to develop and can be used to further help businesses and the community.
Fischer, F. (2012). VGI as Big Data: A new but delicate geographic data-source. GeoInformatics, 15(3), 46-47.
Topography-based Analysis of Hurricane Katrina Inundation of New Orleans
During the relief efforts of Katrina in 2005, response teams
in low lying New Orleans relied on geospatial data to predict the most
inundated parts of the city. Lidar, which was the geospatial technology used to
attain the inundation data, is a high-resolution, high-accuracy elevation data,
which proved valuable for the development of topographic-based products crucial
in the immediate days following the storm. Because of its high level of spatial
detail and vertical accuracy of elevation measurements, USGS scientists were
able to give estimates on flood water volume, areas of extreme flooding, etc.
Because of its high detail and accuracy, lidar is an
excellent mapping technology for use in low-relief hurricane-prone coastal
areas. Possibly lidar could be applicable for use in other disaster relief efforts that involve a geospatial aspect.
Gesch, D. (2005). Topography-based analysis of Hurricane Katrina inundation of New Orleans. Science and the storms: The USGS Response to the Hurricanes of.
When a powerful and deadly hurricane makes landfall
somewhere, geospatial data can be very useful. In 2005, a category five
hurricane known as Katrina made landfall in New Orleans and affected much of
the infrastructure and people in the city. Lidar data has been used to
determine the land surface elevation of a place and was used in Louisiana in
2002 after an oil spill. As a result it was available three years later and
provided high-resolution elevation data for New Orleans after Katrina made landfall
and was extremely helpful in how people would respond to its aftermath. The lidar
data was needed in order to determine the magnitude of flood waters in specific
areas around the city. It allowed people
to make estimates of the floodwater volume as well which were needed so that
people could anticipate the amount of time it would take to get rid of the
floodwater in the city. People were more knowledgeable on how to respond to a
flooded area with this data information. In addition, people can use this
information to determine how flooding might impact an urban environment. This
data will also be helpful to people when planning on building infrastructure or
reconstructing so they can be more ready for these types of disasters in the
future.
Gesch, D. (2005). Topography-based analysis of Hurricane
Katrina inundation of New Orleans.
Science and the storms: The USGS Response
to the Hurricanes of.
Thursday, September 24, 2015
This study shows the latest developments of the Normalized
Difference Vegetation Index (NDVI) in ecology. This data has been used to show
the distributions and abundance of herbivores and non-herbivores. Since about
1981 the importance of different temporal and spatial lags on population
performance can be assessed by the understanding the population dynamics. This was previously thought to most useful in
temperate environments. Models can be
used to reconstruct old patterns in vegetation in the effects of future
environmental change on biodiversity. Since then, the NDVI has been an essential
tool for past and future population and biodiversity consequences of change in
climate, vegetation phenology and primary productivity.
Pettorelli, N., Ryan, S. J., Mueller, T., Bunnefeld, N., Jedrzejewsk, B., Lima, M., & Kausrud, K. (2011). The Normalized Difference Vegetation Index (NDVI): unforeseen successes in animal ecology. Climate Research, (46), 15-27.
Wednesday, September 23, 2015
Solar Radiation Models and Temperature Data
Pinde Fu and Paul Rich’s article concerns solar radiation. They made isolation maps from digital elevation models in order to apply the model for spatial interpolation of different topography aspects in the world. The maps are best utilized in forest and agriculture. Often, geographical information is not accurate that is readily available. These solar radiation models are cost-effective as they do not cost much to build and do not require an insolation monitor station. The models made by Rich and Fu are developed for ARC/INFO GIS platforms. Temperature is part of the information used by the solar radiation models. Weather stations are not very accurate as there are not enough per square mile to get accurate temperature all of the time. In using a solar radiation model, a more accurate temperature can be measured. Many other systems have been used to interpolate temperature, however, they often have miscalculations due to not factoring geography features into their systems. Rich and Fu intend to prove that high spatial resolution maps can provide a better temperature prediction. Together they determine temperature maps for a study based on an isolation model. The then outline how this process is completed.
In order to begin the research process Fu and Rich decided to study at the Rocky Mountain Biological Laboratory in Gunnison County, Colorado. Eleven Hobo soil temperature senors were buried in many different locations around Gunnison County, Colorado. They then logged temperature by each hour. After the data was collected, Fu and Rich began creating insolation and temperature maps to display their findings. The insolation data was inserted into the TopoView model. Most weather stations do not record soil temperature, so this study is valuable. The results determined that soil temperature varies with position in the topographic landscape. It was concluded that during the summer high spatial resolution for temperature was represented. When snow is present the spatial resolution lowers. This study is valuable because using solar radiation models for soil temperature calculation is just one of the abilities. The study mentions that water balance is also available to study with insolation data and solar radiation models.
This model displays soil temperature (in degrees Celsius) with a legend that has the different temperatures in relation to color. The lighter the color on the legend the higher the temperature at 20cm depth. These soil temperatures were collected daily with a minimum and maximums.
In order to begin the research process Fu and Rich decided to study at the Rocky Mountain Biological Laboratory in Gunnison County, Colorado. Eleven Hobo soil temperature senors were buried in many different locations around Gunnison County, Colorado. They then logged temperature by each hour. After the data was collected, Fu and Rich began creating insolation and temperature maps to display their findings. The insolation data was inserted into the TopoView model. Most weather stations do not record soil temperature, so this study is valuable. The results determined that soil temperature varies with position in the topographic landscape. It was concluded that during the summer high spatial resolution for temperature was represented. When snow is present the spatial resolution lowers. This study is valuable because using solar radiation models for soil temperature calculation is just one of the abilities. The study mentions that water balance is also available to study with insolation data and solar radiation models.
This model displays soil temperature (in degrees Celsius) with a legend that has the different temperatures in relation to color. The lighter the color on the legend the higher the temperature at 20cm depth. These soil temperatures were collected daily with a minimum and maximums.
Fu, P., & Rich, P. M. (2002). A geometric solar radiation model with applications in agriculture and forestry. Computers and electronics in agriculture, 37(1), 25-35.
Tuesday, September 22, 2015
GIS Data Being Used to Determine How Population Density in a Local Stream Fish Aggregation Relates to the Intra-Annual Environmental Niche Variability
GIS Data Being Used to Determine How Population Density in a Local Stream Fish Aggregation Relates to the Intra-Annual Environmental Niche Variability
“Research was conducted in Labarque
Creek, a second-order tributary of the Meramac River in Jefferson County,
Missouri” (Anderson, Caruso, et al., 2011). Examination was done four times
over the course of a year, once for each season. Specifically 30 June–2
July 2007 (which can be seen in Fig. 1), 29–30 October 2007, 14–15 January 2008,
and 26–27 April 2008. This was done seasonally because the
environment changes due to the differences in temperature, weather, and water
flow. The data that was obtained involved stream flow rate, dissolved oxygen,
and species of fishes. Stream flow rate varied between seasons. Flow rate was
low in July and October, but high in January and April. Dissolved oxygen, while at a sufficient
level, varied from day to day. This is because there are natural factors that
affect the dissolved oxygen levels, such as temperature. There were 25
different species of fish caught, but only eleven were caught during every
sampling period. Because of this, these eleven species were the only ones where
their data was used, which can be seen in Table 1.
The
data gathered in the study suggests that the effects of the changing seasons on
habitat availability are detrimental towards determining the “variation in
population abundance among species”
(Anderson, Caruso, et al., 2011). The results of the study were also found
to be contradictory to previous findings. Particularly, “the
extent and distribution of available habitat is a strong predictor of variation
in population density among species, but only during colder periods within a
seasonally variable environment, with the understanding that our results are
based on a single location” (Anderson,
Caruso, et al., 2011). This is most likely due to the study occurring in
a small place, in this situation a creek, rather than a widespread area like a
river. Competition most likely increased during the colder seasons, causing a
lower population density (Anderson,
Caruso, et al., 2011). Also, predation as a factor was not taken into account.
Increased and decreased predation during different seasons could have altered
the results if tested for. Regardless, the study was beneficial towards using
habitat availability as a predictor of variation in population density (Anderson,
Caruso, et al., 2011).
Source:
Anderson, K., Caruso, N., Dupre, P., Knouft,
J., Puccinelli, J., & Trumbo, D. (2011).
Using fine-scale GIS
data to assess the relationship between intra-annual environmental niche
variability and population density in a local stream fish assemblage. Methods
in Ecology and
Evolution, (2), 303-311. doi:10.1111
Subscribe to:
Posts (Atom)