Coastal scientist Don Scavia argues that it's time to decide on a new approach to combat human-induced anoxia in the northern Gulf of Mexico in an op-ed published on 2 September 2011. "The definition of insanity is repeating the same thing over and over and expecting different results."
This post relates to Topic 3:Management applications and Topic 6: Management problems to be discussed during the Synthesis Sessions at CERF 2011.
News Articles, Information and Commentary on the
21st Biennial Conference of the Coastal and Estuarine Research Federation
6 - 10 November 2011,
Ocean Center,
Daytona Beach, FL USA
Thursday, September 22, 2011
Northern Gulf of Mexico (Brain-)dead Zone
Tuesday, September 20, 2011
DPSIR - Building Ecosystem Models of Everything, Including the Kitchen Sink
William Nuttle, Organizer for CERF 2011 Synthesis Sessions
wnuttle@eco-hydrology.com
wnuttle@eco-hydrology.com
![]() |
The "acorn" to the mighty DPSIR - Odum's Silver Springs model |
Ecosystem based management represents a new stage in the development of ecosystem models. H.T. Odum’s ecosystem model of Silver Springs is the acorn from which mighty oaks have grown. Odum created this, the first ecosystem model, in the 1950s as a tool to synthesize information from disparate types of data and to illustrate the underlying processes at work in ecosystems. Later, beginning in the 1980s, ecosystem models found wide application in risk analyses related to the implementation of the Clean Water Act. Today, ecosystem models provide the comprehensive framework that managers use to assess environmental problems, often spanning large regions, and to evaluate proposed solutions.
With maturity and widespread application, changes in ecosystem models have followed an arc of increasing scope and complexity. As the acorn is to the oak, Odum’s Silver Springs model is tiny in scale and shares only the most rudimentary elements in common with today's model. Ecosystem based management defines an ecosystem as “a geographically specified system of organisms (including humans), the environment, and the processes that control its dynamics.” Where Odum’s model described the Silver Springs ecosystem simply, in terms of material and energy budgets, today ecosystem models must describe everything in a region - including people - and their kitchen sinks.
The DPSIR framework represents the latest form taken in the continuing growth and development of ecosystem models. DPSIR stands for Driver-Pressure-State-Impact-Response.
- Drivers are factors that result in pressures that in turn cause changes in the system.
- Pressures include factors such as coastal pollution, habitat loss and degradation, and fishing effort that can be mapped to specific drivers.
- State variables are indicators of the condition of the ecosystem (including physical, chemical, and biotic factors).
- Impacts comprise measures of the effect of change in these state variables such as loss of biodiversity, declines in productivity and yield, etc.
- Responses are the actions (regulatory and otherwise) that are taken in response to predicted impacts.
Recently, Atkins et al. (2011) argue that the needs of ecosystem based management can be met by incorporating the concept of ecosystem services into ecosystem models constructed around the DPSIR framework. The approach described by Atkins et al. brings people more fully into the picture by addressing management responses directly and by using ecosystem services to evaluate impacts. DPSIR has been applied broadly in environmental assessments of terrestrial and aquatic ecosystems, especially in Europe. But, the jury is still out on the question of whether ecosystem models built around the DPSIR framework are yet sturdy enough to support regional management of coastal ecosystems.
This post relates to Topic 3: Management applications, Topic 5: Dynamic ecosystems and Topic 6: Management problems to be discussed during the Synthesis Sessions at CERF 2011.
Reference:
Atkins, J.P., D. Burdon, M. Elliott, and A.J. Gregory, 2011. Management of the marine environment: integrating ecosystem services and societal benefits with the DPSIR framework in a systems approach. Marine Pollution Bulletin 62:215-226 (doi:10.1016/j.marpolbul.2010.12.012)
Thursday, September 15, 2011
Spatial Planning for New Energy Development on the Oregon Coast
Robert Bailey, Oregon Coastal Management Program (bob.bailey@state.or.us)
Science plays a big role in finding a place for new “hydrokinetic energy” projects into Oregon’s crowded coastal waters. In late 2007, coastal communities and ocean fishermen were up in arms over plans to put wave energy generating facilities smack dab in the middle of crabbing and other valuable fishing areas. In response, the Governor charged the Oregon Coastal Management Program with the task of working with scientists, stakeholders, agencies, interest groups and others to ensure that new ocean energy devices avoid impacts on ocean fisheries, recreation, and other uses, and protect valuable ecological areas. This kind of effort has today become known as “marine spatial planning.” After more than three years, Oregon is now rounding the corner headed for the homestretch of this effort.
A lot of factors have come into play to create conditions that enable us to incorporate a high level of scientific data into a marine planning process. First, our state ocean policies explicitly require it. Second is wide-spread availability of high-powered, low-cost information technologies, such as ArcGIS, Google Earth, on-line information resources such as the Oregon Coastal Atlas, and creation of “decision-support tools” using Open Source software. These are enabling us, along with scientists, stakeholders, and the public, to use desktop computers at home, public meetings, and in the office to find, view, and assess a variety of data about Oregon’s nearshore marine environment and its uses.
A key factor are the people involved and the fact that, over time, an informal network of scientists and other data providers has emerged along with a similar network of data managers and users within state and federal agencies and NGOs. A principal task has been to find and acquire relevant datasets and then create interactive geospatial databases that allow various data to be used together in a spatially-explicit format. Fortunately, a lot of smart (mostly young) people and some terrific technology, including some we have helped to advance, are enabling us to build a credible scientific data base and decision-support tools to support this planning process.
Even with these supportive conditions, it is still not as easy as one would think to use scientific data to support marine spatial planning decisions. Despite vast amounts of data collected over the years about the marine environment (and believe me, we do know a lot!), pulling the various kinds of data into decision-support frameworks is daunting and time-consuming. The marine environment is vast, complex in at least four-dimensions, and, as we all know, highly mutable over many spatial and temporal scales. While these existing data can be (and have been) used to frame a broad understanding of how the marine environment (in this case off Oregon) functions over time and across ocean space, surprisingly little of it is directly useful to making spatially explicit planning or management decisions (“here, not there”) for a point in time (“now”).
Add to this the fact that new field studies and even simple observations of the seafloor with high-definition video constantly reveal to us how little we truly know about even the ocean within just the first few miles from shore. Add in the reluctance of scientists in one discipline to use data collected from another and the demands of stakeholders and agency decision-makers for certainty, and we have a very complex situation for ensuring scientific integrity of our final plan.
Fortunately, a plan is just a plan. It is a guide, not reality. It will frame where energy development should not go and suggest where it might. We don’t need to know everything now. Many key questions about potential environmental effects of placing one or many wave-energy devices in our nearshore environment can be answered once we know the specific size, shape, and function of the technology involved and the exact location in which it will be placed. Policies already adopted require significant monitoring and rigorous assessment of potential effects such as physical alteration of wave regime on shoreline processes, changes in sediment transport, creation of new habitat structures where none exist, electromagnetic field effects on sharks and rays, contamination from paints and lubricating fluids, entanglement by whales and pinnepeds in a network of anchoring lines, and effects of lights on birds.
Our job as the agency charged with adopting the plan is to make sure the plan is scientifically defensible and is accepted by stakeholders, the public, and agencies as being the best we could do with what we know. The plan cannot outrun the science or faith in that science. So that will likely lead us, when all is said and done, to be fairly cautious about designating “ecological exclusion areas,” “areas important to fisheries,” and identifying areas where energy development will be allowed. But over the past three years we have created the conditions for incorporating science into the planning and…ultimately…the decision-making process. Science is, after all, the best way for us “decision-makers” to account for the complexities and uncertainties of our marine environment so that we don’t end up making decisions that subsequent generations will look upon and ask “what the heck were they thinking!”
This post relates to Topic 6: Management problems to be discussed during the Synthesis Sessions at CERF 2011.
(Figure credit: http://nenmore.blogspot.com/2010/04/doe-grant-for-wave-energy-project.html)
![]() |
Schematic of the OPT wave energy system |
Science plays a big role in finding a place for new “hydrokinetic energy” projects into Oregon’s crowded coastal waters. In late 2007, coastal communities and ocean fishermen were up in arms over plans to put wave energy generating facilities smack dab in the middle of crabbing and other valuable fishing areas. In response, the Governor charged the Oregon Coastal Management Program with the task of working with scientists, stakeholders, agencies, interest groups and others to ensure that new ocean energy devices avoid impacts on ocean fisheries, recreation, and other uses, and protect valuable ecological areas. This kind of effort has today become known as “marine spatial planning.” After more than three years, Oregon is now rounding the corner headed for the homestretch of this effort.
A lot of factors have come into play to create conditions that enable us to incorporate a high level of scientific data into a marine planning process. First, our state ocean policies explicitly require it. Second is wide-spread availability of high-powered, low-cost information technologies, such as ArcGIS, Google Earth, on-line information resources such as the Oregon Coastal Atlas, and creation of “decision-support tools” using Open Source software. These are enabling us, along with scientists, stakeholders, and the public, to use desktop computers at home, public meetings, and in the office to find, view, and assess a variety of data about Oregon’s nearshore marine environment and its uses.
A key factor are the people involved and the fact that, over time, an informal network of scientists and other data providers has emerged along with a similar network of data managers and users within state and federal agencies and NGOs. A principal task has been to find and acquire relevant datasets and then create interactive geospatial databases that allow various data to be used together in a spatially-explicit format. Fortunately, a lot of smart (mostly young) people and some terrific technology, including some we have helped to advance, are enabling us to build a credible scientific data base and decision-support tools to support this planning process.
Even with these supportive conditions, it is still not as easy as one would think to use scientific data to support marine spatial planning decisions. Despite vast amounts of data collected over the years about the marine environment (and believe me, we do know a lot!), pulling the various kinds of data into decision-support frameworks is daunting and time-consuming. The marine environment is vast, complex in at least four-dimensions, and, as we all know, highly mutable over many spatial and temporal scales. While these existing data can be (and have been) used to frame a broad understanding of how the marine environment (in this case off Oregon) functions over time and across ocean space, surprisingly little of it is directly useful to making spatially explicit planning or management decisions (“here, not there”) for a point in time (“now”).
Add to this the fact that new field studies and even simple observations of the seafloor with high-definition video constantly reveal to us how little we truly know about even the ocean within just the first few miles from shore. Add in the reluctance of scientists in one discipline to use data collected from another and the demands of stakeholders and agency decision-makers for certainty, and we have a very complex situation for ensuring scientific integrity of our final plan.
Fortunately, a plan is just a plan. It is a guide, not reality. It will frame where energy development should not go and suggest where it might. We don’t need to know everything now. Many key questions about potential environmental effects of placing one or many wave-energy devices in our nearshore environment can be answered once we know the specific size, shape, and function of the technology involved and the exact location in which it will be placed. Policies already adopted require significant monitoring and rigorous assessment of potential effects such as physical alteration of wave regime on shoreline processes, changes in sediment transport, creation of new habitat structures where none exist, electromagnetic field effects on sharks and rays, contamination from paints and lubricating fluids, entanglement by whales and pinnepeds in a network of anchoring lines, and effects of lights on birds.
Our job as the agency charged with adopting the plan is to make sure the plan is scientifically defensible and is accepted by stakeholders, the public, and agencies as being the best we could do with what we know. The plan cannot outrun the science or faith in that science. So that will likely lead us, when all is said and done, to be fairly cautious about designating “ecological exclusion areas,” “areas important to fisheries,” and identifying areas where energy development will be allowed. But over the past three years we have created the conditions for incorporating science into the planning and…ultimately…the decision-making process. Science is, after all, the best way for us “decision-makers” to account for the complexities and uncertainties of our marine environment so that we don’t end up making decisions that subsequent generations will look upon and ask “what the heck were they thinking!”
This post relates to Topic 6: Management problems to be discussed during the Synthesis Sessions at CERF 2011.
(Figure credit: http://nenmore.blogspot.com/2010/04/doe-grant-for-wave-energy-project.html)
Tuesday, September 13, 2011
Maryland BayStat - Fighting Crime, Restoring Ecosystems, and Connecting People
William Nuttle, Organizer for CERF 2011 Synthesis Sessions
wnuttle@eco-hydrology.com
wnuttle@eco-hydrology.com
Before becoming governor, O’Malley served two terms as mayor of Baltimore. When he entered office in 2000 Baltimore’s murder rate was five times the rate in New York City. During the 1990s New York served as a proving ground for systematic approach to fighting crime that was credited with reducing the crime rate. The approach, known as CompStat, implements concepts of quality control and systems management borrowed from business and industry. O’Malley brought a similar program to Baltimore and expanded it into a general approach for management and accountability in government.
O’Malley’s program for restoring the Chesapeake Bay, established in 2007, goes by the name of BayStat. It’s touted as a “tool designed to assess, coordinate and target Maryland’s Bay restoration programs, and to inform our citizens on progress.” Underlying this approach is an extensive program of environmental monitoring, modeling, and analysis by state agencies and academic scientists. Managers meet frequently with scientists and political leaders to assess progress toward restoration goals and adapt management actions based on results obtained.
One is tempted to say that there is really nothing new to BayStat. There are direct parallels between elements of O’Malley’s program and elements of ecosystem-based management and adaptive management as described in numerous reports by the National Research Council, Council for Environmental Quality, and large environmental NGOs. Quantitative ecosystem indicators, performance measures, restoration targets, and a report card – they are all here.
What makes Maryland's BayStat so special?
Well, for one thing, it is remarkable to see a political leader take the reins and offer a reasoned assessment of current conditions and progress toward restoration, including a summary analysis of the data on half a dozen indicators. This shows a reassuring commitment at the political level. But there is a bit of magic at work here as well. Somehow, in the politician’s hands the bare concepts behind ecosystem-based management, which lie inert on the pages of so many technical reports, become a means to connect people with each other and people with the ecosystems in which they live. Maybe, the magic is in the ability to articulate what fighting crime and restoring ecosystems have in common.
The information in this post relates to Topic 3: Management applications and Topic 6: Management challenges of the CERF 2011 synthesis sessions.
What makes Maryland's BayStat so special?
Well, for one thing, it is remarkable to see a political leader take the reins and offer a reasoned assessment of current conditions and progress toward restoration, including a summary analysis of the data on half a dozen indicators. This shows a reassuring commitment at the political level. But there is a bit of magic at work here as well. Somehow, in the politician’s hands the bare concepts behind ecosystem-based management, which lie inert on the pages of so many technical reports, become a means to connect people with each other and people with the ecosystems in which they live. Maybe, the magic is in the ability to articulate what fighting crime and restoring ecosystems have in common.
The information in this post relates to Topic 3: Management applications and Topic 6: Management challenges of the CERF 2011 synthesis sessions.
Monday, August 29, 2011
Who, What, When, Why and How of Louisiana's 2012 Coastal Master Plan
Alaina Owens (Brown and Caldwell)
Who: The plan is being developed by the Louisiana Coastal Protection and Restoration Authority. In all, over 100 people are involved in the planning effort. The state has established a variety of ways for government staff, industry representatives, non governmental organizations, members of academia, citizens, and other stakeholders to participate in the process.
What: The 2012 Master Plan will offer a comprehensive approach to coastal restoration and risk reduction in coastal Louisiana. The Plan focuses on the following five overarching objectives:
Why: The wetlands of coastal Louisiana help protect communities and critical oil and gas infrastructure from storm surge, support waterborne commerce for the nation, and provide a substantial portion of the nation’s commercial fisheries landings. The coast’s expanse of natural habitats makes it one of the nation’s most unique and valuable landscapes. Unfortunately, Louisiana’s wetlands are being lost at an alarming rate, with an estimated loss of nearly 2,000 square miles since the 1930s.
How: The 2012 Coastal Master Plan is grounded in a coastal “vision.” The vision specifies levels of flood risk reduction for communities and targets for ecosystem services across the coast. A suite of seven predictive models is being used to predict how far various restoration and/or protection projects can move the state toward achieving its vision. Model output, combined with a set of decision criteria (factors that reflect what is important to the state), feed a decision support tool that compares the effects of projects and groups of projects. This information will help state decision makers identify projects that provide the most benefit.
Example project types include levee / floodwall construction, barrier island restoration, and marsh creation. River diversions, hydrologic restoration, shoreline protection and bank stabilization projects are also being analyzed as part of the 2012 Master Plan update.
The use of predictive models in formulating Louisiana's coastal Master Plans relates to Topic 5: Dynamic ecosystems to be discussed during the Synthesis Sessions at CERF 2011. You can hear more on the specific topic of this post in session SCI-010 - Challenges and Innovative Methods Integrating Science and Coastal Decision Making.
For more information on the Louisiana Master Plan go to www.coastalmasterplan.la.gov, or email – masterplan@la.gov.
Who: The plan is being developed by the Louisiana Coastal Protection and Restoration Authority. In all, over 100 people are involved in the planning effort. The state has established a variety of ways for government staff, industry representatives, non governmental organizations, members of academia, citizens, and other stakeholders to participate in the process.
![]() |
Levee/floodwall construction |
What: The 2012 Master Plan will offer a comprehensive approach to coastal restoration and risk reduction in coastal Louisiana. The Plan focuses on the following five overarching objectives:
- Reduce economic losses from storm based flooding to residential, public, industrial, and commercial infrastructure.
- Promote a sustainable coastal ecosystem by harnessing the processes of the natural system.
- Provide habitats suitable to support an array of commercial and recreational activities coast-wide.
- Sustain, to the extent practicable, the unique cultural heritage of coastal Louisiana by protecting historic properties and traditional living cultures and their ties and relationships to the natural environment.
- Promote a viable working coast to support regionally and nationally important business and industry.
![]() |
Barrier island restoration |
When: The plan will be submitted to the Louisiana State Legislature for approval in the spring of 2012. By legislative mandate, the plan must be updated every five years so the state can respond to changes on the ground as well as innovations in science, engineering, and policy. Louisiana’s 2012 Coastal Master Plan is the second of what will be an ongoing series of master plans, each one improving on work done before.
![]() |
Marsh creation |
Why: The wetlands of coastal Louisiana help protect communities and critical oil and gas infrastructure from storm surge, support waterborne commerce for the nation, and provide a substantial portion of the nation’s commercial fisheries landings. The coast’s expanse of natural habitats makes it one of the nation’s most unique and valuable landscapes. Unfortunately, Louisiana’s wetlands are being lost at an alarming rate, with an estimated loss of nearly 2,000 square miles since the 1930s.
How: The 2012 Coastal Master Plan is grounded in a coastal “vision.” The vision specifies levels of flood risk reduction for communities and targets for ecosystem services across the coast. A suite of seven predictive models is being used to predict how far various restoration and/or protection projects can move the state toward achieving its vision. Model output, combined with a set of decision criteria (factors that reflect what is important to the state), feed a decision support tool that compares the effects of projects and groups of projects. This information will help state decision makers identify projects that provide the most benefit.
Example project types include levee / floodwall construction, barrier island restoration, and marsh creation. River diversions, hydrologic restoration, shoreline protection and bank stabilization projects are also being analyzed as part of the 2012 Master Plan update.
The use of predictive models in formulating Louisiana's coastal Master Plans relates to Topic 5: Dynamic ecosystems to be discussed during the Synthesis Sessions at CERF 2011. You can hear more on the specific topic of this post in session SCI-010 - Challenges and Innovative Methods Integrating Science and Coastal Decision Making.
For more information on the Louisiana Master Plan go to www.coastalmasterplan.la.gov, or email – masterplan@la.gov.
Monday, August 15, 2011
Coastal Scientists and Managers in a Three-legged Race to Set Nutrient Criteria
William Nuttle, Organizer for CERF 2011 Synthesis Sessions
wnuttle@eco-hydrology.com
In the ideal partnership between coastal science and management, the job of scientists is to discover underlying causes and describe possible solutions to a problem, and the managers’ job is to implement the solution. This requires scientists and managers coordinate their actions, much like teammates in a three-legged race. So, if one partner makes a change in direction, then it’s bound affect the other’s game.
The US EPA is the lead management agency in the effort to combat the growing “dead zone” the northern Gulf of Mexico. Nutrients from farm fields in Midwestern states, carried into the Gulf by the Mississippi River, feed an annual algal bloom in near shore shelf waters. The death and decay of bloom organisms depletes oxygen in the stratified bottom waters over a very large area of the coast.
The solution to this all too familiar problem of coastal eutrophication is to reduce the amount of nutrients carried to the Gulf by the Mississippi. Managers rely on scientists to provide data and analyses needed to establish defensible nutrient concentration thresholds and loading rates that protect against the negative effects of eutrophication. The key to success is to be able to link the problems caused by excess nutrients in coastal waters directly to the various processes that introduce nutrients into the river up in the watershed.
Scientists and managers have already run this three-legged race in the Chesapeake Bay and its watershed. At the end of last year, December 29, 2010, EPA established a “pollution diet” for the District of Columbia, and large sections of Delaware, Maryland, New York and Pennsylvania. These limits were established across the entire watershed of the bay, all at once, based on state-of-the-art modeling tools, extensive monitoring data, and peer-reviewed science.
However, EPA is taking a different direction in addressing the problem of coastal eutrophication in the northern Gulf of Mexico. Last month, July 2011, EPA reiterated its intent to follow a proposed Framework for State Nutrient Reductions in setting numeric nutrient criteria for the Mississippi River watershed. The proposed framework lays out a states-based approach, instead of the watershed-based approach used for the Chesapeake Bay. The first step is to establish, separately for each of the 31 states in Mississippi-Atchafalaya River Basin, priorities for nutrient reductions among hydrologic basins within the state.
What does the decision by EPA mean for the scientists who will be called on to provide the essential information needed to establish these criteria? How can water managers in Missouri factor in the impacts of eutrophication in coastal Louisiana when setting water quality criteria for rivers and streams in their state? What data will be required and what approach can be taken to perform such an analysis. Will water managers in Kansas use the same approach or a different one that they might prefer? Who can say?
These questions relate to Topic 3: Management applications and Topic 6: Management problems to be discussed during the Synthesis Sessions at CERF 2011.
Figure credit: http://www.georgekrevskygallery.com/dynamic/artwork_detail.asp?ArtworkID=651
wnuttle@eco-hydrology.com
In the ideal partnership between coastal science and management, the job of scientists is to discover underlying causes and describe possible solutions to a problem, and the managers’ job is to implement the solution. This requires scientists and managers coordinate their actions, much like teammates in a three-legged race. So, if one partner makes a change in direction, then it’s bound affect the other’s game.
The US EPA is the lead management agency in the effort to combat the growing “dead zone” the northern Gulf of Mexico. Nutrients from farm fields in Midwestern states, carried into the Gulf by the Mississippi River, feed an annual algal bloom in near shore shelf waters. The death and decay of bloom organisms depletes oxygen in the stratified bottom waters over a very large area of the coast.
The solution to this all too familiar problem of coastal eutrophication is to reduce the amount of nutrients carried to the Gulf by the Mississippi. Managers rely on scientists to provide data and analyses needed to establish defensible nutrient concentration thresholds and loading rates that protect against the negative effects of eutrophication. The key to success is to be able to link the problems caused by excess nutrients in coastal waters directly to the various processes that introduce nutrients into the river up in the watershed.
Scientists and managers have already run this three-legged race in the Chesapeake Bay and its watershed. At the end of last year, December 29, 2010, EPA established a “pollution diet” for the District of Columbia, and large sections of Delaware, Maryland, New York and Pennsylvania. These limits were established across the entire watershed of the bay, all at once, based on state-of-the-art modeling tools, extensive monitoring data, and peer-reviewed science.
However, EPA is taking a different direction in addressing the problem of coastal eutrophication in the northern Gulf of Mexico. Last month, July 2011, EPA reiterated its intent to follow a proposed Framework for State Nutrient Reductions in setting numeric nutrient criteria for the Mississippi River watershed. The proposed framework lays out a states-based approach, instead of the watershed-based approach used for the Chesapeake Bay. The first step is to establish, separately for each of the 31 states in Mississippi-Atchafalaya River Basin, priorities for nutrient reductions among hydrologic basins within the state.
What does the decision by EPA mean for the scientists who will be called on to provide the essential information needed to establish these criteria? How can water managers in Missouri factor in the impacts of eutrophication in coastal Louisiana when setting water quality criteria for rivers and streams in their state? What data will be required and what approach can be taken to perform such an analysis. Will water managers in Kansas use the same approach or a different one that they might prefer? Who can say?
These questions relate to Topic 3: Management applications and Topic 6: Management problems to be discussed during the Synthesis Sessions at CERF 2011.
Figure credit: http://www.georgekrevskygallery.com/dynamic/artwork_detail.asp?ArtworkID=651
Friday, August 5, 2011
PCAST Recommends National Ecosystem Assessments, Better Science
William Nuttle, Organizer for CERF 2011 Synthesis Sessions
wnuttle@eco-hydrology.com
In July, advisors to the President called for the US to begin tracking the state of its ecosystems and evaluating the economic benefits they provide. This report by the President's Council of Advisors on Science and Technology recommends a quadrennial, across the board assessment of the state of US ecosystems and the services they provide.
Right now, the US federal government spends about $10 Billion per year on ecosystem restoration and preservation. This figure does not include investments made by state and local governments and by private groups. One of the concerns raised by the report is that data from government-supported environmental monitoring programs are not available to assess the efficacy of these investments, or indeed whether more work is needed
“The Nation has an urgent need for more complete monitoring systems in order to inform policy, as a basis for development of predictive capabilities, and to address issues of compliance, assessment, and management.” The committee recommends that the federal government conduct a comprehensive assessment of the nation’s ecosystems every four years, the Quadrennial EcoSystems Trends (QuEST) Assessment.
Managing the data in such an effort would be challenging, and this requires attention to data accessibility and the innovative use of information technologies. Other challenges relate to the science underlying ecosystem monitoring and analysis. Questions that must be addressed include:
The recommendations lack the force of regulation or policy. But they do indicate recognition at high levels of need for regular reporting on the nation’s ecological health and, implicitly, for the science underlying ecological monitoring and assessment. Details remain, but implementation must keep in sight that national ecological assets are not simply numbers in a ledger; they are where people live and the resources we depend on.
The information in this post relates to Topics 1 and 5 of the CERF 2011 synthesis sessions.
Figure credit: http://www.nefsc.noaa.gov/ecosys/background.html
wnuttle@eco-hydrology.com
In July, advisors to the President called for the US to begin tracking the state of its ecosystems and evaluating the economic benefits they provide. This report by the President's Council of Advisors on Science and Technology recommends a quadrennial, across the board assessment of the state of US ecosystems and the services they provide.
Right now, the US federal government spends about $10 Billion per year on ecosystem restoration and preservation. This figure does not include investments made by state and local governments and by private groups. One of the concerns raised by the report is that data from government-supported environmental monitoring programs are not available to assess the efficacy of these investments, or indeed whether more work is needed
“The Nation has an urgent need for more complete monitoring systems in order to inform policy, as a basis for development of predictive capabilities, and to address issues of compliance, assessment, and management.” The committee recommends that the federal government conduct a comprehensive assessment of the nation’s ecosystems every four years, the Quadrennial EcoSystems Trends (QuEST) Assessment.
Managing the data in such an effort would be challenging, and this requires attention to data accessibility and the innovative use of information technologies. Other challenges relate to the science underlying ecosystem monitoring and analysis. Questions that must be addressed include:
- “Are current modeling methods adequate to predict the consequences of human-ecosystem dynamics for biodiversity preservation, for ecosystem services, and for biosecurity?
- “What is the scope for using socio-economic data in modeling anthropogenic environmental change?
- “How can existing monitoring systems be augmented to include such data?”
The recommendations lack the force of regulation or policy. But they do indicate recognition at high levels of need for regular reporting on the nation’s ecological health and, implicitly, for the science underlying ecological monitoring and assessment. Details remain, but implementation must keep in sight that national ecological assets are not simply numbers in a ledger; they are where people live and the resources we depend on.
The information in this post relates to Topics 1 and 5 of the CERF 2011 synthesis sessions.
Figure credit: http://www.nefsc.noaa.gov/ecosys/background.html
Subscribe to:
Posts (Atom)