From the Anthropocene to the Applet: A tale of two workshops

Network News Fall 2012, Vol. 25 No. 3
ASM Reports
**NOTE: Includes minor correction to reflect the correct order of author names.

As we looked at the Anthropocene during the recent Long Term Ecological Research (LTER) All Scientist Meeting (ASM) in Estes Park, Colorado, we may have overlooked how rates of change in human activity vary and how such variation impacts what humans must do to cope with change. The picture of change presented by Earle Ellis and like-minded colleagues suggests that anthropogenic changes to the relationship between humans and natural systems may take generations, centuries, or even millennia to run their course. This presents a real challenge to the LTER Network because only a few members of our community study changes on such grand time scales despite our mission to study changes in ecological processes that take longer than normal grant-funded research cycles.

At the same time, the technologies we employ change faster than many of the ecological processes we study. In this context of constant change, information managers must work hard to balance opposing priorities in their contributions to Network and site science. They must weigh possible increases in efficiency and cost-savings by new technologies against the need to adapt and preserve scientific practices that have (in many cases) created and preserved a record of scientific data for three decades or more.

Two working groups at ASM were especially relevant to the question of how today’s research practices can support the study of ecology in the rapidly changing Anthropocene. “What needs to be done today to support ecological analysis in 2100?” was the driving question behind one working group, while another complementary group asked the question, “Could the efficiency of LTER data management be improved by centralizing some activities? If so, which ones and how can we do it?” Below we summarize these two sessions.

Ecological research from a truly long-term perspective

The LTER Network was created with the understanding that some ecological phenomena occur on a temporal scale too large to be effectively studied with 3-5 year grants (or even 10-year grants).. But optimizing today’s efforts to support long-term research on a decadal scale is not a trivial challenge, especially if the goal is to support hypothesis-driven investigation, not just data accumulation.

To better understand this problem, organizers challenged participants to speculate about the ecology of the future and what LTER needs to do now so that a researcher in 2050 could use LTER data to ask questions (for example) about profound changes in North American ecology hypothesized to have happened between 2000 and 2025 (see The future of ecological research: Quantum chaos and the Metadata Effect). To do this, the researcher would have to determine whether or not the LTER data holdings had information that was 1) conceptually useful, 2) technically and scientifically suitable for analysis, and 3) digitally available for use.

The workshop included Susan Stafford’s presentation of the problem, short presentations by six speakers (Bill Michener, Randy Butler, Nancy Huntly, Emily Stanley, Wade Sheldon, and Bob Robbins), and a general discussion. Several key points emerged from these presentations and discussions:

  1. The LTER Network has a dual mission to conduct quality science today while collecting data for others to use tomorrow.
  2. Resource allocation problems are real. It is important that LTER support future third-party research, but it cannot compromise its current scientific mission to do so.
  3. Strategic partnerships with other “big data projects” will be critical going forward.
  4. Although predicting the future of technology is difficult, we can be certain that many things that are difficult today will be easier tomorrow.
  5. Socio-cultural challenges may prove to be more formidable than technical challenges; especially finding resources to support new technologies and the will to embrace the pattern of constant change required to make the most of new capabilities.
  6. Data must be accompanied with appropriate metadata to be fit for third-party use. Metadata benefits third-party users especially, but its collection burdens data providers. This asymmetry creates problems.
  7. Historically, the preparation of metadata has been difficult and time consuming. . We must move into a world where metadata capture acts to enable, not inhibit scientific data archiving. The key to this will be the development of methods for transparent metadata capture, where technology automatically captures both the data and the appropriate metadata, including the history of data transformations (provenance). Automated metadata capture will enable rapid advances in software functionality and user experience with no additional effort by data providers.
  8. Practical and social issues surround the allocation of resources for scientific infrastructure vs. scientific superstructure. Conducting research is science; developing long-term data resources is infrastructure. The ideal funding vehicles for science and infrastructure differ in critical ways, but the ultimate goal is to get the maximum bang for the buck. Where would small additional funding produce the most gain? Go there. Where would large additional spending produce only marginal gain? Avoid that.

 Centralized information management?

The second forward-looking workshop, “Centralized LTER Information Management?” was a facilitated discussion exploring the opportunities and challenges of centralizing LTER information management (IM). Site Information Managers (IMs) and Principal Investigators (PIs) were encouraged to fill out a survey on current IM practices and to participate in discussions on the advantages and disadvantages of centralizing some or all aspects of IM over the next 5-10 years.

The survey had a 48 percent response rate. Responses showed agreement on IM practice, but also some divergent perceptions. IMs reported doing tasks that some PIs may not view as requiring IM time. IMs and PIs also varied in their estimates of time allocated to specific tasks, especially long-term data management, to which IMs allocate more time than the PIs think IMs do.

In discussions, participants classified tasks according to whether they were best centralized or done locally. After classification, they looked for general patterns. The patterns revealed that tasks most associated with providing support for local science were almost always seen as best done locally, but when economies of scale are important, such as in large-scale software tool development, tasks were often classified as candidates for centralization.

In discussing challenges, many IMs pointed out that local IM staff would likely be held responsible for the quality of IT services used by LTER scientists, whether centralized or not. This led to a strong concern that any effort to centralize IM activities, regardless of the potential efficiencies to be gained, would require a carefully designed plan for governance and accountability. Furthermore, since LTER research activities are often highly localized, any effort to build standard solutions to LTER IM challenges must involve a philosophical commitment to developing standard methods for solving custom problems.

Based on an analysis of group responses, the workshop organizers reached six conclusions:

  1. Models for centralization must be based on a consensus understanding of current cost-benefit structures for IM practice.  At present such understanding does not exist.
  2. Some IM tasks can be effectively centralized, others cannot.
  3. The principal drivers that distinguish these tasks are: IM tasks that directly support local science must stay local; IM tasks that primarily support third-party data use can be centralized.
  4. One argument for centralization is improved cost-effectiveness, but this must be traded off against local responsiveness. Effective, on-going, longer-term movement of local IM activities into a centralized model must include new approaches to governance, reporting, and dispute resolution that ensure local responsiveness and effectiveness are maintained while striving for centralized efficiency.
  5. Cost-benefit models outlining the structure of existing IM practices are a starting point for IM centralization and should be used as a foundation for these analyses.
  6. The LTER Network would be best served by internally developed, service-oriented IM centralization plans that emphasize service to both site and Network science.


Today’s science supports future science in two ways: (1) the publication of solid research papers and (2) the collection, annotation, management, and distribution of comprehensive and comparable data sets that provide current measurements for future analyses. Evolving technology, especially with transparent metadata collection, will greatly facilitate the provisioning of data useful for third-party 21st-century ecological research. Some useful technology may arise from the file-sharing world of social media. Some of the automated metadata collection (EXIF data) and annotation (automated tagging) that is occurring in consumer photography may also provide an important model for LTER.

The biggest challenges will be socio-cultural, especially those associated with resource allocation and constantly changing roles and expectations. Scientists and information managers can expect significant technical and cultural change in the coming years. It is easy to envision a future with vast amounts of readily available data; it is quite another thing to find the resources today to fund the collection and storage of those data. There will be (and indeed already is) a constant need to rethink information management problems, and to change the model when technological innovation again alters the associated cost structures. Such movements imply the need for some rapid (and frequent) changes in approach to information management, at least some of which involve the movement of some IM activities into centralized services. These changes must be carefully considered, involve all stakeholders, and serve the needs of LTER science.

All fields experience technological change in which something that once could only be hand crafted increasingly becomes automated and ultimately disappears into a service provided by someone else. With activities that are not related to information technology (IT), these changes often occur slowly, barely perceptible over a lifetime. In IT, the changes are so rapid that several generations of change must be accommodated within a single professional career. Allowing LTER to take advantage of rapidly changing technology will require significant management attention. To deliver the full potential of LTER for 21st-century ecological research, everyone involved with LTER leadership will need to become proficient at change management.

By Robert J. Robbins (UCSD), Susan G. Stafford (AND/SGS), John F. Chamblee (CWT), and Emery R. Boose (HFR)