skip navigation link

Minutes

Curators of Marine & Lacustrine Geological Samples Meeting
18-20 September 2000

Hosted By
College of Oceanic and Atmospheric Sciences
Oregon State University
Corvallis, OR

Minutes compiled by Bobbi Conard, OSU

2000 MEETING ATTENDEES AND AFFILIATIONS

Meeting Summary

The Curators of Marine and Lacustrine Geological Samples group convened at the College of Oceanic and Atmospheric Sciences, Oregon State University, September 18-20, 2000. Fourteen core curators attended, representing 8 facilities in the US, UK and Canada and the US National Geophysical Data Center/World Data Center for Geophysics & Marine Geology, Boulder. Alan Mix and Martin Fisk, OSU curators, acted as joint meeting chair.

Presentations and discussions included overviews of curatorial facilities, non-destructive core logging systems, the status of digital photography as a tool for core curation versus scientific investigation, Index to Marine & Lacustrine Geological Samples database, Eu-seased databases, facilities' web sites, Global Lake Drilling Project (GLAD), general curatorial issues, development of an educational site for teaching core curation, and the need for having a trained core curator at sea on every coring expedition. A new sediment classification scheme was proposed and a committee formed to study changing the type of sediment information input into the database.

Introduction and Action Item Status

Alan Mix welcomed the attendees and asked for further additions to the agenda. Status of action items from 1998 was addressed:

  1. Regarding NSF participation: A representative was invited, but was unable to come. Most facilities in US get some or all of their funding from NSF; current proposals are held over to the November panel.

  2. Interaction of Core Curators group with IMAGES: Nick Pisias and Alan Mix, who are on the IMAGES board of directors, reported that IMAGES is currently looking for a new database manager. IMAGES is involved with EUROCORE and has submitted data from some cruises to NGDC. Concern was expressed over the scattered storage locations for the IMAGES cores. Guy Rothwell will meet with the IMAGES group in France in October and discuss this matter with them.

  3. Curators Database User Interface: Not as much feedback to Carla Moore as hoped. Updates at NDGC have limited the search parameters for the time being.

  4. Curators Database Modifications:
    • The codes in the database were replaced with plain English equivalents, but the entry program still uses codes. Pull-down menus allow one to choose English descriptions and the entry program then enters the appropriate code.

    • Point samples: No one has submitted point info data; not so critical now that qualifier notes are available

    • Cruise readme files can now be linked to the core database. IMAGES has also provided links into their website. Carla Moore urged the curators to forward to NGDC any information related to cores, for example: cruise reports, photos, descriptions.

  5. Curators List Server: An updated list was sent out to all in September, 2000

  6. Core Storage Survey: September, 2000: Bobbi Conard sent an email to Tobias Moertz last known address requesting information regarding his study on core aging and core storage techniques. No reply as of yet

  7. Multisensor data calibrations...On agenda

  8. Smear slide CD-ROM and Web site: Paula Worstell sent smear slides to Guy. CD-ROM and Web site to be discussed as agenda item

  9. Reference Collections: Not many smear slides were exchanged

  10. Sediment Classification: On agenda

  11. Data Sharing with Eurocore: On agenda

  12. GEOTIMES article: Summary of 1998 meeting was not submitted. Summary of 2000 meeting to be submitted to GEOTIMES and also to online newsletter SEABED news

  13. Paula Worstell's report on the 1998 meeting was reviewed and accepted.

Curatorial Facilities

Representatives from the University of Rhode Island, Scripps Institute of Oceanography, the Limnological Research Center (University of Minnesota), the Ocean Drilling Program, Canadian Geological Survey, British Ocean Sediment Core Repository, and Oregon State University presented overviews of their curatorial operations. Attendees exchanged information about the history and development of their collections, existing facilities and equipment, types of collections and regions from which they were amassed, staffing and funding sources, types of data routinely collected, development and access to databases, and educational outreach programs

Common themes of the discussions included shortage of space for future acquisitions and lack of personnel as a result of limited budgets. Frustration was expressed with the variable quality of descriptions returned from cruises.

Strong Recommendation: In order to ensure the optimum curation of materials collected at sea, a trained curator should be available on every research cruise where cores will be opened. Since core facilities are underfunded and can't provide technicians, we need a pool of curatorial assistants supported by NSF (similar to the coring tech program) to be included in any coring proposal. This could be presented to NSF as a cost effective measure: less work would need to be done on shore and better data would result. Consider the cost of a person onboard the ship vs. the cost to a core facility to curate. If the curation is not done properly and the core samples cannot be used again, the cost is enormous. To be effective, we need to have standardized procedures and common use equipment including computer software, core cutters, and photography systems.

Bobbi Conard and June Wilson conducted a tour of the OSU core repository including the describing room, the walk-in refrigerator, and MST data logger housed in a dedicated seagoing van.

Sediment Classification

A sediment classification scheme should be easy to understand, avoid cumbersome and uninformative terms, be largely consistent with the existing ODP classification scheme, and be useful to database users.

There is a lack of consensus as to the extent sediment classification should be devoid of genetic interpretations, whether the neritic sediment class should be retained, and the scope of the potential data entry. In an attempt to solve these questions, two classification schemes were proposed:

  1. Ternary sediment classification: completely descriptive data can be entered with increasing level of detail, component listings are based on those that can be recognized in smear slides and gross identifications. Principal classes of sediment are to be based on a ternary diagram with 100% biogenic, 100% glass and 100% mineral/lithic as the endpoints. If a single fossil group comprises at least 30%, the sediment is classified as biogenic. For data entry, the percentages of the three major classes (biogenic, glass, mineral/lithic) are needed. A major modifier is >25%; a minor modifier (10-25%).

  2. Simple sediment classification: retains the four major sediment classifications of ODP scheme, eliminates mixed sediment, minimizes the number of component data inputs needed to classify the sediment, and reduces the amount of data entry for the data base.

A long discussion ensued covering the following points:

  1. The majority of sediment is classified according to the ODP classification. Whatever we do must be compatible with ODP's scheme.

  2. Naming conventions: Should the name describe components or should it imply a process? For example: foram ooze implies a process while foram sand denotes texture. Does nomenclature order imply abundance? Terms must be defined so that all use the same name for the same classification.

  3. Quantification of sediment components: Describers vary in their procedures for estimation of components as seen in the smear slides. Sometimes absolute percentages are given; sometimes qualitative descriptors are used (i.e. abundant, common, rare) that represent a range of percentages. It was agreed that it is important not to degrade data and that the database should be able to store both types of data.

  4. Database: The objective is to build a consistent searchable database. What we want in the database is the fundamental components in the sediment. If the data input structure is changed, then the sediment classification could be derived from the components using the ODP scheme for nomenclature.

  5. Lake sediments: volcanic glass is fairly rare in lakes; mostly what are seen are mineral precipitates and biogenic components. Maybe two input pages are needed, one for lakes and one for ocean sediments.

  6. Searches: Carla reported that currently 90% of all searches used the parameters latitude, longitude and water depth. The remainder searched on the basis of an individual component, e.g. rads. Specific components are clearly more important than a sediment classification system.

  7. Currently the database has no method of specifying whether a component was actually looked for and not found or not looked for.

  8. Data input: There are separate entry forms (Excel templates) for dredges and sediments. Excel templates are a good method, but they need a better error checking system. At this point, Carla has to QC everyone's submissions. An example line could be added to show users what the input should look like.

  9. Changes in data input: Carla cautioned that we must decide what we want and then design the database. We must be able to map old to new if we change the fundamental input. In the current input scheme, the user classifies the sediment which implies percentages of components; rather, users should input percentages of components from which a classification can be derived. The input program has a huge overwhelming list from which to select components. Maybe the most important fields should be presented to the end user with a hierarchical system for more detailed information.

Action item: Dave Gunn and Carla Moore will add error checking to the excel template for inputting data.

Action item: Develop a comprehensive list of new data inputs for the data base, considering especially how data for components are to be entered, i.e. options of Trace, Common, Abundant or Absent or %'s. To be undertaken by a committee headed by Steve Carey, with Doug Schnurrenberger, Carla Moore, Paula Worstell, Guy Rothwell, June Wilson, maybe Tom Janecek

GLAD: Global Lake Drilling

The goal of the GLAD program is to obtain long continental records. Most lake cores are a maximum of 20M sediment; in Minnesota this represents about 10,000yrs.

The drilling rig, GLAD800, is a modular platform placed on a barge. It is a mining drill rig that takes 3M cores via hydraulic piston coring (up to 60-70M depth) or rotary drilling and can alternate methods. DOSECC, located in Salt Lake City, owns and operates the GLAD800. Drilling projects are funded by ICDP, International Continental Drilling Program. NSF has funded the science programs for Lake Titicaca in 2001 and Lake Malawi in 2002. Proposals have been submitted to drill in Lake Bosumtwi, Africa and Lake Elgyggytgyn in northeast Siberia.

GLAD1, the testing program, took place summer, 2000 at Great Salt Lake and Bear Lake. About 600M of core was acquired during the testing program. The deepest hole in Great Salt Lake, which has a maximum depth of 9M, was 120.01M. They also drilled a 120M core from Bear Lake, which has a max depth of 80M. Stability of the drilling platform is a problem and will be addressed before the next deployment.

It was suggested that the MST data logger is an essential tool for determining whether whole core coverage in the borehole is obtained; a similar situation exists for ODP drilling. A system like the OSU MST data logger, self-contained in a 20ft container van, could be set up on the lake shore for timely logging of the cores.

Common Curatorial Issues

Shipboard vs. shore based curation: At OSU most cores are opened at sea. Advantages include more personnel available, sediment is fresh, scientists leave the ship with core data, and costs are lower. Disadvantages include lack of consistency in descriptions, photos are difficult to take (easier with advances in digital photography), and describers build a backlog when transit time between stations is short. Some cores still come back unopened.

Core Curation Consistency:

  1. The measurement of a core section is a fundamental issue. When the core is split at sea with half to MST logger and half to describers, the human measurement varies from the logger measurement. With MST loggers, the length of liner is used as the length of the core. Remarks were made as to whether this measurement should include endcaps, some of which create a gap beyond the end of the core liner. At BOSCOR endcaps are removed when possible. The GSC cores are generally in clear liner; the MST can be lined up with the top of sediment.

  2. Consistency of descriptions varies with describer bias. Certain components in a smear slide are often emphasized according to the current project, (e.g. looking for ash layers or radiolarians or forams) while the absence of other components is not noted. It was noted that ODP uses a specialized version of Applecore for shipboard descriptions. No common handbook is used for training describers; some facilities use the ODP manual or give in-house lessons.

Educational CD-ROM: Consistent descriptions begin with proper training. Guy plans to set up an educational web site that would eventually be turned into an educational CD-ROM. This web site could also be a reference curatorial web site with a library of images contributed by all of the curators group.

Consensus: We should set up the reference web site first which would have examples of common and representative sediments and smear slides; photos of cores demonstrating particular features, i.e. this is a turbidite. Photos of structures could also be tied to the classification scheme. Guy has some images and is set up to take photomicrographs. We can also link to other sites with photo data (e.g.: www.radiolaria.org)

Questions arose on to how promote contributions to the reference web-site. Suggestions included advertising or directly requesting contributions.

Sulfide cores: ODP stores sulfide cores in a Nitrogen atmosphere. The cores are encased in oxygen-impervious sleeves, which are sealed after adding Nitrogen. H2S is a safety issue if opening cores in an enclosed area; the usual practice is to vent sulfide cores prior to sampling. Sniffers should be used with this type of sediment.

Preserving core for museum use: The Smithsonian has expertise in this area and should be consulted. . SIO uses heat shrink material around their show and tell cores; as the core warms up, condensation disappears. . ODP shrink wraps basalt cores to hold them in place.

Rock archives: There is no standard methodology for handling individual rocks from dredge collections. Sometimes individual rocks are numbered; sometimes only dredge numbers are given; sometimes a subsample of a dredge haul is numbered and referenced.

Currently, rocks are not stored or handled in an appropriate manner for ultratrace sampling or microbiology research. A subset needs to be set aside from each dredge which will be treated aseptically from the beginning; i.e. a curator is needed on board ship who knows how to do this.

Sediment trap material: CGS and OSU archive trap samples. CGS stores only dry material while OSU has wet splits preserved in formalin. These samples are not listed with NGDC and are distributed through the PI who collected them. If they are to be available to the scientific community, they should be included in the NGDC database.

Decommission, deaccession: With the reality of limited archival space and funding, comes the question whether repositories should keep everything. Various current policies include cold storage for seven years, then move to warehouse; accepting "cores of opportunity"; accepting "orphan" collections when the PI is finished with the samples. Arguments against disposal: sampling requests continue to be received for old material; materials become more valuable once measurements have been made on them; it is expensive to return to a site to recollect samples; old samples may contain unique information that new technologies can measure.

It may be possible to put together oceanographic kits for education; donating extra samples to schools. This could also take more time and personnel than is available.

Consensus: We must support each other and reaffirm there is no basis for throwing samples out. All are valuable and we cannot say when old material is no longer useful. As a group we should be willing to turn samples down. However, if we can't accept them, we should contact the other repositories and find out if anyone else wants the material.

Handiwrap: Handiwrap is no longer being sold in the US. The manufacturer claims that Saran Cling Plus is the same material. Iris will test whether it has the same optical properties as Handiwrap.

MST Data Logging

MST data loggers are fast becoming standard equipment for core curation (GEOTEK has sold more than 40) and attention needs to be focused on the quality of data output. Discussion focused on the following areas:

  1. Quality of sediment core: It's important to be aware of variations in core quality, for example core disturbance (flow-in), or aging (drying out, oxidation/chemical changes, color changes). Steps taken at BOSCOR to insure data quality include: adding distilled water to the core (resaturate), especially sands that drain easily; removing endcaps if the core is not soupy; maintaining good acoustic coupling (rollers work better than original design but large surface topography changes are still a problem for the transducers); using a calibration piece; counting for a sufficient time with the gamma source to reduce noise (usually 5sec); and allowing cores to warm to room temperature since the magnetic susceptibility measurement is very sensitive to temperature changes. When measuring whole cores, it is best to do this with the gamma detector in the horizontal position which results in better p-wave and gamma density data.

  2. Calibrations: Calibrations are essential. Calibrations must be tracked with the raw core data. For gamma analysis, Al calibration sections are used; for p-wave measurements, the travel time through water is measured; for magnetic susceptibility, a calibration source is included with each loop. Be aware that the zero in gamma calibrations is very important. There is a difference in the x-ray attenuation coefficient in different minerals. If you measure a water core, the density calculates out to 1.1. ODP densities are calibrated with water density of 1.1.

  3. Standardization of procedures: In order to compare data between labs, reference standards are needed. As a group we need to agree on what form the standards should be and they should then come from the same manufacturer. Suggestions for possible core standards include the water-Al telescope (as suggested by GEOTEK), a water core, a real sediment core in a standard liner, a synthetic core in a standard liner, an Al stepped bar for split cores. Different core diameters and core liner material will require their own set of standards. Calibration cores could be produced that are traveling cores between labs to see if all are producing consistent information.

  4. Software: The GEOTEK software for the MST data logger cannot store multiple calibration files. This is a problem when measuring cores with varying diameters. Consequently, the raw data must be processed using other software (e.g. EXCEL or MATLAB).

  5. MST length measurements may vary from describers' choices. Being consistent becomes more important than being right.

Consensus: Whatever the MST measures should be the length of the core. Recognize that all depth measurements are relative.

Dave Gunn showed evidence of consistency in MST data measurements when he compared a MST data log measured on a core and then remeasured seven years later. The magnetic susceptibility did not change; the gamma trace showed bulk density dropping a bit due to drying of the core. As the surface dries, the p-wave measurement is lost.

Digital Imaging

There is great interest in digital imaging, both as a scientific tool and an aid to core curation.

Curation: Images from digital cameras are fast and inexpensive and approaching archival quality. There are questions regarding the permanence of digital media, whether the digital format will be readable in the future, and how to obtain and print out images with true colors. New inks for inkjet printers are being produced that have similar lifetimes as those in regular photos. Digital images may replace photos by default due to lack of personnel; photographs take more time.

Consensus: Printed photographs are still the most permanent archival record and their continued use is encouraged.

Scientific tool: Quantifying the red, green and blue lines of the image shows great promise. The GEOTEK line scan camera with its 1000pixel line CCD (25% red, 25% green, 50% blue) eliminates distortion longitudinally. The 1.7f stop lens is very fast but for dark cores has such a small depth of field that the focus is lost. A faster CCD would yield higher sensitivity and greater depth of field. The camera is also very sensitive to vibration, which may preclude its use at sea.

Researchers at Oregon State University have used an Epson digital camera, generating electronic data by importing the JPEG file into Matlab and processing out text files. The red and green lines were stable, but the blue component of the CCD array drifted too much to generate usable data. This digital camera is too automated and for scientific work more manual control is needed

GEOTOP uses a small handheld spectrophotometer from which a measurement is typically taken every 5cm. If the core is regular and flat on the track, more measurements can be taken. The Minolta spectrophotometer has its own software that will download spectral data from which munsell colors can be derived. Readings can be taken through cling film which is transparent but the white balance should not be set with cling film covering the standard. A major disadvantage is that the core surface must be absolutely flat or errant readings result.

For all digital imaging, the calibration of black and white is key to reproducible data. Ideally, a scheme for maintaining a white calibration for each camera shot should be used (e.g. a beam splitter). The zero calibration is equivalent to black, but is this the ambient light surrounding the core or is it the total absence of light? It is not a trivial matter to calibrate black.

Web pages and Databases

NGDC: The NGDC's mission is to promote the free exchange of data (the index to marine geological samples can be accessed at www.ngdc.noaa.gov/mgg/curator/curator.html). Carla Moore presented " the state of the Data Base" at NGDC. Over the last twelve months, there have been 25,011 hits on the curators pages during 14,300 user sessions by 4,896 unique hosts, averaging 69 hits per day. Identifiable US addresses constitute slightly over 51% of users, unidentified IP and network addresses account for roughly 32% of users, with the rest coming from 33 different countries. There are 101,904 samples represented in the Index to Marine & Lacustrine Geological Samples database as of September 15, 2000, with an additional 2,863 samples in review. An unusually large amount of data has been received and processed during the last two years. There were 5,826 samples received in 1999 and 10,476 samples in 8 months of 2000 compared to 709 samples in 1998. It was agreed that the database should be augmented with references, i.e. a core should refer back to a program, thesis, publication, or actual physical location.

Discussion regarding the NGDC database:

  1. Information on cores attempted, but not recovered should be included in the database.

  2. Change the layout of the search page to a simpler format with a button for more advanced search parameters.

  3. The group would like to augment the database with references, i.e. a core would reference back to a program, thesis, publication, actual physical location, etc. Some repositories already reference cores to publications. No consensus was reached regarding the level of detail that should be included in the database.

  4. Concern was expressed over the potential loss of data to the scientific community when IMAGES is completed as it is a program and not a physical institute. Since cores taken within the IMAGES program are housed at multiple repositories, these facilities should submit data pertaining to their cores to NGDC, and the NGDC database should reference the different storage locations.

EU-seased: (www.eu-seased.net) The Marine Science and Technology Programme (MAST) was funded through the European Commission and was expected to co-ordinate with all of Europe. EUROCORE, one of the last MAST programs still funded, came out of the realization that existing cores were underutilized. A consortium was formed and a private company hired to develop the database and server. Personnel were sent to all the repositories to collect data for the project. The objective was to build a searchable central database on the internet of seabed data from ocean basins that are held in European Institutions, providing greater networking and accessibility of the deep sea cores. Concurrently, the European Geological Surveys formed a central database, Eu-marsin, consisting of all the shallow water cores from European seas and the continental shelf less than 200M deep. The two projects have now been merged into one, Eu-seased, that encompasses all major repositories in Europe and all European Geological Surveys. Currently Eu-seased has information on 150,000 cores, samples, or grabs.

Eu-seased is searchable by maps or text. Seabed news, a new on-line newsletter, is accessed from Eu-seased. Seabed news encourages contributions, pictures in JPEG format to make this a global newsletter.

Canadian Geological Survey. CGS comprises ten sites across Canada. Iris presented an overview of a data collection program, Ed@sea, which is based on Personal Oracle. This program is loaded on a portable PC for use at sea to collect information in a standard format ready to be downloaded by the curator and forwarded to NGDC.

Web pages: Most repositories have web pages linked to NGDC, but many lack resources to keep their webpages current. Some search engines are computer dependent; this needs to be addressed if we are to provide information to everyone.

General Discussion: NGDC should provide a link to Eu-seased and vice-versa; links to other large databases are also encouraged. Databases must be kept up to date to maintain their usefulness and the pressure to make and keep databases current must come from the scientific communities. The best scenario is that the scientific communities contribute data to the various databases. The current Eu-seased database was built with about 50 people who covered 14 geological surveys and 7 repositories. It is important to have proactive input with someone in each country visiting the institutions and collecting data, since requests for volunteer information are rarely honored. The next step could be making Eu-seased a global effort. The European Union is looking at eastward expansion with new research programs to link with Soviet Block. Perhaps a US-European workshop meeting could be scheduled in two years. New sources of funding were considered, including advertising and writing proposals that expand databases to include maps, sonar, point source data, and photos. This would be a move away from core databases to building marine libraries.

Consensus: All curators' groups should continue explorations on how to capitalize on collaborations.

Consensus: Anyone who finds a pertinent database will send the link on to the curator's list for possible inclusion on the NGDC web site.

Consensus: All institutions will start sending barrels sheets as JPEG, TIFF, or GIF files to be linked to core information.

Strong Recommendation: All the database managers should be communicating with each other for possible links and collaborations, especially we should be talking to PANGAEA.

Action Items

  1. A meeting summary will be submitted to Geotimes and Seabed News.

  2. Guy Rothwell will inquire about the physical location of IMAGES cores.

  3. Marty Fisk will check on status of archives of rocks collected under the RIDGE program.

  4. Sediment classification: develop a comprehensive list of new data inputs for data base especially how data for components are to be entered, i.e. options of Trace,Common, Abundant or Absent or %'s.

  5. Committee headed by Steve Carey, with Doug Schnurrenberger, Carla Moore, Paula Worstell, Guy Rothwell, June Wilson, maybe Tom Janecek

  6. Committee to draft statement asking for NSF support for seagoing curatorial assistants, similar to coring tech program: Alan Mix and Nick Pisias

  7. Dave Gunn to set up a curatorial reference web site using outline submitted by Guy. Rest of curators group to contribute slides and photos.

  8. Dave Gunn to produce calibration cores for MST logger to share with other groups to test consistency between analyzers.

  9. Carla Moore and Guy Rothwell to insert active links from EUROCORE to searches of the Index to Marine & Lacustrine Geological Samples and from the Index to EUROCORE.

  10. Dave Gunn and Carla Moore to add more error checking in excel data input templates for NGDC

  11. Carla Moore to make new search page with simple and more advanced options. Bobbi Conard will remote test MAC compatibility.

  12. All curators to test Carla's pages; be more responsive and try out Carla's fixes. Also all need to test links from their web pages to other repositories. If anyone finds a pertinent database, forward link to Carla for possible inclusion at NGDC.

  13. Carla to check if personal Oracle will run on MACs.

  14. Carla to contact PMEL to see if we can add more of their cores to NGDC database.

  15. All Curators to send graphics (barrel sheets, photographs, descriptions) of cores for inclusion in the database.

  16. PAC subcommittee: Alan Mix will post a message to curators group regarding the formation of a subcommitee to explore how to better communicate with funding agencies, including the possibility of a meeting in Washington D.C.

Next Meeting

Doug Schurrenberger offered to host the next meeting in fall, 2002 at the Limnological Research Center. An alternate proposal was made to hold the meeting in Washington D.C. so that NSF program directors might attend. This could be a regular three day working meeting with one afternoon devoted to an interagency briefing concerning the long term problems of maintaining repositories. A final decision will be made following further discussion (via email) of the curators group.