The first year's effort was approximately April, 1995 through March, 1996. This is based on information I received in June, 1996.
This is a brief summary of the objectives, accomplishments, and plans for each of the 9 tasks of this project. It was taken from information I received from the leaders of the tasks, who responded so enthusiastically to my query that the full responses take 23 pages! That is too long to send to each participant, so I have summarized the highlights for each task in this message, and have made (or will in the next day or two) the full responses available on the project's WWW page.
If you do not have access to or interest in getting the information from the WWW page, I would be happy to FAX or email the full text on request. (Leigh House, Los Alamos Nat'l Lab; Tel: 505-667-1912; FAX: 505-667-8487 or by email).
This summary is intended to help keep our participants aware of what the project has accomplished in it's first full year or so of work, and what the various tasks are planning to do when new funding is received.
Funding for the second year of project work is still uncertain. I am told it is likely, but that is what I have been told for many months, and we do not yet have it. I think that we will get new funding (FY 1997), although exactly when, I do not know. When we do receive the funding, it is likely to be less than what we received for our first year. Thus, we may have to make some difficult decisions about how much each of the project's tasks get, or whether we need to suspend work on some tasks to help assure that others are able to complete their work.
To help us make informed, meaningful, and productive funding decisions, I will be sending a survey to all project participants tomorrow. I will be looking to assemble the responses soon after I return from the Society of Exploration Geophysicists meeting next week.
Quick jumps to tasks:
A. Modeling Components
Task 1. Elastic Modeling of the SEG/EAEG Structures for Comparison With the Acoustic Results.
Task 2. Time Parallel Algorithms
Task 3. Physical Model
B. Imaging and Inversion Components
Task 4. Imaging the SEG/EAEG Dataset Using Approximate Imaging Methods
Task 5. Evaluation of New Cost-Effective Prestack Imaging Methods
Task 6. Seismic Imaging and Inversion via Multi-Level Distributed Computing
Task 7. Event Picking and Tracking for Velocity Estimation Using Neural Networks
Task 8. Fast Global Optimization
Task 9. Seismic Holographic Data Storage and Visualization for Oil and Gas Exploration
Task Leaders: Walter Kessinger, Houston Advanced Research Center (HARC), email: email@example.com and Shawn Larsen, Lawrence Livermore National Laboratory (LLNL), email: firstname.lastname@example.org
Develop and test a 3-D elastic code for calculating synthetic seismic data to evaluate the effects of wave conversions (P-Sv) on imaging in complex structures.
The SEG/EAGE salt model, which was an acoustic model, was extended to an elastic model, in consultation with industry participants. Simple 2-D modeling, using pseudo-spectral code, and imaging experiments provided test results that were compared to acoustic modeling and imaging. A robust 3-D finite-difference elastic modeling code was written (E3D) and was able to have testing started. Notable features of the code are the ability to model realistic earth properties, including attenuation and topography, a variable density computational grid, efficient parallelization, and implementation on two MPP's, the Meiko CS-2 and the Cray T3D. This code and the MPP hardware will allow simulation of geophysical models that are 1 to 2 orders of magnitude larger than has been so far possible.
Full 3-D modeling using pseudo-spectral modeling code to identify the importance of converted mode events and multiples. AVO studies of base of salt and subsalt events. Use acoustic methods for imaging converted mode energy to allow imaging of a single, converted mode path through salt. An elastic subset of the 3-D acoustic data computed by the SEG-EAGE model project will be generated and compared to the acoustic data to help identify the importance of converted wave energy in 3-D imaging. Further, the effects of attenuation will also be studied.
Task Leaders: Jacob Barhen, Oak Ridge National Laboratory (ORNL), email: email@example.com, and Amir Fijany, Jet Propulsion Laboratory (JPL), email: Amir.Fijany@jpl.nasa.gov
Exploit the concept of time-parallelism and of implicit finite-difference methods to speed up forward seismic modeling calculations. This task encompasses two related sub-tasks that are pursuing slightly different approaches to solving the acoustic wave equation, which may be extendible to solving the elastic wave equation, as well.
A new local absorbing boundary condition operator was designed and tested in the IFP 3-D acoustic modeling code. This was the code that was used to compute the SEG-EAGE model datasets. This promises to speed up the calculations that involve boundary conditions. A pilot parallel algorithm based on a splitting method for solving the acoustic wave equation was formulated and initially tested. A novel algorithm for solving the acoustic wave equation was developed that combines the computational efficiency of explicit algorithms with the unconditional stability of implicit algorithms. The new algorithm is expected to be particularly well suited for efficient implementation on a wide range of parallel computing platforms. While initially implemented for solving the acoustic wave equation, it should be well suited for solving the elastic wave equation as well, and is also being investigated for possible use in reservoir modeling applications.
The new boundary condition operator will be further validated and its performance benchmarked. It will then be incorporated into a parallelizable algorithm. The pilot splitting algorithm will be validated against conventional solutions to the acoustic wave equation, and extended to solve the elastic wave equation. The novel, unconditionally stable explicit finite difference algorithm will be validated and benchmarked against the IFP 3-D code. It will also be extended to solving the elastic wave equation. Its usefulness for reservoir simulation will be more fully determined.
Task Leaders: K.K. Sekharan, University of Houston, email: firstname.lastname@example.org, and Mike Fehler, Los Alamos National Laboratory (LANL), email: email@example.com
Collect a large volume of seismic data from a scaled, 3D model of the SEG-EAGE salt structure to allow independent evaluation of the synthetic data set being generated by the SEG-EAGE modeling project.
The physical model was constructed and measures 0.9m x 0.9m in horizontal dimensions, and 0.25m in depth. It has a spatial scale of 1:30,000. Data were collected using a geometry to simulate a typical marine survey, with a total of about 19,400,000 traces recorded, which have a total data volume of more than 80 Gbytes,. The time scaling of the data is 1:18,000. The scaled sampling interval of the data is 7.2 msec, which some participants could not easily work with, so the data were resampled to a sampling interval of 8.0 msec for them. A total of 12 companies actively participated in this model data collection task, and all have received copies of the data set. Several participants have begun processing the model dataset.
Additional data will be collected from the salt model using non-traditional acquisition geometries to help identify strengths and weaknesses of those approaches. In particular, there has been much interest by participants in collecting data using other source-receiver geometries, such as a vertical cable receiver geometry, or a strike direction survey (the initial data set was acquired in the dip direction).
Investigate the use of alternative approaches to 3D imaging, such as prestack time migration and two-pass 3D depth migration, that may be much faster and less costly than full 3D prestack depth migration. The SEG-EAGE model dataset will be used as the test data for evaluating the alternative approaches.
This project task has been delayed in starting, so there are no accomplishments or plans to report.
Task Leaders: Biondo Biondi, Stanford, email: firstname.lastname@example.org, and Mike Fehler, LANL, email: email@example.com
Devise and implement a new approach to 3-D prestack depth migration, termed "common azimuth prestack migration". Common-azimuth prestack migration should be considerably faster than traditional approaches to the migration since it involves a partial pre-stack of data before migrating it.
The Azimuth Moveout (AMO) algorithm was formulated and tested, as was an algorithm to carry out Common-Azimuth migration. AMO is intended to allow partial pre-stacking of the data before the migration, thereby speeding up the migration. A major concern is how the resulting image compares to that from a full 3-D pre-stack migration. A North Sea data set is being used to test the algorithm that combines AMO and Common-Azimuth migration. The North Sea data is being used because the synthetic data from phase C of the SEG-EAGE/GONII calculations are not yet available. Initial results from the combined AMO and Common Azimuth migration are expected at the end of September, with final results to be presented at the SEG meeting in November. A paper presenting the theory of AMO and its application to the North Sea data set is in preparation for the journal Geophysics.
The AMO-Common Azimuth migration algorithm will be tested on the SEG-EAGE/GONII synthetic seismic data (SSD) set when the Phase C data are available from the salt model. The Common Azimuth migration algorithm will be extended to 3-D and tested with the two data sets.
Task Leaders: John Scales, Colorado School of Mines (CSM), email: firstname.lastname@example.org, and Leigh House, LANL, email: email@example.com
Develop scaleable, parallel implementations of the important processing steps that run on workstation clusters or small-scale parallel computing systems. Compare run-times and resulting images with those from traditional serial processing and from massively parallel supercomputers
Initial development of the Distributed Seismic Unix (DSU) was completed. The primary goal of DSU is to demonstrate one approach for exploiting the large-scale parallel computing resources that are becoming increasingly common in the oil and gas industry. Another goal is to develop a common suite of processing programs that can be used by the project to carry out benchmark comparisons to participants own processing programs. Also, an existing serial 3-D prestack depth migration program was adapted to run in parallel on both a collection of workstations and on the Cray T3D. The parallelizing of this program demonstrates how effective a simple approach to parallelizing can be. The use of a 128 cpu partition of the T3D allowed a 100 fold speed up the in time required to migrate a test 3-D data set compared to the time on a single workstation.
Continued development of DSU, with new parallel implementations of several compute-intensive algorithms, such as modeling, migration, inversion, and DMO. The load-balancing within DSU will be improved. Portions of the SEG-EAGE/GONII salt data set will be migrated using a 3-D prestack depth migration algorithm to provide a sample 3-D image for other tasks to compare to.
Project Leaders: Benny Toomarian, JPL, email: firstname.lastname@example.org, and Mike Glinsky, LLNL, email: email@example.com
Develop and apply neural network analysis techniques to several specific problems, including lithology estimation, velocity model estimation, and event tracking. Use neural networks to reduce the dimensionality of seismic data through use of a nonlinear version of principal component analysis.
This task consists of two related sub-tasks, which are developing neural networks to help automate seismic processing. Sub-task 1) focuses on the picking and tracking of events in seismic data, sub-task 2) on estimating rock parameters from seismic data and well logs.
The most promising 2-D features for event tracking were identified (coherence, amplitude moments, Gabor wavelet transforms) as well as proximity features, which will allow using a limited number of human picks. Preliminary work on the use of Constant False Alarm Rate (CFAR) preprocessing showed that it can improve the signal-to-noise ratio of the picks from the neural network. Classifying events in common reflection gathers using a probablistic neural network (PNN) yielded an optimize probability of correct classification of 92%. Results of this work will be presented at the SEG meeting in November.
The goals of the parameter estimation sub-task are to use Artificial Neural Networks (ANN) to use the detailed petrophysical information available from well logs with information from the larger-scale seismic data to yield detailed characterization of reservoir properties at a field-wide scale. ANN algorithms were developed to obtain functional relationships between variations of rock properties (e.g. bed thickness, porosity, sand to clay ratio, saturation) and seismic response. Application of the ANN algorithms to a synthetic test data set produced surprisingly accurate estimates of reservoir rock parameters, and were deemed a success by an industry participant.
Application of the CFAR algorithm will be completed and further tested. The PNN will be further refined and optimized. The use of proximity features in the PNN classification will be tested and evaluated. Ways to track events from one common reflection point gather to another will be explored.
The ANN algorithms will be applied to a real data set to compare the predictions to measured reservoir properties. This application will use a subset of both the available well log and seismic data to train the ANN algorithms. After training, the algorithms will operate solely on the remaining seismic data to carry out the parameter estimation.
Task Leader: Jacob Barhen, ORNL, email: firstname.lastname@example.org
This task aims to develop and exploit a fast and robust approach to the global optimization problem to facilitate solving larger problems faster than currently used techniques can. In particular, it aims to apply a newly developed code, TRUST (Terminal Repeller Unconstrained Subenergy Tunneling) to the problem of estimating residual statics in seismic data.
Three test data sets were defined and tested with the algorithm. From a simple initial model in which statics were zero, the subsequent test data sets were made increasingly complicated. Two data sets contained 24 shots and 50 receivers, yielding a total of 74 free parameters, while the third set contained 77 shots and 77 receivers, with a total of 154 parameters to solve for. Approaches to help constrain the ill-conditioned nature of the data sets were devised. The TRUST algorithm was successful with the first data set, and was able to produce a result that was more physically plausible than a conventional statics estimation technique (it provided no abnormally large corrections). Application of the new TRUST algorithm to the third data set is still in progress.
New codes were developed to improve the internal calculations required by TRUST (calculation of the total energy of the stacked seismic traces, and its gradient - its derivatives with respect to the statics corrections). These codes will be used to refine the TRUST algorithm for the statics problem and improved its performance.
Task Leaders: H.K. Liu, University of South Alabama, email: Liuhk@aol.com, and Jacob Barhen, ORNL, email: email@example.com
Develop and test techniques that are effective for displaying, visualizing, and processing of large, 3-D seismic data volumes.
The concept of using optical holography for true 3-D display of seismid data has been clearly and definitely demonstrated through experimental results of optical holography, optical data capture and computer generated holographic storage and display. This task involves deveopment of both data processing techniques and advanced optoelectronic device instrumentation. In spite of modest funding in its first year, the goals for the first year have been mostly met.
The holographic display will be tested with simulated seismic data, perhaps also with a sub-set of the full SEG-EAGE data set. This demonstration will require additional processing of the seismic data to prepare it for the optical holographic display system.
Last Edited: Nov 22, 1996