for Journals by Title or ISSN
for Articles by Keywords
help
  Subjects -> ENGINEERING (Total: 2417 journals)
    - CHEMICAL ENGINEERING (207 journals)
    - CIVIL ENGINEERING (202 journals)
    - ELECTRICAL ENGINEERING (111 journals)
    - ENGINEERING (1267 journals)
    - ENGINEERING MECHANICS AND MATERIALS (400 journals)
    - HYDRAULIC ENGINEERING (56 journals)
    - INDUSTRIAL ENGINEERING (76 journals)
    - MECHANICAL ENGINEERING (98 journals)

ENGINEERING (1267 journals)                  1 2 3 4 5 6 7 | Last

Showing 1 - 200 of 1205 Journals sorted alphabetically
3 Biotech     Open Access   (Followers: 8)
3D Research     Hybrid Journal   (Followers: 21)
AAPG Bulletin     Hybrid Journal   (Followers: 8)
AASRI Procedia     Open Access   (Followers: 14)
Abstract and Applied Analysis     Open Access   (Followers: 3)
Aceh International Journal of Science and Technology     Open Access   (Followers: 2)
ACS Nano     Full-text available via subscription   (Followers: 273)
Acta Geotechnica     Hybrid Journal   (Followers: 7)
Acta Metallurgica Sinica (English Letters)     Hybrid Journal   (Followers: 7)
Acta Polytechnica : Journal of Advanced Engineering     Open Access   (Followers: 3)
Acta Scientiarum. Technology     Open Access   (Followers: 3)
Acta Universitatis Cibiniensis. Technical Series     Open Access  
Active and Passive Electronic Components     Open Access   (Followers: 7)
Adaptive Behavior     Hybrid Journal   (Followers: 11)
Adıyaman Üniversitesi Mühendislik Bilimleri Dergisi     Open Access  
Adsorption     Hybrid Journal   (Followers: 4)
Advanced Engineering Forum     Full-text available via subscription   (Followers: 7)
Advanced Journal of Graduate Research     Open Access  
Advanced Science     Open Access   (Followers: 5)
Advanced Science Focus     Free   (Followers: 5)
Advanced Science Letters     Full-text available via subscription   (Followers: 10)
Advanced Science, Engineering and Medicine     Partially Free   (Followers: 7)
Advanced Synthesis & Catalysis     Hybrid Journal   (Followers: 18)
Advances in Calculus of Variations     Hybrid Journal   (Followers: 2)
Advances in Catalysis     Full-text available via subscription   (Followers: 5)
Advances in Complex Systems     Hybrid Journal   (Followers: 7)
Advances in Engineering Software     Hybrid Journal   (Followers: 27)
Advances in Fuel Cells     Full-text available via subscription   (Followers: 17)
Advances in Fuzzy Systems     Open Access   (Followers: 5)
Advances in Geosciences (ADGEO)     Open Access   (Followers: 13)
Advances in Heat Transfer     Full-text available via subscription   (Followers: 21)
Advances in Human Factors/Ergonomics     Full-text available via subscription   (Followers: 22)
Advances in Magnetic and Optical Resonance     Full-text available via subscription   (Followers: 9)
Advances in Natural Sciences: Nanoscience and Nanotechnology     Open Access   (Followers: 29)
Advances in Operations Research     Open Access   (Followers: 12)
Advances in OptoElectronics     Open Access   (Followers: 6)
Advances in Physics Theories and Applications     Open Access   (Followers: 13)
Advances in Polymer Science     Hybrid Journal   (Followers: 43)
Advances in Porous Media     Full-text available via subscription   (Followers: 5)
Advances in Remote Sensing     Open Access   (Followers: 44)
Advances in Science and Research (ASR)     Open Access   (Followers: 6)
Aerobiologia     Hybrid Journal   (Followers: 3)
African Journal of Science, Technology, Innovation and Development     Hybrid Journal   (Followers: 6)
AIChE Journal     Hybrid Journal   (Followers: 35)
Ain Shams Engineering Journal     Open Access   (Followers: 5)
Akademik Platform Mühendislik ve Fen Bilimleri Dergisi     Open Access   (Followers: 1)
Alexandria Engineering Journal     Open Access   (Followers: 1)
AMB Express     Open Access   (Followers: 1)
American Journal of Applied Sciences     Open Access   (Followers: 26)
American Journal of Engineering and Applied Sciences     Open Access   (Followers: 10)
American Journal of Engineering Education     Open Access   (Followers: 9)
American Journal of Environmental Engineering     Open Access   (Followers: 16)
American Journal of Industrial and Business Management     Open Access   (Followers: 24)
Analele Universitatii Ovidius Constanta - Seria Chimie     Open Access  
Annals of Combinatorics     Hybrid Journal   (Followers: 4)
Annals of Pure and Applied Logic     Open Access   (Followers: 2)
Annals of Regional Science     Hybrid Journal   (Followers: 7)
Annals of Science     Hybrid Journal   (Followers: 7)
Antarctic Science     Hybrid Journal   (Followers: 1)
Applicable Algebra in Engineering, Communication and Computing     Hybrid Journal   (Followers: 2)
Applicable Analysis: An International Journal     Hybrid Journal   (Followers: 1)
Applied Catalysis A: General     Hybrid Journal   (Followers: 6)
Applied Catalysis B: Environmental     Hybrid Journal   (Followers: 18)
Applied Clay Science     Hybrid Journal   (Followers: 6)
Applied Computational Intelligence and Soft Computing     Open Access   (Followers: 11)
Applied Magnetic Resonance     Hybrid Journal   (Followers: 4)
Applied Nanoscience     Open Access   (Followers: 8)
Applied Network Science     Open Access   (Followers: 3)
Applied Numerical Mathematics     Hybrid Journal   (Followers: 5)
Applied Physics Research     Open Access   (Followers: 5)
Applied Sciences     Open Access   (Followers: 3)
Applied Spatial Analysis and Policy     Hybrid Journal   (Followers: 4)
Arabian Journal for Science and Engineering     Hybrid Journal   (Followers: 5)
Archives of Computational Methods in Engineering     Hybrid Journal   (Followers: 5)
Archives of Foundry Engineering     Open Access  
Archives of Thermodynamics     Open Access   (Followers: 8)
Arkiv för Matematik     Hybrid Journal   (Followers: 1)
ASEE Prism     Full-text available via subscription   (Followers: 3)
Asia-Pacific Journal of Science and Technology     Open Access  
Asian Engineering Review     Open Access  
Asian Journal of Applied Science and Engineering     Open Access   (Followers: 1)
Asian Journal of Applied Sciences     Open Access   (Followers: 2)
Asian Journal of Biotechnology     Open Access   (Followers: 8)
Asian Journal of Control     Hybrid Journal  
Asian Journal of Current Engineering & Maths     Open Access  
Asian Journal of Technology Innovation     Hybrid Journal   (Followers: 8)
Assembly Automation     Hybrid Journal   (Followers: 2)
at - Automatisierungstechnik     Hybrid Journal   (Followers: 1)
ATZagenda     Hybrid Journal  
ATZextra worldwide     Hybrid Journal  
Australasian Physical & Engineering Sciences in Medicine     Hybrid Journal   (Followers: 1)
Australian Journal of Multi-Disciplinary Engineering     Full-text available via subscription   (Followers: 2)
Autonomous Mental Development, IEEE Transactions on     Hybrid Journal   (Followers: 9)
Avances en Ciencias e Ingeniería     Open Access  
Balkan Region Conference on Engineering and Business Education     Open Access   (Followers: 1)
Bangladesh Journal of Scientific and Industrial Research     Open Access  
Basin Research     Hybrid Journal   (Followers: 5)
Batteries     Open Access   (Followers: 6)
Bautechnik     Hybrid Journal   (Followers: 1)
Bell Labs Technical Journal     Hybrid Journal   (Followers: 28)
Beni-Suef University Journal of Basic and Applied Sciences     Open Access   (Followers: 4)
BER : Manufacturing Survey : Full Survey     Full-text available via subscription   (Followers: 1)
BER : Motor Trade Survey     Full-text available via subscription  
BER : Retail Sector Survey     Full-text available via subscription   (Followers: 1)
BER : Retail Survey : Full Survey     Full-text available via subscription   (Followers: 1)
BER : Survey of Business Conditions in Manufacturing : An Executive Summary     Full-text available via subscription   (Followers: 3)
BER : Survey of Business Conditions in Retail : An Executive Summary     Full-text available via subscription   (Followers: 3)
Beyond : Undergraduate Research Journal     Open Access  
Bhakti Persada : Jurnal Aplikasi IPTEKS     Open Access  
Bharatiya Vaigyanik evam Audyogik Anusandhan Patrika (BVAAP)     Open Access   (Followers: 1)
Bilge International Journal of Science and Technology Research     Open Access  
Biofuels Engineering     Open Access   (Followers: 1)
Biointerphases     Open Access   (Followers: 1)
Biomaterials Science     Full-text available via subscription   (Followers: 11)
Biomedical Engineering     Hybrid Journal   (Followers: 15)
Biomedical Engineering and Computational Biology     Open Access   (Followers: 13)
Biomedical Engineering Letters     Hybrid Journal   (Followers: 5)
Biomedical Engineering, IEEE Reviews in     Full-text available via subscription   (Followers: 21)
Biomedical Engineering, IEEE Transactions on     Hybrid Journal   (Followers: 37)
Biomedical Engineering: Applications, Basis and Communications     Hybrid Journal   (Followers: 5)
Biomedical Microdevices     Hybrid Journal   (Followers: 9)
Biomedical Science and Engineering     Open Access   (Followers: 4)
Biomedizinische Technik - Biomedical Engineering     Hybrid Journal   (Followers: 1)
Biomicrofluidics     Open Access   (Followers: 4)
BioNanoMaterials     Hybrid Journal   (Followers: 2)
Biotechnology Progress     Hybrid Journal   (Followers: 39)
Bitlis Eren University Journal of Science and Technology     Open Access  
Boletin Cientifico Tecnico INIMET     Open Access  
Botswana Journal of Technology     Full-text available via subscription   (Followers: 1)
Boundary Value Problems     Open Access   (Followers: 1)
Brazilian Journal of Science and Technology     Open Access   (Followers: 2)
Broadcasting, IEEE Transactions on     Hybrid Journal   (Followers: 12)
Bulletin of Canadian Petroleum Geology     Full-text available via subscription   (Followers: 13)
Bulletin of Engineering Geology and the Environment     Hybrid Journal   (Followers: 14)
Bulletin of the Crimean Astrophysical Observatory     Hybrid Journal  
Cahiers, Droit, Sciences et Technologies     Open Access  
Calphad     Hybrid Journal   (Followers: 2)
Canadian Geotechnical Journal     Hybrid Journal   (Followers: 31)
Canadian Journal of Remote Sensing     Full-text available via subscription   (Followers: 42)
Case Studies in Engineering Failure Analysis     Open Access   (Followers: 6)
Case Studies in Thermal Engineering     Open Access   (Followers: 5)
Catalysis Communications     Hybrid Journal   (Followers: 6)
Catalysis Letters     Hybrid Journal   (Followers: 2)
Catalysis Reviews: Science and Engineering     Hybrid Journal   (Followers: 7)
Catalysis Science and Technology     Free   (Followers: 8)
Catalysis Surveys from Asia     Hybrid Journal   (Followers: 3)
Catalysis Today     Hybrid Journal   (Followers: 7)
CEAS Space Journal     Hybrid Journal   (Followers: 2)
Cellular and Molecular Neurobiology     Hybrid Journal   (Followers: 3)
Central European Journal of Engineering     Hybrid Journal  
Chaos : An Interdisciplinary Journal of Nonlinear Science     Hybrid Journal   (Followers: 2)
Chaos, Solitons & Fractals     Hybrid Journal   (Followers: 3)
Chinese Journal of Catalysis     Full-text available via subscription   (Followers: 2)
Chinese Journal of Engineering     Open Access   (Followers: 2)
Chinese Science Bulletin     Open Access   (Followers: 1)
Ciencia e Ingenieria Neogranadina     Open Access  
Ciencia en su PC     Open Access   (Followers: 1)
Ciencias Holguin     Open Access   (Followers: 3)
CienciaUAT     Open Access   (Followers: 1)
Cientifica     Open Access  
CIRP Annals - Manufacturing Technology     Full-text available via subscription   (Followers: 11)
CIRP Journal of Manufacturing Science and Technology     Full-text available via subscription   (Followers: 13)
City, Culture and Society     Hybrid Journal   (Followers: 21)
Clay Minerals     Full-text available via subscription   (Followers: 10)
Clean Air Journal     Full-text available via subscription   (Followers: 1)
Clinical Science     Full-text available via subscription   (Followers: 9)
Coal Science and Technology     Full-text available via subscription   (Followers: 3)
Coastal Engineering     Hybrid Journal   (Followers: 11)
Coastal Engineering Journal     Hybrid Journal   (Followers: 6)
Coatings     Open Access   (Followers: 4)
Cogent Engineering     Open Access   (Followers: 2)
Cognitive Computation     Hybrid Journal   (Followers: 4)
Color Research & Application     Hybrid Journal   (Followers: 2)
COMBINATORICA     Hybrid Journal  
Combustion Theory and Modelling     Hybrid Journal   (Followers: 14)
Combustion, Explosion, and Shock Waves     Hybrid Journal   (Followers: 14)
Communications Engineer     Hybrid Journal   (Followers: 1)
Communications in Numerical Methods in Engineering     Hybrid Journal   (Followers: 2)
Components, Packaging and Manufacturing Technology, IEEE Transactions on     Hybrid Journal   (Followers: 28)
Composite Interfaces     Hybrid Journal   (Followers: 7)
Composite Structures     Hybrid Journal   (Followers: 277)
Composites Part A : Applied Science and Manufacturing     Hybrid Journal   (Followers: 208)
Composites Part B : Engineering     Hybrid Journal   (Followers: 249)
Composites Science and Technology     Hybrid Journal   (Followers: 193)
Comptes Rendus Mécanique     Full-text available via subscription   (Followers: 2)
Computation     Open Access   (Followers: 1)
Computational Geosciences     Hybrid Journal   (Followers: 16)
Computational Optimization and Applications     Hybrid Journal   (Followers: 7)
Computational Science and Discovery     Full-text available via subscription   (Followers: 2)
Computer Applications in Engineering Education     Hybrid Journal   (Followers: 8)
Computer Science and Engineering     Open Access   (Followers: 19)
Computers & Geosciences     Hybrid Journal   (Followers: 31)
Computers & Mathematics with Applications     Full-text available via subscription   (Followers: 8)
Computers and Electronics in Agriculture     Hybrid Journal   (Followers: 5)
Computers and Geotechnics     Hybrid Journal   (Followers: 11)
Computing and Visualization in Science     Hybrid Journal   (Followers: 7)
Computing in Science & Engineering     Full-text available via subscription   (Followers: 33)
Conciencia Tecnologica     Open Access  
Concurrent Engineering     Hybrid Journal   (Followers: 3)
Continuum Mechanics and Thermodynamics     Hybrid Journal   (Followers: 8)

        1 2 3 4 5 6 7 | Last

Journal Cover
Computers & Geosciences
Journal Prestige (SJR): 1.35
Citation Impact (citeScore): 3
Number of Followers: 31  
 
  Hybrid Journal Hybrid journal (It can contain Open Access articles)
ISSN (Print) 0098-3004
Published by Elsevier Homepage  [3163 journals]
  • An investigation into preserving spatially-distinct pore systems in
           multi-component rocks using a fossiliferous limestone example
    • Authors: Zeyun Jiang; Gary D. Couples; Helen Lewis; Alessandro Mangione
      Pages: 1 - 11
      Abstract: Publication date: July 2018
      Source:Computers & Geosciences, Volume 116
      Author(s): Zeyun Jiang, Gary D. Couples, Helen Lewis, Alessandro Mangione
      Limestones containing abundant disc-shaped fossil Nummulites can form significant hydrocarbon reservoirs but they have a distinctly heterogeneous distribution of pore shapes, sizes and connectivities, which make it particularly difficult to calculate petrophysical properties and consequent flow outcomes. The severity of the problem rests on the wide length-scale range from the millimetre scale of the fossil's pore space to the micron scale of rock matrix pores. This work develops a technique to incorporate multi-scale void systems into a pore network, which is used to calculate the petrophysical properties for subsequent flow simulations at different stages in the limestone's petrophysical evolution. While rock pore size, shape and connectivity can be determined, with varying levels of fidelity, using techniques such as X-ray computed tomography (CT) or scanning electron microscopy (SEM), this work represents a more challenging class where the rock of interest is insufficiently sampled or, as here, has been overprinted by extensive chemical diagenesis. The main challenge is integrating multi-scale void structures derived from both SEM and CT images, into a single model or a pore-scale network while still honouring the nature of the connections across these length scales. Pore network flow simulations are used to illustrate the technique but of equal importance, to demonstrate how supportable earlier-stage petrophysical property distributions can be used to assess the viability of several potential geological event sequences. The results of our flow simulations on generated models highlight the requirement for correct determination of the dominant pore scales (one plus of nm, μm, mm, cm), the spatial correlation and the cross-scale connections.

      PubDate: 2018-04-23T17:31:36Z
      DOI: 10.1016/j.cageo.2018.04.004
      Issue No: Vol. 116 (2018)
       
  • Semantics–informed geological maps: Conceptual modeling and
           knowledge encoding
    • Authors: Vincenzo Lombardo; Fabrizio Piana; Dario Mimmo
      Pages: 12 - 22
      Abstract: Publication date: July 2018
      Source:Computers & Geosciences, Volume 116
      Author(s): Vincenzo Lombardo, Fabrizio Piana, Dario Mimmo
      This paper introduces a novel, semantics-informed geologic mapping process, whose application domain is the production of a synthetic geologic map of a large administrative region. A number of approaches concerning the expression of geologic knowledge through UML schemata and ontologies have been around for more than a decade. These approaches have yielded resources that concern specific domains, such as, e.g., lithology. We develop a conceptual model that aims at building a digital encoding of several domains of geologic knowledge, in order to support the interoperability of the sources. We apply the devised terminological base to the classification of the elements of a geologic map of the Italian Western Alps and northern Apennines (Piemonte region). The digitally encoded knowledge base is a merged set of ontologies, called OntoGeonous. The encoding process identifies the objects of the semantic encoding, the geologic units, gathers the relevant information about such objects from authoritative resources, such as GeoSciML (giving priority to the application schemata reported in the INSPIRE Encoding Cookbook), and expresses the statements by means of axioms encoded in the Web Ontology Language (OWL). To support interoperability, OntoGeonous interlinks the general concepts by referring to the upper part level of ontology SWEET (developed by NASA), and imports knowledge that is already encoded in ontological format (e.g., ontology Simple Lithology). Machine-readable knowledge allows for consistency checking and for classification of the geological map data through algorithms of automatic reasoning.

      PubDate: 2018-04-23T17:31:36Z
      DOI: 10.1016/j.cageo.2018.04.001
      Issue No: Vol. 116 (2018)
       
  • An algorithm for fast elastic wave simulation using a vectorized finite
           difference operator
    • Authors: Ajay Malkoti; Nimisha Vedanti; Ram Krishna Tiwari
      Pages: 23 - 31
      Abstract: Publication date: July 2018
      Source:Computers & Geosciences, Volume 116
      Author(s): Ajay Malkoti, Nimisha Vedanti, Ram Krishna Tiwari
      Modern geophysical imaging techniques exploit the full wavefield information which can be simulated numerically. These numerical simulations are computationally expensive due to several factors, such as a large number of time steps and nodes, big size of the derivative stencil and huge model size. Besides these constraints, it is also important to reformulate the numerical derivative operator for improved efficiency. In this paper, we have introduced a vectorized derivative operator over the staggered grid with shifted coordinate systems. The operator increases the efficiency of simulation by exploiting the fact that each variable can be represented in the form of a matrix. This operator allows updating all nodes of a variable defined on the staggered grid, in a manner similar to the collocated grid scheme and thereby reducing the computational run-time considerably. Here we demonstrate an application of this operator to simulate the seismic wave propagation in elastic media (Marmousi model), by discretizing the equations on a staggered grid. We have compared the performance of this operator on three programming languages, which reveals that it can increase the execution speed by a factor of at least 2–3 times for FORTRAN and MATLAB; and nearly 100 times for Python. We have further carried out various tests in MATLAB to analyze the effect of model size and the number of time steps on total simulation run-time. We find that there is an additional, though small, computational overhead for each step and it depends on total number of time steps used in the simulation. A MATLAB code package, ’FDwave’, for the proposed simulation scheme is available upon request.

      PubDate: 2018-04-23T17:31:36Z
      DOI: 10.1016/j.cageo.2018.04.002
      Issue No: Vol. 116 (2018)
       
  • Global tectonic reconstructions with continuously deforming and evolving
           rigid plates
    • Authors: Michael Gurnis; Ting Yang; John Cannon; Mark Turner; Simon Williams; Nicolas Flament; R. Dietmar Müller
      Pages: 32 - 41
      Abstract: Publication date: Available online 21 April 2018
      Source:Computers & Geosciences
      Author(s): Michael Gurnis, Ting Yang, John Cannon, Mark Turner, Simon Williams, Nicolas Flament, R. Dietmar Müller
      Traditional plate reconstruction methodologies do not allow for plate deformation to be considered. Here we present software to construct and visualize global tectonic reconstructions with deforming plates within the context of rigid plates. Both deforming and rigid plates are defined by continuously evolving polygons. The deforming regions are tessellated with triangular meshes such that either strain rate or cumulative strain can be followed. The finite strain history, crustal thickness and stretching factor of points within the deformation zones are tracked as Lagrangian points. Integrating these tools within the interactive platform GPlates enables specialized users to build and refine deforming plate models and integrate them with other models in time and space. We demonstrate the integrated platform with regional reconstructions of Cenozoic western North America, the Mesozoic South American Atlantic margin, and Cenozoic southeast Asia, embedded within global reconstructions, using different data and reconstruction strategies.

      PubDate: 2018-04-23T17:31:36Z
      DOI: 10.1016/j.cageo.2018.04.007
      Issue No: Vol. 116 (2018)
       
  • Estimation of the displacements among distant events based on parallel
           tracking of events in seismic traces under uncertainty
    • Authors: Samuel G. Huamán Bustamante; Marco A. Cavalcanti Pacheco; Juan G. Lazo Lazo
      Pages: 81 - 90
      Abstract: Publication date: Available online 28 April 2018
      Source:Computers & Geosciences
      Author(s): Samuel G. Huamán Bustamante, Marco A. Cavalcanti Pacheco, Juan G. Lazo Lazo
      The method we propose in this paper seeks to estimate interface displacements among strata related with reflection seismic events, in comparison to the interfaces at other reference points. To do so, we search for reflection events in the reference point of a second seismic trace taken from the same 3D survey and close to a well. However, the nature of the seismic data introduces uncertainty in the results. Therefore, we perform an uncertainty analysis using the standard deviation results from several experiments with cross-correlation of signals. To estimate the displacements of events in depth between two seismic traces, we create a synthetic seismic trace with an empirical wavelet and the sonic log of the well, close to the second seismic trace. Then, we relate the events of the seismic traces to the depth of the sonic log. Finally, we test the method with data from the Namorado Field in Brazil. The results show that the accuracy of the event estimated depth depends on the results of parallel cross-correlation, primarily those from the procedures used in the integration of seismic data with data from the well. The proposed approach can correctly identify several similar events in two seismic traces without requiring all seismic traces between two distant points of interest to correlate strata in the subsurface.

      PubDate: 2018-04-30T11:09:23Z
      DOI: 10.1016/j.cageo.2018.04.011
      Issue No: Vol. 116 (2018)
       
  • Who cares about impact factor'
    • Authors: Gregoire Mariethoz; Derek Karssenberg; Dario Grana
      Abstract: Publication date: June 2018
      Source:Computers & Geosciences, Volume 115
      Author(s): Gregoire Mariethoz, Derek Karssenberg, Dario Grana


      PubDate: 2018-04-30T11:09:23Z
      DOI: 10.1016/s0098-3004(18)30374-1
      Issue No: Vol. 115 (2018)
       
  • Conditioning 3D object-based models to dense well data
    • Authors: Yimin C. Wang; Michael J. Pyrcz; Octavian Catuneanu; Jeff B. Boisvert
      Pages: 1 - 11
      Abstract: Publication date: June 2018
      Source:Computers & Geosciences, Volume 115
      Author(s): Yimin C. Wang, Michael J. Pyrcz, Octavian Catuneanu, Jeff B. Boisvert
      Object-based stochastic simulation models are used to generate categorical variable models with a realistic representation of complicated reservoir heterogeneity. A limitation of object-based modeling is the difficulty of conditioning to dense data. One method to achieve data conditioning is to apply optimization techniques. Optimization algorithms can utilize an objective function measuring the conditioning level of each object while also considering the geological realism of the object. Here, an objective function is optimized with implicit filtering which considers constraints on object parameters. Thousands of objects conditioned to data are generated and stored in a database. A set of objects are selected with linear integer programming to generate the final realization and honor all well data, proportions and other desirable geological features. Although any parameterizable object can be considered, objects from fluvial reservoirs are used to illustrate the ability to simultaneously condition multiple types of geologic features. Channels, levees, crevasse splays and oxbow lakes are parameterized based on location, path, orientation and profile shapes. Functions mimicking natural river sinuosity are used for the centerline model. Channel stacking pattern constraints are also included to enhance the geological realism of object interactions. Spatial layout correlations between different types of objects are modeled. Three case studies demonstrate the flexibility of the proposed optimization-simulation method. These examples include multiple channels with high sinuosity, as well as fragmented channels affected by limited preservation. In all cases the proposed method reproduces input parameters for the object geometries and matches the dense well constraints. The proposed methodology expands the applicability of object-based simulation to complex and heterogeneous geological environments with dense sampling.

      PubDate: 2018-04-23T17:31:36Z
      DOI: 10.1016/j.cageo.2018.02.006
      Issue No: Vol. 115 (2018)
       
  • Ontology-driven data integration and visualization for exploring regional
           geologic time and paleontological information
    • Authors: Chengbin Wang; Xiaogang Ma; Jianguo Chen
      Pages: 12 - 19
      Abstract: Publication date: June 2018
      Source:Computers & Geosciences, Volume 115
      Author(s): Chengbin Wang, Xiaogang Ma, Jianguo Chen
      Initiatives of open data promote the online publication and sharing of large amounts of geologic data. How to retrieve information and discover knowledge from the big data is an ongoing challenge. In this paper, we developed an ontology-driven data integration and visualization pilot system for exploring information of regional geologic time, paleontology, and fundamental geology. The pilot system (http://www2.cs.uidaho.edu/∼max/gts/) implemented the following functions: modeling and visualization of a geologic time scale ontology of North America, interactive retrieval and display of fossil information, geologic map information query and comparison with fossil information. A few case studies were carried out in the pilot system for querying fossil occurrence records from Plaeobiology Database and comparing them with information from the USGS geologic map services. The results show that, to improve the compatibility between local and global geologic standards, bridge gaps between different data sources, and create smart geoscience data services, it is necessary to further extend and improve the existing geoscience ontologies and use them to support functions to explore the open data.

      PubDate: 2018-04-23T17:31:36Z
      DOI: 10.1016/j.cageo.2018.03.004
      Issue No: Vol. 115 (2018)
       
  • FabricS: A user-friendly, complete and robust software for particle
           shape-fabric analysis
    • Authors: G. Moreno Chávez; F. Castillo Rivera; D. Sarocchi; L. Borselli; L.A. Rodríguez-Sedano
      Pages: 20 - 30
      Abstract: Publication date: June 2018
      Source:Computers & Geosciences, Volume 115
      Author(s): G. Moreno Chávez, F. Castillo Rivera, D. Sarocchi, L. Borselli, L.A. Rodríguez-Sedano
      Shape-fabric is a textural parameter related to the spatial arrangement of elongated particles in geological samples. Its usefulness spans a range from sedimentary petrology to igneous and metamorphic petrology. Independently of the process being studied, when a material flows, the elongated particles are oriented with the major axis in the direction of flow. In sedimentary petrology this information has been used for studies of paleo-flow direction of turbidites, the origin of quartz sediments, and locating ignimbrite vents, among others. In addition to flow direction and its polarity, the method enables flow rheology to be inferred. The use of shape-fabric has been limited due to the difficulties of automatically measuring particles and analyzing them with reliable circular statistics programs. This has dampened interest in the method for a long time. Shape-fabric measurement has increased in popularity since the 1980s thanks to the development of new image analysis techniques and circular statistics software. However, the programs currently available are unreliable, old and are incompatible with newer operating systems, or require programming skills. The goal of our work is to develop a user-friendly program, in the MATLAB environment, with a graphical user interface, that can process images and includes editing functions, and thresholds (elongation and size) for selecting a particle population and analyzing it with reliable circular statistics algorithms. Moreover, the method also has to produce rose diagrams, orientation vectors, and a complete series of statistical parameters. All these requirements are met by our new software. In this paper, we briefly explain the methodology from collection of oriented samples in the field to the minimum number of particles needed to obtain reliable fabric data. We obtained the data using specific statistical tests and taking into account the degree of iso-orientation of the samples and the required degree of reliability. The program has been verified by means of several simulations performed using appropriately designed features and by analyzing real samples.

      PubDate: 2018-04-23T17:31:36Z
      DOI: 10.1016/j.cageo.2018.02.005
      Issue No: Vol. 115 (2018)
       
  • A trace map comparison algorithm for the discrete fracture network models
           of rock masses
    • Authors: Shuai Han; Gang Wang; Mingchao Li
      Pages: 31 - 41
      Abstract: Publication date: June 2018
      Source:Computers & Geosciences, Volume 115
      Author(s): Shuai Han, Gang Wang, Mingchao Li
      Discrete fracture networks (DFN) are widely used to build refined geological models. However, validating whether a refined model can match to reality is a crucial problem, concerning whether the model can be used for analysis. The current validation methods include numerical validation and graphical validation. However, the graphical validation, aiming at estimating the similarity between a simulated trace map and the real trace map by visual observation, is subjective. In this paper, an algorithm for the graphical validation of DFN is set up. Four main indicators, including total gray, gray grade curve, characteristic direction and gray density distribution curve, are presented to assess the similarity between two trace maps. A modified Radon transform and loop cosine similarity are presented based on Radon transform and cosine similarity respectively. Besides, how to use Bézier curve to reduce the edge effect is described. Finally, a case study shows that the new algorithm can effectively distinguish which simulated trace map is more similar to the real trace map.

      PubDate: 2018-04-23T17:31:36Z
      DOI: 10.1016/j.cageo.2018.03.002
      Issue No: Vol. 115 (2018)
       
  • Acoustic reverse-time migration using GPU card and POSIX thread based on
           the adaptive optimal finite-difference scheme and the hybrid absorbing
           boundary condition
    • Authors: Xiaohui Cai; Yang Liu; Zhiming Ren
      Pages: 42 - 55
      Abstract: Publication date: June 2018
      Source:Computers & Geosciences, Volume 115
      Author(s): Xiaohui Cai, Yang Liu, Zhiming Ren
      Reverse-time migration (RTM) is a powerful tool for imaging geologically complex structures such as steep-dip and subsalt. However, its implementation is quite computationally expensive. Recently, as a low-cost solution, the graphic processing unit (GPU) was introduced to improve the efficiency of RTM. In the paper, we develop three ameliorative strategies to implement RTM on GPU card. First, given the high accuracy and efficiency of the adaptive optimal finite-difference (FD) method based on least squares (LS) on central processing unit (CPU), we study the optimal LS-based FD method on GPU. Second, we develop the CPU-based hybrid absorbing boundary condition (ABC) to the GPU-based one by addressing two issues of the former when introduced to GPU card: time-consuming and chaotic threads. Third, for large-scale data, the combinatorial strategy for optimal checkpointing and efficient boundary storage is introduced for the trade-off between memory and recomputation. To save the time of communication between host and disk, the portable operating system interface (POSIX) thread is utilized to create the other CPU core at the checkpoints. Applications of the three strategies on GPU with the compute unified device architecture (CUDA) programming language in RTM demonstrate their efficiency and validity.

      PubDate: 2018-04-23T17:31:36Z
      DOI: 10.1016/j.cageo.2018.02.001
      Issue No: Vol. 115 (2018)
       
  • Quantitative X-ray Map Analyser (Q-XRMA): A new GIS-based statistical
           approach to Mineral Image Analysis
    • Authors: Gaetano Ortolano; Roberto Visalli; Gaston Godard; Rosolino Cirrincione
      Pages: 56 - 65
      Abstract: Publication date: June 2018
      Source:Computers & Geosciences, Volume 115
      Author(s): Gaetano Ortolano, Roberto Visalli, Gaston Godard, Rosolino Cirrincione
      We present a new ArcGIS®-based tool developed in the Python programming language for calibrating EDS/WDS X-ray element maps, with the aim of acquiring quantitative information of petrological interest. The calibration procedure is based on a multiple linear regression technique that takes into account interdependence among elements and is constrained by the stoichiometry of minerals. The procedure requires an appropriate number of spot analyses for use as internal standards and provides several test indexes for a rapid check of calibration accuracy. The code is based on an earlier image-processing tool designed primarily for classifying minerals in X-ray element maps; the original Python code has now been enhanced to yield calibrated maps of mineral end-members or the chemical parameters of each classified mineral. The semi-automated procedure can be used to extract a dataset that is automatically stored within queryable tables. As a case study, the software was applied to an amphibolite-facies garnet-bearing micaschist. The calibrated images obtained for both anhydrous (i.e., garnet and plagioclase) and hydrous (i.e., biotite) phases show a good fit with corresponding electron microprobe analyses. This new GIS-based tool package can thus find useful application in petrology and materials science research. Moreover, the huge quantity of data extracted opens new opportunities for the development of a thin-section microchemical database that, using a GIS platform, can be linked with other major global geoscience databases.

      PubDate: 2018-04-23T17:31:36Z
      DOI: 10.1016/j.cageo.2018.03.001
      Issue No: Vol. 115 (2018)
       
  • Interoperability challenges in river discharge modelling: A cross domain
           application scenario
    • Authors: Mattia Santoro; Volker Andres; Simon Jirka; Toshio Koike; Ulrich Looser; Stefano Nativi; Florian Pappenberger; Manuela Schlummer; Adrian Strauch; Michael Utech; Ervin Zsoter
      Pages: 66 - 74
      Abstract: Publication date: June 2018
      Source:Computers & Geosciences, Volume 115
      Author(s): Mattia Santoro, Volker Andres, Simon Jirka, Toshio Koike, Ulrich Looser, Stefano Nativi, Florian Pappenberger, Manuela Schlummer, Adrian Strauch, Michael Utech, Ervin Zsoter
      River discharge is a critical water cycle variable, as it integrates all the processes (e.g. runoff and evapotranspiration) occurring within a river basin and provides a hydrological output variable that can be readily measured. Its prediction is of invaluable help for many water-related tasks including water resources assessment and management, flood protection, and disaster mitigation. Observations of river discharge are important to calibrate and validate hydrological or coupled land, atmosphere and ocean models. This requires using datasets from different scientific domains (Water, Weather, etc.). Typically, such datasets are provided using different technological solutions. This complicates the integration of new hydrological data sources into application systems. Therefore, a considerable effort is often spent on data access issues instead of the actual scientific question. This paper describes the work performed to address multidisciplinary interoperability challenges related to river discharge modeling and validation. This includes definition and standardization of domain specific interoperability standards for hydrological data sharing and their support in global frameworks such as the Global Earth Observation System of Systems (GEOSS). The research was developed in the context of the EU FP7-funded project GEOWOW (GEOSS Interoperability for Weather, Ocean and Water), which implemented a “River Discharge” application scenario. This scenario demonstrates the combination of river discharge observations data from the Global Runoff Data Centre (GRDC) database and model outputs produced by the European Centre for Medium-Range Weather Forecasts (ECMWF) predicting river discharge based on weather forecast information in the context of the GEOSS.

      PubDate: 2018-04-23T17:31:36Z
      DOI: 10.1016/j.cageo.2018.03.008
      Issue No: Vol. 115 (2018)
       
  • Ambient occlusion – A powerful algorithm to segment shell and skeletal
           intrapores in computed tomography data
    • Authors: J. Titschack; D. Baum; K. Matsuyama; K. Boos; C. Färber; W.-A. Kahl; K. Ehrig; D. Meinel; C. Soriano; S.R. Stock
      Pages: 75 - 87
      Abstract: Publication date: June 2018
      Source:Computers & Geosciences, Volume 115
      Author(s): J. Titschack, D. Baum, K. Matsuyama, K. Boos, C. Färber, W.-A. Kahl, K. Ehrig, D. Meinel, C. Soriano, S.R. Stock
      During the last decades, X-ray (micro-)computed tomography has gained increasing attention for the description of porous skeletal and shell structures of various organism groups. However, their quantitative analysis is often hampered by the difficulty to discriminate cavities and pores within the object from the surrounding region. Herein, we test the ambient occlusion (AO) algorithm and newly implemented optimisations for the segmentation of cavities (implemented in the software Amira). The segmentation accuracy is evaluated as a function of (i) changes in the ray length input variable, and (ii) the usage of AO (scalar) field and other AO-derived (scalar) fields. The results clearly indicate that the AO field itself outperforms all other AO-derived fields in terms of segmentation accuracy and robustness against variations in the ray length input variable. The newly implemented optimisations improved the AO field-based segmentation only slightly, while the segmentations based on the AO-derived fields improved considerably. Additionally, we evaluated the potential of the AO field and AO-derived fields for the separation and classification of cavities as well as skeletal structures by comparing them with commonly used distance-map-based segmentations. For this, we tested the zooid separation within a bryozoan colony, the stereom classification of an ophiuroid tooth, the separation of bioerosion traces within a marble block and the calice (central cavity)-pore separation within a dendrophyllid coral. The obtained results clearly indicate that the ideal input field depends on the three-dimensional morphology of the object of interest. The segmentations based on the AO-derived fields often provided cavity separations and skeleton classifications that were superior to or impossible to obtain with commonly used distance-map-based segmentations. The combined usage of various AO-derived fields by supervised or unsupervised segmentation algorithms might provide a promising target for future research to further improve the results for this kind of high-end data segmentation and classification. Furthermore, the application of the developed segmentation algorithm is not restricted to X-ray (micro-)computed tomographic data but may potentially be useful for the segmentation of 3D volume data from other sources.

      PubDate: 2018-04-23T17:31:36Z
      DOI: 10.1016/j.cageo.2018.03.007
      Issue No: Vol. 115 (2018)
       
  • SeisFlows—Flexible waveform inversion software
    • Authors: Ryan T. Modrak; Dmitry Borisov; Matthieu Lefebvre; Jeroen Tromp
      Pages: 88 - 95
      Abstract: Publication date: June 2018
      Source:Computers & Geosciences, Volume 115
      Author(s): Ryan T. Modrak, Dmitry Borisov, Matthieu Lefebvre, Jeroen Tromp
      SeisFlows is an open source Python package that provides a customizable waveform inversion workflow and framework for research in oil and gas exploration, earthquake tomography, medical imaging, and other areas. New methods can be rapidly prototyped in SeisFlows by inheriting from default inversion or migration classes, and code can be tested on 2D examples before application to more expensive 3D problems. Wave simulations must be performed using an external software package such as SPECFEM3D. The ability to interface with external solvers lends flexibility, and the choice of SPECFEM3D as a default option provides optional GPU acceleration and other useful capabilities. Through support for massively parallel solvers and interfaces for high-performance computing (HPC) systems, inversions with thousands of seismic traces and billions of model parameters can be performed. So far, SeisFlows has run on clusters managed by the Department of Defense, Chevron Corp., Total S.A., Princeton University, and the University of Alaska, Fairbanks.

      PubDate: 2018-04-23T17:31:36Z
      DOI: 10.1016/j.cageo.2018.02.004
      Issue No: Vol. 115 (2018)
       
  • A novel tree-based algorithm to discover seismic patterns in earthquake
           catalogs
    • Authors: E. Florido; G. Asencio–Cortés; J.L. Aznarte; C. Rubio-Escudero; F. Martínez–Álvarez
      Pages: 96 - 104
      Abstract: Publication date: June 2018
      Source:Computers & Geosciences, Volume 115
      Author(s): E. Florido, G. Asencio–Cortés, J.L. Aznarte, C. Rubio-Escudero, F. Martínez–Álvarez
      A novel methodology is introduced in this research study to detect seismic precursors. Based on an existing approach, the new methodology searches for patterns in the historical data. Such patterns may contain statistical or soil dynamics information. It improves the original version in several aspects. First, new seismicity indicators have been used to characterize earthquakes. Second, a machine learning clustering algorithm has been applied in a very flexible way, thus allowing the discovery of new data groupings. Third, a novel search strategy is proposed in order to obtain non-overlapped patterns. And, fourth, arbitrary lengths of patterns are searched for, thus discovering long and short-term behaviors that may influence in the occurrence of medium-large earthquakes. The methodology has been applied to seven different datasets, from three different regions, namely the Iberian Peninsula, Chile and Japan. Reported results show a remarkable improvement with respect to the former version, in terms of all evaluated quality measures. In particular, the number of false positives has decreased and the positive predictive values increased, both of them in a very remarkable manner.

      PubDate: 2018-04-23T17:31:36Z
      DOI: 10.1016/j.cageo.2018.03.005
      Issue No: Vol. 115 (2018)
       
  • Water Residence Time estimation by 1D deconvolution in the form of a
           l2-regularized inverse problem with smoothness, positivity and causality
           constraints
    • Authors: Alina G. Meresescu; Matthieu Kowalski; Frédéric Schmidt; François Landais
      Pages: 105 - 121
      Abstract: Publication date: June 2018
      Source:Computers & Geosciences, Volume 115
      Author(s): Alina G. Meresescu, Matthieu Kowalski, Frédéric Schmidt, François Landais
      The Water Residence Time distribution is the equivalent of the impulse response of a linear system allowing the propagation of water through a medium, e.g. the propagation of rain water from the top of the mountain towards the aquifers. We consider the output aquifer levels as the convolution between the input rain levels and the Water Residence Time, starting with an initial aquifer base level. The estimation of Water Residence Time is important for a better understanding of hydro-bio-geochemical processes and mixing properties of wetlands used as filters in ecological applications, as well as protecting fresh water sources for wells from pollutants. Common methods of estimating the Water Residence Time focus on cross-correlation, parameter fitting and non-parametric deconvolution methods. Here we propose a 1D full-deconvolution, regularized, non-parametric inverse problem algorithm that enforces smoothness and uses constraints of causality and positivity to estimate the Water Residence Time curve. Compared to Bayesian non-parametric deconvolution approaches, it has a fast runtime per test case; compared to the popular and fast cross-correlation method, it produces a more precise Water Residence Time curve even in the case of noisy measurements. The algorithm needs only one regularization parameter to balance between smoothness of the Water Residence Time and accuracy of the reconstruction. We propose an approach on how to automatically find a suitable value of the regularization parameter from the input data only. Tests on real data illustrate the potential of this method to analyze hydrological datasets.

      PubDate: 2018-04-23T17:31:36Z
      DOI: 10.1016/j.cageo.2018.03.009
      Issue No: Vol. 115 (2018)
       
  • AqSo_NaCl: Computer program to calculate p-T-V-x properties in the
           H2O-NaCl fluid system applied to fluid inclusion research and pore fluid
           calculation
    • Authors: Ronald J. Bakker
      Pages: 122 - 133
      Abstract: Publication date: June 2018
      Source:Computers & Geosciences, Volume 115
      Author(s): Ronald J. Bakker
      The program AqSo_NaCl has been developed to calculate pressure - molar volume - temperature - composition (p-V-T-x) properties, enthalpy, and heat capacity of the binary H2O-NaCl system. The algorithms are designed in BASIC within the Xojo programming environment, and can be operated as stand-alone project with Macintosh-, Windows-, and Unix-based operating systems. A series of ten self-instructive interfaces (modules) are developed to calculate fluid inclusion properties and pore fluid properties. The modules may be used to calculate properties of pure NaCl, the halite-liquidus, the halite-vapourus, dew-point and bubble-point curves (liquid-vapour), critical point, and SLV solid-liquid-vapour curves at temperatures above 0.1 °C (with halite) and below 0.1 °C (with ice or hydrohalite). Isochores of homogeneous fluids and unmixed fluids in a closed system can be calculated and exported to a.txt file. Isochores calculated for fluid inclusions can be corrected according to the volumetric properties of quartz. Microthermometric data, i.e. dissolution temperatures and homogenization temperatures, can be used to calculated bulk fluid properties of fluid inclusions. Alternatively, in the absence of total homogenization temperature the volume fraction of the liquid phase in fluid inclusions can be used to obtain bulk properties.

      PubDate: 2018-04-23T17:31:36Z
      DOI: 10.1016/j.cageo.2018.03.003
      Issue No: Vol. 115 (2018)
       
  • Automatic contouring of geologic fabric and finite strain data on the unit
           hyperboloid
    • Authors: Frederick W. Vollmer
      Pages: 134 - 142
      Abstract: Publication date: June 2018
      Source:Computers & Geosciences, Volume 115
      Author(s): Frederick W. Vollmer
      Fabric and finite strain analysis, an integral part of studies of geologic structures and orogenic belts, is commonly done by the analysis of particles whose shapes can be approximated as ellipses. Given a sample of such particles, the mean and confidence intervals of particular parameters can be calculated, however, taking the extra step of plotting and contouring the density distribution can identify asymmetries or modes related to sedimentary fabrics or other factors. A common graphical strain analysis technique is to plot final ellipse ratios, R f , versus orientations, ϕ f on polar Elliott or R f / ϕ plots to examine the density distribution. The plot may be contoured, however, it is desirable to have a contouring method that is rapid, reproducible, and based on the underlying geometry of the data. The unit hyperboloid, H 2 , gives a natural parameter space for two-dimensional strain, and various projections, including equal-area and stereographic, have useful properties for examining density distributions for anisotropy. An index, I a , is given to quantify the magnitude and direction of anisotropy. Elliott and R f / ϕ plots can be understood by applying hyperbolic geometry and recognizing them as projections of H 2 . These both distort area, however, so the equal-area projection is preferred for examining density distributions. The algorithm presented here gives fast, accurate, and reproducible contours of density distributions calculated directly on H 2 . The algorithm back-projects the data onto H 2 , where the density calculation is done at regular nodes using a weighting value based on the hyperboloid distribution, which is then contoured. It is implemented as an Octave compatible MATLAB function that plots ellipse data using a variety of projections, and calculates and displays contours of their density distribution on H 2 .

      PubDate: 2018-04-23T17:31:36Z
      DOI: 10.1016/j.cageo.2018.03.006
      Issue No: Vol. 115 (2018)
       
  • A method for automatic grain segmentation of multi-angle cross-polarized
           microscopic images of sandstone
    • Authors: Feng Jiang; Qing Gu; Huizhen Hao; Na Li; Bingqian Wang; Xiumian Hu
      Pages: 143 - 153
      Abstract: Publication date: June 2018
      Source:Computers & Geosciences, Volume 115
      Author(s): Feng Jiang, Qing Gu, Huizhen Hao, Na Li, Bingqian Wang, Xiumian Hu
      Automatic grain segmentation of sandstone is to partition mineral grains into separate regions in the thin section, which is the first step for computer aided mineral identification and sandstone classification. The sandstone microscopic images contain a large number of mixed mineral grains where differences among adjacent grains, i.e., quartz, feldspar and lithic grains, are usually ambiguous, which make grain segmentation difficult. In this paper, we take advantage of multi-angle cross-polarized microscopic images and propose a method for grain segmentation with high accuracy. The method consists of two stages, in the first stage, we enhance the SLIC (Simple Linear Iterative Clustering) algorithm, named MSLIC, to make use of multi-angle images and segment the images as boundary adherent superpixels. In the second stage, we propose the region merging technique which combines the coarse merging and fine merging algorithms. The coarse merging merges the adjacent superpixels with less evident boundaries, and the fine merging merges the ambiguous superpixels using the spatial enhanced fuzzy clustering. Experiments are designed on 9 sets of multi-angle cross-polarized images taken from the three major types of sandstones. The results demonstrate both the effectiveness and potential of the proposed method, comparing to the available segmentation methods.

      PubDate: 2018-04-23T17:31:36Z
      DOI: 10.1016/j.cageo.2018.03.010
      Issue No: Vol. 115 (2018)
       
  • ClimateSpark: An in-memory distributed computing framework for big climate
           data analytics
    • Authors: Fei Hu; Chaowei Yang; John L. Schnase; Daniel Q. Duffy; Mengchao Xu; Michael K. Bowen; Tsengdar Lee; Weiwei Song
      Pages: 154 - 166
      Abstract: Publication date: June 2018
      Source:Computers & Geosciences, Volume 115
      Author(s): Fei Hu, Chaowei Yang, John L. Schnase, Daniel Q. Duffy, Mengchao Xu, Michael K. Bowen, Tsengdar Lee, Weiwei Song
      The unprecedented growth of climate data creates new opportunities for climate studies, and yet big climate data pose a grand challenge to climatologists to efficiently manage and analyze big data. The complexity of climate data content and analytical algorithms increases the difficulty of implementing algorithms on high performance computing systems. This paper proposes an in-memory, distributed computing framework, ClimateSpark, to facilitate complex big data analytics and time-consuming computational tasks. Chunking data structure improves parallel I/O efficiency, while a spatiotemporal index is built for the chunks to avoid unnecessary data reading and preprocessing. An integrated, multi-dimensional, array-based data model (ClimateRDD) and ETL operations are developed to address big climate data variety by integrating the processing components of the climate data lifecycle. ClimateSpark utilizes Spark SQL and Apache Zeppelin to develop a web portal to facilitate the interaction among climatologists, climate data, analytic operations and computing resources (e.g., using SQL query and Scala/Python notebook). Experimental results show that ClimateSpark conducts different spatiotemporal data queries/analytics with high efficiency and data locality. ClimateSpark is easily adaptable to other big multiple-dimensional, array-based datasets in various geoscience domains.

      PubDate: 2018-04-23T17:31:36Z
      DOI: 10.1016/j.cageo.2018.03.011
      Issue No: Vol. 115 (2018)
       
  • A review of numerical techniques approaching microstructures of
           crystalline rocks
    • Authors: Yahui Zhang; Louis Ngai Yuen Wong
      Pages: 167 - 187
      Abstract: Publication date: June 2018
      Source:Computers & Geosciences, Volume 115
      Author(s): Yahui Zhang, Louis Ngai Yuen Wong
      The macro-mechanical behavior of crystalline rocks including strength, deformability and failure pattern are dominantly influenced by their grain-scale structures. Numerical technique is commonly used to assist understanding the complicated mechanisms from a microscopic perspective. Each numerical method has its respective strengths and limitations. This review paper elucidates how numerical techniques take geometrical aspects of the grain into consideration. Four categories of numerical methods are examined: particle-based methods, block-based methods, grain-based methods, and node-based methods. Focusing on the grain-scale characters, specific relevant issues including increasing complexity of micro-structure, deformation and breakage of model elements, fracturing and fragmentation process are described in more detail. Therefore, the intrinsic capabilities and limitations of different numerical approaches in terms of accounting for the micro-mechanics of crystalline rocks and their phenomenal mechanical behavior are explicitly presented.

      PubDate: 2018-04-23T17:31:36Z
      DOI: 10.1016/j.cageo.2018.03.012
      Issue No: Vol. 115 (2018)
       
  • The Simple Concurrent Online Processing System (SCOPS) - An open-source
           interface for remotely sensed data processing
    • Authors: M.A. Warren; S. Goult; D. Clewley
      Pages: 188 - 197
      Abstract: Publication date: June 2018
      Source:Computers & Geosciences, Volume 115
      Author(s): M.A. Warren, S. Goult, D. Clewley
      Advances in technology allow remotely sensed data to be acquired with increasingly higher spatial and spectral resolutions. These data may then be used to influence government decision making and solve a number of research and application driven questions. However, such large volumes of data can be difficult to handle on a single personal computer or on older machines with slower components. Often the software required to process data is varied and can be highly technical and too advanced for the novice user to fully understand. This paper describes an open-source tool, the Simple Concurrent Online Processing System (SCOPS), which forms part of an airborne hyperspectral data processing chain that allows users accessing the tool over a web interface to submit jobs and process data remotely. It is demonstrated using Natural Environment Research Council Airborne Research Facility (NERC-ARF) instruments together with other free- and open-source tools to take radiometrically corrected data from sensor geometry into geocorrected form and to generate simple or complex band ratio products. The final processed data products are acquired via an HTTP download. SCOPS can cut data processing times and introduce complex processing software to novice users by distributing jobs across a network using a simple to use web interface.

      PubDate: 2018-04-23T17:31:36Z
      DOI: 10.1016/j.cageo.2018.03.013
      Issue No: Vol. 115 (2018)
       
  • Multi-scale segmentation algorithm for pattern-based partitioning of large
           categorical rasters
    • Abstract: Publication date: September 2018
      Source:Computers & Geosciences, Volume 118
      Author(s): Jarosław Jasiewicz, Tomasz Stepinski, Jacek Niesterowicz
      Analyzing large Earth Observation (EO) data on the broad spatial scales frequently involves regionalization of patterns. To automate this process we present a segmentation algorithm designed specifically to delineate segments containing quasi-stationary patterns. The algorithm is designed to work with patterns of a categorical variable. This makes it possible to analyze very large spatial datasets (for example, a global land cover) in their entirety. An input categorical raster is first tessellated into small square tiles to form a new, coarser, grid of tiles. A mosaic of categories within each tile forms a local pattern, and the segmentation algorithm partitions the grid of tiles while maintaining the cohesion of pattern in each segment. The algorithm is based on the principle of seeded region growing (SRG) but it also includes segment merging and other enhancements to segmentation quality. Our key contribution is an extension of the concept of segmentation to grids in which each cell has a non-negligible size and contains a complex data structure (histograms of pattern features). Specific modification of a standard SRG algorithm include: working in a distance space with complex data objects, introducing six-connected “brick wall” topology of the grid to decrease artifacts associated with tessellation of geographical space, constructing the SRG priority queue of seeds on the basis of local homogeneity of patterns, and using a content-dependent value of segment-growing threshold. The detailed description of the algorithm is given followed by an assessment of its performance on test datasets representing three pertinent themes of land cover, topography, and a high-resolution image. Pattern-based segmentation algorithm will find application in ecology, forestry, geomorphology, land management, and agriculture. The algorithm is implemented as a module of GeoPAT – an already existing, open source toolbox for performing pattern-based analysis of categorical rasters.

      PubDate: 2018-06-18T14:53:15Z
       
  • A fast approach for unsupervised karst feature identification using GPU
    • Abstract: Publication date: Available online 15 June 2018
      Source:Computers & Geosciences
      Author(s): Luis C.S. Afonso, Mateus Basso, Michelle C. Kuroda, Alexandre C. Vidal, João P. Papa
      Among the geological features, karst is the one that has received special attention in oil and gas exploration for being a strong indicator of the potential existence of hydrocarbon reservoirs. The integration of automatic pattern recognition methods and Graphics Processing Units (GPU) provides a powerful tool to help geological interpretation of seismic data. In order to provide insightful information for interpreters, this work investigates the usage of GPUs in addition to image segmentation by means of unsupervised classification for the identification of karst features in 3D seismic data. For this purpose, an implementation of the robust Self-Organizing Map for GPUs (SOM/GPU) is provided, and a comparison against a Central Processing Unit (CPU)-based SOM (SOM/CPU) is performed to assess the speeding-up provided by GPU. Experiments have shown promising results for geological interpretation using seismic data.

      PubDate: 2018-06-18T14:53:15Z
       
  • Plumetrack: Flux calculation software for UV cameras
    • Abstract: Publication date: September 2018
      Source:Computers & Geosciences, Volume 118
      Author(s): Nial Peters, Clive Oppenheimer
      Ultraviolet (UV) cameras are increasingly employed to map and measure SO2 abundances in volcanic emissions to the atmosphere. The main purpose of this is to estimate mass fluxes of SO2, which requires estimation of the transport velocity of the plume. In this paper, we present Plumetrack, open-source Python based software for computing SO2 fluxes from calibrated UV camera images. Designed to be the final component in UV camera processing toolchains, Plumetrack provides functionality for velocity estimation using optical-flow, flux calculation and error estimates. It can be used interactively via a graphical user interface or for batch processing via a commandline interface. We discuss the features and implementation details of Plumetrack, describe in detail a new flux calculation algorithm and demonstrate its performance on a set of synthetic UV camera images. The new algorithm is found to out perform the established flux calculation method, especially for highly spatiotemporally variable plumes. Furthermore, we show that the Plumetrack software may be successfully used with data from other imaging systems such as standard video cameras.

      PubDate: 2018-06-07T09:54:31Z
       
  • Context-dependent image quality assessment of JPEG compressed Mars Science
           Laboratory Mastcam images using convolutional neural networks
    • Abstract: Publication date: Available online 6 June 2018
      Source:Computers & Geosciences
      Author(s): Hannah R. Kerner, James F. Bell, Heni Ben Amor
      The Mastcam color imaging system on the Mars Science Laboratory Curiosity rover acquires images that are often JPEG compressed before being downlinked to Earth. Depending on the context of the observation, this compression can result in image artifacts that might introduce problems in the scientific interpretation of the data and might require the image to be retransmitted losslessly. We propose to streamline the tedious process of manually analyzing images using context-dependent image quality assessment, a process wherein the context and intent behind the image observation determine the acceptable image quality threshold. We propose a neural network solution for estimating the probability that a Mastcam user would find the quality of a compressed image acceptable for science analysis. We also propose an automatic labeling method that avoids the need for domain experts to label thousands of training examples. We performed multiple experiments to evaluate the ability of our model to assess context-dependent image quality, the efficiency a user might gain when incorporating our model, and the uncertainty of the model given different types of input images. We compare our approach to the state of the art in no-reference image quality assessment. Our model correlates well with the perceptions of scientists assessing context-dependent image quality and could result in significant time savings when included in the current Mastcam image review process.

      PubDate: 2018-06-07T09:54:31Z
       
  • Improved workflow for unguided multiphase image segmentation
    • Abstract: Publication date: Available online 6 June 2018
      Source:Computers & Geosciences
      Author(s): Brendan A. West, Taylor S. Hodgdon, Matthew D. Parno, Arnold J. Song
      Quantitative image analysis often depends on accurate classification of pixels through a segmentation process. However, imaging artifacts such as the partial volume effect and sensor noise complicate the classification process. These effects increase the pixel intensity variance of each constituent class, causing intensity values of one class to overlap with another. This increased variance makes threshold based segmentation methods insufficient due to ambiguous overlap regions in the pixel intensity distributions. The class ambiguity becomes even more complex for systems with more than two constituents, such as unsaturated moist granular media. In this paper, we propose an image processing workflow that improves segmentation accuracy for multiphase systems. First, the ambiguous transition regions between classes are identified and removed, which allows for global thresholding of single-class regions. Then the transition regions are classified using a distance function, and finally both segmentations are combined into one classified image. This workflow includes three methodologies for identifying transition pixels and we demonstrate on a variety of synthetic images that these approaches are able to accurately separate the ambiguous transition pixels from the single-class regions. One of these methods does not require any parameter tuning and is entirely unguided, whereas the other two require manual threshold specifications and are thus user-guided. For situations with typical amounts of image noise, misclassification errors and area differences calculated between each class of the synthetic images and the resultant segmented images range from 0.69-1.48% and 0.01–0.74%, respectively, showing the segmentation accuracy of this approach. We demonstrate that we are able to accurately segment x-ray microtomography images of moist granular media using these computationally efficient methodologies.

      PubDate: 2018-06-07T09:54:31Z
       
  • E-wave software: EBSD-based dynamic wave propagation model for studying
           seismic anisotropy
    • Abstract: Publication date: Available online 4 June 2018
      Source:Computers & Geosciences
      Author(s): Xin Zhong, Marcel Frehner
      The study of seismic anisotropy has benefited from the wide application of the electron backscatter diffraction (EBSD) technique that provides complete information on the crystallographic and shape preferred orientations in 2D sections. Classical effective medium theory statistically approximates the seismic anisotropy based on the crystallographic preferred orientation, but the shape preferred orientation is often idealized as e.g. parallel layering or oriented inclusions. Due to higher demands in precisely quantifying seismic anisotropy in natural rocks and taking full advantage of the EBSD technique, dynamic wave propagation methods have received broad attention. This paper presents the MATLAB program E-Wave based on a novel approach to directly use EBSD data for 2D numerical wave propagation simulation. The complete mechanical formulation and numerical benchmarks with simple model setups are presented. The E-Wave program allows straightforward EBSD data import, finite-difference simulations with one-button click, and automatic result analysis. The E-Wave program can be a helpful and independent tool in future works to shed light on the relationship between microstructures and seismic anisotropy, and contribute from the modelling perspective to studies in seismology, geodynamics and rock physics.

      PubDate: 2018-06-04T09:52:47Z
       
  • Corrigendum to 'Gamra: Simple meshing for complex earthquakes' [Comput.
           Geosci. 90PA (2016) 49–63]
    • Abstract: Publication date: August 2018
      Source:Computers & Geosciences, Volume 117
      Author(s): Walter Landry, Sylvain Barbot


      PubDate: 2018-06-01T09:43:50Z
       
  • Sparse regression interaction models for spatial prediction of soil
           properties in 3D
    • Abstract: Publication date: September 2018
      Source:Computers & Geosciences, Volume 118
      Author(s): Milutin Pejović, Mladen Nikolić, Gerard B.M. Heuvelink, Tomislav Hengl, Milan Kilibarda, Branislav Bajat
      An approach for using lasso (Least Absolute Shrinkage and Selection Operator) regression in creating sparse 3D models of soil properties for spatial prediction at multiple depths is presented. Modeling soil properties in 3D benefits from interactions of spatial predictors with soil depth and its polynomial expansion, which yields a large number of model variables (and corresponding model parameters). Lasso is able to perform variable selection, hence reducing the number of model parameters and making the model more easily interpretable. This also prevents overfitting, which makes the model more accurate. The presented approach was tested using four variable selection approaches – none, stepwise, lasso and hierarchical lasso, on four kinds of models – standard linear model, linear model with polynomial expansion of depth, linear model with interactions of covariates with depth and linear model with interactions of covariates with depth and its polynomial expansion. This framework was used to predict Soil Organic Carbon (SOC) in three contrasting study areas: Bor (Serbia), Edgeroi (Australia) and the Netherlands. Results show that lasso yields substantial improvements in accuracy over standard and stepwise regression — up to 50 % of total variance. It yields models which contain up to five times less nonzero parameters than the full models and that are usually more sparse than models obtained by stepwise regression, up to three times. Extension of the standard linear model by including interactions typically improves the accuracy of models produced by lasso, but is detrimental to standard and stepwise regression. Regarding computation time, it was demonstrated that lasso is several orders of magnitude more efficient than stepwise regression for models with tens or hundreds of variables (including interactions). Proper model evaluation is emphasized. Considering the fact that lasso requires meta-parameter tuning, standard cross-validation does not suffice for adequate model evaluation, hence a nested cross-validation was employed. The presented approach is implemented as publicly available sparsereg3D R package.

      PubDate: 2018-05-28T17:03:57Z
       
  • Enhanced IT2FCM algorithm using object-based triangular fuzzy set modeling
           for remote-sensing clustering
    • Abstract: Publication date: September 2018
      Source:Computers & Geosciences, Volume 118
      Author(s): Tao Jiang, Dan Hu, Xianchuan Yu
      Object-based fuzzy clustering method has been widely used to remote-sensing clustering analysis. Mean and interval spectral signatures are typically used to describe an object's features. However, accurately distinguishing two objects with the same mean or interval values and different internal distributions is difficult. Focus on this problem, we developed triangular fuzzy set modeling to describe objects and designed an interval distance metric to measure the dissimilarities between triangular fuzzy sets. Furthermore, using the variation of fuzzifier (two fuzzifiers) to construct interval type-2 fuzzy c-means (IT2FCM) clustering methods, which are sensitive to the choice of fuzzifier, has uncertainty and subjectivity. Thus, an enhanced IT2FCM clustering algorithm that directly adopts the interval distance metric rather than the variation of fuzzifier is proposed for high-resolution remote-sensing clustering. We performed land-cover classification experiments for three study areas by utilizing remote-sensing images from the SPOT-5 and Gaofen-2 satellite sensor, which spatial resolution are approximately 10 m and 1 m, respectively. Visual and numerical results, including Kappa coefficients and the confusion matrix, were utilized to verify the classification results. The experimental results indicated that triangular fuzzy set modeling is appropriate for extracting features from ground objects; moreover, it limits the classification errors caused by same objects with different spectral features. Compared with the object-based interval-valued fuzzy c-means (IV-FCM) method reported in the literature, the proposed algorithm results in improved classification quality and accuracy.

      PubDate: 2018-05-28T17:03:57Z
       
  • Modular implementation of magnetotelluric 2D forward modeling with general
           anisotropy
    • Abstract: Publication date: September 2018
      Source:Computers & Geosciences, Volume 118
      Author(s): Zeqiu Guo, Gary D. Egbert, Wenbo Wei
      We present a general framework for two-dimensional finite difference modeling of magnetotelluric data in the presence of general anisotropy. Our approach is modular, allowing differential operators for a range of formulations of the governing equations, defined on several possible discrete grids, to be constructed from a basic set of first difference and averaging operators. We specifically consider two formulations of the two-dimensional anisotropic problem, one with Maxwell's equations reduced to a second order system in terms of three coupled electric components, and one in terms of coupled electric and magnetic x-components. Both formulations are discretized on a staggered grid; the second (coupled electric and magnetic) system is also implemented on a grid with fixed nodes (i.e., not staggered). The three implementations are validated and compared using a range of test models, including a half-space with general anisotropy, an infinite fault with axial anisotropy and a simple dyke model. Comparisons to analytic results (for half-space and fault models), and to results from other anisotropic codes, combined with grid-refinement convergence tests, demonstrate that our algorithms are accurate and capable of routine modeling of two-dimensional general anisotropy. These finite difference codes, demonstrating the flexibility of our numerical discretization approach, can be readily applied to other problems.

      PubDate: 2018-05-28T17:03:57Z
       
  • Focal beam analysis for 3D acquisition geometries in complex media with
           GPU implementation
    • Abstract: Publication date: September 2018
      Source:Computers & Geosciences, Volume 118
      Author(s): Jun Su, Li-Yun Fu, Wei Wei, Junhua Hu, Weijia Sun
      Seismic acquisition geometries have a significant influence on the quality of seismic data in the oil and gas exploration process. Therefore, prior analyses are beneficial to the design of acquisition geometries before implementation of seismic acquisition. The focal beam analysis method can provide quantitative insights into the combined influence of acquisition geometries and subsurface structures. This approach involves a large calculation burden concerning 3D wavefield extrapolation in the case of complex media, thus inhibiting the practical application of focal beam analysis in complex media when using regular CPUs. Therefore, using a graphics processing unit (GPU) to accelerate focal beam analysis becomes imperative. We have developed a fast parallel algorithm to speed up the focal beam analysis for 3D acquisition geometries in complex media on GPUs. Three-dimensional numerical examples show that the GPU-based focal beam analysis runs about 17 times faster than a serial CPU-based one. We also demonstrate the validity and scalability of the proposed approach with numerical examples. The boost in performance afforded by the GPU architecture allows us to analyse 3D acquisition geometries in complex media with less time and at lower cost of hardware.

      PubDate: 2018-05-28T17:03:57Z
       
  • Identification of geochemical anomalies through combined sequential
           Gaussian simulation and grid-based local singularity analysis
    • Abstract: Publication date: September 2018
      Source:Computers & Geosciences, Volume 118
      Author(s): Jian Wang, Renguang Zuo
      Local singularity analysis (LSA) has been proven to be an effective tool for identifying weak geochemical anomalies. The common practice of grid-based LSA is to firstly interpolate irregularly distributed observations onto a raster map by using either kriging or inverse distance weighting (IDW). The inherent nature of the weighted moving averaging of these methods typically subjects the interpolated map to a smoothing effect. Additionally, the traditional procedure did not allow for uncertainties on the values of geochemical attributes at unsampled locations. As such, these two aspects might affect LSA results. This paper presents a hybrid method, which combines sequential Gaussian simulation and grid-based LSA to identify geochemical anomalies. A case study of processing soil samples collected from the Jilinbaolige district, Inner Mongolia, China, further illustrates the hybrid method and helps compare the results with those from kriging-based LSA. The findings indicate that (1) the uncertainties of values at unsampled locations could affect the results of grid-based LSA, and (2) singularity exponents from kriging-based LSA roughly represent the trend (median) of singularity exponent distributions from simulation-based LSA, but the latter can also provide a measure of uncertainty of singularity exponent propagated from the uncertain values at unsampled locations, and (3) the procedure combining simulation-based LSA and analysis of distance is a feasible way for identifying geochemical anomalies with uncertainty being considered. The anomaly probability map obtained can provide a more generalized perspective than interpolation-based LSA to delineate anomalous areas.

      PubDate: 2018-05-28T17:03:57Z
       
  • Advancing interoperability of geospatial data provenance on the web: Gap
           analysis and strategies
    • Abstract: Publication date: August 2018
      Source:Computers & Geosciences, Volume 117
      Author(s): Liangcun Jiang, Peng Yue, Werner Kuhn, Chenxiao Zhang, Changhui Yu, Xia Guo
      Geospatial data provenance is a fundamental issue in distributing spatial information on the Web. In the geoinformatics domain, provenance is often referred to as lineage. While the ISO 19115 lineage model is used widely in spatial data infrastructures, W3C has recommended the W3C provenance (PROV) data model for capturing and sharing provenance information on the Web. The use of these two separate efforts needs to be harmonized so that geospatial information does not remain an isolated area on the Web. Motivated by several domain use cases, we synthesize a list of provenance questions and analyze gaps between the two models in addressing these questions. Our strategy is to enrich W3C PROV with domain semantics from the ISO 19115 lineage model by suggesting ways to bridge them. A semantic mapping between the ISO lineage model and the W3C PROV model is formalized, and key issues involved are discussed. Use cases illustrate the applicability of the approach.

      PubDate: 2018-05-28T17:03:57Z
       
  • Fibonacci lattices for the evaluation and optimization of map projections
    • Abstract: Publication date: August 2018
      Source:Computers & Geosciences, Volume 117
      Author(s): Sergio Baselga
      Latitude-longitude grids are frequently used in geosciences for global numerical modelling although they are remarkably inhomogeneous due to meridian convergence. In contrast, Fibonacci lattices are highly isotropic and homogeneous so that the area represented by each lattice point is virtually the same. In the present paper we show the higher performance of Fibonacci versus latitude-longitude lattices for evaluating distortion coefficients of map projections. In particular, we obtain first a typical distortion for the Lambert Conformal Conic projection with their currently defined parameters and geographic boundaries for Europe that has been adopted as standard by the INSPIRE directive. Further, we optimize the defining parameters of this projection, lower and upper standard parallel latitudes, so that the typical distortion for Europe is reduced a 10% when they are set to 36° and 61.5°, respectively. We also apply the optimization procedure to the determination of the best standard parallels for using this projection in Spain, whose values remained unspecified by the National decree that commanded its official adoption, and obtain optimum values of 37° and 42° and a resulting typical distortion of 828 ppm.

      PubDate: 2018-05-28T17:03:57Z
       
  • Reconstruction of seismic data with missing traces based on optimized
           Poisson Disk sampling and compressed sensing
    • Abstract: Publication date: August 2018
      Source:Computers & Geosciences, Volume 117
      Author(s): Yuan-Yuan Sun, Rui-Sheng Jia, Hong-Mei Sun, Xing-Li Zhang, Yan-Jun Peng, Xin-Ming Lu
      In seismic exploration, challenges posed by the collection environment often lead to incomplete or irregular seismic data. Fortunately, Compressed Sensing theory provides the possibility of reconstructing under-sampled data. One of the most important aspects of applying this theory is selection of an appropriate sampling method. In this paper, we propose a seismic data reconstruction method based on optimized Poisson Disk sampling under Compressed Sensing. First, K-Singular Value Decomposition is used to train the seismic sample data and obtain an overcomplete dictionary which can be used to sparsely represent the missing seismic data. Then, after using the optimized Poisson Disk sampling method to compress the seismic data, the missing seismic data are restored by the orthogonal matching pursuit algorithm. The results of experiments show that, unlike the traditional Gaussian random sampling and Poisson Disk sampling methods, the proposed method preserves the uniformity of the sampling points while maintaining the sampling randomness, resulting in improved reconstruction results.

      PubDate: 2018-05-28T17:03:57Z
       
  • Computation of analytical sensitivity matrix for the frequency-domain EM
           data: MATLAB code
    • Abstract: Publication date: August 2018
      Source:Computers & Geosciences, Volume 117
      Author(s): Jide Nosakare Ogunbo
      Among many applications of the sensitivity matrix is its use for performing nonlinear least-squares inversion of geophysical data. The mainstream finite difference approach is approximate. Some previous works have discussed the accurate analytical computation approach for the sensitivity matrix of the 1D geophysical problems but the codes are not publicly accessible. Using the basic differentiation rules (product and quotient) and the logarithmic differentiation the MATLAB code for analytical computation of the 1D recursive transverse electric frequency-domain electromagnetic forward response is presented. Because only a single call to the sensitivity matrix subroutine is needed for analytical computation the computation time is a linear function of the number of parameters. However, the computation time for the sensitivity matrix computation via the approximate method is a quadratic function of the model parameters. Experimentation results show that the analytical method increasingly gain speed over its counterpart approximate method (for instance, the analytical route is a million times faster than approximate method when the number of parameters is a million) with no loss of accuracy.

      PubDate: 2018-05-28T17:03:57Z
       
  • DOUBLE FIT: Optimization procedure applied to lattice strain model
    • Abstract: Publication date: August 2018
      Source:Computers & Geosciences, Volume 117
      Author(s): Célia Dalou, Julien Boulon, Kenneth T. Koga, Robert Dalou, Robert L. Dennen
      Modeling trace element partition coefficients using the lattice strain model is a powerful tool for understanding the effects of P-T conditions and mineral and melt compositions on partition coefficients, thus significantly advancing the geochemical studies of trace element distributions in nature. In this model, partition coefficients describe the strain caused by a volume change upon cation substitution in the crystal lattice. In some mantle minerals, divalent, trivalent, and tetravalent trace element cations are mainly substituted in one specific site. Lattice strain model parameters, for instance in olivine and plagioclase, are thus fit for one crystal site. However, trace element cations can be substituted in two sites in the cases of pyroxenes, garnets, amphiboles, micas, or epidote-group minerals. To thoroughly study element partitioning in those minerals, one must consider the lattice strain parameters of the two sites. In this paper, we present a user-friendly executable program, working on PC, Linux, and Macintosh, to fit a lattice strain model by an error-weighted differential-evolution-constrained algorithm (Storn, R., and Price, K. 1997. Differential evolution - A simple and efficient heuristic for global optimization over continuous spaces. Journal of Global Optimization 11, 341–359). This optimization procedure is called DOUBLE FIT and is available for download on http://celiadalou.wixsite.com/website/double-fit-program. DOUBLE FIT generates single or double parabolas fitting experimentally determined trace element partition coefficients using a very limited amount of data (at minimum six experimental data points) and accounting for data uncertainties. It is the fastest calculation available to obtain the best-fit lattice strain parameters while accounting for the elastic response of two different sites to trace element substitution in various minerals.

      PubDate: 2018-05-28T17:03:57Z
       
  • GPU acceleration of time gating based reverse time migration using the
           pseudospectral time-domain algorithm
    • Abstract: Publication date: August 2018
      Source:Computers & Geosciences, Volume 117
      Author(s): Jiangang Xie, Zichao Guo, Hai Liu, Qing Huo Liu
      We present a Graphics Processing Units (GPU) implementation of time gating based reverse time migration (TG-RTM) which uses the pseudospectral time-domain (PSTD) algorithm to solve the acoustic wave equation. TG-RTM adopts the prior information of surrounding media to strengthen the correlation between the wavefields, thus has advantages in locating the targets over traditional reverse time migration (RTM) methods. The PSTD algorithm adopts fast Fourier transform (FFT) to obtain the spatial derivatives and a perfectly matched layer as an absorbing boundary condition to eliminate the wraparound effect introduced by the FFT periodicity assumption. Under the Nyquist sampling theorem, the spatial sampling density of the PSTD algorithm requires only two points per minimum wavelength. Thus, the PSTD algorithm can solve the time dependent partial differential equations efficiently and save mass computer memory. Compared with traditional RTM based on the finite difference time domain (FDTD) algorithm, the proposed RTM based on the PSTD algorithm can be implemented on a memory-limited GPU and can solve much larger models. To secure a better performance and generality of FFT in GPU, we present a scheme which combines 1D FFT with matrix transpositions instead of using 3D FFT directly. The matrix transpositions use shared memory to improve memory access efficiency. We also apply an efficient FFT scheme which replaces even-sized R2C FFT with a half-sized C2C FFT. For a small amount and balanced memory swapping from computer to GPU, we save the boundaries in lieu of checkpointing scheme when we propagate the source wavefield forward and backward. The proposed RTM has an acceleration ratio of about 80 times by a Tesla K20X GPU card on a desktop computer. The simulation results of 2D and 3D models demonstrate that the proposed RTM is fast and inexpensive.

      PubDate: 2018-05-28T17:03:57Z
       
  • A mesh-free finite-difference method for elastic wave propagation in the
           frequency-domain
    • Abstract: Publication date: Available online 24 May 2018
      Source:Computers & Geosciences
      Author(s): Junichi Takekawa, Hitoshi Mikada
      We developed an innovative finite-difference method for elastic wave propagation in the frequency-domain. The method is a class of mesh-free method which can discretize governing equations without a restriction of regular lattice grid structure. We investigated the performance of the proposed method using the dispersion analyses and numerical experiments. The dispersion analyses for regular and irregular grid distributions show that quasi-uniform grid distribution can reduce the dispersion error. For a regular grid distribution, the proposed method has the accuracy equivalent to the average-derivative finite-difference method under the same resolution. Numerical experiments demonstrate that the adaptive grid distribution can be used to simulate elastic wave propagation. Since an irregular grid distribution is not used in conjunction with a mesh generation process, this feature can reduce time for initial pre-processing mesh construction. We also investigate the computational cost by comparing the calculation time of the proposed method with that of the average-derivative finite-difference method. The comparison indicates that the proposed method can provide efficient calculation results by using adaptive resolution to the velocity structure.

      PubDate: 2018-05-28T17:03:57Z
       
  • A structural rank reduction operator for removing artifacts in
           least-squares reverse time migration
    • Abstract: Publication date: Available online 3 May 2018
      Source:Computers & Geosciences
      Author(s): Min Bai, Juan Wu, Shaohuan Zu, Wei Chen
      Least-square reverse time migration (LSRTM) has been widely accepted because of its exceptional performance in mitigating migration artifacts and preserving the reflection amplitude. Due to the ill-posedness of the inverse problem, regularization methods or constraints must be applied to the reflectivity model. In this paper, we propose a novel iterative LSRTM framework that is regularized by a lowrank constraint. The lowrank constraint is applied along the geological structure of the subsurface reflectivity image and thus can also be called structural lowrank constraint. The lowrank constraint is applied by iteratively applying a rank reduction operator that is based on the lowrank approximation theory. The rank reduction operator applied along the structure direction can effectively remove those artifacts caused from sparse shot/receiver sampling or other circumstances. Compared with the traditionally used smoothness based constraint, the lowrank constraint is more capable of removing noise while preserving edge details. Since the constraint is applied in post-stack seismic image, the extra computational cost caused by the singular value decomposition (SVD) of the rank reduction operator is negligible compared with the computational cost of the migration operator. Numerical examples with different levels of structural complexity are used to demonstrate the effectiveness and validity of the proposed algorithm.

      PubDate: 2018-05-28T17:03:57Z
       
  • Arc-Malstrøm: A 1D hydrologic screening method for stormwater assessments
           based on geometric networks
    • Authors: Thomas Balstrøm; David Crawford
      Abstract: Publication date: Available online 23 April 2018
      Source:Computers & Geosciences
      Author(s): Thomas Balstrøm, David Crawford
      A method is presented to perform a first screening of high-resolution digital terrain models to detect the extents and capacities of local landscape sinks. When identified their capacities and rain volumes provided from their local catchments during a rainstorm are saved as attributes for the pour points. Next, the downstream paths from the pour points are saved as junctions and edges in a geometric network leading to a final calculation of the accumulated spillover in the topologic data structure determined by a custom trace tool. Although the screening method is based on a representation of the overland surface in a 1D network not involving any hydrodynamic components, it is well suited to provide a quick first overview of a landscape's overall drainage basins, the location of sinks, their contributing watersheds and the accumulated downstream flow when the sinks spill over. The exemplified study assumes Hortonian flow and a uniform rain event, but if spatial variations in precipitation or infiltration capacities are available the local sinks' catchment level, they may easily be added to the workflow to produce first risk map approximations for residential areas threatened by future stormwater incidents.

      PubDate: 2018-04-23T17:31:36Z
      DOI: 10.1016/j.cageo.2018.04.010
       
  • Efficient 3D inversions using the Richards equation
    • Authors: Rowan Cockett; Lindsey J. Heagy; Eldad Haber
      Abstract: Publication date: Available online 23 April 2018
      Source:Computers & Geosciences
      Author(s): Rowan Cockett, Lindsey J. Heagy, Eldad Haber
      Fluid flow in the vadose zone is governed by the Richards equation; it is parameterized by hydraulic conductivity, which is a nonlinear function of pressure head. Investigations in the vadose zone typically require characterizing distributed hydraulic properties. Water content or pressure head data may include direct measurements made from boreholes. Increasingly, proxy measurements from hydrogeophysics are being used to supply more spatially and temporally dense data sets. Inferring hydraulic parameters from such datasets requires the ability to efficiently solve and optimize the nonlinear time domain Richards equation. This is particularly important as the number of parameters to be estimated in a vadose zone inversion continues to grow. In this paper, we describe an efficient technique to invert for distributed hydraulic properties in 1D, 2D, and 3D. Our technique does not store the Jacobian matrix, but rather computes its product with a vector. Existing literature for the Richards equation inversion explicitly calculates the sensitivity matrix using finite difference or automatic differentiation, however, for large scale problems these methods are constrained by computation and/or memory. Using an implicit sensitivity algorithm enables large scale inversion problems for any distributed hydraulic parameters in the Richards equation to become tractable on modest computational resources. We provide an open source implementation of our technique based on the SimPEG framework, and show it in practice for a 3D inversion of saturated hydraulic conductivity using water content data through time.

      PubDate: 2018-04-23T17:31:36Z
      DOI: 10.1016/j.cageo.2018.04.006
       
  • Computation of continuum percolation threshold for pore systems composed
           of vugs and fractures
    • Authors: Evgeny Pervago; Aleksandr Mousatov; Elena Kazatchenko; Mikhail Markov
      Abstract: Publication date: Available online 18 April 2018
      Source:Computers & Geosciences
      Author(s): Evgeny Pervago, Aleksandr Mousatov, Elena Kazatchenko, Mikhail Markov
      In this research, we study the connectivity of a network composed of pores with different shapes including combinations of vugs and fractures. For this purpose, we have developed a numerical simulation technique to determine the dependence of continuum percolation threshold on the pore-shape distribution for isotropic porous 2D and 3D networks composed by elliptical and spheroidal elements respectively. This technique is based on the following new algorithms: (1) analytical estimation of overlapping between inclusions; (2) partial discretization schemes (elements of a discrete pixel base with one continuous dimension) for numerical calculation of connected-cluster porosity; and (3) determining the percolation-threshold porosity by using Monte Carlo simulations for different relative pore sizes. By approximating the pore shapes by ellipses (2D) and spheroids (3D) and varying their aspect ratios, we can model different types of pores from vugs (spheres) to fractures (oblate spheroids) and channels (prolate spheroids). We have calculated the critical percolation porosity for the following models: (1) a network consisting of elements with constant shapes; (2) a network composed of elements with the uniform logarithmic distribution of aspect ratios; and (3) a network containing elements of two different shapes. To validate the simulation technique, we have compared the modeling results for the first model with a threshold-aspect ratio relationship published previously. Based on the modeling results we have found simple and explicit equations for 2D and 3D models to determine the percolation threshold for pore networks with the bimodal distribution of shapes. The equations use only the element concentrations and percolation-threshold values for each elements shape.

      PubDate: 2018-04-23T17:31:36Z
      DOI: 10.1016/j.cageo.2018.04.008
       
  • An active monitoring method for flood events
    • Authors: Zeqiang Chen; Nengcheng Chen; Wenying Du; Jianya Gong
      Abstract: Publication date: Available online 18 April 2018
      Source:Computers & Geosciences
      Author(s): Zeqiang Chen, Nengcheng Chen, Wenying Du, Jianya Gong
      Timely and active detecting and monitoring of a flood event are critical for a quick response, effective decision-making and disaster reduction. To achieve the purpose, this paper proposes an active service framework for flood monitoring based on Sensor Web services and an active model for the concrete implementation of the active service framework. The framework consists of two core components—active warning and active planning. The active warning component is based on a publish-subscribe mechanism implemented by the Sensor Event Service. The active planning component employs the Sensor Planning Service to control the execution of the schemes and models and plans the model input data. The active model, called SMDSA, defines the quantitative calculation method for five elements, scheme, model, data, sensor, and auxiliary information, as well as their associations. Experimental monitoring of the Liangzi Lake flood in the summer of 2010 is conducted to test the proposed framework and model. The results show that 1) the proposed active service framework is efficient for timely and automated flood monitoring. 2) The active model, SMDSA, is a quantitative calculation method used to monitor floods from manual intervention to automatic computation. 3) As much preliminary work as possible should be done to take full advantage of the active service framework and the active model.

      PubDate: 2018-04-23T17:31:36Z
      DOI: 10.1016/j.cageo.2018.04.009
       
  • Estimating habitat volume of living resources using three-dimensional
           circulation and biogeochemical models
    • Authors: Katharine A. Smith; Zachary Schlag; Elizabeth W. North
      Abstract: Publication date: Available online 7 April 2018
      Source:Computers & Geosciences
      Author(s): Katharine A. Smith, Zachary Schlag, Elizabeth W. North
      Coupled three-dimensional circulation and biogeochemical models predict changes in water properties that can be used to define fish habitat, including physiologically important parameters such as temperature, salinity, and dissolved oxygen. Yet methods for calculating the volume of habitat defined by the intersection of multiple water properties are not well established for coupled three-dimensional models. The objectives of this research were to examine multiple methods for calculating habitat volume from three-dimensional model predictions, select the most robust approach, and provide an example application of the technique. Three methods were assessed: the “Step,” “Ruled Surface”, and “Pentahedron” methods, the latter of which was developed as part of this research. Results indicate that the analytical Pentahedron method is exact, computationally efficient, and preserves continuity in water properties between adjacent grid cells. As an example application, the Pentahedron method was implemented within the Habitat Volume Model (HabVol) using output from a circulation model with an Arakawa C-grid and physiological tolerances of juvenile striped bass (Morone saxatilis). This application demonstrates that the analytical Pentahedron method can be successfully applied to calculate habitat volume using output from coupled three-dimensional circulation and biogeochemical models, and it indicates that the Pentahedron method has wide application to aquatic and marine systems for which these models exist and physiological tolerances of organisms are known.

      PubDate: 2018-04-23T17:31:36Z
      DOI: 10.1016/j.cageo.2018.04.005
       
 
 
JournalTOCs
School of Mathematical and Computer Sciences
Heriot-Watt University
Edinburgh, EH14 4AS, UK
Email: journaltocs@hw.ac.uk
Tel: +00 44 (0)131 4513762
Fax: +00 44 (0)131 4513327
 
Home (Search)
Subjects A-Z
Publishers A-Z
Customise
APIs
Your IP address: 54.225.31.188
 
About JournalTOCs
API
Help
News (blog, publications)
JournalTOCs on Twitter   JournalTOCs on Facebook

JournalTOCs © 2009-