for Journals by Title or ISSN
for Articles by Keywords
help
  Subjects -> ENGINEERING (Total: 2270 journals)
    - CHEMICAL ENGINEERING (191 journals)
    - CIVIL ENGINEERING (183 journals)
    - ELECTRICAL ENGINEERING (99 journals)
    - ENGINEERING (1199 journals)
    - ENGINEERING MECHANICS AND MATERIALS (390 journals)
    - HYDRAULIC ENGINEERING (55 journals)
    - INDUSTRIAL ENGINEERING (64 journals)
    - MECHANICAL ENGINEERING (89 journals)

ENGINEERING (1199 journals)                  1 2 3 4 5 6 | Last

Showing 1 - 200 of 1205 Journals sorted alphabetically
3 Biotech     Open Access   (Followers: 7)
3D Research     Hybrid Journal   (Followers: 19)
AAPG Bulletin     Full-text available via subscription   (Followers: 5)
AASRI Procedia     Open Access   (Followers: 15)
Abstract and Applied Analysis     Open Access   (Followers: 3)
Aceh International Journal of Science and Technology     Open Access   (Followers: 2)
ACS Nano     Full-text available via subscription   (Followers: 217)
Acta Geotechnica     Hybrid Journal   (Followers: 6)
Acta Metallurgica Sinica (English Letters)     Hybrid Journal   (Followers: 5)
Acta Polytechnica : Journal of Advanced Engineering     Open Access   (Followers: 2)
Acta Scientiarum. Technology     Open Access   (Followers: 3)
Acta Universitatis Cibiniensis. Technical Series     Open Access  
Active and Passive Electronic Components     Open Access   (Followers: 7)
Adaptive Behavior     Hybrid Journal   (Followers: 10)
Adıyaman Üniversitesi Mühendislik Bilimleri Dergisi     Open Access  
Adsorption     Hybrid Journal   (Followers: 4)
Advanced Engineering Forum     Full-text available via subscription   (Followers: 4)
Advanced Science     Open Access   (Followers: 4)
Advanced Science Focus     Free   (Followers: 3)
Advanced Science Letters     Full-text available via subscription   (Followers: 5)
Advanced Science, Engineering and Medicine     Partially Free   (Followers: 7)
Advanced Synthesis & Catalysis     Hybrid Journal   (Followers: 17)
Advances in Artificial Neural Systems     Open Access   (Followers: 4)
Advances in Calculus of Variations     Hybrid Journal   (Followers: 2)
Advances in Catalysis     Full-text available via subscription   (Followers: 5)
Advances in Complex Systems     Hybrid Journal   (Followers: 7)
Advances in Engineering Software     Hybrid Journal   (Followers: 25)
Advances in Fuel Cells     Full-text available via subscription   (Followers: 14)
Advances in Fuzzy Systems     Open Access   (Followers: 5)
Advances in Geosciences (ADGEO)     Open Access   (Followers: 9)
Advances in Heat Transfer     Full-text available via subscription   (Followers: 19)
Advances in Human Factors/Ergonomics     Full-text available via subscription   (Followers: 23)
Advances in Magnetic and Optical Resonance     Full-text available via subscription   (Followers: 8)
Advances in Natural Sciences: Nanoscience and Nanotechnology     Open Access   (Followers: 28)
Advances in Operations Research     Open Access   (Followers: 11)
Advances in OptoElectronics     Open Access   (Followers: 5)
Advances in Physics Theories and Applications     Open Access   (Followers: 12)
Advances in Polymer Science     Hybrid Journal   (Followers: 40)
Advances in Porous Media     Full-text available via subscription   (Followers: 4)
Advances in Remote Sensing     Open Access   (Followers: 35)
Advances in Science and Research (ASR)     Open Access   (Followers: 6)
Aerobiologia     Hybrid Journal   (Followers: 1)
African Journal of Science, Technology, Innovation and Development     Hybrid Journal   (Followers: 4)
AIChE Journal     Hybrid Journal   (Followers: 28)
Ain Shams Engineering Journal     Open Access   (Followers: 5)
Akademik Platform Mühendislik ve Fen Bilimleri Dergisi     Open Access  
Alexandria Engineering Journal     Open Access   (Followers: 1)
AMB Express     Open Access   (Followers: 1)
American Journal of Applied Sciences     Open Access   (Followers: 27)
American Journal of Engineering and Applied Sciences     Open Access   (Followers: 11)
American Journal of Engineering Education     Open Access   (Followers: 9)
American Journal of Environmental Engineering     Open Access   (Followers: 16)
American Journal of Industrial and Business Management     Open Access   (Followers: 23)
Analele Universitatii Ovidius Constanta - Seria Chimie     Open Access  
Annals of Combinatorics     Hybrid Journal   (Followers: 3)
Annals of Pure and Applied Logic     Open Access   (Followers: 2)
Annals of Regional Science     Hybrid Journal   (Followers: 7)
Annals of Science     Hybrid Journal   (Followers: 7)
Applicable Algebra in Engineering, Communication and Computing     Hybrid Journal   (Followers: 2)
Applicable Analysis: An International Journal     Hybrid Journal   (Followers: 1)
Applied Catalysis A: General     Hybrid Journal   (Followers: 6)
Applied Catalysis B: Environmental     Hybrid Journal   (Followers: 8)
Applied Clay Science     Hybrid Journal   (Followers: 4)
Applied Computational Intelligence and Soft Computing     Open Access   (Followers: 12)
Applied Magnetic Resonance     Hybrid Journal   (Followers: 3)
Applied Nanoscience     Open Access   (Followers: 7)
Applied Network Science     Open Access  
Applied Numerical Mathematics     Hybrid Journal   (Followers: 5)
Applied Physics Research     Open Access   (Followers: 3)
Applied Sciences     Open Access   (Followers: 2)
Applied Spatial Analysis and Policy     Hybrid Journal   (Followers: 4)
Arabian Journal for Science and Engineering     Hybrid Journal   (Followers: 5)
Archives of Computational Methods in Engineering     Hybrid Journal   (Followers: 4)
Archives of Foundry Engineering     Open Access  
Archives of Thermodynamics     Open Access   (Followers: 7)
Arkiv för Matematik     Hybrid Journal   (Followers: 1)
ASEE Prism     Full-text available via subscription   (Followers: 2)
Asian Engineering Review     Open Access  
Asian Journal of Applied Science and Engineering     Open Access   (Followers: 1)
Asian Journal of Applied Sciences     Open Access   (Followers: 2)
Asian Journal of Biotechnology     Open Access   (Followers: 7)
Asian Journal of Control     Hybrid Journal  
Asian Journal of Current Engineering & Maths     Open Access  
Asian Journal of Technology Innovation     Hybrid Journal   (Followers: 8)
Assembly Automation     Hybrid Journal   (Followers: 2)
at - Automatisierungstechnik     Hybrid Journal   (Followers: 1)
ATZagenda     Hybrid Journal  
ATZextra worldwide     Hybrid Journal  
Australasian Physical & Engineering Sciences in Medicine     Hybrid Journal   (Followers: 1)
Australian Journal of Multi-Disciplinary Engineering     Full-text available via subscription   (Followers: 2)
Autonomous Mental Development, IEEE Transactions on     Hybrid Journal   (Followers: 7)
Avances en Ciencias e Ingeniería     Open Access  
Balkan Region Conference on Engineering and Business Education     Open Access   (Followers: 1)
Bangladesh Journal of Scientific and Industrial Research     Open Access  
Basin Research     Hybrid Journal   (Followers: 3)
Batteries     Open Access   (Followers: 3)
Bautechnik     Hybrid Journal   (Followers: 1)
Bell Labs Technical Journal     Hybrid Journal   (Followers: 23)
Beni-Suef University Journal of Basic and Applied Sciences     Open Access   (Followers: 3)
BER : Manufacturing Survey : Full Survey     Full-text available via subscription   (Followers: 2)
BER : Motor Trade Survey     Full-text available via subscription   (Followers: 1)
BER : Retail Sector Survey     Full-text available via subscription   (Followers: 2)
BER : Retail Survey : Full Survey     Full-text available via subscription   (Followers: 2)
BER : Survey of Business Conditions in Manufacturing : An Executive Summary     Full-text available via subscription   (Followers: 3)
BER : Survey of Business Conditions in Retail : An Executive Summary     Full-text available via subscription   (Followers: 3)
Bharatiya Vaigyanik evam Audyogik Anusandhan Patrika (BVAAP)     Open Access   (Followers: 1)
Biofuels Engineering     Open Access  
Biointerphases     Open Access   (Followers: 1)
Biomaterials Science     Full-text available via subscription   (Followers: 9)
Biomedical Engineering     Hybrid Journal   (Followers: 16)
Biomedical Engineering and Computational Biology     Open Access   (Followers: 13)
Biomedical Engineering Letters     Hybrid Journal   (Followers: 5)
Biomedical Engineering, IEEE Reviews in     Full-text available via subscription   (Followers: 16)
Biomedical Engineering, IEEE Transactions on     Hybrid Journal   (Followers: 31)
Biomedical Engineering: Applications, Basis and Communications     Hybrid Journal   (Followers: 5)
Biomedical Microdevices     Hybrid Journal   (Followers: 8)
Biomedical Science and Engineering     Open Access   (Followers: 3)
Biomedizinische Technik - Biomedical Engineering     Hybrid Journal  
Biomicrofluidics     Open Access   (Followers: 4)
BioNanoMaterials     Hybrid Journal   (Followers: 1)
Biotechnology Progress     Hybrid Journal   (Followers: 39)
Boletin Cientifico Tecnico INIMET     Open Access  
Botswana Journal of Technology     Full-text available via subscription  
Boundary Value Problems     Open Access   (Followers: 1)
Brazilian Journal of Science and Technology     Open Access   (Followers: 2)
Broadcasting, IEEE Transactions on     Hybrid Journal   (Followers: 10)
Bulletin of Canadian Petroleum Geology     Full-text available via subscription   (Followers: 14)
Bulletin of Engineering Geology and the Environment     Hybrid Journal   (Followers: 3)
Bulletin of the Crimean Astrophysical Observatory     Hybrid Journal  
Cahiers, Droit, Sciences et Technologies     Open Access  
Calphad     Hybrid Journal  
Canadian Geotechnical Journal     Full-text available via subscription   (Followers: 13)
Canadian Journal of Remote Sensing     Full-text available via subscription   (Followers: 40)
Case Studies in Engineering Failure Analysis     Open Access   (Followers: 7)
Case Studies in Thermal Engineering     Open Access   (Followers: 3)
Catalysis Communications     Hybrid Journal   (Followers: 6)
Catalysis Letters     Hybrid Journal   (Followers: 2)
Catalysis Reviews: Science and Engineering     Hybrid Journal   (Followers: 8)
Catalysis Science and Technology     Free   (Followers: 6)
Catalysis Surveys from Asia     Hybrid Journal   (Followers: 3)
Catalysis Today     Hybrid Journal   (Followers: 5)
CEAS Space Journal     Hybrid Journal  
Cellular and Molecular Neurobiology     Hybrid Journal   (Followers: 3)
Central European Journal of Engineering     Hybrid Journal   (Followers: 1)
CFD Letters     Open Access   (Followers: 6)
Chaos : An Interdisciplinary Journal of Nonlinear Science     Hybrid Journal   (Followers: 2)
Chaos, Solitons & Fractals     Hybrid Journal   (Followers: 3)
Chinese Journal of Catalysis     Full-text available via subscription   (Followers: 2)
Chinese Journal of Engineering     Open Access   (Followers: 2)
Chinese Science Bulletin     Open Access   (Followers: 1)
Ciencia e Ingenieria Neogranadina     Open Access  
Ciencia en su PC     Open Access   (Followers: 1)
Ciencias Holguin     Open Access   (Followers: 1)
CienciaUAT     Open Access  
Cientifica     Open Access  
CIRP Annals - Manufacturing Technology     Full-text available via subscription   (Followers: 11)
CIRP Journal of Manufacturing Science and Technology     Full-text available via subscription   (Followers: 14)
City, Culture and Society     Hybrid Journal   (Followers: 21)
Clay Minerals     Full-text available via subscription   (Followers: 9)
Clean Air Journal     Full-text available via subscription   (Followers: 2)
Coal Science and Technology     Full-text available via subscription   (Followers: 3)
Coastal Engineering     Hybrid Journal   (Followers: 11)
Coastal Engineering Journal     Hybrid Journal   (Followers: 4)
Coatings     Open Access   (Followers: 2)
Cogent Engineering     Open Access   (Followers: 2)
Cognitive Computation     Hybrid Journal   (Followers: 4)
Color Research & Application     Hybrid Journal   (Followers: 1)
COMBINATORICA     Hybrid Journal  
Combustion Theory and Modelling     Hybrid Journal   (Followers: 13)
Combustion, Explosion, and Shock Waves     Hybrid Journal   (Followers: 13)
Communications Engineer     Hybrid Journal   (Followers: 1)
Communications in Numerical Methods in Engineering     Hybrid Journal   (Followers: 2)
Components, Packaging and Manufacturing Technology, IEEE Transactions on     Hybrid Journal   (Followers: 23)
Composite Interfaces     Hybrid Journal   (Followers: 6)
Composite Structures     Hybrid Journal   (Followers: 252)
Composites Part A : Applied Science and Manufacturing     Hybrid Journal   (Followers: 176)
Composites Part B : Engineering     Hybrid Journal   (Followers: 222)
Composites Science and Technology     Hybrid Journal   (Followers: 164)
Comptes Rendus Mécanique     Full-text available via subscription   (Followers: 2)
Computation     Open Access  
Computational Geosciences     Hybrid Journal   (Followers: 12)
Computational Optimization and Applications     Hybrid Journal   (Followers: 7)
Computational Science and Discovery     Full-text available via subscription   (Followers: 2)
Computer Applications in Engineering Education     Hybrid Journal   (Followers: 6)
Computer Science and Engineering     Open Access   (Followers: 17)
Computers & Geosciences     Hybrid Journal   (Followers: 25)
Computers & Mathematics with Applications     Full-text available via subscription   (Followers: 5)
Computers and Electronics in Agriculture     Hybrid Journal   (Followers: 4)
Computers and Geotechnics     Hybrid Journal   (Followers: 8)
Computing and Visualization in Science     Hybrid Journal   (Followers: 6)
Computing in Science & Engineering     Full-text available via subscription   (Followers: 25)
Conciencia Tecnologica     Open Access  
Concurrent Engineering     Hybrid Journal   (Followers: 3)
Continuum Mechanics and Thermodynamics     Hybrid Journal   (Followers: 6)
Control and Dynamic Systems     Full-text available via subscription   (Followers: 8)
Control Engineering Practice     Hybrid Journal   (Followers: 41)
Control Theory and Informatics     Open Access   (Followers: 7)
Corrosion Science     Hybrid Journal   (Followers: 24)
CT&F Ciencia, Tecnologia y Futuro     Open Access  
CTheory     Open Access  

        1 2 3 4 5 6 | Last

Journal Cover Computers & Geosciences
  [SJR: 1.268]   [H-I: 78]   [25 followers]  Follow
    
   Hybrid Journal Hybrid journal (It can contain Open Access articles)
   ISSN (Print) 0098-3004
   Published by Elsevier Homepage  [3031 journals]
  • JMorph: Software for performing rapid morphometric measurements on digital
           images of fossil assemblages
    • Abstract: Publication date: August 2017
      Source:Computers & Geosciences, Volume 105
      Author(s): Peter G. Lelièvre, Melissa Grey
      Quantitative morphometric analyses of form are widely used in palaeontology, especially for taxonomic and evolutionary research. These analyses can involve several measurements performed on hundreds or even thousands of samples. Performing measurements of size and shape on large assemblages of macro- or microfossil samples is generally infeasible or impossible with traditional instruments such as vernier calipers. Instead, digital image processing software is required to perform measurements via suitable digital images of samples. Many software packages exist for morphometric analyses but there is not much available for the integral stage of data collection, particularly for the measurement of the outlines of samples. Some software exists to automatically detect the outline of a fossil sample from a digital image. However, automatic outline detection methods may perform inadequately when samples have incomplete outlines or images contain poor contrast between the sample and staging background. Hence, a manual digitization approach may be the only option. We are not aware of any software packages that are designed specifically for efficient digital measurement of fossil assemblages with numerous samples, especially for the purposes of manual outline analysis. Throughout several previous studies, we have developed a new software tool, JMorph, that is custom-built for that task. JMorph provides the means to perform many different types of measurements, which we describe in this manuscript. We focus on JMorph's ability to rapidly and accurately digitize the outlines of fossils. JMorph is freely available from the authors.

      PubDate: 2017-05-22T09:57:50Z
       
  • GPU based contouring method on grid DEM data
    • Abstract: Publication date: August 2017
      Source:Computers & Geosciences, Volume 105
      Author(s): Liheng Tan, Gang Wan, Feng Li, Xiaohui Chen, Wenlong Du
      This paper presents a novel method to generate contour lines from grid DEM data based on the programmable GPU pipeline. The previous contouring approaches often use CPU to construct a finite element mesh from the raw DEM data, and then extract contour segments from the elements. They also need a tracing or sorting strategy to generate the final continuous contours. These approaches can be heavily CPU-costing and time-consuming. Meanwhile the generated contours would be unsmooth if the raw data is sparsely distributed. Unlike the CPU approaches, we employ the GPU's vertex shader to generate a triangular mesh with arbitrary user-defined density, in which the height of each vertex is calculated through a third-order Cardinal spline function. Then in the same frame, segments are extracted from the triangles by the geometry shader, and translated to the CPU-side with an internal order in the GPU's transform feedback stage. Finally we propose a “Grid Sorting” algorithm to achieve the continuous contour lines by travelling the segments only once. Our method makes use of multiple stages of GPU pipeline for computation, which can generate smooth contour lines, and is significantly faster than the previous CPU approaches. The algorithm can be easily implemented with OpenGL 3.3 API or higher on consumer-level PCs.

      PubDate: 2017-05-22T09:57:50Z
       
  • A new method for geochemical anomaly separation based on the distribution
           patterns of singularity indices
    • Abstract: Publication date: August 2017
      Source:Computers & Geosciences, Volume 105
      Author(s): Yue Liu, Kefa Zhou, Qiuming Cheng
      Singularity analysis is one of the most important models in the fractal/multifractal family that has been demonstrated as an efficient tool for identifying hybrid distribution patterns of geochemical data, such as normal and multifractal distributions. However, the question of how to appropriately separate these patterns using reasonable thresholds has not been well answered. In the present study, a new method termed singularity-quantile (S-Q) analysis was proposed to separate multiple geochemical anomaly populations based on integrating singularity analysis and quantile-quantile plot (QQ-plot) analysis. The new method provides excellent abilities for characterizing frequency distribution patterns of singularity indices by plotting singularity index quantiles vs. standard normal quantiles. From a perspective of geochemical element enrichment processes, distribution patterns of singularity indices can be evidently separated into three groups by means of the new method, corresponding to element enrichment, element generality and element depletion, respectively. A case study for chromitite exploration based on geochemical data in the western Junggar region (China), was employed to examine the potential application of the new method. The results revealed that the proposed method was very sensitive to the changes of singularity indices with three segments when it was applied to characterize geochemical element enrichment processes. And hence, the S-Q method can be considered as an efficient and powerful tool for separating hybrid geochemical anomalies on the basis of statistical and inherent fractal/multifractal properties.

      PubDate: 2017-05-22T09:57:50Z
       
  • Performance prediction of finite-difference solvers for different computer
           architectures
    • Abstract: Publication date: August 2017
      Source:Computers & Geosciences, Volume 105
      Author(s): Mathias Louboutin, Michael Lange, Felix J. Herrmann, Navjot Kukreja, Gerard Gorman
      The life-cycle of a partial differential equation (PDE) solver is often characterized by three development phases: the development of a stable numerical discretization; development of a correct (verified) implementation; and the optimization of the implementation for different computer architectures. Often it is only after significant time and effort has been invested that the performance bottlenecks of a PDE solver are fully understood, and the precise details varies between different computer architectures. One way to mitigate this issue is to establish a reliable performance model that allows a numerical analyst to make reliable predictions of how well a numerical method would perform on a given computer architecture, before embarking upon potentially long and expensive implementation and optimization phases. The availability of a reliable performance model also saves developer effort as it both informs the developer on what kind of optimisations are beneficial, and when the maximum expected performance has been reached and optimisation work should stop. We show how discretization of a wave-equation can be theoretically studied to understand the performance limitations of the method on modern computer architectures. We focus on the roofline model, now broadly used in the high-performance computing community, which considers the achievable performance in terms of the peak memory bandwidth and peak floating point performance of a computer with respect to algorithmic choices. A first principles analysis of operational intensity for key time-stepping finite-difference algorithms is presented. With this information available at the time of algorithm design, the expected performance on target computer systems can be used as a driver for algorithm design.

      PubDate: 2017-05-22T09:57:50Z
       
  • A geophone wireless sensor network for investigating glacier stick-slip
           motion
    • Abstract: Publication date: August 2017
      Source:Computers & Geosciences, Volume 105
      Author(s): Kirk Martinez, Jane K. Hart, Philip J. Basford, Graeme M. Bragg, Tyler Ward, David S. Young
      We have developed an innovative passive borehole geophone system, as part of a wireless environmental sensor network to investigate glacier stick-slip motion. The new geophone nodes use an ARM Cortex-M3 processor with a low power design capable of running on battery power while embedded in the ice. Only data from seismic events was stored, held temporarily on a micro-SD card until they were retrieved by systems on the glacier surface which are connected to the internet. The sampling rates, detection and filtering levels were determined from a field trial using a standard commercial passive seismic system. The new system was installed on the Skalafellsjökull glacier in Iceland and provided encouraging results. The results showed that there was a relationship between surface melt water production and seismic event (ice quakes), and these occurred on a pattern related to the glacier surface melt-water controlled velocity changes (stick-slip motion). Three types of seismic events were identified, which were interpreted to reflect a pattern of till deformation (Type A), basal sliding (Type B) and hydraulic transience (Type C) associated with stick-slip motion.

      PubDate: 2017-05-17T09:52:55Z
       
  • Gaussian process emulators for quantifying uncertainty in CO2 spreading
           predictions in heterogeneous media
    • Abstract: Publication date: August 2017
      Source:Computers & Geosciences, Volume 105
      Author(s): Liang Tian, Richard Wilkinson, Zhibing Yang, Henry Power, Fritjof Fagerlund, Auli Niemi
      We explore the use of Gaussian process emulators (GPE) in the numerical simulation of CO 2 injection into a deep heterogeneous aquifer. The model domain is a two-dimensional, log-normally distributed stochastic permeability field. We first estimate the cumulative distribution functions (CDFs) of the CO 2 breakthrough time and the total CO 2 mass using a computationally expensive Monte Carlo (MC) simulation. We then show that we can accurately reproduce these CDF estimates with a GPE, using only a small fraction of the computational cost required by traditional MC simulation. In order to build a GPE that can predict the simulator output from a permeability field consisting of 1000s of values, we use a truncated Karhunen-Loève (K-L) expansion of the permeability field, which enables the application of the Bayesian functional regression approach. We perform a cross-validation exercise to give an insight of the optimization of the experiment design for selected scenarios: we find that it is sufficient to use 100s values for the size of training set and that it is adequate to use as few as 15 K-L components. Our work demonstrates that GPE with truncated K-L expansion can be effectively applied to uncertainty analysis associated with modelling of multiphase flow and transport processes in heterogeneous media.

      PubDate: 2017-05-17T09:52:55Z
       
  • Matching pursuit parallel decomposition of seismic data
    • Abstract: Publication date: July 2017
      Source:Computers & Geosciences, Volume 104
      Author(s): Chuanhui Li, Fanchang Zhang
      In order to improve the computation speed of matching pursuit decomposition of seismic data, a matching pursuit parallel algorithm is designed in this paper. We pick a fixed number of envelope peaks from the current signal in every iteration according to the number of compute nodes and assign them to the compute nodes on average to search the optimal Morlet wavelets in parallel. With the help of parallel computer systems and Message Passing Interface, the parallel algorithm gives full play to the advantages of parallel computing to significantly improve the computation speed of the matching pursuit decomposition and also has good expandability. Besides, searching only one optimal Morlet wavelet by every compute node in every iteration is the most efficient implementation.

      PubDate: 2017-05-17T09:52:55Z
       
  • Software for determining the direction of movement, shear and normal
           stresses of a fault under a determined stress state
    • Abstract: Publication date: July 2017
      Source:Computers & Geosciences, Volume 104
      Author(s): Alejandra Álvarez del Castillo, Susana Alicia Alaniz-Álvarez, Angel Francisco Nieto-Samaniego, Shunshan Xu, Gil Humberto Ochoa-González, Luis Germán Velasquillo-Martínez
      In the oil, gas and geothermal industry, the extraction or the input of fluids induces changes in the stress field of the reservoir, if the in-situ stress state of a fault plane is sufficiently disturbed, a fault may slip and can trigger fluid leakage or the reservoir might fracture and become damaged. The goal of the SSLIPO 1.0 software is to obtain data that can reduce the risk of affecting the stability of wellbores. The input data are the magnitudes of the three principal stresses and their orientation in geographic coordinates. The output data are the slip direction of a fracture in geographic coordinates, and its normal (σn) and shear (τ) stresses resolved on a single or multiple fracture planes. With this information, it is possible to calculate the slip tendency (τ/σn) and the propensity to open a fracture that is inversely proportional to σn. This software could analyze any compressional stress system, even non-Andersonian. An example is given from an oilfield in southern Mexico, in a region that contains fractures formed in three events of deformation. In the example SSLIPO 1.0 was used to determine in which deformation event the oil migrated. SSLIPO 1.0 is an open code application developed in MATLAB. The URL to obtain the source code and to download SSLIPO 1.0 are: http://www.geociencias.unam.mx/~alaniz/main_code.txt, http://www.geociencias.unam.mx/~alaniz/ SSLIPO_pkg.exe.

      PubDate: 2017-05-17T09:52:55Z
       
  • RINGMesh: A programming library for developing mesh-based geomodeling
           applications
    • Abstract: Publication date: July 2017
      Source:Computers & Geosciences, Volume 104
      Author(s): Jeanne Pellerin, Arnaud Botella, François Bonneau, Antoine Mazuyer, Benjamin Chauvin, Bruno Lévy, Guillaume Caumon
      RINGMesh is a C++ open-source programming library for manipulating discretized geological models. It is designed to ease the development of applications and workflows that use discretized 3D models. It is neither a geomodeler, nor a meshing software. RINGMesh implements functionalities to read discretized surface-based or volumetric structural models and to check their validity. The models can be then exported in various file formats. RINGMesh provides data structures to represent geological structural models, either defined by their discretized boundary surfaces, and/or by discretized volumes. A programming interface allows to develop of new geomodeling methods, and to plug in external software. The goal of RINGMesh is to help researchers to focus on the implementation of their specific method rather than on tedious tasks common to many applications. The documented code is open-source and distributed under the modified BSD license. It is available at https://www.ring-team.org/index.php/software/ringmesh.

      PubDate: 2017-05-17T09:52:55Z
       
  • PyRQA—Conducting recurrence quantification analysis on very long
           time series efficiently
    • Abstract: Publication date: July 2017
      Source:Computers & Geosciences, Volume 104
      Author(s): Tobias Rawald, Mike Sips, Norbert Marwan
      PyRQA is a software package that efficiently conducts recurrence quantification analysis (RQA) on time series consisting of more than one million data points. RQA is a method from non-linear time series analysis that quantifies the recurrent behaviour of systems. Existing implementations to RQA are not capable of analysing such very long time series at all or require large amounts of time to calculate the quantitative measures. PyRQA overcomes their limitations by conducting the RQA computations in a highly parallel manner. Building on the OpenCL framework, PyRQA leverages the computing capabilities of a variety of parallel hardware architectures, such as GPUs. The underlying computing approach partitions the RQA computations and enables to employ multiple compute devices at the same time. The goal of this publication is to demonstrate the features and the runtime efficiency of PyRQA. For this purpose we employ a real-world example, comparing the dynamics of two climatological time series, and a synthetic example, reducing the runtime regarding the analysis of a series consisting of over one million data points from almost eight hours using state-of-the-art RQA software to roughly 69s using PyRQA.

      PubDate: 2017-05-17T09:52:55Z
       
  • Extending R packages to support 64-bit compiled code: An illustration with
           spam64 and GIMMS NDVI3g data
    • Abstract: Publication date: July 2017
      Source:Computers & Geosciences, Volume 104
      Author(s): Florian Gerber, Kaspar Mösinger, Reinhard Furrer
      Software packages for spatial data often implement a hybrid approach of interpreted and compiled programming languages. The compiled parts are usually written in C, C++, or Fortran, and are efficient in terms of computational speed and memory usage. Conversely, the interpreted part serves as a convenient user-interface and calls the compiled code for computationally demanding operations. The price paid for the user friendliness of the interpreted component is—besides performance—the limited access to low level and optimized code. An example of such a restriction is the 64-bit vector support of the widely used statistical language R. On the R side, users do not need to change existing code and may not even notice the extension. On the other hand, interfacing 64-bit compiled code efficiently is challenging. Since many R packages for spatial data could benefit from 64-bit vectors, we investigate strategies to efficiently pass 64-bit vectors to compiled languages. More precisely, we show how to simply extend existing R packages using the foreign function interface to seamlessly support 64-bit vectors. This extension is shown with the sparse matrix algebra R package spam. The new capabilities are illustrated with an example of GIMMS NDVI3g data featuring a parametric modeling approach for a non-stationary covariance matrix.
      Graphical abstract image Highlights

      PubDate: 2017-05-17T09:52:55Z
       
  • Correlation confidence limits for unevenly sampled data
    • Abstract: Publication date: July 2017
      Source:Computers & Geosciences, Volume 104
      Author(s): Jason Roberts, Mark Curran, Samuel Poynter, Andrew Moy, Tas van Ommen, Tessa Vance, Carly Tozer, Felicity S. Graham, Duncan A. Young, Christopher Plummer, Joel Pedro, Donald Blankenship, Martin Siegert
      Estimation of correlation with appropriate uncertainty limits for scientific data that are potentially serially correlated is a common problem made seriously challenging especially when data are sampled unevenly in space and/or time. Here we present a new, robust method for estimating correlation with uncertainty limits between autocorrelated series that does not require either resampling or interpolation. The technique employs the Gaussian kernel method with a bootstrapping resampling approach to derive the probability density function and resulting uncertainties. The method is validated using an example from radar geophysics. Autocorrelation and error bounds are estimated for an airborne radio-echo profile of ice sheet thickness. The computed limits are robust when withholding 10%, 20%, and 50% of data. As a further example, the method is applied to two time-series of methanesulphonic acid in Antarctic ice cores from different sites. We show how the method allows evaluation of the significance of correlation where the signal-to-noise ratio is low and reveals that the two ice cores exhibit a significant common signal.

      PubDate: 2017-05-17T09:52:55Z
       
  • Visual exploration and analysis of ionospheric scintillation monitoring
           data: The ISMR Query Tool
    • Abstract: Publication date: July 2017
      Source:Computers & Geosciences, Volume 104
      Author(s): Bruno César Vani, Milton Hirokazu Shimabukuro, João Francisco Galera Monico
      Ionospheric Scintillations are rapid variations on the phase and/or amplitude of a radio signal as it passes through ionospheric plasma irregularities. The ionosphere is a specific layer of the Earth's atmosphere located approximately between 50km and 1000km above the Earth's surface. As Global Navigation Satellite Systems (GNSS) – such as GPS, Galileo, BDS and GLONASS – use radio signals, these variations degrade their positioning service quality. Due to its location, Brazil is one of the places most affected by scintillation in the world. For that reason, ionosphere monitoring stations have been deployed over Brazilian territory since 2011 through cooperative projects between several institutions in Europe and Brazil. Such monitoring stations compose a network that generates a large amount of monitoring data everyday. GNSS receivers deployed at these stations – named Ionospheric Scintillation Monitor Receivers (ISMR) – provide scintillation indices and related signal metrics for available satellites dedicated to satellite-based navigation and positioning services. With this monitoring infrastructure, more than ten million observation values are generated and stored every day. Extracting the relevant information from this huge amount of data was a hard process and required the expertise of computer and geoscience scientists. This paper describes the concepts, design and aspects related to the implementation of the software that has been supporting research on ISMR data – the so-called ISMR Query Tool. Usability and other aspects are also presented via examples of application. This web based software has been designed and developed aiming to ensure insights over the huge amount of ISMR data that is fetched every day on an integrated platform. The software applies and adapts time series mining and information visualization techniques to extend the possibilities of exploring and analyzing ISMR data. The software is available to the scientific community through the World Wide Web, therefore constituting an analysis infrastructure that complements the monitoring one, providing support for researching ionospheric scintillation in the GNSS context. Interested researchers can access the functionalities without cost at http://is-cigala-calibra.fct.unesp.br/, under online request to the Space Geodesy Study Group from UNESP – Univ Estadual Paulista at Presidente Prudente.

      PubDate: 2017-05-17T09:52:55Z
       
  • SwathProfiler and NProfiler: Two new ArcGIS Add-ins for the automatic
           extraction of swath and normalized river profiles
    • Abstract: Publication date: July 2017
      Source:Computers & Geosciences, Volume 104
      Author(s): J.V. Pérez-Peña, M. Al-Awabdeh, J.M. Azañón, J.P. Galve, G. Booth-Rea, D. Notti
      The present-day great availability of high-resolution Digital Elevation Models has improved tectonic geomorphology analyses in their methodological aspects and geological meaning. Analyses based on topographic profiles are valuable to explore the short and long-term landscape response to tectonic activity and climate changes. Swath and river longitudinal profiles are two of the most used analysis to explore the long and short-term landscape responses. Most of these morphometric analyses are conducted in GIS software, which have become standard tools for analyzing drainage network metrics. In this work we present two ArcGIS Add-Ins to automatically delineate swath and normalized river profiles. Both tools are programmed in Visual Basic . NET and use ArcObjects library-architecture to access directly to vector and raster data. The SwathProfiler Add-In allows analyzing the topography within a swath or band by representing maximum-minimum-mean elevations, first and third quartile, local relief and hypsometry. We have defined a new transverse hypsometric integral index (THi) that analyzes hypsometry along the swath and offer valuable information in these kind of graphics. The NProfiler Add-In allows representing longitudinal normalized river profiles and their related morphometric indexes as normalized concavity (CT), maximum concavity (Cmax) and length of maximum concavity (Lmax). Both tools facilitate the spatial analysis of topography and drainage networks directly in a GIS environment as ArcMap and provide graphical outputs. To illustrate how these tools work, we analyzed two study areas, the Sierra Alhamilla mountain range (Betic Cordillera, SE Spain) and the Eastern margin of the Dead Sea (Jordan). The first study area has been recently studied from a morphotectonic perspective and these new tools can show an added value to the previous studies. The second study area has not been analyzed by quantitative tectonic geomorphology and the results suggest a landscape in transient state due to a continuous base-level fall produced by the formation of the Dead Sea basin.

      PubDate: 2017-05-17T09:52:55Z
       
  • Antarctic Mapping Tools for Matlab
    • Abstract: Publication date: July 2017
      Source:Computers & Geosciences, Volume 104
      Author(s): Chad A. Greene, David E. Gwyther, Donald D. Blankenship
      We present the Antarctic Mapping Tools package, an open-source Matlab toolbox for analysis and plotting of Antarctic geospatial datasets. This toolbox is designed to streamline scientific workflow and maximize repeatability through functions which allow fully scripted data analysis and mapping. Data access is facilitated by several dataset-specific plugins which are freely available online. An open architecture has been chosen to encourage users to develop and share plugins for future Antarctic geospatial datasets. This toolbox includes functions for coordinate transformations, flight line or ship track analysis, and data mapping in georeferenced or projected coordinates. Each function is thoroughly documented with clear descriptions of function syntax alongside examples of data analysis or display using Antarctic geospatial data. The Antarctic Mapping Tools package is designed for ease of use and allows users to perform each step of data processing including raw data import, data analysis, and creation of publication-quality maps, wholly within the numerical environment of Matlab.
      Graphical abstract image Highlights

      PubDate: 2017-05-17T09:52:55Z
       
  • Slicken 1.0: Program for calculating the orientation of shear on
           reactivated faults
    • Abstract: Publication date: July 2017
      Source:Computers & Geosciences, Volume 104
      Author(s): Hong Xu, Shunshan Xu, Ángel F. Nieto-Samaniego, Susana A. Alaniz-Álvarez
      The slip vector on a fault is an important parameter in the study of the movement history of a fault and its faulting mechanism. Although there exist many graphical programs to represent the shear stress (or slickenline) orientations on faults, programs to quantitatively calculate the orientation of fault slip based on a given stress field are scarce. In consequence, we develop Slicken 1.0, a software to rapidly calculate the orientation of maximum shear stress on any fault plane. For this direct method of calculating the resolved shear stress on a planar surface, the input data are the unit vector normal to the involved plane, the unit vectors of the three principal stress axes, and the stress ratio. The advantage of this program is that the vertical or horizontal principal stresses are not necessarily required. Due to its nimble design using Java SE 8.0, it runs on most operating systems with the corresponding Java VM. The software program will be practical for geoscience students, geologists and engineers and will help resolve a deficiency in field geology, and structural and engineering geology.

      PubDate: 2017-05-17T09:52:55Z
       
  • Spatio-ecological complexity measures in GRASS GIS
    • Abstract: Publication date: July 2017
      Source:Computers & Geosciences, Volume 104
      Author(s): Duccio Rocchini, Vaclav Petras, Anna Petrasova, Yann Chemin, Carlo Ricotta, Alessandro Frigeri, Martin Landa, Matteo Marcantonio, Lucy Bastin, Markus Metz, Luca Delucchi, Markus Neteler
      Good estimates of ecosystem complexity are essential for a number of ecological tasks: from biodiversity estimation, to forest structure variable retrieval, to feature extraction by edge detection and generation of multifractal surface as neutral models for e.g. feature change assessment. Hence, measuring ecological complexity over space becomes crucial in macroecology and geography. Many geospatial tools have been advocated in spatial ecology to estimate ecosystem complexity and its changes over space and time. Among these tools, free and open source options especially offer opportunities to guarantee the robustness of algorithms and reproducibility. In this paper we will summarize the most straightforward measures of spatial complexity available in the Free and Open Source Software GRASS GIS, relating them to key ecological patterns and processes.

      PubDate: 2017-05-17T09:52:55Z
       
  • Bayesian inference of spectral induced polarization parameters for
           laboratory complex resistivity measurements of rocks and soils
    • Abstract: Publication date: August 2017
      Source:Computers & Geosciences, Volume 105
      Author(s): Charles L. Bérubé, Michel Chouteau, Pejman Shamsipour, Randolph J. Enkin, Gema R. Olivo
      Spectral induced polarization (SIP) measurements are now widely used to infer mineralogical or hydrogeological properties from the low-frequency electrical properties of the subsurface in both mineral exploration and environmental sciences. We present an open-source program that performs fast multi-model inversion of laboratory complex resistivity measurements using Markov-chain Monte Carlo simulation. Using this stochastic method, SIP parameters and their uncertainties may be obtained from the Cole-Cole and Dias models, or from the Debye and Warburg decomposition approaches. The program is tested on synthetic and laboratory data to show that the posterior distribution of a multiple Cole-Cole model is multimodal in particular cases. The Warburg and Debye decomposition approaches yield unique solutions in all cases. It is shown that an adaptive Metropolis algorithm performs faster and is less dependent on the initial parameter values than the Metropolis-Hastings step method when inverting SIP data through the decomposition schemes. There are no advantages in using an adaptive step method for well-defined Cole-Cole inversion. Finally, the influence of measurement noise on the recovered relaxation time distribution is explored. We provide the geophysics community with a open-source platform that can serve as a base for further developments in stochastic SIP data inversion and that may be used to perform parameter analysis with various SIP models.

      PubDate: 2017-05-06T21:34:07Z
       
  • Fractal generator for efficient production of random planar patterns and
           symbols in digital mapping
    • Abstract: Publication date: Available online 5 May 2017
      Source:Computers & Geosciences
      Author(s): Qiyu Chen, Gang Liu, Xiaogang Ma, Xinchuan Li, Zhenwen He
      In digital cartography, the automatic generation of random planar patterns and symbols is still an ongoing challenge. Those patterns and symbols of randomness have randomly variated configurations and boundaries, and their generating algorithms are constrained by the shape features, cartographic standards and many other conditions. The fractal geometry offers favorable solutions to simulate random boundaries and patterns. In the work presented in this paper, we used both fractal theory and random Iterated Function Systems (IFS) to develop a method for the automatic generation of random planar patterns and symbols. The marshland and the trough cross-bedding patterns were used as two case studies for the implementation of the method. We first analyzed the morphological characteristics of those two planar patterns. Then we designed algorithms and implementation schemes addressing the features of each pattern. Finally, we ran the algorithms to generate the patterns and symbols, and compared them with the requirements of a few digital cartographic standards. The method presented in this paper has already been deployed in a digital mapping system for practical uses. The flexibility of the method also allows it to be reused and/or adapted in various software platforms for digital mapping.

      PubDate: 2017-05-06T21:34:07Z
       
  • Large Crater Clustering Tool
    • Abstract: Publication date: Available online 3 May 2017
      Source:Computers & Geosciences
      Author(s): Jason Laura, James A. Skinner Jr., Marc A. Hunter
      In this paper we present the Large Crater Clustering (LCC) tool set, an ArcGIS plugin that supports the quantitative approximation of a primary impact location from user-identified locations of possible secondary impact craters or the long-axes of clustered secondary craters. The identification of primary impact craters directly supports planetary geologic mapping and topical science studies where the chronostratigraphic age of some geologic units may be known, but more distant features have questionable geologic ages. Previous works (e.g., McEwen et al., 2005; Dundas and McEwen, 2007) have shown that the source of secondary impact craters can be estimated from secondary impact craters. This work adapts those methods into a statistically robust tool set. We describe the four individual tools within the LCC tool set to support: (1) processing individually digitized point observations (craters), (2) estimating the directional distribution of a clustered set of craters, back projecting the potential flight paths (crater clusters or linearly approximated catenae or lineaments), (3) intersecting projected paths, and (4) intersecting back-projected trajectories to approximate the local of potential source primary craters. We present two case studies using secondary impact features mapped in two regions of Mars. We demonstrate that the tool is able to quantitatively identify primary impacts and supports the improved qualitative interpretation of potential secondary crater flight trajectories.

      PubDate: 2017-05-06T21:34:07Z
       
  • Reconstructing daily clear-sky land surface temperature for cloudy regions
           from MODIS data
    • Abstract: Publication date: Available online 29 April 2017
      Source:Computers & Geosciences
      Author(s): Liang Sun, Zhongxin Chen, Feng Gao, Martha Anderson, Lisheng Song, Limin Wang, Bo Hu, Yun Yang
      Land surface temperature (LST) is a critical parameter in environmental studies and resource management. The MODIS LST data product has been widely used in various studies, such as drought monitoring, evapotranspiration mapping, soil moisture estimation and forest fire detection. However, cloud contamination affects thermal band observations and will lead to inconsistent LST results. In this study, we present a new Remotely Sensed DAily land Surface Temperature reconstruction (RSDAST) model that recovers clear sky LST for pixels covered by cloud using only clear-sky neighboring pixels from nearby dates. The reconstructed LST was validated using the original LST pixels. Model shows high accuracy for reconstructing one masked pixel with R2 of 0.995, bias of −0.02K and RMSE of 0.51K. Extended spatial reconstruction results show a better accuracy for flat areas with R2 of 0.72‒0.89, bias of −0.02‒0.21K, and RMSE of 0.92‒1.16K, and for mountain areas with R2 of 0.81‒0.89, bias of −0.35 ‒ −1.52K, and RMSE of 1.42‒2.24K. The reconstructed areas show spatial and temporal patterns that are consistent with the clear neighbor areas. In the reconstructed LST and NDVI triangle feature space which is controlled by soil moisture, LST values distributed reasonably and correspond well to the real soil moisture conditions. Our approach shows great potential for reconstructing clear sky LST under cloudy conditions and provides consistent daily LST which are critical for daily drought monitoring.

      PubDate: 2017-04-30T21:05:50Z
       
  • StackSplit - a plugin for multi-event shear wave splitting analyses in
           SplitLab
    • Abstract: Publication date: Available online 29 April 2017
      Source:Computers & Geosciences
      Author(s): Michael Grund
      SplitLab is a powerful and widely used tool for analysing seismological shear wave splitting of single event measurements. However, in many cases, especially temporary station deployments close to the noisy seaside, ocean bottom or for recordings affected by strong anthropogenic noise, only multi-event approaches provide stable and reliable splitting results. In order to extend the original SplitLab environment for such analyses, I present the StackSplit plugin that can easily be implemented within the well accepted main program. StackSplit grants easy access to several different analysis approaches within SplitLab, including a new multiple waveform based inversion method as well as the most established standard stacking procedures. The possibility to switch between different analysis approaches at any time allows the user for the most flexible processing of individual multi-event splitting measurements for a single recording station. Besides the provided functions of the plugin, no other external program is needed for the multi-event analyses since StackSplit performs within the available SplitLab structure which is based on MATLAB. The effectiveness and use of this plugin is demonstrated with data examples of a long running seismological recording station in Finland.

      PubDate: 2017-04-30T21:05:50Z
       
  • A 3D forward stratigraphic model of fluvial meander-bend evolution for
           prediction of point-bar lithofacies architecture
    • Abstract: Publication date: Available online 28 April 2017
      Source:Computers & Geosciences
      Author(s): Na Yan, Nigel P. Mountney, Luca Colombera, Robert M. Dorrell
      Although fundamental types of fluvial meander-bend transformations – expansion, translation, rotation, and combinations thereof – are widely recognised, the relationship between the migratory behaviour of a meander bend, and its resultant accumulated sedimentary architecture and lithofacies distribution remains relatively poorly understood. Three-dimensional data from both currently active fluvial systems and from ancient preserved successions known from outcrop and subsurface settings are limited. To tackle this problem, a 3D numerical forward stratigraphic model – the Point-Bar Sedimentary Architecture Numerical Deduction (PB-SAND) – has been devised as a tool for the reconstruction and prediction of the complex spatio-temporal migratory evolution of fluvial meanders, their generated bar forms and the associated lithofacies distributions that accumulate as heterogeneous fluvial successions. PB-SAND uses a dominantly geometric modelling approach supplemented by process-based and stochastic model components, and is constrained by quantified sedimentological data derived from modern point bars or ancient successions that represent suitable analogues. The model predicts the internal architecture and geometry of fluvial point-bar elements in three dimensions. The model is applied to predict the sedimentary lithofacies architecture of ancient preserved point-bar and counter-point-bar deposits of the middle Jurassic Scalby Formation (North Yorkshire, UK) to demonstrate the predictive capabilities of PB-SAND in modelling 3D architectures of different types of meander-bend transformations. PB-SAND serves as a practical tool with which to predict heterogeneity in subsurface hydrocarbon reservoirs and water aquifers.
      Graphical abstract image

      PubDate: 2017-04-30T21:05:50Z
       
  • Fast hyperbolic Radon transform represented as convolutions in log-polar
           coordinates
    • Abstract: Publication date: Available online 28 April 2017
      Source:Computers & Geosciences
      Author(s): Viktor V. Nikitin, Fredrik Andersson, Marcus Carlsson, Anton A. Duchkov
      The hyperbolic Radon transform is a commonly used tool in seismic processing, for instance in seismic velocity analysis, data interpolation and for multiple removal. A direct implementation by summation of traces with different moveouts is computationally expensive for large data sets. In this paper we present a new method for fast computation of the hyperbolic Radon transforms. It is based on using a log-polar sampling with which the main computational parts reduce to computing convolutions. This allows for fast implementations by means of FFT. In addition to the FFT operations, interpolation procedures are required for switching between coordinates in the time-offset; Radon; and log-polar domains. Graphical Processor Units (GPUs) are suitable to use as a computational platform for this purpose, due to the hardware supported interpolation routines as well as optimized routines for FFT. Performance tests show large speed-ups of the proposed algorithm. Hence, it is suitable to use in iterative methods, and we provide examples for data interpolation and multiple removal using this approach.

      PubDate: 2017-04-30T21:05:50Z
       
  • An efficient regularization method for a large scale ill-posed geothermal
           problem
    • Abstract: Publication date: Available online 27 April 2017
      Source:Computers & Geosciences
      Author(s): Fredrik Berntsson, Chen Lin, Tao Xu, Dennis Wokiyi
      The inverse geothermal problem consists of estimating the temperature distribution below the earth's surface using measurements on the surface. The problem is important since temperature governs a variety of geologic processes, including the generation of magmas and the deformation style of rocks. Since the thermal properties of rocks depend strongly on temperature the problem is non-linear. The problem is formulated as an ill-posed operator equation, where the righthand side is the heat-flux at the surface level. Since the problem is ill-posed regularization is needed. In this study we demonstrate that Tikhonov regularization can be implemented efficiently for solving the operator equation. The algorithm is based on having a code for solving a well-posed problem related to the above mentioned operator. The algorithm is designed in such a way that it can deal with both 2 D and 3 D calculations. Numerical results, for 2 D domains, show that the algorithm works well and the inverse problem can be solved accurately with a realistic noise level in the surface data.

      PubDate: 2017-04-30T21:05:50Z
       
  • COMPUTATIONAL TIME REDUCTION FOR SEQUENTIAL BATCH SOLUTIONS IN GNSS
           PRECISE POINT POSITIONING TECHNIQUE
    • Abstract: Publication date: Available online 27 April 2017
      Source:Computers & Geosciences
      Author(s): Angel Martín, Ana Belén Anquela Julián, Alejandro Dimas-Pages, Fernando Cos-Gayón
      Precise point positioning (PPP) is a well established Global Navigation Satellite System (GNSS) technique that only requires information from the receiver (or rover) to obtain high-precision position coordinates. This is a very interesting and promising technique because eliminates the need for a reference station near the rover receiver or a network of reference stations, thus reducing the cost of a GNSS survey. From a computational perspective, there are two ways to solve the system of observation equations produced by static PPP either in a single step (so-called batch adjustment) or with a sequential adjustment/filter. The results of each should be the same if they are both well implemented. However, if a sequential solution (that is, not only the final coordinates, but also those observed in previous GNSS epochs), is needed, as for convergence studies, finding a batch solution becomes a very time consuming task owing to the need for matrix inversion that accumulates with each consecutive epoch. This is not a problem for the filter solution, which uses information computed in the previous epoch for the solution of the current epoch. Thus filter implementations need extra considerations of user dynamics and parameter state variations between observation epochs with appropriate stochastic update parameter variances from epoch to epoch. These filtering considerations are not needed in batch adjustment, which makes it attractive. The main objective of this research is to significantly reduce the computation time required to obtain sequential results using batch adjustment. The new method we implemented in the adjustment process led to a mean reduction in computational time by 45%. .

      PubDate: 2017-04-30T21:05:50Z
       
  • Ensemble predictive model for more accurate soil organic carbon
           spectroscopic estimation
    • Abstract: Publication date: Available online 26 April 2017
      Source:Computers & Geosciences
      Author(s): Radim Vašát, Radka Kodešová, Luboš Borůvka
      A myriad of signal pre-processing strategies and multivariate calibration techniques has been explored in attempt to improve the spectroscopic prediction of soil organic carbon (SOC) over the last few decades. Therefore, to come up with a novel, more powerful, and accurate predictive approach to beat the rank becomes a challenging task. However, there may be a way, so that combine several individual predictions into a single final one (according to ensemble learning theory). As this approach performs best when combining in nature different predictive algorithms that are calibrated with structurally different predictor variables, we tested predictors of two different kinds: 1) reflectance values (or transforms) at each wavelength and 2) absorption feature parameters. Consequently we applied four different calibration techniques, two per each type of predictors: a) partial least squares regression and support vector machines for type 1, and b) multiple linear regression and random forest for type 2. The weights to be assigned to individual predictions within the ensemble model (constructed as a weighted average) were determined by an automated procedure that ensured the best solution among all possible was selected. The approach was tested at soil samples taken from surface horizon of four sites differing in the prevailing soil units. By employing the ensemble predictive model the prediction accuracy of SOC improved at all four sites. The coefficient of determination in cross-validation (R2 cv) increased from 0.849, 0.611, 0.811 and 0.644 (the best individual predictions) to 0.864, 0.650, 0.824 and 0.698 for Site 1, 2, 3 and 4, respectively. Generally, the ensemble model affected the final prediction so that the maximal deviations of predicted vs. observed values of the individual predictions were reduced, and thus the correlation cloud became thinner as desired.

      PubDate: 2017-04-30T21:05:50Z
       
  • A training image evaluation and selection method based on minimum data
           event distance for multiple-point geostatistics
    • Abstract: Publication date: July 2017
      Source:Computers & Geosciences, Volume 104
      Author(s): Wenjie Feng, Shenghe Wu, Yanshu Yin, Jiajia Zhang, Ke Zhang
      A training image (TI) can be regarded as a database of spatial structures and their low to higher order statistics used in multiple-point geostatistics (MPS) simulation. Presently, there are a number of methods to construct a series of candidate TIs (CTIs) for MPS simulation based on a modeler's subjective criteria. The spatial structures of TIs are often various, meaning that the compatibilities of different CTIs with the conditioning data are different. Therefore, evaluation and optimal selection of CTIs before MPS simulation is essential. This paper proposes a CTI evaluation and optimal selection method based on minimum data event distance (MDevD). In the proposed method, a set of MDevD properties are established through calculation of the MDevD of conditioning data events in each CTI. Then, CTIs are evaluated and ranked according to the mean value and variance of the MDevD properties. The smaller the mean value and variance of an MDevD property are, the more compatible the corresponding CTI is with the conditioning data. In addition, data events with low compatibility in the conditioning data grid can be located to help modelers select a set of complementary CTIs for MPS simulation. The MDevD property can also help to narrow the range of the distance threshold for MPS simulation. The proposed method was evaluated using three examples: a 2D categorical example, a 2D continuous example, and an actual 3D oil reservoir case study. To illustrate the method, a C++ implementation of the method is attached to the paper.

      PubDate: 2017-04-17T11:27:20Z
       
  • Analysis of Training Sample Selection Strategies for Regression-Based
           Quantitative Landslide Susceptibility Mapping Methods
    • Abstract: Publication date: Available online 11 April 2017
      Source:Computers & Geosciences
      Author(s): Arzu Erener, A. Abdullah Sivas, A. Sevtap Selcuk-Kestel, H. Sebnem Düzgün
      All of the quantitative landslide susceptibility mapping (QLSM) methods requires two basic data types, namely, landslide inventory and factors that influence landslide occurrence (landslide influencing factors, LIF). Depending on type of landslides, nature of triggers and LIF, accuracy of the QLSM methods differs. Moreover, how to balance the number of 0 (nonoccurrence) and 1 (occurrence) in the training set obtained from the landslide inventory and how to select which one of the 1's and 0's to be included in QLSM models play critical role in the accuracy of the QLSM. Although performance of various QLSM methods is largely investigated in the literature, the challenge of training set construction is not adequately investigated for the QLSM methods. In order to tackle this challenge, in this study three different training set selection strategies along with the original data set is used for testing the performance of three different regression methods namely Logistic Regression (LR), Bayesian Logistic Regression (BLR) and Fuzzy Logistic Regression (FLR). The first sampling strategy is proportional random sampling (PRS), which takes into account a weighted selection of landslide occurrences in the sample set. The second method, namely non-selective nearby sampling (NNS), includes randomly selected sites and their surrounding neighboring points at certain preselected distances to include the impact of clustering. Selective nearby sampling (SNS) is the third method, which concentrates on the group of 1's and their surrounding neighborhood. A randomly selected group of landslide sites and their neighborhood are considered in the analyses similar to NNS parameters. It is found that LR-PRS, FLR-PRS and BLR-Whole Data set-ups, with order, yield the best fits among the other alternatives. The results indicate that in QLSM based on regression models, avoidance of spatial correlation in the data set is critical for the model's performance.

      PubDate: 2017-04-17T11:27:20Z
       
  • Seismic traveltime inversion based on tomographic equation without
           integral terms
    • Abstract: Publication date: Available online 6 April 2017
      Source:Computers & Geosciences
      Author(s): Guangnan Huang, Bing Zhou, Hongxing Li, David C. Nobes
      The Jacobian matrix in the seismic traveltime tomographic equations usually contains several integral terms. These integral expressions not only greatly increase the computational complexity of seismic traveltime tomography, but also increase difficulty for programming these expressions. Therefore, if these integral expressions of the Jacobian matrix can be eliminated, the program of seismic traveltime tomography can be greatly simplified. In order to solve the computational complexity of the traditional seismic traveltime tomography, we found an anisotropic seismic traveltime tomographic equation which doesn’t contain integral expressions. Then, it is degenerated into an isotropic seismic traveltime tomographic equation. In order to verify the effectiveness of this seismic traveltime tomographic equation based on the node network, a program has been coded to execute seismic traveltime inversion. For a crosswell checkerboard velocity model, the same results are obtained by this proposed tomographic method and the traditional method (with integral terms). Besides, two undulating topography velocity models are used as testing models. Numerical simulation results show that this proposed tomographic method can achieve good tomograms. Finally, this proposed tomographic method is used to investigate near surface velocity distribution near a power plant. Tomogram indicates that contaminated liquid diffuses and aggregates along strata at a certain depth. And velocity is lower near pollutant source than that away from it.

      PubDate: 2017-04-10T09:32:56Z
       
  • A framework for interactive visual analysis of heterogeneous marine data
           in an integrated problem solving environment
    • Abstract: Publication date: Available online 5 April 2017
      Source:Computers & Geosciences
      Author(s): Shuai Liu, Ge Chen, Shifeng Yao, Fenglin Tian, Wei Liu
      This paper presents a novel integrated marine visualization framework which focuses on processing, analyzing the multi-dimension spatiotemporal marine data in one workflow. Effective marine data visualization is needed in terms of extracting useful patterns, recognizing changes, and understanding physical processes in oceanography researches. However, the multi-source, multi-format, multi-dimension characteristics of marine data pose a challenge for interactive and feasible (timely) marine data analysis and visualization in one workflow. And, global multi-resolution virtual terrain environment is also needed to give oceanographers and the public a real geographic background reference and to help them to identify the geographical variation of ocean phenomena. This paper introduces a data integration and processing method to efficiently visualize and analyze the heterogeneous marine data. Based on the data we processed, several GPU-based visualization methods are explored to interactively demonstrate marine data. GPU-tessellated global terrain rendering using ETOPO1 data is realized and the video memory usage is controlled to ensure high efficiency. A modified ray-casting algorithm for the uneven multi-section Argo volume data is also presented and the transfer function is designed to analyze the 3D structure of ocean phenomena. Based on the framework we designed, an integrated visualization system is realized. The effectiveness and efficiency of the framework is demonstrated. This system is expected to make a significant contribution to the demonstration and understanding of marine physical process in a virtual global environment.

      PubDate: 2017-04-10T09:32:56Z
       
  • Impact of mineralogical heterogeneity on reactive transport modelling
    • Abstract: Publication date: Available online 2 April 2017
      Source:Computers & Geosciences
      Author(s): Min Liu, Mehdi Shabaninejad, Peyman Mostaghimi
      Impact of mineralogical heterogeneity of rocks in reactive modelling is investigated by applying a pore scale model based on Lattice Boltzmann and Finite Volume Methods. Mass transport, chemical reaction and solid structure modification are included in the model. A two-dimensional mineral map of a sandstone rock is acquired using the imaging technique of QEMSCAN SEM with Energy-Dispersive X-ray Spectroscopy (EDS). The mineralogical heterogeneity is explored by performing multi-mineral reaction simulation on images containing various minerals. The results are then compared with the predictions of single mineral dissolution modelling. Dissolution patterns and permeability variations of multi-mineral and single mineral reactions are presented. The errors of single mineral reaction modelling are also estimated. Numerical results show that mineralogical heterogeneity can cause significant errors in permeability predictions, if a uniform mineral distribution is assumed which are related with flow regimes. The errors are smaller in high Péclet regimes than in low Péclet regimes in this sample.

      PubDate: 2017-04-03T11:40:51Z
       
  • A MATLAB based 3D modeling and inversion code for MT data
    • Abstract: Publication date: Available online 27 March 2017
      Source:Computers & Geosciences
      Author(s): Arun Singh, Rahul Dehiya, Pravin K. Gupta, M. Israil
      The development of a MATLAB based computer code, AP3DMT, for modeling and inversion of 3D Magnetotelluric (MT) data is presented. The code comprises two independent components: grid generator code and modeling/inversion code. The grid generator code performs model discretization and acts as an interface by generating various I/O files. The inversion code performs core computations in modular form – forward modeling, data functionals, sensitivity computations and regularization. These modules can be readily extended to other similar inverse problems like Controlled-Source EM (CSEM). The modular structure of the code provides a framework useful for implementation of new applications and inversion algorithms. The use of MATLAB and its libraries makes it more compact and user friendly. The code has been validated on several published models. To demonstrate its versatility and capabilities the results of inversion for two complex models are presented.

      PubDate: 2017-04-03T11:40:51Z
       
  • Augmenting comprehension of geological relationships by integrating 3D
           laser scanned hand samples within a GIS environment
    • Abstract: Publication date: June 2017
      Source:Computers & Geosciences, Volume 103
      Author(s): A.S. Harvey, G. Fotopoulos, B. Hall, K. Amolins
      Geological observations can be made on multiple scales, including micro- (e.g. thin section), meso- (e.g. hand-sized to outcrop) and macro- (e.g. outcrop and larger) scales. Types of meso-scale samples include, but are not limited to, rocks (including drill cores), minerals, and fossils. The spatial relationship among samples paired with physical (e.g. granulometric composition, density, roughness) and chemical (e.g. mineralogical and isotopic composition) properties can aid in interpreting geological settings, such as paleo-environmental and formational conditions as well as geomorphological history. Field samples are collected along traverses in the area of interest based on characteristic representativeness of a region, predetermined rate of sampling, and/or uniqueness. The location of a sample can provide relative context in seeking out additional key samples. Beyond labelling and recording of geospatial coordinates for samples, further analysis of physical and chemical properties may be conducted in the field and laboratory. The main motivation for this paper is to present a workflow for the digital preservation of samples (via 3D laser scanning) paired with the development of cyber infrastructure, which offers geoscientists and engineers the opportunity to access an increasingly diverse worldwide collection of digital Earth materials. This paper describes a Web-based graphical user interface developed using Web AppBuilder for ArcGIS for digitized meso-scale 3D scans of geological samples to be viewed alongside the macro-scale environment. Over 100 samples of virtual rocks, minerals and fossils populate the developed geological database and are linked explicitly with their associated attributes, characteristic properties, and location. Applications of this new Web-based geological visualization paradigm in the geosciences demonstrate the utility of such a tool in an age of increasing global data sharing.

      PubDate: 2017-03-27T12:35:58Z
       
  • Towards semi-automatic rock mass discontinuity orientation and set
           analysis from 3D point clouds
    • Abstract: Publication date: June 2017
      Source:Computers & Geosciences, Volume 103
      Author(s): Jiateng Guo, Shanjun Liu, Peina Zhang, Lixin Wu, Wenhui Zhou, Yinan Yu
      Obtaining accurate information on rock mass discontinuities for deformation analysis and the evaluation of rock mass stability is important. Obtaining measurements for high and steep zones with the traditional compass method is difficult. Photogrammetry, three-dimensional (3D) laser scanning and other remote sensing methods have gradually become mainstream methods. In this study, a method that is based on a 3D point cloud is proposed to semi-automatically extract rock mass structural plane information. The original data are pre-treated prior to segmentation by removing outlier points. The next step is to segment the point cloud into different point subsets. Various parameters, such as the normal, dip/direction and dip, can be calculated for each point subset after obtaining the equation of the best fit plane for the relevant point subset. A cluster analysis (a point subset that satisfies some conditions and thus forms a cluster) is performed based on the normal vectors by introducing the firefly algorithm (FA) and the fuzzy c-means (FCM) algorithm. Finally, clusters that belong to the same discontinuity sets are merged and coloured for visualization purposes. A prototype system is developed based on this method to extract the points of the rock discontinuity from a 3D point cloud. A comparison with existing software shows that this method is feasible. This method can provide a reference for rock mechanics, 3D geological modelling and other related fields.

      PubDate: 2017-03-27T12:35:58Z
       
  • A machine learning approach to the potential-field method for implicit
           modeling of geological structures
    • Abstract: Publication date: June 2017
      Source:Computers & Geosciences, Volume 103
      Author(s): Ítalo Gomes Gonçalves, Sissa Kumaira, Felipe Guadagnin
      Implicit modeling has experienced a rise in popularity over the last decade due to its advantages in terms of speed and reproducibility in comparison with manual digitization of geological structures. The potential-field method consists in interpolating a scalar function that indicates to which side of a geological boundary a given point belongs to, based on cokriging of point data and structural orientations. This work proposes a vector potential-field solution from a machine learning perspective, recasting the problem as multi-class classification, which alleviates some of the original method's assumptions. The potentials related to each geological class are interpreted in a compositional data framework. Variogram modeling is avoided through the use of maximum likelihood to train the model, and an uncertainty measure is introduced. The methodology was applied to the modeling of a sample dataset provided with the software Move™. The calculations were implemented in the R language and 3D visualizations were prepared with the rgl package.

      PubDate: 2017-03-27T12:35:58Z
       
  • A transfer learning method for automatic identification of sandstone
           microscopic images
    • Abstract: Publication date: June 2017
      Source:Computers & Geosciences, Volume 103
      Author(s): Na Li, Huizhen Hao, Qing Gu, Danru Wang, Xiumian Hu
      Classification of sandstone microscopic images is an essential task in geology, and the classical method is either subjective or time-consuming. Computer aided automatic classification has been proved useful, but it seldom considers the situation where sandstone images are collected from separated regions. In this paper, we provide a method called Festra, which uses transfer learning to handle the problem of interregional sandstone microscopic image classification. The method contains two parts: one is feature selection, which aims to screen out features having great difference between the regions, the other is instance transfer using an enhanced TrAdaBoost, whose object is to mitigate the difference among thin section images collected from the regions. Experiments are conducted based on the sandstone images taken from four regions in Tibet to study the performance of Festra. The experimental results have proved both effectiveness and validity of Festra, which provides competitive prediction performance on all the four regions, with few target instances labeled suitable for the field use.

      PubDate: 2017-03-20T03:59:36Z
       
  • Rule-based topology system for spatial databases to validate complex
           geographic datasets
    • Abstract: Publication date: Available online 18 March 2017
      Source:Computers & Geosciences
      Author(s): J. Martinez-Llario, E. Coll, M. Núñez-Andrés, C. Femenia-Ribera
      A rule-based topology software system providing a highly flexible and fast procedure to enforce integrity in spatial relationships among datasets is presented. This improved topology rule system is built over the spatial extension Jaspa. Both projects are open source, freely available software developed by the corresponding author of this paper. Currently, there is no spatial DBMS that implements a rule-based topology engine (considering that the topology rules are designed and performed in the spatial backend). If the topology rules are applied in the frontend (as in many GIS desktop programs), ArcGIS is the most advanced solution. The system presented in this paper has several major advantages over the ArcGIS approach: it can be extended with new topology rules, it has a much wider set of rules, and it can mix feature attributes with topology rules as filters. In addition, the topology rule system can work with various DBMSs, including PostgreSQL, H2 or Oracle, and the logic is performed in the spatial backend. The proposed topology system allows users to check the complex spatial relationships among features (from one or several spatial layers) that require some complex cartographic datasets, such as the data specifications proposed by INSPIRE in Europe and the Land Administration Domain Model (LADM) for Cadastral data.

      PubDate: 2017-03-20T03:59:36Z
       
  • Quasi-equal area subdivision algorithm for uniform points on a sphere with
           application to any geographical data distribution
    • Abstract: Publication date: Available online 18 March 2017
      Source:Computers & Geosciences
      Author(s): Sanghyun Lee, Daniele Mortari
      This paper describes a quasi-equal area subdivision algorithm based on equal area spherical subdivision to obtain approximated solutions to the problem of uniform distribution of points on a 2-dimensional sphere, better known as Smale's seventh problem. The algorithm provides quasi-equal area triangles, starting by splitting the Platonic solids into subsequent spherical triangles of identical areas. The main feature of the proposed algorithm is that the final adjacent triangles share common vertices that can be merged. It applies reshaping to the final triangles in order to remove obtuse triangles. The proposed algorithm is fast and efficient to generate a large number of points. Consequently, they are suitable for various applications requiring a large number of distributed points. The proposed algorithm is then applied to two geographical data distributions that are modeled by quasi-uniform distribution of weighted points.

      PubDate: 2017-03-20T03:59:36Z
       
  • The Application of Artificial Intelligence for the Identification of the
           Maceral Groups and Mineral Components of Coal
    • Abstract: Publication date: Available online 16 March 2017
      Source:Computers & Geosciences
      Author(s): Mariusz Mlynarczuk, Marta Skiba
      The correct and consistent identification of the petrographic properties of coal is an important issue for researchers in the fields of mining and geology. As part of the study described in this paper, investigations concerning the application of artificial intelligence methods for the identification of the aforementioned characteristics were carried out. The methods in question were used to identify the maceral groups of coal, i.e. vitrinite, inertinite, and liptinite. Additionally, an attempt was made to identify some non-organic minerals. The analyses were performed using pattern recognition techniques (NN, kNN), as well as artificial neural network techniques (a multilayer perceptron – MLP). The classification process was carried out using microscopy images of polished sections of coals. A multidimensional feature space was defined, which made it possible to classify the discussed structures automatically, based on the methods of pattern recognition and algorithms of the artificial neural networks. Also, the authors of the study assessed the impact of the parameters for which the applied methods proved effective upon the final outcome of the classification procedure. The result of the analyses was a high percentage (over 97%) of correct classifications of maceral groups and mineral components. The paper discusses also an attempt to analyze particular macerals of the inertinite group. It was demonstrated that using artificial neural networks to this end makes it possible to classify the macerals properly in over 91 percent of cases. Thus, it was proved that artificial intelligence methods can be successfully applied for the identification of selected petrographic features of coal.

      PubDate: 2017-03-20T03:59:36Z
       
  • Development of a data-driven forecasting tool for hydraulically fractured,
           horizontal wells in tight-gas sands
    • Abstract: Publication date: Available online 12 March 2017
      Source:Computers & Geosciences
      Author(s): B. Kulga, E. Artun, T. Ertekin
      Tight-gas sand reservoirs are considered to be one of the major unconventional resources. Due to the strong heterogeneity and very low permeability of the formation, and the complexity of well trajectories with multiple hydraulic fractures; there are challenges associated with performance forecasting and optimum exploitation of these resources using conventional modeling approaches. In this study, it is aimed to develop a data-driven forecasting tool for tight-gas sands, which are based on artificial neural networks that can complement the physics-driven modeling approach, namely numerical simulation models. The tool is designed to predict the horizontal-well performance as a proxy to the numerical model, once the initial conditions, operational parameters, reservoir/hydraulic-fracture characteristics are provided. The data-driven model, that the forecasting tool is based on, is validated with blind cases by estimating the cumulative gas production after 10 years with an average error of 3.2%. A graphical-user-interface application is developed that allows the practicing engineer to use the developed tool in a practical manner by visualizing estimated performance for a given reservoir within a fraction of a second. Practicality of the tool is demonstrated with a case study for the Williams Fork Formation by assessing the performance of various well designs and by incorporating known uncertainties through Monte Carlo simulation. P10, P50 and P90 estimates of the horizontal-well performance are quickly obtained within acceptable accuracy levels.

      PubDate: 2017-03-13T02:45:54Z
       
  • Development of a GIS-based integrated framework for coastal seiches
           monitoring and forecasting: A North Jiangsu shoal case study
    • Abstract: Publication date: Available online 11 March 2017
      Source:Computers & Geosciences
      Author(s): Rufu Qin, Liangzhao Lin
      Coastal seiches have become an increasingly important issue in coastal science and present many challenges, particularly when attempting to provide warning services. This paper presents the methodologies, techniques and integrated services adopted for the design and implementation of a Seiches Monitoring and Forecasting Integration Framework (SMAF-IF). The SMAF-IF is an integrated system with different types of sensors and numerical models and incorporates the Geographic Information System (GIS) and web techniques, which focuses on coastal seiche events detection and early warning in the North Jiangsu shoal, China. The in situ sensors perform automatic and continuous monitoring of the marine environment status and the numerical models provide the meteorological and physical oceanographic parameter estimates. A model outputs processing software was developed in C# language using ArcGIS Engine functions, which provides the capabilities of automatically generating visualization maps and warning information. Leveraging the ArcGIS Flex API and ASP.NET web services, a web based GIS framework was designed to facilitate quasi real-time data access, interactive visualization and analysis, and provision of early warning services for end users. The integrated framework proposed in this study enables decision-makers and the publics to quickly response to emergency coastal seiche events and allows an easy adaptation to other regional and scientific domains related to real-time monitoring and forecasting.

      PubDate: 2017-03-13T02:45:54Z
       
  • EdgeDetectPFI: an algorithm for automatic edge detection in potential
           field anomaly images – application to dike-like magnetic structures
    • Abstract: Publication date: Available online 10 March 2017
      Source:Computers & Geosciences
      Author(s): Saulo P. Oliveira, Francisco J.F. Ferreira, Jeferson de Souza
      We propose an algorithm to automatically locate the spatial position of anomalies in potential field images, which can be used to estimate the depth and width of causative sources. The magnetic anomaly is firstly enhanced using an edge detection filter based on a simple transformation (the Signum transform) which retains only the signs of the anomalous field. The theoretical edge positions can be recognized from the locations where one of the spatial field derivatives (or a function of them) change its sign: the zero crossover points. These points are easily identified from the Signum transformed spatial derivatives. The actual sources depths and widths are then estimated using the widths of the positive plateaus obtained from two different Signum transformed data: one based on the vertical derivative and the other using the vertical derivative minus the absolute value of the horizontal derivative. Our algorithm finds these widths in an automatic fashion by computing the radius of the largest circles inside the positive plateaus. Numerical experiments with synthetic data show that the proposed approach provides reliable estimates for the target parameters. Additional testing is carried out with aeromagnetic data from Santa Catarina, Southern Brazil, and the resulting parameter maps are compared with Euler deconvolution.

      PubDate: 2017-03-13T02:45:54Z
       
  • Fractal parameters and well-logs investigation using automated
           well-to-well correlation
    • Abstract: Publication date: Available online 7 March 2017
      Source:Computers & Geosciences
      Author(s): Seyyed Mohammad Amin Partovi, Saeid Sadeghnejad
      The aim of well-to-well correlation is to detect similar geological boundaries in two or more wells across a formation, which is usually done manually. The construction of such a correlation by hand for a field with several wells is quite complex and also time-consuming as well. The aim of this study is to speed up the well-to-well correlation process by providing an automated approach. The input data for our algorithm is the depths of all geological boundaries in a reference well. The algorithm automatically searches for similar depths associated with those geological boundaries in other wells (i.e., observation wells). The fractal parameters of well-logs, such as wavelet exponent (Hw), wavelet standard deviation exponent (Hws), and Hausdorff dimension (Ha), which are calculated by wavelet transform, are considered as pattern recognition dimensions during the well-to-well correlation. Finding the proper fractal dimensions in the automatic well-to-well correlation approach that provide the closest geological depth estimation to the results of the manual interpretation is one of the prime aims of this research. To validate the proposed technique, it is implemented on the well-log data from one of the Iranian onshore oil fields. Moreover, the capability of gamma ray, density, and sonic log in automatic detection of geological boundaries by this novel approach is also analyzed in detail. The outcome of this approach shows promising results.

      PubDate: 2017-03-08T13:40:06Z
       
  • Mechanical Properties and Energy Conversion of 3D Close-packed Lattice
           Model for Brittle Rocks
    • Abstract: Publication date: Available online 3 March 2017
      Source:Computers & Geosciences
      Author(s): Chun Liu, Qiang Xu, Bin Shi, Shang Deng, Honghu Zhu
      Numerical simulations using the 3D discrete element method can yield mechanical and dynamic behaviors similar to rocks and grains. In the model, rock is represented by bonded elements, which are arranged on a tetrahedral lattice. The conversion formulas between inter-element parameters and rock mechanical properties were derived. By using the formulas, inter-element parameters can be determined according to mechanical properties of model, including Young's modulus, Poisson's ratio, tensile strength (T u), compressive strength (C u) and coefficient of internal friction. The energy conversion rules of the model are proposed. Based on the methods, a Matlab code “MatDEM” was developed. Numerical models of quartzite were used to validate the formulas. The tested mechanical properties of a single unit correspond reasonably well with the values of quartzite. Tested T u and C u with multiple elements are lower than the values predicted by the formulas. In the simulation of rock failure processes, mechanical energy conversed between different forms and heat is generated, but the mechanical energy plus heat always remains constant. Variations of breaking heat and frictional heat provide clues of the fracturing and slipping behaviors of the T u and C u tests. The model may be applied to a wide range of geological structures that involve breakage at multiple scales, heat generation and dynamic processes.

      PubDate: 2017-03-08T13:40:06Z
       
  • Optimal Ordering of Realizations for Visualization and Presentation
    • Abstract: Publication date: Available online 2 March 2017
      Source:Computers & Geosciences
      Author(s): George de Barros, Clayton V. Deutsch
      In geostatistical simulation, a realization represents one possible outcome of the spatial uncertainty model. Tens to hundreds of realizations are generated in order to understand response property variation. There are ways to summarize local uncertainty, but visualizing all realizations is important to understand joint uncertainty between multiple locations. There is no straightforward manner to visualize all realizations at the same time or in sequence. This paper presents a new method to sequentially display multiple geostatistical realizations. The proposed algorithm performs an ordering of the visible portion of the realizations (images), according to the distance between realizations. The concept of distance corresponds to the differences computed cell by cell for every realization pair or to the differences computed from a moving window filtering applied to each realization. To define an optimal sequence of realizations, the shortest path route through the realizations is established by a simulated annealing technique. The gradual transition between realizations is enhanced by an image morphing technique where intermediate images are introduced between the original images. The final result consists of an animation that shows the sequence of realizations and allows better understanding of the uncertainty model.

      PubDate: 2017-03-08T13:40:06Z
       
  • An integrated workflow for stress and flow modelling using outcrop-derived
           discrete fracture networks
    • Abstract: Publication date: Available online 2 March 2017
      Source:Computers & Geosciences
      Author(s): K. Bisdom, H.M. Nick, G. Bertotti
      Fluid flow in naturally fractured reservoirs is often controlled by subseismic-scale fracture networks. Although the fracture network can be partly sampled in the direct vicinity of wells, the inter-well scale network is poorly constrained in fractured reservoir models. Outcrop analogues can provide data for population of domains of the reservoir model where no direct measurements are available. However, extracting relevant statistics from large outcrops representative of inter-well scale fracture networks remains challenging. Recent advances in outcrop imaging provide high-resolution datasets that can cover areas of several hundred by several hundred meters, i.e. the domain between adjacent wells, but even then, data from the high-resolution models is often upscaled to reservoir flow grids, resulting in loss of accuracy. We present a workflow that uses photorealistic georeferenced outcrop models to construct geomechanical and fluid flow models containing thousands of discrete fractures covering sufficiently large areas, that does not require upscaling to model permeability. This workflow seamlessly integrates geomechanical Finite Element models with flow models that take into account stress-sensitive fracture permeability and matrix flow to determine the full permeability tensor. The applicability of this workflow is illustrated using an outcropping carbonate pavement in the Potiguar basin in Brazil, from which 1082 fractures are digitised. The permeability tensor for a range of matrix permeabilities shows that conventional upscaling to effective grid properties leads to potential underestimation of the true permeability and the orientation of principal permeabilities. The presented workflow yields the full permeability tensor model of discrete fracture networks with stress-induced apertures, instead of relying on effective properties as most conventional flow models do.
      Graphical abstract image

      PubDate: 2017-03-08T13:40:06Z
       
  • Numerical simulation of electro-osmotic consolidation coupling non-linear
           variation of soil parameters
    • Abstract: Publication date: Available online 2 March 2017
      Source:Computers & Geosciences
      Author(s): Wu Hui, Liming Hu, Qingbo Wen
      Electro-osmotic consolidation is an effective method for soft ground improvement. A main limitation of previous numerical models on this technique is the ignorance of the non-linear variation of soil parameters. In the present study, a multi-field numerical model is developed with the consideration of the non-linear variation of soil parameters during electro-osmotic consolidation process. The numerical simulations on an axisymmetric model indicated that the non-linear variation of soil parameters showed remarkable impact on the development of the excess pore water pressure and degree of consolidation. A field experiment with complex geometry, boundary conditions, electrode configuration and voltage application was further simulated with the developed numerical model. The comparison between field and numerical data indicated that the coupling of the non-linear variation of soil parameters gave more reasonable results. The developed numerical model is capable to analyze engineering cases with complex operating conditions.

      PubDate: 2017-03-08T13:40:06Z
       
  • Statistical Modeling of Geopressured Geothermal Reservoirs
    • Abstract: Publication date: Available online 21 February 2017
      Source:Computers & Geosciences
      Author(s): Esmail Ansari, Richard Hughes, Christopher White
      Identifying attractive candidate reservoirs for producing geothermal energy requires predictive models. In this work, inspectional analysis and statistical modeling are used to create simple predictive models for a line drive design. Inspectional analysis on the partial differential equations governing this design yields a minimum number of fifteen dimensionless groups required to describe the physics of the system. These dimensionless groups are explained and confirmed using models with similar dimensionless groups but different dimensional parameters. This study models dimensionless production temperature and thermal recovery factor as the responses of a numerical model. These responses are obtained by a Box-Behnken experimental design. An uncertainty plot is used to segment the dimensionless time and develop a model for each segment. The important dimensionless numbers for each segment of the dimensionless time are identified using the Boosting method. These selected numbers are used in the regression models. The developed models are reduced to have a minimum number of predictors and interactions. The reduced final models are then presented and assessed using testing runs. Finally, applications of these models are offered. The presented workflow is generic and can be used to translate the output of a numerical simulator into simple predictive models in other research areas involving numerical simulation.

      PubDate: 2017-02-23T14:01:04Z
       
  • Modification of the random forest algorithm to avoid statistical
           dependence problems when classifying remote sensing imagery
    • Abstract: Publication date: Available online 20 February 2017
      Source:Computers & Geosciences
      Author(s): Fulgencio Cánovas-García, Francisco Alonso-Sarría, Francisco Gomariz-Castillo, Fernando Oñate-Valdivieso
      Random forest is a classification technique widely used in remote sensing. One of its advantages is that it produces an estimation of classification accuracy based on the so called out-of-bag cross-validation method. It is usually assumed that such estimation is not biased and may be used instead of validation based on an external data-set or a cross-validation external to the algorithm. In this paper we show that this is not necessarily the case when classifying remote sensing imagery using training areas with several pixels or objects. According to our results, out-of-bag cross-validation clearly overestimates accuracy, both overall and per class. The reason is that, in a training patch, pixels or objects are not independent (from a statistical point of view) of each other; however, they are split by bootstrapping into in-bag and out-of-bag as if they were really independent. We believe that putting whole patch, rather than pixels/objects, in one or the other set would produce a less biased out-of-bag cross-validation. To deal with the problem, we propose a modification of the random forest algorithm to split training patches instead of the pixels (or objects) that compose them. This modified algorithm does not overestimate accuracy and has no lower predictive capability than the original. When its results are validated with an external data-set, the accuracy is not different from that obtained with the original algorithm. We analysed three remote sensing images with different classification approaches (pixel and object based); in the three cases reported, the modification we propose produces a less biased accuracy estimation.

      PubDate: 2017-02-23T14:01:04Z
       
 
 
JournalTOCs
School of Mathematical and Computer Sciences
Heriot-Watt University
Edinburgh, EH14 4AS, UK
Email: journaltocs@hw.ac.uk
Tel: +00 44 (0)131 4513762
Fax: +00 44 (0)131 4513327
 
Home (Search)
Subjects A-Z
Publishers A-Z
Customise
APIs
Your IP address: 54.145.118.24
 
About JournalTOCs
API
Help
News (blog, publications)
JournalTOCs on Twitter   JournalTOCs on Facebook

JournalTOCs © 2009-2016