for Journals by Title or ISSN
for Articles by Keywords
  Subjects -> ENGINEERING (Total: 2284 journals)
    - CHEMICAL ENGINEERING (192 journals)
    - CIVIL ENGINEERING (184 journals)
    - ELECTRICAL ENGINEERING (102 journals)
    - ENGINEERING (1208 journals)
    - HYDRAULIC ENGINEERING (55 journals)
    - INDUSTRIAL ENGINEERING (65 journals)
    - MECHANICAL ENGINEERING (89 journals)

ENGINEERING (1208 journals)                  1 2 3 4 5 6 7 | Last

Showing 1 - 200 of 1205 Journals sorted alphabetically
3 Biotech     Open Access   (Followers: 7)
3D Research     Hybrid Journal   (Followers: 19)
AAPG Bulletin     Hybrid Journal   (Followers: 5)
AASRI Procedia     Open Access   (Followers: 15)
Abstract and Applied Analysis     Open Access   (Followers: 3)
Aceh International Journal of Science and Technology     Open Access   (Followers: 2)
ACS Nano     Full-text available via subscription   (Followers: 227)
Acta Geotechnica     Hybrid Journal   (Followers: 7)
Acta Metallurgica Sinica (English Letters)     Hybrid Journal   (Followers: 5)
Acta Polytechnica : Journal of Advanced Engineering     Open Access   (Followers: 2)
Acta Scientiarum. Technology     Open Access   (Followers: 3)
Acta Universitatis Cibiniensis. Technical Series     Open Access  
Active and Passive Electronic Components     Open Access   (Followers: 7)
Adaptive Behavior     Hybrid Journal   (Followers: 11)
Adıyaman Üniversitesi Mühendislik Bilimleri Dergisi     Open Access  
Adsorption     Hybrid Journal   (Followers: 4)
Advanced Engineering Forum     Full-text available via subscription   (Followers: 6)
Advanced Science     Open Access   (Followers: 5)
Advanced Science Focus     Free   (Followers: 3)
Advanced Science Letters     Full-text available via subscription   (Followers: 6)
Advanced Science, Engineering and Medicine     Partially Free   (Followers: 7)
Advanced Synthesis & Catalysis     Hybrid Journal   (Followers: 17)
Advances in Artificial Neural Systems     Open Access   (Followers: 4)
Advances in Calculus of Variations     Hybrid Journal   (Followers: 2)
Advances in Catalysis     Full-text available via subscription   (Followers: 5)
Advances in Complex Systems     Hybrid Journal   (Followers: 7)
Advances in Engineering Software     Hybrid Journal   (Followers: 25)
Advances in Fuel Cells     Full-text available via subscription   (Followers: 14)
Advances in Fuzzy Systems     Open Access   (Followers: 5)
Advances in Geosciences (ADGEO)     Open Access   (Followers: 10)
Advances in Heat Transfer     Full-text available via subscription   (Followers: 20)
Advances in Human Factors/Ergonomics     Full-text available via subscription   (Followers: 25)
Advances in Magnetic and Optical Resonance     Full-text available via subscription   (Followers: 9)
Advances in Natural Sciences: Nanoscience and Nanotechnology     Open Access   (Followers: 28)
Advances in Operations Research     Open Access   (Followers: 11)
Advances in OptoElectronics     Open Access   (Followers: 5)
Advances in Physics Theories and Applications     Open Access   (Followers: 12)
Advances in Polymer Science     Hybrid Journal   (Followers: 40)
Advances in Porous Media     Full-text available via subscription   (Followers: 4)
Advances in Remote Sensing     Open Access   (Followers: 37)
Advances in Science and Research (ASR)     Open Access   (Followers: 6)
Aerobiologia     Hybrid Journal   (Followers: 1)
African Journal of Science, Technology, Innovation and Development     Hybrid Journal   (Followers: 4)
AIChE Journal     Hybrid Journal   (Followers: 29)
Ain Shams Engineering Journal     Open Access   (Followers: 5)
Akademik Platform Mühendislik ve Fen Bilimleri Dergisi     Open Access  
Alexandria Engineering Journal     Open Access   (Followers: 1)
AMB Express     Open Access   (Followers: 1)
American Journal of Applied Sciences     Open Access   (Followers: 28)
American Journal of Engineering and Applied Sciences     Open Access   (Followers: 11)
American Journal of Engineering Education     Open Access   (Followers: 9)
American Journal of Environmental Engineering     Open Access   (Followers: 16)
American Journal of Industrial and Business Management     Open Access   (Followers: 23)
Analele Universitatii Ovidius Constanta - Seria Chimie     Open Access  
Annals of Combinatorics     Hybrid Journal   (Followers: 3)
Annals of Pure and Applied Logic     Open Access   (Followers: 2)
Annals of Regional Science     Hybrid Journal   (Followers: 7)
Annals of Science     Hybrid Journal   (Followers: 7)
Applicable Algebra in Engineering, Communication and Computing     Hybrid Journal   (Followers: 2)
Applicable Analysis: An International Journal     Hybrid Journal   (Followers: 1)
Applied Catalysis A: General     Hybrid Journal   (Followers: 6)
Applied Catalysis B: Environmental     Hybrid Journal   (Followers: 9)
Applied Clay Science     Hybrid Journal   (Followers: 4)
Applied Computational Intelligence and Soft Computing     Open Access   (Followers: 12)
Applied Magnetic Resonance     Hybrid Journal   (Followers: 3)
Applied Nanoscience     Open Access   (Followers: 7)
Applied Network Science     Open Access  
Applied Numerical Mathematics     Hybrid Journal   (Followers: 5)
Applied Physics Research     Open Access   (Followers: 3)
Applied Sciences     Open Access   (Followers: 2)
Applied Spatial Analysis and Policy     Hybrid Journal   (Followers: 4)
Arabian Journal for Science and Engineering     Hybrid Journal   (Followers: 5)
Archives of Computational Methods in Engineering     Hybrid Journal   (Followers: 4)
Archives of Foundry Engineering     Open Access  
Archives of Thermodynamics     Open Access   (Followers: 7)
Arid Zone Journal of Engineering, Technology and Environment     Open Access  
Arkiv för Matematik     Hybrid Journal   (Followers: 1)
ASEE Prism     Full-text available via subscription   (Followers: 3)
Asian Engineering Review     Open Access  
Asian Journal of Applied Science and Engineering     Open Access   (Followers: 1)
Asian Journal of Applied Sciences     Open Access   (Followers: 2)
Asian Journal of Biotechnology     Open Access   (Followers: 8)
Asian Journal of Control     Hybrid Journal  
Asian Journal of Current Engineering & Maths     Open Access  
Asian Journal of Technology Innovation     Hybrid Journal   (Followers: 8)
Assembly Automation     Hybrid Journal   (Followers: 2)
at - Automatisierungstechnik     Hybrid Journal   (Followers: 1)
ATZagenda     Hybrid Journal  
ATZextra worldwide     Hybrid Journal  
Australasian Physical & Engineering Sciences in Medicine     Hybrid Journal   (Followers: 1)
Australian Journal of Multi-Disciplinary Engineering     Full-text available via subscription   (Followers: 2)
Autonomous Mental Development, IEEE Transactions on     Hybrid Journal   (Followers: 8)
Avances en Ciencias e Ingeniería     Open Access  
Balkan Region Conference on Engineering and Business Education     Open Access   (Followers: 1)
Bangladesh Journal of Scientific and Industrial Research     Open Access  
Basin Research     Hybrid Journal   (Followers: 3)
Batteries     Open Access   (Followers: 4)
Bautechnik     Hybrid Journal   (Followers: 1)
Bell Labs Technical Journal     Hybrid Journal   (Followers: 23)
Beni-Suef University Journal of Basic and Applied Sciences     Open Access   (Followers: 3)
BER : Manufacturing Survey : Full Survey     Full-text available via subscription   (Followers: 2)
BER : Motor Trade Survey     Full-text available via subscription   (Followers: 1)
BER : Retail Sector Survey     Full-text available via subscription   (Followers: 2)
BER : Retail Survey : Full Survey     Full-text available via subscription   (Followers: 2)
BER : Survey of Business Conditions in Manufacturing : An Executive Summary     Full-text available via subscription   (Followers: 3)
BER : Survey of Business Conditions in Retail : An Executive Summary     Full-text available via subscription   (Followers: 3)
Bharatiya Vaigyanik evam Audyogik Anusandhan Patrika (BVAAP)     Open Access   (Followers: 1)
Biofuels Engineering     Open Access  
Biointerphases     Open Access   (Followers: 1)
Biomaterials Science     Full-text available via subscription   (Followers: 9)
Biomedical Engineering     Hybrid Journal   (Followers: 16)
Biomedical Engineering and Computational Biology     Open Access   (Followers: 13)
Biomedical Engineering Letters     Hybrid Journal   (Followers: 5)
Biomedical Engineering, IEEE Reviews in     Full-text available via subscription   (Followers: 17)
Biomedical Engineering, IEEE Transactions on     Hybrid Journal   (Followers: 32)
Biomedical Engineering: Applications, Basis and Communications     Hybrid Journal   (Followers: 5)
Biomedical Microdevices     Hybrid Journal   (Followers: 8)
Biomedical Science and Engineering     Open Access   (Followers: 3)
Biomedizinische Technik - Biomedical Engineering     Hybrid Journal  
Biomicrofluidics     Open Access   (Followers: 4)
BioNanoMaterials     Hybrid Journal   (Followers: 2)
Biotechnology Progress     Hybrid Journal   (Followers: 39)
Boletin Cientifico Tecnico INIMET     Open Access  
Botswana Journal of Technology     Full-text available via subscription  
Boundary Value Problems     Open Access   (Followers: 1)
Brazilian Journal of Science and Technology     Open Access   (Followers: 2)
Broadcasting, IEEE Transactions on     Hybrid Journal   (Followers: 10)
Bulletin of Canadian Petroleum Geology     Full-text available via subscription   (Followers: 14)
Bulletin of Engineering Geology and the Environment     Hybrid Journal   (Followers: 3)
Bulletin of the Crimean Astrophysical Observatory     Hybrid Journal  
Cahiers, Droit, Sciences et Technologies     Open Access  
Calphad     Hybrid Journal  
Canadian Geotechnical Journal     Hybrid Journal   (Followers: 14)
Canadian Journal of Remote Sensing     Full-text available via subscription   (Followers: 41)
Case Studies in Engineering Failure Analysis     Open Access   (Followers: 7)
Case Studies in Thermal Engineering     Open Access   (Followers: 3)
Catalysis Communications     Hybrid Journal   (Followers: 6)
Catalysis Letters     Hybrid Journal   (Followers: 2)
Catalysis Reviews: Science and Engineering     Hybrid Journal   (Followers: 8)
Catalysis Science and Technology     Free   (Followers: 6)
Catalysis Surveys from Asia     Hybrid Journal   (Followers: 3)
Catalysis Today     Hybrid Journal   (Followers: 5)
CEAS Space Journal     Hybrid Journal  
Cellular and Molecular Neurobiology     Hybrid Journal   (Followers: 3)
Central European Journal of Engineering     Hybrid Journal   (Followers: 1)
CFD Letters     Open Access   (Followers: 6)
Chaos : An Interdisciplinary Journal of Nonlinear Science     Hybrid Journal   (Followers: 2)
Chaos, Solitons & Fractals     Hybrid Journal   (Followers: 3)
Chinese Journal of Catalysis     Full-text available via subscription   (Followers: 2)
Chinese Journal of Engineering     Open Access   (Followers: 2)
Chinese Science Bulletin     Open Access   (Followers: 1)
Ciencia e Ingenieria Neogranadina     Open Access  
Ciencia en su PC     Open Access   (Followers: 1)
Ciencias Holguin     Open Access   (Followers: 1)
CienciaUAT     Open Access  
Cientifica     Open Access  
CIRP Annals - Manufacturing Technology     Full-text available via subscription   (Followers: 11)
CIRP Journal of Manufacturing Science and Technology     Full-text available via subscription   (Followers: 14)
City, Culture and Society     Hybrid Journal   (Followers: 21)
Clay Minerals     Full-text available via subscription   (Followers: 9)
Clean Air Journal     Full-text available via subscription   (Followers: 2)
Coal Science and Technology     Full-text available via subscription   (Followers: 3)
Coastal Engineering     Hybrid Journal   (Followers: 11)
Coastal Engineering Journal     Hybrid Journal   (Followers: 4)
Coatings     Open Access   (Followers: 3)
Cogent Engineering     Open Access   (Followers: 2)
Cognitive Computation     Hybrid Journal   (Followers: 4)
Color Research & Application     Hybrid Journal   (Followers: 1)
COMBINATORICA     Hybrid Journal  
Combustion Theory and Modelling     Hybrid Journal   (Followers: 13)
Combustion, Explosion, and Shock Waves     Hybrid Journal   (Followers: 13)
Communications Engineer     Hybrid Journal   (Followers: 1)
Communications in Numerical Methods in Engineering     Hybrid Journal   (Followers: 2)
Components, Packaging and Manufacturing Technology, IEEE Transactions on     Hybrid Journal   (Followers: 26)
Composite Interfaces     Hybrid Journal   (Followers: 6)
Composite Structures     Hybrid Journal   (Followers: 256)
Composites Part A : Applied Science and Manufacturing     Hybrid Journal   (Followers: 179)
Composites Part B : Engineering     Hybrid Journal   (Followers: 227)
Composites Science and Technology     Hybrid Journal   (Followers: 197)
Comptes Rendus Mécanique     Full-text available via subscription   (Followers: 2)
Computation     Open Access  
Computational Geosciences     Hybrid Journal   (Followers: 13)
Computational Optimization and Applications     Hybrid Journal   (Followers: 7)
Computational Science and Discovery     Full-text available via subscription   (Followers: 2)
Computer Applications in Engineering Education     Hybrid Journal   (Followers: 6)
Computer Science and Engineering     Open Access   (Followers: 17)
Computers & Geosciences     Hybrid Journal   (Followers: 28)
Computers & Mathematics with Applications     Full-text available via subscription   (Followers: 5)
Computers and Electronics in Agriculture     Hybrid Journal   (Followers: 4)
Computers and Geotechnics     Hybrid Journal   (Followers: 10)
Computing and Visualization in Science     Hybrid Journal   (Followers: 5)
Computing in Science & Engineering     Full-text available via subscription   (Followers: 29)
Conciencia Tecnologica     Open Access  
Concurrent Engineering     Hybrid Journal   (Followers: 3)
Continuum Mechanics and Thermodynamics     Hybrid Journal   (Followers: 6)
Control and Dynamic Systems     Full-text available via subscription   (Followers: 8)
Control Engineering Practice     Hybrid Journal   (Followers: 42)
Control Theory and Informatics     Open Access   (Followers: 7)
Corrosion Science     Hybrid Journal   (Followers: 25)
CT&F Ciencia, Tecnologia y Futuro     Open Access  

        1 2 3 4 5 6 7 | Last

Journal Cover Computers & Geosciences
  [SJR: 1.268]   [H-I: 78]   [28 followers]  Follow
   Hybrid Journal Hybrid journal (It can contain Open Access articles)
   ISSN (Print) 0098-3004
   Published by Elsevier Homepage  [3042 journals]
  • An adjoint-free method to determine conditional nonlinear optimal
    • Abstract: Publication date: September 2017
      Source:Computers & Geosciences, Volume 106
      Author(s): Aleid Oosterwijk, Henk A. Dijkstra, Tristan van Leeuwen
      The analysis of the growth of initial perturbations in dynamical systems is an important aspect of predictability theory because it informs on error growth. The Conditional Nonlinear Optimal Perturbation (CNOP) method is an approach where the nonlinear growth of perturbations is determined over a certain lead time. The CNOPs can be found by a nonlinear constrained optimisation problem, which is typically solved using sequential quadratic programming (SQP), a routine that requires an adjoint model. Such adjoint models are not always available and hence we here study the performance of an adjoint-free optimisation method (COBYLA), in combination with a dimension reduction technique, to determine CNOPs. The new technique is applied to a quasi-geostrophic model of the wind-driven ocean circulation. We find that COBYLA is able to find good approximations of CNOPs, albeit at a higher computational cost than conventional adjoint-based methods.

      PubDate: 2017-07-09T04:53:19Z
  • Estimating the surface relaxivity as a function of pore size from NMR T2
           distributions and micro-tomographic images
    • Abstract: Publication date: September 2017
      Source:Computers & Geosciences, Volume 106
      Author(s): Francisco Benavides, Ricardo Leiderman, Andre Souza, Giovanna Carneiro, Rodrigo Bagueira
      In the present work, we formulate and solve an inverse problem to recover the surface relaxivity as a function of pore size. The input data for our technique are the T 2 distribution measurement and the micro-tomographic image of the rock sample under investigation. We simulate the NMR relaxation signal for a given surface relaxivity function using the random walk method and rank different surface relaxivity functions according to the correlation of the resulting simulated T 2 distributions with the measured T 2 distribution. The optimization is performed using genetic algorithms and determines the surface relaxivity function whose corresponding simulated T 2 distribution best matches the measured T 2 distribution. In the proposed methodology, pore size is associated with a number of collisions in the random walk simulations. We illustrate the application of the proposed method by performing inversions from synthetic and laboratory input data and compare the obtained results with those obtained using the uniform relaxivity assumption.

      PubDate: 2017-07-09T04:53:19Z
  • Automatic fracture detection based on Terrestrial Laser Scanning data: A
           new method and case study
    • Abstract: Publication date: September 2017
      Source:Computers & Geosciences, Volume 106
      Author(s): Ting Cao, Ancheng Xiao, Lei Wu, Liguang Mao
      Terrestrial Laser Scanning (TLS), widely known as light detection and ranging (LiDAR) technology, is increasingly used to obtain rapidly three-dimensional (3-D) geometry or highly detailed digital terrain models with millimetric point precision and accuracy. In this contribution, we proposed a simple and unbiased approach to identify fractures directly from 3-D surface model of natural outcrops generated from TLS data and thus acquire surface density, which can provide important supplement data for fracture related research. One outcrop from the Shizigou anticline in the Qaidam Basin (NW China) is taken as the case to validate the method and obtain optimal parameters, according to the references of surface density measured in the field and from the photos taken by high-resolution camera. The results show that with suitable parameters, the proposed method can identify most structural fractures quickly, providing a solution of extracting structural fractures from virtual outcrops based on TLS data. Furthermore, it will help a lot in analyzing the development of fractures and other related fields.

      PubDate: 2017-07-09T04:53:19Z
  • Comments on “computation of the gravity field and its
    • Abstract: Publication date: September 2017
      Source:Computers & Geosciences, Volume 106
      Author(s): Dailei Zhang, Danian Huang

      PubDate: 2017-07-09T04:53:19Z
  • WASS: an Open-Source Pipeline for 3D Stereo Reconstruction of Ocean Waves
    • Abstract: Publication date: Available online 6 July 2017
      Source:Computers & Geosciences
      Author(s): Filippo Bergamasco, Andrea Torsello, Mauro Sclavo, Francesco Barbariol, Alvise Benetazzo
      Stereo 3D reconstruction of ocean waves is gaining more and more popularity in the oceanographic community and industry. Indeed, recent advances of both computer vision algorithms and computer processing power now allow the study of the spatio-temporal wave field with unprecedented accuracy, especially at small scales. Even if simple in theory, multiple details are difficult to be mastered for a practitioner, so that the implementation of a sea-waves 3D reconstruction pipeline is in general considered a complex task. For instance, camera calibration, reliable stereo feature matching and mean sea-plane estimation are all factors for which a well designed implementation can make the difference to obtain valuable results. For this reason, we believe that the open availability of a well tested software package that automates the reconstruction process from stereo images to a 3D point cloud would be a valuable addition for future researches in this area. We present WASS (, an Open-Source stereo processing pipeline for sea waves 3D reconstruction. Our tool completely automates all the steps required to estimate dense point clouds from stereo images. Namely, it computes the extrinsic parameters of the stereo rig so that no delicate calibration has to be performed on the field. It implements a fast 3D dense stereo reconstruction procedure based on the consolidated OpenCV library and, lastly, it includes set of filtering techniques both on the disparity map and the produced point cloud to remove the vast majority of erroneous points that can naturally arise while analyzing the optically complex nature of the water surface. In this paper, we describe the architecture of WASS and the internal algorithms involved. The pipeline workflow is shown step-by-step and demonstrated on real datasets acquired at sea.

      PubDate: 2017-07-09T04:53:19Z
  • A framework for simulation and inversion in electromagnetics
    • Abstract: Publication date: Available online 3 July 2017
      Source:Computers & Geosciences
      Author(s): Lindsey J. Heagy, Rowan Cockett, Seogi Kang, Gudni K. Rosenkjaer, Douglas W. Oldenburg
      Simulations and inversions of electromagnetic geophysical data are paramount for discerning meaningful information about the subsurface from these data. Depending on the nature of the source electromagnetic experiments may be classified as time-domain or frequency-domain. Multiple heterogeneous and sometimes anisotropic physical properties, including electrical conductivity and magnetic permeability, may need be considered in a simulation. Depending on what one wants to accomplish in an inversion, the parameters which one inverts for may be a voxel-based description of the earth or some parametric representation that must be mapped onto a simulation mesh. Each of these permutations of the electromagnetic problem has implications in a numerical implementation of the forward simulation as well as in the computation of the sensitivities, which are required when considering gradient-based inversions. This paper proposes a framework for organizing and implementing electromagnetic simulations and gradient-based inversions in a modular, extensible fashion. We take an object-oriented approach for defining and organizing each of the necessary elements in an electromagnetic simulation, including: the physical properties, sources, formulation of the discrete problem to be solved, the resulting fields and fluxes, and receivers used to sample to the electromagnetic responses. A corresponding implementation is provided as part of the open source simulation and parameter estimation project SimPEG ( The application of the framework is demonstrated through two synthetic examples and one field example. The first example shows the application of the common framework for 1D time domain and frequency domain inversions. The second is a field example that demonstrates a 1D inversion of electromagnetic data collected over the Bookpurnong Irrigation District in Australia. The final example is a 3D example which shows how the modular implementation is used to compute the sensitivity for a parametric model where a transmitter is positioned inside a steel cased well.

      PubDate: 2017-07-09T04:53:19Z
  • Wind wave analysis in depth limited water using OCEANLYZ, A MATLAB toolbox
    • Abstract: Publication date: September 2017
      Source:Computers & Geosciences, Volume 106
      Author(s): Arash Karimpour, Qin Chen
      There are a number of well established methods in the literature describing how to assess and analyze measured wind wave data. However, obtaining reliable results from these methods requires adequate knowledge on their behavior, strengths and weaknesses. A proper implementation of these methods requires a series of procedures including a pretreatment of the raw measurements, and adjustment and refinement of the processed data to provide quality assurance of the outcomes, otherwise it can lead to untrustworthy results. This paper discusses potential issues in these procedures, explains what parameters are influential for the outcomes and suggests practical solutions to avoid and minimize the errors in the wave results. The procedure of converting the water pressure data into the water surface elevation data, treating the high frequency data with a low signal-to-noise ratio, partitioning swell energy from wind sea, and estimating the peak wave frequency from the weighted integral of the wave power spectrum are described. Conversion and recovery of the data acquired by a pressure transducer, particularly in depth-limited water like estuaries and lakes, are explained in detail. To provide researchers with tools for a reliable estimation of wind wave parameters, the Ocean Wave Analyzing toolbox, OCEANLYZ, is introduced. The toolbox contains a number of MATLAB functions for estimation of the wave properties in time and frequency domains. The toolbox has been developed and examined during a number of the field study projects in Louisiana’s estuaries.

      PubDate: 2017-06-28T12:16:45Z
  • Automatic identification of watercourses in flat and engineered landscapes
           by computing the skeleton of a LiDAR point cloud
    • Abstract: Publication date: Available online 13 June 2017
      Source:Computers & Geosciences
      Author(s): Tom Broersen, Ravi Peters, Hugo Ledoux
      Drainage networks play a crucial role in protecting land against floods. It is therefore important to have an accurate map of the watercourses that form the drainage network. Previous work on the automatic identification of watercourses was typically based on grids, focused on natural landscapes, and used mostly the slope and curvature of the terrain. We focus in this paper on areas that are characterised by low-lying, flat, and engineered landscapes; these are characteristic to the Netherlands for instance. We propose a new methodology to identify watercourses automatically from elevation data, it uses solely a raw classified LiDAR point cloud as input. We show that by computing twice a skeleton of the point cloud — once in 2D and once in 3D — and that by using the properties of the skeletons we can identify most of the watercourses. We have implemented our methodology and tested it for three different soil types around Utrecht, the Netherlands. We were able to detect 98% of the watercourses for one soil type, and around 75% for the worst case, when we compared to a reference dataset that was obtained semi-automatically.

      PubDate: 2017-06-15T07:48:41Z
  • Subsetting hyperspectral core imaging data using a
           graphic-identification-based IDL program
    • Abstract: Publication date: September 2017
      Source:Computers & Geosciences, Volume 106
      Author(s): Jun-Ting Qiu, Chuan Zhang, Zhang-Fa Yu, Qing-Jun Xu, Ding Wu, Wei-Wei Li, Jia-Lei Yao
      This study presents an IDL program to subset hyperspectral drill core imagery automatically based on graphic identification. A HySpex SWIR-320m-e imager and drill cores from the Xiangshan uranium deposit were used to do an application test. Based on the HySpex images, we found that the ratio variation tolerance of 75%, minimum marker size of 37 pixel × 37 pixel (28mm × 28mm), and wavelength of 1141.3nm (band #30) are preferences for the IDL program. The results indicate that the IDL program subsets hyperspectral images with high accuracy and efficiency without consuming additional time during the scanning process. Additionally, deformation of the core box, the material from which the core box is made, and variation in the diameter of the drill core do not significantly affect the quality of the results.

      PubDate: 2017-06-11T07:43:44Z
  • Applicability of computer-aided comprehensive tool (LINDA: LINeament
           Detection and Analysis) and shaded digital elevation model for
           characterizing and interpreting morphotectonic features from lineaments
    • Abstract: Publication date: September 2017
      Source:Computers & Geosciences, Volume 106
      Author(s): Alaa Masoud, Katsuaki Koike
      Detection and analysis of linear features related to surface and subsurface structures have been deemed necessary in natural resource exploration and earth surface instability assessment. Subjectivity in choosing control parameters required in conventional methods of lineament detection may cause unreliable results. To reduce this ambiguity, we developed LINDA (LINeament Detection and Analysis), an integrated tool with graphical user interface in Visual Basic. This tool automates processes of detection and analysis of linear features from grid data of topography (digital elevation model; DEM), gravity and magnetic surfaces, as well as data from remote sensing imagery. A simple interface with five display windows forms a user-friendly interactive environment. The interface facilitates grid data shading, detection and grouping of segments, lineament analyses for calculating strike and dip and estimating fault type, and interactive viewing of lineament geometry. Density maps of the center and intersection points of linear features (segments and lineaments) are also included. A systematic analysis of test DEMs and Landsat 7 ETM+ imagery datasets in the North and South Eastern Deserts of Egypt is implemented to demonstrate the capability of LINDA and correct use of its functions. Linear features from the DEM are superior to those from the imagery in terms of frequency, but both linear features agree with location and direction of V-shaped valleys and dykes and reference fault data. Through the case studies, LINDA applicability is demonstrated to highlight dominant structural trends, which can aid understanding of geodynamic frameworks in any region.

      PubDate: 2017-06-11T07:43:44Z
  • A Relevancy Algorithm for Curating Earth Science Data around Phenomenon
    • Abstract: Publication date: Available online 10 June 2017
      Source:Computers & Geosciences
      Author(s): Manil Maskey, Rahul Ramachandran, Xiang Li, Amanda Weigel, Kaylin Bugbee, Patrick Gatlin, J.J. Miller
      Earth science data are being collected for various science needs and applications, processed using different algorithms at multiple resolutions and coverages, and then archived at different archiving centers for distribution and stewardship causing difficulty in data discovery. Curation, which typically occurs in museums, art galleries, and libraries, is traditionally defined as the process of collecting and organizing information around a common subject matter or a topic of interest. Curating data sets around topics or areas of interest addresses some of the data discovery needs in the field of Earth science, especially for unanticipated users of data. This paper describes a methodology to automate search and selection of data around specific phenomena. Different components of the methodology including the assumptions, the process, and the relevancy ranking algorithm are described. The paper makes two unique contributions to improving data search and discovery capabilities. First, the paper describes a novel methodology developed for automatically curating data around a topic using Earth science metadata records. Second, the methodology has been implemented as a stand-alone web service that is utilized to augment search and usability of data in a variety of tools.

      PubDate: 2017-06-11T07:43:44Z
  • A Feature Selection Approach towards Progressive Vector Transmission over
           the Internet
    • Abstract: Publication date: Available online 8 June 2017
      Source:Computers & Geosciences
      Author(s): Ru Miao, Jia Song, Min Feng
      WebGIS has been applied for visualizing and sharing geospatial information popularly over the Internet. In order to improve the efficiency of the client applications, the web-based progressive vector transmission approach is proposed. Important features should be selected and transferred firstly, and the methods for measuring the importance of features should be further considered in the progressive transmission. However, studies on progressive transmission for large-volume vector data have mostly focused on map generalization in the field of cartography, but rarely discussed on the selection of geographic features quantitatively. This paper applies information theory for measuring the feature importance of vector maps. A measurement model for the amount of information of vector features is defined based upon the amount of information for dealing with feature selection issues. The measurement model involves geometry factor, spatial distribution factor and thematic attribute factor. Moreover, a real-time transport protocol (RTP)-based progressive transmission method is then presented to improve the transmission of vector data. To clearly demonstrate the essential methodology and key techniques, a prototype for web-based progressive vector transmission is presented, and an experiment of progressive selection and transmission for vector features is conducted. The experimental results indicate that our approach clearly improves the performance and end-user experience of delivering and manipulating large vector data over the Internet.

      PubDate: 2017-06-11T07:43:44Z
  • WFCatalog: a catalogue for seismological waveform data
    • Abstract: Publication date: Available online 8 June 2017
      Source:Computers & Geosciences
      Author(s): Luca Trani, Mathijs Koymans, Malcolm Atkinson, Reinoud Sleeman, Rosa Filgueira
      This paper reports advances in seismic waveform description and discovery leading to a new seismological service and presents the key steps in its design, implementation and adoption. This service, named WFCatalog, which stands for waveform catalogue, accommodates features of seismological waveform data. Therefore, it meets the need for seismologists to be able to select waveform data based on seismic waveform features as well as sensor geolocations and temporal specifications. We describe the collaborative design methods and the technical solution showing the central role of seismic feature catalogues in framing the technical and operational delivery of the new service. Also, we provide an overview of the complex environment wherein this endeavour is scoped and the related challenges discussed. As multi-disciplinary, multi-organisational and global collaboration is necessary to address today's challenges, canonical representations can provide a focus for collaboration and conceptual tools for agreeing directions. Such collaborations can be fostered and formalised by rallying intellectual effort into the design of novel scientific catalogues and the services that support them. This work offers an example of the benefits generated by involving cross-disciplinary skills (e.g. data and domain expertise) from the early stages of design, and by sustaining the engagement with the target community throughout the delivery and deployment process.

      PubDate: 2017-06-11T07:43:44Z
  • Optimal estimation of areal values of near-land-surface temperatures for
           testing global and local spatio-temporal trends
    • Abstract: Publication date: Available online 6 June 2017
      Source:Computers & Geosciences
      Author(s): Hong Wang, Eulogio Pardo-Igúzquiza, Peter A. Dowd, Yongguo Yang
      This paper provides a solution to the problem of estimating the mean value of near-land-surface temperature over a relatively large area (here, by way of example, applied to mainland Spain covering an area of around half a million square kilometres) from a limited number of weather stations covering a non-representative (biased) range of altitudes. As evidence mounts for altitude-dependent global warming, this bias is a significant problem when temperatures at high altitudes are under-represented. We correct this bias by using altitude as a secondary variable and using a novel clustering method for identifying geographical regions (clusters) that maximize the correlation between altitude and mean temperature. In addition, the paper provides an improved regression kriging estimator, which is optimally determined by the cluster analysis. The optimal areal values of near-land-surface temperature are used to generate time series of areal temperature averages in order to assess regional changes in temperature trends. The methodology is applied to records of annual mean temperatures over the period 1950-2011 across mainland Spain. The robust non-parametric Theil-Sen method is used to test for temperature trends in the regional temperature time series. Our analysis shows that, over the 62-year period of the study, 78% of mainland Spain has had a statistically significant increase in annual mean temperature.

      PubDate: 2017-06-11T07:43:44Z
  • Unsupervised feature learning for autonomous rock image classification
    • Abstract: Publication date: September 2017
      Source:Computers & Geosciences, Volume 106
      Author(s): Lei Shu, Kenneth McIsaac, Gordon R. Osinski, Raymond Francis
      Autonomous rock image classification can enhance the capability of robots for geological detection and enlarge the scientific returns, both in investigation on Earth and planetary surface exploration on Mars. Since rock textural images are usually inhomogeneous and manually hand-crafting features is not always reliable, we propose an unsupervised feature learning method to autonomously learn the feature representation for rock images. In our tests, rock image classification using the learned features shows that the learned features can outperform manually selected features. Self-taught learning is also proposed to learn the feature representation from a large database of unlabelled rock images of mixed class. The learned features can then be used repeatedly for classification of any subclass. This takes advantage of the large dataset of unlabelled rock images and learns a general feature representation for many kinds of rocks. We show experimental results supporting the feasibility of self-taught learning on rock images.

      PubDate: 2017-06-06T07:57:39Z
  • Stochastic simulation by image quilting of process-based geological models
    • Abstract: Publication date: September 2017
      Source:Computers & Geosciences, Volume 106
      Author(s): Júlio Hoffimann, Céline Scheidt, Adrian Barfod, Jef Caers
      Process-based modeling offers a way to represent realistic geological heterogeneity in subsurface models. The main limitation lies in conditioning such models to data. Multiple-point geostatistics can use these process-based models as training images and address the data conditioning problem. In this work, we further develop image quilting as a method for 3D stochastic simulation capable of mimicking the realism of process-based geological models with minimal modeling effort (i.e. parameter tuning) and at the same time condition them to a variety of data. In particular, we develop a new probabilistic data aggregation method for image quilting that bypasses traditional ad-hoc weighting of auxiliary variables. In addition, we propose a novel criterion for template design in image quilting that generalizes the entropy plot for continuous training images. The criterion is based on the new concept of voxel reuse—a stochastic and quilting-aware function of the training image. We compare our proposed method with other established simulation methods on a set of process-based training images of varying complexity, including a real-case example of stochastic simulation of the buried-valley groundwater system in Denmark.

      PubDate: 2017-06-06T07:57:39Z
  • Porosity estimation by semi-supervised learning with sparsely available
           labeled samples
    • Abstract: Publication date: September 2017
      Source:Computers & Geosciences, Volume 106
      Author(s): Luiz Alberto Lima, Nico Görnitz, Luiz Eduardo Varella, Marley Vellasco, Klaus-Robert Müller, Shinichi Nakajima
      This paper addresses the porosity estimation problem from seismic impedance volumes and porosity samples located in a small group of exploratory wells. Regression methods, trained on the impedance as inputs and the porosity as output labels, generally suffer from extremely expensive (and hence sparsely available) porosity samples. To optimally make use of the valuable porosity data, a semi-supervised machine learning method was proposed, Transductive Conditional Random Field Regression (TCRFR), showing good performance (Görnitz et al., 2017). TCRFR, however, still requires more labeled data than those usually available, which creates a gap when applying the method to the porosity estimation problem in realistic situations. In this paper, we aim to fill this gap by introducing two graph-based preprocessing techniques, which adapt the original TCRFR for extremely weakly supervised scenarios. Our new method outperforms the previous automatic estimation methods on synthetic data and provides a comparable result to the manual labored, time-consuming geostatistics approach on real data, proving its potential as a practical industrial tool.

      PubDate: 2017-06-06T07:57:39Z
  • Computationally efficient variable resolution depth estimation
    • Abstract: Publication date: September 2017
      Source:Computers & Geosciences, Volume 106
      Author(s): B.R. Calder, G. Rice
      A new algorithm for data-adaptive, large-scale, computationally efficient estimation of bathymetry is proposed. The algorithm uses a first pass over the observations to construct a spatially varying estimate of data density, which is then used to predict achievable estimate sample spacing for robust depth estimation across the area of interest. A low-resolution estimate of depth is also constructed during the first pass as a guide for further work. A piecewise-regular grid is then constructed following the sample spacing estimates, and accurate depth is finally estimated using the composite refined grid and an extended and re-implemented version of the cube algorithm. Resource-efficient data structures allow for the algorithm to operate over large areas and large datasets without excessive compute resources; modular design allows for more complex spatial representations to be included if required. The proposed system is demonstrated on a pair of hydrographic datasets, illustrating the adaptation of the algorithm to different depth- and sensor-driven data densities. Although the algorithm was designed for bathymetric estimation, it could be readily used on other two dimensional scalar fields where variable data density is a driver.

      PubDate: 2017-06-06T07:57:39Z
  • Spatial coding-based approach for partitioning big spatial data in Hadoop
    • Abstract: Publication date: September 2017
      Source:Computers & Geosciences, Volume 106
      Author(s): Xiaochuang Yao, Mohamed F. Mokbel, Louai Alarabi, Ahmed Eldawy, Jianyu Yang, Wenju Yun, Lin Li, Sijing Ye, Dehai Zhu
      Spatial data partitioning (SDP) plays a powerful role in distributed storage and parallel computing for spatial data. However, due to skew distribution of spatial data and varying volume of spatial vector objects, it leads to a significant challenge to ensure both optimal performance of spatial operation and data balance in the cluster. To tackle this problem, we proposed a spatial coding-based approach for partitioning big spatial data in Hadoop. This approach, firstly, compressed the whole big spatial data based on spatial coding matrix to create a sensing information set (SIS), including spatial code, size, count and other information. SIS was then employed to build spatial partitioning matrix, which was used to spilt all spatial objects into different partitions in the cluster finally. Based on our approach, the neighbouring spatial objects can be partitioned into the same block. At the same time, it also can minimize the data skew in Hadoop distributed file system (HDFS). The presented approach with a case study in this paper is compared against random sampling based partitioning, with three measurement standards, namely, the spatial index quality, data skew in HDFS, and range query performance. The experimental results show that our method based on spatial coding technique can improve the query performance of big spatial data, as well as the data balance in HDFS. We implemented and deployed this approach in Hadoop, and it is also able to support efficiently any other distributed big spatial data systems.

      PubDate: 2017-06-06T07:57:39Z
  • Learning Characteristic Natural Gamma Shale Marker Signatures in Iron Ore
    • Abstract: Publication date: Available online 4 June 2017
      Source:Computers & Geosciences
      Author(s): D. Nathan, P. Duuring, E.J. Holden, D. Wedge, T. Horrocks
      Uncertainty in the location of stratigraphic boundaries in stratiform deposits has a direct impact on the uncertainty of resource estimates. The interpretation of stratigraphic boundaries in banded iron formation (BIF)-hosted deposits in the Hamersley province of Western Australia is made by recognizing shale markers which have characteristic signatures from natural gamma wireline logs. This paper presents a novel application of a probabilistic sequential model, named a continuous profile model, which is capable of jointly modelling the uncertainty in the amplitude and alignment of characteristic signatures. We demonstrate the accuracy of this approach by comparing three models that incorporate varying intensities of distortion and alignment in their ability to correctly identify a shale band of the West Angelas member of the Wittenoom Formation which overlies the Marra Mamba Iron Formation in the Hamersley Basin. Our experiments show that the proposed approach recovers 98.72% of interpreted shale band intervals and importantly quantifies the uncertainty in scale and alignment that contribute to probabilistic interpretations of stratigraphic boundaries.

      PubDate: 2017-06-06T07:57:39Z
  • Modelling the interaction of aeolian and fluvial processes with a combined
           cellular model of sand dunes and river systems
    • Abstract: Publication date: September 2017
      Source:Computers & Geosciences, Volume 106
      Author(s): Baoli Liu, Tom J. Coulthard
      Aeolian and fluvial processes are important agents for shaping the surface of the Earth, but are largely studied in isolation despite there being many locations where both processes are acting together and influencing each other. Using field data to investigate fluvial-aeolian interactions is, however, hampered by our short length of record and low temporal resolution of observations. Here we use numerical modelling to investigate, for the first time, the interplay between aeolian (sand dunes) and fluvial (river channel) processes. This modelling is carried out by combining two existing cellular models of aeolian and fluvial processes that requires considerable consideration of the different process representation and time stepping used. The result is a fully coupled (in time and space) sand dune – river model. Over a thousand-year simulation the model shows how the migration of sand dunes is readily blocked by rivers, yet aeolian processes can push the channel downwind. Over time cyclic channel avulsions develop indicating that aeolian action on fluvial systems may play an important part in governing avulsion frequency, and thus alluvial architecture.

      PubDate: 2017-05-26T18:16:29Z
  • Stochastic simulation of channelized sedimentary bodies using a
           constrained L-system
    • Abstract: Publication date: August 2017
      Source:Computers & Geosciences, Volume 105
      Author(s): Guillaume Rongier, Pauline Collon, Philippe Renard
      Simulating realistic sedimentary bodies while conditioning all the available data is a major topic of research. We present a new method to simulate the channel morphologies resulting from the deposition processes. It relies on a formal grammar system, the Lindenmayer system, or L-system. The L-system puts together channel segments based on user-defined rules and parameters. The succession of segments is then interpreted to generate non-rational uniform B-splines representing straight to meandering channels. Constraints attract or repulse the channel from the data during the channel development. They enable to condition various data types, from well data to probability cubes or a confinement. The application to a synthetic case highlights the method's ability to manage various data while preserving at best the channel morphology.

      PubDate: 2017-05-26T18:16:29Z
  • JMorph: Software for performing rapid morphometric measurements on digital
           images of fossil assemblages
    • Abstract: Publication date: August 2017
      Source:Computers & Geosciences, Volume 105
      Author(s): Peter G. Lelièvre, Melissa Grey
      Quantitative morphometric analyses of form are widely used in palaeontology, especially for taxonomic and evolutionary research. These analyses can involve several measurements performed on hundreds or even thousands of samples. Performing measurements of size and shape on large assemblages of macro- or microfossil samples is generally infeasible or impossible with traditional instruments such as vernier calipers. Instead, digital image processing software is required to perform measurements via suitable digital images of samples. Many software packages exist for morphometric analyses but there is not much available for the integral stage of data collection, particularly for the measurement of the outlines of samples. Some software exists to automatically detect the outline of a fossil sample from a digital image. However, automatic outline detection methods may perform inadequately when samples have incomplete outlines or images contain poor contrast between the sample and staging background. Hence, a manual digitization approach may be the only option. We are not aware of any software packages that are designed specifically for efficient digital measurement of fossil assemblages with numerous samples, especially for the purposes of manual outline analysis. Throughout several previous studies, we have developed a new software tool, JMorph, that is custom-built for that task. JMorph provides the means to perform many different types of measurements, which we describe in this manuscript. We focus on JMorph's ability to rapidly and accurately digitize the outlines of fossils. JMorph is freely available from the authors.

      PubDate: 2017-05-22T09:57:50Z
  • GPU based contouring method on grid DEM data
    • Abstract: Publication date: August 2017
      Source:Computers & Geosciences, Volume 105
      Author(s): Liheng Tan, Gang Wan, Feng Li, Xiaohui Chen, Wenlong Du
      This paper presents a novel method to generate contour lines from grid DEM data based on the programmable GPU pipeline. The previous contouring approaches often use CPU to construct a finite element mesh from the raw DEM data, and then extract contour segments from the elements. They also need a tracing or sorting strategy to generate the final continuous contours. These approaches can be heavily CPU-costing and time-consuming. Meanwhile the generated contours would be unsmooth if the raw data is sparsely distributed. Unlike the CPU approaches, we employ the GPU's vertex shader to generate a triangular mesh with arbitrary user-defined density, in which the height of each vertex is calculated through a third-order Cardinal spline function. Then in the same frame, segments are extracted from the triangles by the geometry shader, and translated to the CPU-side with an internal order in the GPU's transform feedback stage. Finally we propose a “Grid Sorting” algorithm to achieve the continuous contour lines by travelling the segments only once. Our method makes use of multiple stages of GPU pipeline for computation, which can generate smooth contour lines, and is significantly faster than the previous CPU approaches. The algorithm can be easily implemented with OpenGL 3.3 API or higher on consumer-level PCs.

      PubDate: 2017-05-22T09:57:50Z
  • A new method for geochemical anomaly separation based on the distribution
           patterns of singularity indices
    • Abstract: Publication date: August 2017
      Source:Computers & Geosciences, Volume 105
      Author(s): Yue Liu, Kefa Zhou, Qiuming Cheng
      Singularity analysis is one of the most important models in the fractal/multifractal family that has been demonstrated as an efficient tool for identifying hybrid distribution patterns of geochemical data, such as normal and multifractal distributions. However, the question of how to appropriately separate these patterns using reasonable thresholds has not been well answered. In the present study, a new method termed singularity-quantile (S-Q) analysis was proposed to separate multiple geochemical anomaly populations based on integrating singularity analysis and quantile-quantile plot (QQ-plot) analysis. The new method provides excellent abilities for characterizing frequency distribution patterns of singularity indices by plotting singularity index quantiles vs. standard normal quantiles. From a perspective of geochemical element enrichment processes, distribution patterns of singularity indices can be evidently separated into three groups by means of the new method, corresponding to element enrichment, element generality and element depletion, respectively. A case study for chromitite exploration based on geochemical data in the western Junggar region (China), was employed to examine the potential application of the new method. The results revealed that the proposed method was very sensitive to the changes of singularity indices with three segments when it was applied to characterize geochemical element enrichment processes. And hence, the S-Q method can be considered as an efficient and powerful tool for separating hybrid geochemical anomalies on the basis of statistical and inherent fractal/multifractal properties.

      PubDate: 2017-05-22T09:57:50Z
  • Performance prediction of finite-difference solvers for different computer
    • Abstract: Publication date: August 2017
      Source:Computers & Geosciences, Volume 105
      Author(s): Mathias Louboutin, Michael Lange, Felix J. Herrmann, Navjot Kukreja, Gerard Gorman
      The life-cycle of a partial differential equation (PDE) solver is often characterized by three development phases: the development of a stable numerical discretization; development of a correct (verified) implementation; and the optimization of the implementation for different computer architectures. Often it is only after significant time and effort has been invested that the performance bottlenecks of a PDE solver are fully understood, and the precise details varies between different computer architectures. One way to mitigate this issue is to establish a reliable performance model that allows a numerical analyst to make reliable predictions of how well a numerical method would perform on a given computer architecture, before embarking upon potentially long and expensive implementation and optimization phases. The availability of a reliable performance model also saves developer effort as it both informs the developer on what kind of optimisations are beneficial, and when the maximum expected performance has been reached and optimisation work should stop. We show how discretization of a wave-equation can be theoretically studied to understand the performance limitations of the method on modern computer architectures. We focus on the roofline model, now broadly used in the high-performance computing community, which considers the achievable performance in terms of the peak memory bandwidth and peak floating point performance of a computer with respect to algorithmic choices. A first principles analysis of operational intensity for key time-stepping finite-difference algorithms is presented. With this information available at the time of algorithm design, the expected performance on target computer systems can be used as a driver for algorithm design.

      PubDate: 2017-05-22T09:57:50Z
  • A geophone wireless sensor network for investigating glacier stick-slip
    • Abstract: Publication date: August 2017
      Source:Computers & Geosciences, Volume 105
      Author(s): Kirk Martinez, Jane K. Hart, Philip J. Basford, Graeme M. Bragg, Tyler Ward, David S. Young
      We have developed an innovative passive borehole geophone system, as part of a wireless environmental sensor network to investigate glacier stick-slip motion. The new geophone nodes use an ARM Cortex-M3 processor with a low power design capable of running on battery power while embedded in the ice. Only data from seismic events was stored, held temporarily on a micro-SD card until they were retrieved by systems on the glacier surface which are connected to the internet. The sampling rates, detection and filtering levels were determined from a field trial using a standard commercial passive seismic system. The new system was installed on the Skalafellsjökull glacier in Iceland and provided encouraging results. The results showed that there was a relationship between surface melt water production and seismic event (ice quakes), and these occurred on a pattern related to the glacier surface melt-water controlled velocity changes (stick-slip motion). Three types of seismic events were identified, which were interpreted to reflect a pattern of till deformation (Type A), basal sliding (Type B) and hydraulic transience (Type C) associated with stick-slip motion.

      PubDate: 2017-05-17T09:52:55Z
  • Gaussian process emulators for quantifying uncertainty in CO2 spreading
           predictions in heterogeneous media
    • Abstract: Publication date: August 2017
      Source:Computers & Geosciences, Volume 105
      Author(s): Liang Tian, Richard Wilkinson, Zhibing Yang, Henry Power, Fritjof Fagerlund, Auli Niemi
      We explore the use of Gaussian process emulators (GPE) in the numerical simulation of CO 2 injection into a deep heterogeneous aquifer. The model domain is a two-dimensional, log-normally distributed stochastic permeability field. We first estimate the cumulative distribution functions (CDFs) of the CO 2 breakthrough time and the total CO 2 mass using a computationally expensive Monte Carlo (MC) simulation. We then show that we can accurately reproduce these CDF estimates with a GPE, using only a small fraction of the computational cost required by traditional MC simulation. In order to build a GPE that can predict the simulator output from a permeability field consisting of 1000s of values, we use a truncated Karhunen-Loève (K-L) expansion of the permeability field, which enables the application of the Bayesian functional regression approach. We perform a cross-validation exercise to give an insight of the optimization of the experiment design for selected scenarios: we find that it is sufficient to use 100s values for the size of training set and that it is adequate to use as few as 15 K-L components. Our work demonstrates that GPE with truncated K-L expansion can be effectively applied to uncertainty analysis associated with modelling of multiphase flow and transport processes in heterogeneous media.

      PubDate: 2017-05-17T09:52:55Z
  • Matching pursuit parallel decomposition of seismic data
    • Abstract: Publication date: July 2017
      Source:Computers & Geosciences, Volume 104
      Author(s): Chuanhui Li, Fanchang Zhang
      In order to improve the computation speed of matching pursuit decomposition of seismic data, a matching pursuit parallel algorithm is designed in this paper. We pick a fixed number of envelope peaks from the current signal in every iteration according to the number of compute nodes and assign them to the compute nodes on average to search the optimal Morlet wavelets in parallel. With the help of parallel computer systems and Message Passing Interface, the parallel algorithm gives full play to the advantages of parallel computing to significantly improve the computation speed of the matching pursuit decomposition and also has good expandability. Besides, searching only one optimal Morlet wavelet by every compute node in every iteration is the most efficient implementation.

      PubDate: 2017-05-17T09:52:55Z
  • Software for determining the direction of movement, shear and normal
           stresses of a fault under a determined stress state
    • Abstract: Publication date: July 2017
      Source:Computers & Geosciences, Volume 104
      Author(s): Alejandra Álvarez del Castillo, Susana Alicia Alaniz-Álvarez, Angel Francisco Nieto-Samaniego, Shunshan Xu, Gil Humberto Ochoa-González, Luis Germán Velasquillo-Martínez
      In the oil, gas and geothermal industry, the extraction or the input of fluids induces changes in the stress field of the reservoir, if the in-situ stress state of a fault plane is sufficiently disturbed, a fault may slip and can trigger fluid leakage or the reservoir might fracture and become damaged. The goal of the SSLIPO 1.0 software is to obtain data that can reduce the risk of affecting the stability of wellbores. The input data are the magnitudes of the three principal stresses and their orientation in geographic coordinates. The output data are the slip direction of a fracture in geographic coordinates, and its normal (σn) and shear (τ) stresses resolved on a single or multiple fracture planes. With this information, it is possible to calculate the slip tendency (τ/σn) and the propensity to open a fracture that is inversely proportional to σn. This software could analyze any compressional stress system, even non-Andersonian. An example is given from an oilfield in southern Mexico, in a region that contains fractures formed in three events of deformation. In the example SSLIPO 1.0 was used to determine in which deformation event the oil migrated. SSLIPO 1.0 is an open code application developed in MATLAB. The URL to obtain the source code and to download SSLIPO 1.0 are:, SSLIPO_pkg.exe.

      PubDate: 2017-05-17T09:52:55Z
  • RINGMesh: A programming library for developing mesh-based geomodeling
    • Abstract: Publication date: July 2017
      Source:Computers & Geosciences, Volume 104
      Author(s): Jeanne Pellerin, Arnaud Botella, François Bonneau, Antoine Mazuyer, Benjamin Chauvin, Bruno Lévy, Guillaume Caumon
      RINGMesh is a C++ open-source programming library for manipulating discretized geological models. It is designed to ease the development of applications and workflows that use discretized 3D models. It is neither a geomodeler, nor a meshing software. RINGMesh implements functionalities to read discretized surface-based or volumetric structural models and to check their validity. The models can be then exported in various file formats. RINGMesh provides data structures to represent geological structural models, either defined by their discretized boundary surfaces, and/or by discretized volumes. A programming interface allows to develop of new geomodeling methods, and to plug in external software. The goal of RINGMesh is to help researchers to focus on the implementation of their specific method rather than on tedious tasks common to many applications. The documented code is open-source and distributed under the modified BSD license. It is available at

      PubDate: 2017-05-17T09:52:55Z
  • PyRQA—Conducting recurrence quantification analysis on very long
           time series efficiently
    • Abstract: Publication date: July 2017
      Source:Computers & Geosciences, Volume 104
      Author(s): Tobias Rawald, Mike Sips, Norbert Marwan
      PyRQA is a software package that efficiently conducts recurrence quantification analysis (RQA) on time series consisting of more than one million data points. RQA is a method from non-linear time series analysis that quantifies the recurrent behaviour of systems. Existing implementations to RQA are not capable of analysing such very long time series at all or require large amounts of time to calculate the quantitative measures. PyRQA overcomes their limitations by conducting the RQA computations in a highly parallel manner. Building on the OpenCL framework, PyRQA leverages the computing capabilities of a variety of parallel hardware architectures, such as GPUs. The underlying computing approach partitions the RQA computations and enables to employ multiple compute devices at the same time. The goal of this publication is to demonstrate the features and the runtime efficiency of PyRQA. For this purpose we employ a real-world example, comparing the dynamics of two climatological time series, and a synthetic example, reducing the runtime regarding the analysis of a series consisting of over one million data points from almost eight hours using state-of-the-art RQA software to roughly 69s using PyRQA.

      PubDate: 2017-05-17T09:52:55Z
  • Extending R packages to support 64-bit compiled code: An illustration with
           spam64 and GIMMS NDVI3g data
    • Abstract: Publication date: July 2017
      Source:Computers & Geosciences, Volume 104
      Author(s): Florian Gerber, Kaspar Mösinger, Reinhard Furrer
      Software packages for spatial data often implement a hybrid approach of interpreted and compiled programming languages. The compiled parts are usually written in C, C++, or Fortran, and are efficient in terms of computational speed and memory usage. Conversely, the interpreted part serves as a convenient user-interface and calls the compiled code for computationally demanding operations. The price paid for the user friendliness of the interpreted component is—besides performance—the limited access to low level and optimized code. An example of such a restriction is the 64-bit vector support of the widely used statistical language R. On the R side, users do not need to change existing code and may not even notice the extension. On the other hand, interfacing 64-bit compiled code efficiently is challenging. Since many R packages for spatial data could benefit from 64-bit vectors, we investigate strategies to efficiently pass 64-bit vectors to compiled languages. More precisely, we show how to simply extend existing R packages using the foreign function interface to seamlessly support 64-bit vectors. This extension is shown with the sparse matrix algebra R package spam. The new capabilities are illustrated with an example of GIMMS NDVI3g data featuring a parametric modeling approach for a non-stationary covariance matrix.
      Graphical abstract image Highlights

      PubDate: 2017-05-17T09:52:55Z
  • Bayesian inference of spectral induced polarization parameters for
           laboratory complex resistivity measurements of rocks and soils
    • Abstract: Publication date: August 2017
      Source:Computers & Geosciences, Volume 105
      Author(s): Charles L. Bérubé, Michel Chouteau, Pejman Shamsipour, Randolph J. Enkin, Gema R. Olivo
      Spectral induced polarization (SIP) measurements are now widely used to infer mineralogical or hydrogeological properties from the low-frequency electrical properties of the subsurface in both mineral exploration and environmental sciences. We present an open-source program that performs fast multi-model inversion of laboratory complex resistivity measurements using Markov-chain Monte Carlo simulation. Using this stochastic method, SIP parameters and their uncertainties may be obtained from the Cole-Cole and Dias models, or from the Debye and Warburg decomposition approaches. The program is tested on synthetic and laboratory data to show that the posterior distribution of a multiple Cole-Cole model is multimodal in particular cases. The Warburg and Debye decomposition approaches yield unique solutions in all cases. It is shown that an adaptive Metropolis algorithm performs faster and is less dependent on the initial parameter values than the Metropolis-Hastings step method when inverting SIP data through the decomposition schemes. There are no advantages in using an adaptive step method for well-defined Cole-Cole inversion. Finally, the influence of measurement noise on the recovered relaxation time distribution is explored. We provide the geophysics community with a open-source platform that can serve as a base for further developments in stochastic SIP data inversion and that may be used to perform parameter analysis with various SIP models.

      PubDate: 2017-05-06T21:34:07Z
  • Fractal generator for efficient production of random planar patterns and
           symbols in digital mapping
    • Abstract: Publication date: Available online 5 May 2017
      Source:Computers & Geosciences
      Author(s): Qiyu Chen, Gang Liu, Xiaogang Ma, Xinchuan Li, Zhenwen He
      In digital cartography, the automatic generation of random planar patterns and symbols is still an ongoing challenge. Those patterns and symbols of randomness have randomly variated configurations and boundaries, and their generating algorithms are constrained by the shape features, cartographic standards and many other conditions. The fractal geometry offers favorable solutions to simulate random boundaries and patterns. In the work presented in this paper, we used both fractal theory and random Iterated Function Systems (IFS) to develop a method for the automatic generation of random planar patterns and symbols. The marshland and the trough cross-bedding patterns were used as two case studies for the implementation of the method. We first analyzed the morphological characteristics of those two planar patterns. Then we designed algorithms and implementation schemes addressing the features of each pattern. Finally, we ran the algorithms to generate the patterns and symbols, and compared them with the requirements of a few digital cartographic standards. The method presented in this paper has already been deployed in a digital mapping system for practical uses. The flexibility of the method also allows it to be reused and/or adapted in various software platforms for digital mapping.

      PubDate: 2017-05-06T21:34:07Z
  • Large Crater Clustering Tool
    • Abstract: Publication date: Available online 3 May 2017
      Source:Computers & Geosciences
      Author(s): Jason Laura, James A. Skinner Jr., Marc A. Hunter
      In this paper we present the Large Crater Clustering (LCC) tool set, an ArcGIS plugin that supports the quantitative approximation of a primary impact location from user-identified locations of possible secondary impact craters or the long-axes of clustered secondary craters. The identification of primary impact craters directly supports planetary geologic mapping and topical science studies where the chronostratigraphic age of some geologic units may be known, but more distant features have questionable geologic ages. Previous works (e.g., McEwen et al., 2005; Dundas and McEwen, 2007) have shown that the source of secondary impact craters can be estimated from secondary impact craters. This work adapts those methods into a statistically robust tool set. We describe the four individual tools within the LCC tool set to support: (1) processing individually digitized point observations (craters), (2) estimating the directional distribution of a clustered set of craters, back projecting the potential flight paths (crater clusters or linearly approximated catenae or lineaments), (3) intersecting projected paths, and (4) intersecting back-projected trajectories to approximate the local of potential source primary craters. We present two case studies using secondary impact features mapped in two regions of Mars. We demonstrate that the tool is able to quantitatively identify primary impacts and supports the improved qualitative interpretation of potential secondary crater flight trajectories.

      PubDate: 2017-05-06T21:34:07Z
  • Reconstructing daily clear-sky land surface temperature for cloudy regions
           from MODIS data
    • Abstract: Publication date: Available online 29 April 2017
      Source:Computers & Geosciences
      Author(s): Liang Sun, Zhongxin Chen, Feng Gao, Martha Anderson, Lisheng Song, Limin Wang, Bo Hu, Yun Yang
      Land surface temperature (LST) is a critical parameter in environmental studies and resource management. The MODIS LST data product has been widely used in various studies, such as drought monitoring, evapotranspiration mapping, soil moisture estimation and forest fire detection. However, cloud contamination affects thermal band observations and will lead to inconsistent LST results. In this study, we present a new Remotely Sensed DAily land Surface Temperature reconstruction (RSDAST) model that recovers clear sky LST for pixels covered by cloud using only clear-sky neighboring pixels from nearby dates. The reconstructed LST was validated using the original LST pixels. Model shows high accuracy for reconstructing one masked pixel with R2 of 0.995, bias of −0.02K and RMSE of 0.51K. Extended spatial reconstruction results show a better accuracy for flat areas with R2 of 0.72‒0.89, bias of −0.02‒0.21K, and RMSE of 0.92‒1.16K, and for mountain areas with R2 of 0.81‒0.89, bias of −0.35 ‒ −1.52K, and RMSE of 1.42‒2.24K. The reconstructed areas show spatial and temporal patterns that are consistent with the clear neighbor areas. In the reconstructed LST and NDVI triangle feature space which is controlled by soil moisture, LST values distributed reasonably and correspond well to the real soil moisture conditions. Our approach shows great potential for reconstructing clear sky LST under cloudy conditions and provides consistent daily LST which are critical for daily drought monitoring.

      PubDate: 2017-04-30T21:05:50Z
  • StackSplit - a plugin for multi-event shear wave splitting analyses in
    • Abstract: Publication date: Available online 29 April 2017
      Source:Computers & Geosciences
      Author(s): Michael Grund
      SplitLab is a powerful and widely used tool for analysing seismological shear wave splitting of single event measurements. However, in many cases, especially temporary station deployments close to the noisy seaside, ocean bottom or for recordings affected by strong anthropogenic noise, only multi-event approaches provide stable and reliable splitting results. In order to extend the original SplitLab environment for such analyses, I present the StackSplit plugin that can easily be implemented within the well accepted main program. StackSplit grants easy access to several different analysis approaches within SplitLab, including a new multiple waveform based inversion method as well as the most established standard stacking procedures. The possibility to switch between different analysis approaches at any time allows the user for the most flexible processing of individual multi-event splitting measurements for a single recording station. Besides the provided functions of the plugin, no other external program is needed for the multi-event analyses since StackSplit performs within the available SplitLab structure which is based on MATLAB. The effectiveness and use of this plugin is demonstrated with data examples of a long running seismological recording station in Finland.

      PubDate: 2017-04-30T21:05:50Z
  • A 3D forward stratigraphic model of fluvial meander-bend evolution for
           prediction of point-bar lithofacies architecture
    • Abstract: Publication date: Available online 28 April 2017
      Source:Computers & Geosciences
      Author(s): Na Yan, Nigel P. Mountney, Luca Colombera, Robert M. Dorrell
      Although fundamental types of fluvial meander-bend transformations – expansion, translation, rotation, and combinations thereof – are widely recognised, the relationship between the migratory behaviour of a meander bend, and its resultant accumulated sedimentary architecture and lithofacies distribution remains relatively poorly understood. Three-dimensional data from both currently active fluvial systems and from ancient preserved successions known from outcrop and subsurface settings are limited. To tackle this problem, a 3D numerical forward stratigraphic model – the Point-Bar Sedimentary Architecture Numerical Deduction (PB-SAND) – has been devised as a tool for the reconstruction and prediction of the complex spatio-temporal migratory evolution of fluvial meanders, their generated bar forms and the associated lithofacies distributions that accumulate as heterogeneous fluvial successions. PB-SAND uses a dominantly geometric modelling approach supplemented by process-based and stochastic model components, and is constrained by quantified sedimentological data derived from modern point bars or ancient successions that represent suitable analogues. The model predicts the internal architecture and geometry of fluvial point-bar elements in three dimensions. The model is applied to predict the sedimentary lithofacies architecture of ancient preserved point-bar and counter-point-bar deposits of the middle Jurassic Scalby Formation (North Yorkshire, UK) to demonstrate the predictive capabilities of PB-SAND in modelling 3D architectures of different types of meander-bend transformations. PB-SAND serves as a practical tool with which to predict heterogeneity in subsurface hydrocarbon reservoirs and water aquifers.
      Graphical abstract image

      PubDate: 2017-04-30T21:05:50Z
  • Fast hyperbolic Radon transform represented as convolutions in log-polar
    • Abstract: Publication date: Available online 28 April 2017
      Source:Computers & Geosciences
      Author(s): Viktor V. Nikitin, Fredrik Andersson, Marcus Carlsson, Anton A. Duchkov
      The hyperbolic Radon transform is a commonly used tool in seismic processing, for instance in seismic velocity analysis, data interpolation and for multiple removal. A direct implementation by summation of traces with different moveouts is computationally expensive for large data sets. In this paper we present a new method for fast computation of the hyperbolic Radon transforms. It is based on using a log-polar sampling with which the main computational parts reduce to computing convolutions. This allows for fast implementations by means of FFT. In addition to the FFT operations, interpolation procedures are required for switching between coordinates in the time-offset; Radon; and log-polar domains. Graphical Processor Units (GPUs) are suitable to use as a computational platform for this purpose, due to the hardware supported interpolation routines as well as optimized routines for FFT. Performance tests show large speed-ups of the proposed algorithm. Hence, it is suitable to use in iterative methods, and we provide examples for data interpolation and multiple removal using this approach.

      PubDate: 2017-04-30T21:05:50Z
  • An efficient regularization method for a large scale ill-posed geothermal
    • Abstract: Publication date: Available online 27 April 2017
      Source:Computers & Geosciences
      Author(s): Fredrik Berntsson, Chen Lin, Tao Xu, Dennis Wokiyi
      The inverse geothermal problem consists of estimating the temperature distribution below the earth's surface using measurements on the surface. The problem is important since temperature governs a variety of geologic processes, including the generation of magmas and the deformation style of rocks. Since the thermal properties of rocks depend strongly on temperature the problem is non-linear. The problem is formulated as an ill-posed operator equation, where the righthand side is the heat-flux at the surface level. Since the problem is ill-posed regularization is needed. In this study we demonstrate that Tikhonov regularization can be implemented efficiently for solving the operator equation. The algorithm is based on having a code for solving a well-posed problem related to the above mentioned operator. The algorithm is designed in such a way that it can deal with both 2 D and 3 D calculations. Numerical results, for 2 D domains, show that the algorithm works well and the inverse problem can be solved accurately with a realistic noise level in the surface data.

      PubDate: 2017-04-30T21:05:50Z
    • Abstract: Publication date: Available online 27 April 2017
      Source:Computers & Geosciences
      Author(s): Angel Martín, Ana Belén Anquela Julián, Alejandro Dimas-Pages, Fernando Cos-Gayón
      Precise point positioning (PPP) is a well established Global Navigation Satellite System (GNSS) technique that only requires information from the receiver (or rover) to obtain high-precision position coordinates. This is a very interesting and promising technique because eliminates the need for a reference station near the rover receiver or a network of reference stations, thus reducing the cost of a GNSS survey. From a computational perspective, there are two ways to solve the system of observation equations produced by static PPP either in a single step (so-called batch adjustment) or with a sequential adjustment/filter. The results of each should be the same if they are both well implemented. However, if a sequential solution (that is, not only the final coordinates, but also those observed in previous GNSS epochs), is needed, as for convergence studies, finding a batch solution becomes a very time consuming task owing to the need for matrix inversion that accumulates with each consecutive epoch. This is not a problem for the filter solution, which uses information computed in the previous epoch for the solution of the current epoch. Thus filter implementations need extra considerations of user dynamics and parameter state variations between observation epochs with appropriate stochastic update parameter variances from epoch to epoch. These filtering considerations are not needed in batch adjustment, which makes it attractive. The main objective of this research is to significantly reduce the computation time required to obtain sequential results using batch adjustment. The new method we implemented in the adjustment process led to a mean reduction in computational time by 45%. .

      PubDate: 2017-04-30T21:05:50Z
  • Ensemble predictive model for more accurate soil organic carbon
           spectroscopic estimation
    • Abstract: Publication date: Available online 26 April 2017
      Source:Computers & Geosciences
      Author(s): Radim Vašát, Radka Kodešová, Luboš Borůvka
      A myriad of signal pre-processing strategies and multivariate calibration techniques has been explored in attempt to improve the spectroscopic prediction of soil organic carbon (SOC) over the last few decades. Therefore, to come up with a novel, more powerful, and accurate predictive approach to beat the rank becomes a challenging task. However, there may be a way, so that combine several individual predictions into a single final one (according to ensemble learning theory). As this approach performs best when combining in nature different predictive algorithms that are calibrated with structurally different predictor variables, we tested predictors of two different kinds: 1) reflectance values (or transforms) at each wavelength and 2) absorption feature parameters. Consequently we applied four different calibration techniques, two per each type of predictors: a) partial least squares regression and support vector machines for type 1, and b) multiple linear regression and random forest for type 2. The weights to be assigned to individual predictions within the ensemble model (constructed as a weighted average) were determined by an automated procedure that ensured the best solution among all possible was selected. The approach was tested at soil samples taken from surface horizon of four sites differing in the prevailing soil units. By employing the ensemble predictive model the prediction accuracy of SOC improved at all four sites. The coefficient of determination in cross-validation (R2 cv) increased from 0.849, 0.611, 0.811 and 0.644 (the best individual predictions) to 0.864, 0.650, 0.824 and 0.698 for Site 1, 2, 3 and 4, respectively. Generally, the ensemble model affected the final prediction so that the maximal deviations of predicted vs. observed values of the individual predictions were reduced, and thus the correlation cloud became thinner as desired.

      PubDate: 2017-04-30T21:05:50Z
  • A training image evaluation and selection method based on minimum data
           event distance for multiple-point geostatistics
    • Abstract: Publication date: July 2017
      Source:Computers & Geosciences, Volume 104
      Author(s): Wenjie Feng, Shenghe Wu, Yanshu Yin, Jiajia Zhang, Ke Zhang
      A training image (TI) can be regarded as a database of spatial structures and their low to higher order statistics used in multiple-point geostatistics (MPS) simulation. Presently, there are a number of methods to construct a series of candidate TIs (CTIs) for MPS simulation based on a modeler's subjective criteria. The spatial structures of TIs are often various, meaning that the compatibilities of different CTIs with the conditioning data are different. Therefore, evaluation and optimal selection of CTIs before MPS simulation is essential. This paper proposes a CTI evaluation and optimal selection method based on minimum data event distance (MDevD). In the proposed method, a set of MDevD properties are established through calculation of the MDevD of conditioning data events in each CTI. Then, CTIs are evaluated and ranked according to the mean value and variance of the MDevD properties. The smaller the mean value and variance of an MDevD property are, the more compatible the corresponding CTI is with the conditioning data. In addition, data events with low compatibility in the conditioning data grid can be located to help modelers select a set of complementary CTIs for MPS simulation. The MDevD property can also help to narrow the range of the distance threshold for MPS simulation. The proposed method was evaluated using three examples: a 2D categorical example, a 2D continuous example, and an actual 3D oil reservoir case study. To illustrate the method, a C++ implementation of the method is attached to the paper.

      PubDate: 2017-04-17T11:27:20Z
  • Analysis of Training Sample Selection Strategies for Regression-Based
           Quantitative Landslide Susceptibility Mapping Methods
    • Abstract: Publication date: Available online 11 April 2017
      Source:Computers & Geosciences
      Author(s): Arzu Erener, A. Abdullah Sivas, A. Sevtap Selcuk-Kestel, H. Sebnem Düzgün
      All of the quantitative landslide susceptibility mapping (QLSM) methods requires two basic data types, namely, landslide inventory and factors that influence landslide occurrence (landslide influencing factors, LIF). Depending on type of landslides, nature of triggers and LIF, accuracy of the QLSM methods differs. Moreover, how to balance the number of 0 (nonoccurrence) and 1 (occurrence) in the training set obtained from the landslide inventory and how to select which one of the 1's and 0's to be included in QLSM models play critical role in the accuracy of the QLSM. Although performance of various QLSM methods is largely investigated in the literature, the challenge of training set construction is not adequately investigated for the QLSM methods. In order to tackle this challenge, in this study three different training set selection strategies along with the original data set is used for testing the performance of three different regression methods namely Logistic Regression (LR), Bayesian Logistic Regression (BLR) and Fuzzy Logistic Regression (FLR). The first sampling strategy is proportional random sampling (PRS), which takes into account a weighted selection of landslide occurrences in the sample set. The second method, namely non-selective nearby sampling (NNS), includes randomly selected sites and their surrounding neighboring points at certain preselected distances to include the impact of clustering. Selective nearby sampling (SNS) is the third method, which concentrates on the group of 1's and their surrounding neighborhood. A randomly selected group of landslide sites and their neighborhood are considered in the analyses similar to NNS parameters. It is found that LR-PRS, FLR-PRS and BLR-Whole Data set-ups, with order, yield the best fits among the other alternatives. The results indicate that in QLSM based on regression models, avoidance of spatial correlation in the data set is critical for the model's performance.

      PubDate: 2017-04-17T11:27:20Z
  • Seismic traveltime inversion based on tomographic equation without
           integral terms
    • Abstract: Publication date: Available online 6 April 2017
      Source:Computers & Geosciences
      Author(s): Guangnan Huang, Bing Zhou, Hongxing Li, David C. Nobes
      The Jacobian matrix in the seismic traveltime tomographic equations usually contains several integral terms. These integral expressions not only greatly increase the computational complexity of seismic traveltime tomography, but also increase difficulty for programming these expressions. Therefore, if these integral expressions of the Jacobian matrix can be eliminated, the program of seismic traveltime tomography can be greatly simplified. In order to solve the computational complexity of the traditional seismic traveltime tomography, we found an anisotropic seismic traveltime tomographic equation which doesn’t contain integral expressions. Then, it is degenerated into an isotropic seismic traveltime tomographic equation. In order to verify the effectiveness of this seismic traveltime tomographic equation based on the node network, a program has been coded to execute seismic traveltime inversion. For a crosswell checkerboard velocity model, the same results are obtained by this proposed tomographic method and the traditional method (with integral terms). Besides, two undulating topography velocity models are used as testing models. Numerical simulation results show that this proposed tomographic method can achieve good tomograms. Finally, this proposed tomographic method is used to investigate near surface velocity distribution near a power plant. Tomogram indicates that contaminated liquid diffuses and aggregates along strata at a certain depth. And velocity is lower near pollutant source than that away from it.

      PubDate: 2017-04-10T09:32:56Z
  • A framework for interactive visual analysis of heterogeneous marine data
           in an integrated problem solving environment
    • Abstract: Publication date: Available online 5 April 2017
      Source:Computers & Geosciences
      Author(s): Shuai Liu, Ge Chen, Shifeng Yao, Fenglin Tian, Wei Liu
      This paper presents a novel integrated marine visualization framework which focuses on processing, analyzing the multi-dimension spatiotemporal marine data in one workflow. Effective marine data visualization is needed in terms of extracting useful patterns, recognizing changes, and understanding physical processes in oceanography researches. However, the multi-source, multi-format, multi-dimension characteristics of marine data pose a challenge for interactive and feasible (timely) marine data analysis and visualization in one workflow. And, global multi-resolution virtual terrain environment is also needed to give oceanographers and the public a real geographic background reference and to help them to identify the geographical variation of ocean phenomena. This paper introduces a data integration and processing method to efficiently visualize and analyze the heterogeneous marine data. Based on the data we processed, several GPU-based visualization methods are explored to interactively demonstrate marine data. GPU-tessellated global terrain rendering using ETOPO1 data is realized and the video memory usage is controlled to ensure high efficiency. A modified ray-casting algorithm for the uneven multi-section Argo volume data is also presented and the transfer function is designed to analyze the 3D structure of ocean phenomena. Based on the framework we designed, an integrated visualization system is realized. The effectiveness and efficiency of the framework is demonstrated. This system is expected to make a significant contribution to the demonstration and understanding of marine physical process in a virtual global environment.

      PubDate: 2017-04-10T09:32:56Z
  • Impact of mineralogical heterogeneity on reactive transport modelling
    • Abstract: Publication date: Available online 2 April 2017
      Source:Computers & Geosciences
      Author(s): Min Liu, Mehdi Shabaninejad, Peyman Mostaghimi
      Impact of mineralogical heterogeneity of rocks in reactive modelling is investigated by applying a pore scale model based on Lattice Boltzmann and Finite Volume Methods. Mass transport, chemical reaction and solid structure modification are included in the model. A two-dimensional mineral map of a sandstone rock is acquired using the imaging technique of QEMSCAN SEM with Energy-Dispersive X-ray Spectroscopy (EDS). The mineralogical heterogeneity is explored by performing multi-mineral reaction simulation on images containing various minerals. The results are then compared with the predictions of single mineral dissolution modelling. Dissolution patterns and permeability variations of multi-mineral and single mineral reactions are presented. The errors of single mineral reaction modelling are also estimated. Numerical results show that mineralogical heterogeneity can cause significant errors in permeability predictions, if a uniform mineral distribution is assumed which are related with flow regimes. The errors are smaller in high Péclet regimes than in low Péclet regimes in this sample.

      PubDate: 2017-04-03T11:40:51Z
  • A MATLAB based 3D modeling and inversion code for MT data
    • Abstract: Publication date: Available online 27 March 2017
      Source:Computers & Geosciences
      Author(s): Arun Singh, Rahul Dehiya, Pravin K. Gupta, M. Israil
      The development of a MATLAB based computer code, AP3DMT, for modeling and inversion of 3D Magnetotelluric (MT) data is presented. The code comprises two independent components: grid generator code and modeling/inversion code. The grid generator code performs model discretization and acts as an interface by generating various I/O files. The inversion code performs core computations in modular form – forward modeling, data functionals, sensitivity computations and regularization. These modules can be readily extended to other similar inverse problems like Controlled-Source EM (CSEM). The modular structure of the code provides a framework useful for implementation of new applications and inversion algorithms. The use of MATLAB and its libraries makes it more compact and user friendly. The code has been validated on several published models. To demonstrate its versatility and capabilities the results of inversion for two complex models are presented.

      PubDate: 2017-04-03T11:40:51Z
School of Mathematical and Computer Sciences
Heriot-Watt University
Edinburgh, EH14 4AS, UK
Tel: +00 44 (0)131 4513762
Fax: +00 44 (0)131 4513327
Home (Search)
Subjects A-Z
Publishers A-Z
Your IP address:
About JournalTOCs
News (blog, publications)
JournalTOCs on Twitter   JournalTOCs on Facebook

JournalTOCs © 2009-2016