Followed Journals
Journal you Follow: 0
 
Sign Up to follow journals, search in your chosen journals and, optionally, receive Email Alerts when new issues of your Followed Journals are published.
Already have an account? Sign In to see the journals you follow.
Similar Journals
Journal Cover
Computers & Geosciences
Journal Prestige (SJR): 1.35
Citation Impact (citeScore): 3
Number of Followers: 31  
 
  Hybrid Journal Hybrid journal (It can contain Open Access articles)
ISSN (Print) 0098-3004
Published by Elsevier Homepage  [3206 journals]
  • Development of hierarchical terron workflow based on gridded data –
           A case study in Denmark
    • Abstract: Publication date: Available online 26 February 2020Source: Computers & GeosciencesAuthor(s): Yannik E. Roell, Yi Peng, Amélie Beucher, Mette B. Greve, Mogens H. Greve
       
  • Crosstalk-free simultaneous-source full waveform inversion with normalized
           seismic data
    • Abstract: Publication date: Available online 26 February 2020Source: Computers & GeosciencesAuthor(s): Qingchen Zhang, Weijian Mao, Jinwei Fang
       
  • A 3D sketch-based formulation to model salt bodies from seismic data
    • Abstract: Publication date: Available online 26 February 2020Source: Computers & GeosciencesAuthor(s): Suellen Motta, Anselmo Montenegro, Marcelo Gattass, Deane Roehl
       
  • Ontology-driven representation of knowledge for geological maps
    • Abstract: Publication date: Available online 26 February 2020Source: Computers & GeosciencesAuthor(s): Alizia Mantovani, Vincenzo Lombardo, Fabrizio Piana
       
  • Numerical stratigraphic forward models as conceptual knowledge
           repositories and experimental tools: An example using CarboCAT new version
           
    • Abstract: Publication date: Available online 24 February 2020Source: Computers & GeosciencesAuthor(s): Isabella Masiero, Estanislao Kozlowski, Georgios Antonatos, Haiwei Xi, Peter Burgess
       
  • 1D geological imaging of the subsurface from geophysical data with
           Bayesian Evidential Learning. Part 2: Applications and software
    • Abstract: Publication date: Available online 24 February 2020Source: Computers & GeosciencesAuthor(s): Hadrien Michel, Thomas Hermans, Thomas Kremer, Ann Elen, Frédéric Nguyen
       
  • Mapping Himalayan leucogranites using a hybrid method of metric learning
           and support vector machine1
    • Abstract: Publication date: Available online 21 February 2020Source: Computers & GeosciencesAuthor(s): Ziye Wang, Renguang Zuo, Yanni DongAbstractRare metals play a considerable role in the development of new materials and energy, making them key mineral resources for global competition. Widely distributed along the Himalayan orogen, the Himalayan leucogranite belt is expected to be an important rare metal metallogenic belt in China. Thus, mapping the spatial distribution of Himalayan leucogranites is critical for prospecting rare metal deposits. The distribution characteristics of geochemical elements are important indicators for lithological identification. The differences in mineral composition and major oxide content between leucogranites and the surrounding rocks facilitate lithological mapping. However, significant uncertainty could arise owing to limited geochemical data due to particularly adverse working conditions and to difficulty in handling similar geochemical data. In this study, a metric learning-based approach is used for mapping leucogranites based on regional geochemical exploration datasets. Defined as a measure of similarity between two samples, metric learning reveals a “better distance” by converting original data into a more suitable Mahalanobis metric space with maximum separation of the target and the background. In this approach, a local weighted metric learning method is first used to assign weights to the training samples in the neighborhood, with respect to their reconstruction contributions in learning the local metric. Then, a discriminative local ensemble learning method is employed to integrate all learned metrics and to convert the original geochemical data into a metric space. This enables more effective separation of highly similar target leucogranites from the surrounding rocks with the help of a support vector machine. The distribution of leucogranites mapped by such a hybrid method showed greater consistency with the geological map, indicating that this approach is reasonable for providing the indicated signature of leucogranites mapping in the study area. These results further provide an alternative way for identifying favorable intrusions based on geochemical exploration data.
       
  • A method and software solution for classifying clast roundness based on
           the radon transform
    • Abstract: Publication date: Available online 19 February 2020Source: Computers & GeosciencesAuthor(s): G. Moreno Chávez, Jesús Villa, D. Sarocchi, Efrén González-RamírezAbstractIn this paper, an algorithm for clast roundness classification based on the Radon transform is presented. The degree of roundness is determined by processing the sinogram of the clast image. The algorithm consists in applying two low-pass filters to the sinogram, obtaining the inverse Radon transform and comparing the filtered images with the original image. For rounded particles, the difference between the original image and either of the filtered images will be small. For angular clasts, the difference will be greater than for rounded clasts, due to the presence of high-frequency components. In the comparison process, each of the two filtered images are subtracted from the original image to yield two difference images. Since the data are binary, these two images present topologically unconnected regions that correspond to the particle's edges. The percentage of non-overlapping area between the original and the difference images, and the number of regions are used to classify the morphology of the clast. The results have been validated using a comparison chart designed for visual roundness estimation. The comparison chart, consisting of five roundness classes, was proposed by Russell, Taylor and, Pettijohn (Müller, 1967). Two cutoff frequencies, one to classify well-rounded, rounded and sub-rounded clasts and another for angular and sub-angular classes, were used. The proposed algorithm correctly classifies the roundness classes of the visual graph. The results provided by the algorithm were compared with the classification performed by a group of experts. The algorithm assigned 92% of the clasts to the same classes as the human experts. We also propose Gaussian models, which are useful to classify the particles into the five classes. We have developed a user-friendly software to carry out the roundness classification algorithm. This software was developed on the MATLAB platform and can be freely downloaded from the public repository.
       
  • Using wavelet filtering to perform seismometer azimuth calculation and
           data correction
    • Abstract: Publication date: Available online 15 February 2020Source: Computers & GeosciencesAuthor(s): Penghui Wang, Yunyao Zhou, Yongqing Lv, Ya XiangAbstractSince errors frequently occur in the measurement of seismometer azimuths installed in non-standard observation stations, it is necessary to estimate the potential azimuth deviations and correct the data. Another similar problem is calculating borehole seismometer azimuths. In this paper, a new method is proposed for the rapid calculation of seismometer azimuths, which differs from the current correlation analysis method. This new method uses the relationship between the signals of two adjacent seismometers to calculate a series of parameters related to the seismometers' projection angles. The use of a wavelet filter enables dataset processing to obtain a reference value. The necessary azimuth is then estimated using an inverse trigonometric function against the reference value. Once an azimuth value is estimated, it can be used to correct the output signal and approximate an expected output that meets the azimuth standard. A series of tests were conducted to quantify this method's calculation effect. The results indicate that the proposed method is feasible and, more importantly, holds an advantage in terms of calculation speed. The method can be used within an acceptable margin of error for the rapid estimation of seismometer azimuths and data correction. Possible avenues for improving this method are also discussed.
       
  • Comparative study of landslide susceptibility mapping with different
           recurrent neural networks
    • Abstract: Publication date: Available online 15 February 2020Source: Computers & GeosciencesAuthor(s): Yi Wang, Zhice Fang, Mao Wang, Ling Peng, Haoyuan HongAbstractThis paper aims to use recurrent neural networks (RNNs) to perform landslide susceptibility mapping in Yongxin County, China. The two main contributions of this study are summarized as follows. First, the regular RNN is compared to its three variants in the case study of landslide susceptibility mapping for the first time, including long short term memory, gated recurrent unit and simple recurrent unit. Second, a sequential data representation method is proposed to fully explore the predicting potential of RNNs. The study area consists of 364 historical landslide locations that were divided into two parts: 255 (70%) for training and 109 (30%) for validation, and 16 landslide influencing factors were considered for spatial prediction. To validate the effectiveness of these RNN-related methods, several objective measures of accuracy, recall, F-measure, Matthews correlation coefficient and the receiver operating characteristic were used for evaluation. Experimental results demonstrate that very high and high susceptible areas are concentrated in the northwest and south of Yongxin County, while landslides in the central area are less prone to occur. Based on quantitative results, all the RNN-related methods achieved area under the curve values above 0.81 and produced the most accurate prediction results with the optimized parameters. Therefore, the RNN framework can be used as a useful tool for the landslide susceptibility mapping task to mitigate and manage landslides.
       
  • Deep learning-based method for SEM image segmentation in mineral
           characterization, an example from Duvernay Shale samples in Western Canada
           Sedimentary Basin
    • Abstract: Publication date: Available online 15 February 2020Source: Computers & GeosciencesAuthor(s): Zhuoheng Chen, Xiaojun Liu, Jijin Yang, Edward Little, Yu ZhouAbstractTexture-based feature extraction and object segmentation are challenging in image processing. In this study, the U-Net architecture developed for biomedical image analysis was used to evaluate geologic characteristics depicted within scanning electron microscope (SEM) images of shale samples. With a revised weight function, the U-Net architecture allowed for effective discrimination of clay aggregates mixed with matrix mineral particles and organic matter (OM). In training, a local variability weight based on spatial statistics was used to enhance the contrast between features across boundary in the loss function of U-Net system optimization, thereby improving the ability of U-Net to distinguish the geologic features specific to our research needs. The Tensorflow neural network library was used to create semantic segmentation and feature extraction models in mineral identification. In the application example of the Devonian Duvernay shale study, we prepared 8000 randomly sliced image cuts (256x256 pixels) from four masked image tiles (6144x6144 pixels) with tagged feature objects, among which 6400 are for training and the remaining 1600 held image slices for validation. In the validation, the average of intersection over union (IOU) reach 91.7%. The trained model approved by validation was used for clay aggregate segmentation and mineral classification. Three hundred SEM image tiles of source rock samples from different maturities in the Duvernay Formation were processed using the proposed workflow. The results show that the clay aggregates are clearly separated from other matrix mineral particles with acceptable boundaries, although both exhibit indistinguishable gray-level pixels. This approach demonstrates that texture-based deep learning feature extraction is feasible, cost-effective and timely, and can help geoscientists gain new insights by quantitatively analyzing specific geological characteristics and features.
       
  • 3D rock fabric analysis using micro-tomography: An introduction to the
           open-source TomoFab MATLAB code
    • Abstract: Publication date: Available online 15 February 2020Source: Computers & GeosciencesAuthor(s): Benoît Petri, Bjarne S.G. Almqvist, Mattia PistoneAbstractThe study of rock fabric properties (orientation, planar, linear, anisotropy) is key to unravelling the geological processes that generated them. With advancements in data acquisition and treatment, X-ray micro-computed tomography (μXCT) represents a powerful method to analyse the shape preferred orientation (SPO) of rock-forming elements, including minerals, aggregates, and pores, in the three-dimensional space. After reconstruction and segmentation of μXCT images, we developed a novel protocol to construct and analyse the fabric tensor, a second-rank symmetric tensor constructed using the orientation and the length of the three characteristic axes of each grain (simplified to a best fit ellipsoid). The analysis of the fabric tensor permits calculation of mean principal directions and associated confidence ellipses, and quantifies the degree of anisotropy (P′) and the shape (T) of the fabric ellipsoid by eigenvalue and eigenvector analysis.We implement this method in the TomoFab open-source MATLAB package. The code integrates a graphical user interface (GUI) that allows the visualisation of the full set of ellipsoid orientation, shape, and size. Density plots and contouring can be utilised to identify fabrics graphically, and a full set of fabric parameters can be calculated based on the analysis of the fabric tensor and/or the analysis of each principal direction orientation tensor.We demonstrate the versatility of TomoFab with synthetic datasets and a field- and laboratory-based investigation of a sample presenting a magmatic foliation and lineation, collected in the Mafic Complex within the lower crustal section of the Ivrea-Verbano Zone (North Italy). In the light of these developments, we stress that μXCT represents a pertinent tool for rock fabric analysis to characterise the SPO of rock components. This approach can be performed parallel or complementary to other rock fabric quantification methods (e.g., AMS, EBSD) and applied to various rock types. TomoFab is freely available for download at https://github.com/benpetri/tomofab.
       
  • Reliable Euler deconvolution estimates throughout the vertical derivatives
           of the total-field anomaly
    • Abstract: Publication date: Available online 13 February 2020Source: Computers & GeosciencesAuthor(s): Felipe F. Melo, Valéria C.F. BarbosaAbstractWe propose a novel methodology to select reliable Euler deconvolution estimates throughout the vertical derivatives of the total-field anomaly, grounded on the capability of this quantity to locate anomalies due to its higher signal decay with distance. In applying Euler deconvolution to a small moving-data window, we compute the standard deviation of the vertical derivatives of the total-field anomaly for each data window. Then, we define the reliable source-location estimates as those estimates that are obtained by using the data windows with the largest standard deviations of the vertical derivatives of the total-field anomaly. For all tentative values of the structural index (SI), the reliable estimates with tight clustering define the correct SI and the mean of these estimates define the source position. We compared our methodology to select reliable Euler source-position estimates with two available methodologies in the literature based on the rejection criteria of data amplitude and of depth uncertainty. We conducted tests on synthetic noise-corrupted data to investigate the sensitivity of our method to deal with the presence of: i) an additive nonlinear background that simulates a regional field; ii) interfering anomalies with distinct amplitudes; iii) weak-, mid-, and strong-interfering anomalies; and iv) distinct noise levels. Most of tests in the sensitivity analysis shows that our methodology to select reliable Euler source-position estimates yielded better interpretation of the simulated magnetic sources than the methodology based on the rejection criteria of data amplitude and of depth uncertainty. The only exception was the tests with high noise levels in which the reliable Euler estimates selected either by our method or by the rejection criteria yielded poor interpretations. Applications to a real aeromagnetic survey from southern Brazil interpreted an isolated plug intrusion over the Anitápolis anomaly and a swarm of shallow-seated dikes with northwest-southeast trend over the Paranaguá Terrane.
       
  • A novel cellular automata model integrated with deep learning for dynamic
           spatio-temporal land use change simulation
    • Abstract: Publication date: Available online 8 February 2020Source: Computers & GeosciencesAuthor(s): Weiran Xing, Yuehui Qian, Xuefeng Guan, Tingting Yang, Huayi WuAbstractLand use change (LUC) exhibits obvious spatio-temporal dependency. Previous cellular automata (CA)-based methods usually treated the LUC dynamics as Markov processes and proposed a series of CA-Markov models, which however, were intrinsically unable to capture the long-term temporal dependency. Meanwhile, such models used only numerical proportion of neighboring land use (LU) types to represent neighborhood effects of LUC, which inevitably neglected the complicated spatial heterogeneity and thus caused inaccurate simulation results. To address these problems, this paper presents a novel CA model integrated with deep learning (DL) techniques to model spatio-temporal LUC dynamics. Our DL-CA model firstly uses a convolutional neural network to capture latent spatial features for complete representation of neighborhood effects. A recurrent neural network then extracts historical information of LUC from time-series land use maps. A random forest is appended as binary change predictor to avoid the imbalanced sample problem during model training.Land use data collected from 2000 to 2014 of the Dongguan City, China were used to verify our proposed DL-CA model. The input data from 2000 to 2009 were used for model training, the 2010 data for model validation, and the data collected from 2011 to 2014 were used for model evaluation. In addition, four traditional CA models of multilayer perceptron (MLP)-CA, support vector machine (SVM)-CA, logistic regression (LR)-CA and random forest (RF)-CA were also developed for accuracy comparisons. The simulation results demonstrate that the proposed DL-CA model accurately captures long-term spatio-temporal dependency for more accurate LUC prediction results. The DL-CA model raised prediction accuracy by 9.3%–11.67% in 2011–2014 in contrast to traditional CA models.
       
  • BP neural network and improved differential evolution for transient
           electromagnetic inversion
    • Abstract: Publication date: Available online 7 February 2020Source: Computers & GeosciencesAuthor(s): Ruiyou Li, Huaiqing Zhang, Qiong Zhuang, Ruiheng Li, Yue ChenAbstractIn the transient electromagnetic (TEM) inversion, the BP neural network method has high efficiency owing to avoid the complicated forward model calculation in every iteration. The global optimization ability of the differential evolution Algorithm (DE) is adopted for amending BP's sensitive to initial parameters. A chaotic mutation and crossover with constraint factor DE (CCDE) is proposed in improving the global optimization ability. The CCDE-BP algorithm performance is validated by two typical testing functions and then by two geoelectric models inversion. The results show that the CCDE-BP method has better inversion speed, accuracy and stability yet with higher fitting degree. It is feasible in geophysical inverse applications.
       
  • Semi-automated component identification of a complex fracture network
           using a mixture of von Mises distributions: Application to the Ardeche
           margin (South-East France)
    • Abstract: Publication date: Available online 6 February 2020Source: Computers & GeosciencesAuthor(s): Arezki Chabani, Caroline Mehl, Isabelle Cojan, Robin Alais, Dominique BruelAbstractProposing a quantitative description of fracture main orientations is of prime interest for reservoir modeling. Manual sorting of fracture sets is time consuming and requires individual expertise. Semi automated methods for determination of the number of fracture sets are not developed in structural geology despite complex fracture networks being common. This study aims at demonstrating the input of mixture of von Mises (MvM) distributions to model complex fracture datasets, based on data from the Ardeche margin (7800 km2 SE France).An appraisal test selects the optimized number of components, without any a priori, by plotting the cumulative weights of MvM components versus concentrations. Estimation of an index of concentration (I70) is added to explicitly estimate the angular range around the mean, such that the probability of falling in the interval [μ – I70/2; μ + I70/2] is 0.7. Fitting and model selections are discussed on three datasets (fractures from geological maps at 1: 50,000 and 1: 250,000 and lineaments from a digital elevation model (DEM)), for basement and sedimentary cover data analyzed separately. The five component MvM distributions correspond to the best fit models, for all datasets. The modeled components from the geological maps result in six mean orientations FA to FF, striking N010–020, N050–060, N090–100, N120, N140–150 and N170-180 respectively. Basement records the 6 trends whereas cover records all of them, except FE. Except for the N090-100 trend, modeled components from the lineaments are similar to those obtained from the geological maps. Five of the main trends are consistent with fracture trends deduced from field studies.Estimation robustness is validated by the good reproducibility of results from one geological map to the other. The larger dispersion of means for components FA and FF attests for the complex loading history of fractures corresponding to these components.
       
  • Increasing the maturity of measurements of essential climate variables
           (ECVs) at Italian atmospheric WMO/GAW observatories by implementing
           automated data elaboration chains
    • Abstract: Publication date: Available online 6 February 2020Source: Computers & GeosciencesAuthor(s): Luca Naitza, Paolo Cristofanelli, Angela Marinoni, Francescopiero Calzolari, Fabrizio Roccato, Maurizio Busetto, Damiano Sferlazzo, Eleonora Aruffo, Piero Di Carlo, Mariantonia Bencardino, Francesco D’Amore, Francesca Sprovieri, Nicola Pirrone, Federico Dallo, Jacopo Gabrieli, Massimiliano Vardè, Giorgio Resci, Carlo Barbante, Paolo Bonasoni, Davide PuteroAbstractIn the framework of the National Project of Interest NextData, we developed automatic procedures for the flagging and formatting of trace gases, atmospheric aerosols and meteorological data to be submitted to the World Data Centers (WDCs) of the Global Atmosphere Watch program of the World Meteorological Organization (WMO/GAW). In particular, the atmospheric Essential Climate Variables (ECVs) covered in this work are observations of near-surface trace gas concentrations, aerosol properties and meteorological variables, which are under the umbrella of the World Data Center for Greenhouse Gases (WDCGG), the World Data Center for Reactive Gases, and the World Data Center for Aerosol (WDCRG and WDCA). We developed an overarching processing chain to create a number of data products (data files and reports) starting from the raw data, finally contributing to increase the maturity of these measurements. To this aim, we implemented specific routines for data filtering, flagging, format harmonization, and creation of data products, useful for detecting instrumental problems, particular atmospheric events and quick data dissemination towards stakeholders or citizens. Currently, the automatic data processing is active for a subset of ECVs at 5 measurement sites in Italy. The system represents a valuable tool to facilitate data originators towards a more efficient data production. Our effort is expected to accelerate the process of data submission to WMO/GAW or to other reference data centers or repositories. Moreover, the adoption of automatic procedures for data flagging and data correction allows to keep track of the process that led to the final validated data, and makes data evaluation and revisions more efficient by improving the traceability of the data production process.
       
  • Integrated multi-scale reservoir data representation and indexing for
           reservoir data management and characterization
    • Abstract: Publication date: Available online 4 February 2020Source: Computers & GeosciencesAuthor(s): Fangyu Li, Kailang Huang, Mao Pan, Xi ChenAbstractA single scale cannot satisfy the requirements of all researchers. The management of multi-scale data is thus important to ensure appropriate representation of geological structures and reservoir characterisation. However, existing methods generally focus on two-dimensional vector maps, and existing multi-scale models and 3D indexing methods are not completely suitable for reservoir data organization and management, and the relationship between spatial objects and the heterogeneity in spatial variations are not considered. To address this issue, this study aimed to develop a multi-scale reservoir model from the finest reservoir model and establish the relationship between models of different scales and integrate multi-scale reservoir models. This paper proposes a multi-scale tree representation and generation method based on corner point grid (CPG) that integrates multi-scale reservoir model representation and spatial index. Experiments on integrated indexing and rendering were performed using the proposed method, and its performance was evaluated through comparison with existing methods. The results showed that the proposed integrated indexing is significantly faster than a non-vertical indexing method. In addition, the proposed algorithm consumed much less time than a popular program at rendering geological objects, and it was also faster at performing operations such as translation and rotation. Overall, efficient query and visualization of arbitrary layers at arbitrary scales and arbitrary regions could be realized.
       
  • ResIPy, an intuitive open source software for complex geoelectrical
           inversion/modeling
    • Abstract: Publication date: Available online 4 February 2020Source: Computers & GeosciencesAuthor(s): Guillaume Blanchy, Sina Saneiyan, James Boyd, Paul McLachlan, Andrew BinleyAbstractElectrical resistivity tomography (ERT) and induced polarization (IP) methods are now widely used in many interdisciplinary projects. Although field surveys using these methods are relatively straightforward, ERT and IP data require the application of inverse methods prior to any interpretation. Several established non-commercial inversion codes exist, but they typically require advanced knowledge to use effectively. ResIPy was developed to provide a more intuitive, user-friendly, approach to inversion of geoelectrical data, using an open source graphical user interface (GUI) and a Python application programming interface (API). ResIPy utilizes the mature R2/cR2 inversion codes for ERT and IP, respectively. The ResIPy GUI facilitates data importing, data filtering, error modeling, mesh generation, data inversion and plotting of inverse models. Furthermore, the easy to use design of ResIPy and the help provided inside makes it an effective educational tool. This paper highlights the rationale and structure behind the interface, before demonstrating its capabilities in a range of environmental problems. Specifically, we demonstrate the ease at which ResIPy deals with topography, advanced data processing, the ability to fix and constrain regions of known geoelectrical properties, time-lapse analysis and the capability for forward modeling and survey design.
       
  • 3D modeling of ground-penetrating radar data across a realistic
           sedimentary model
    • Abstract: Publication date: Available online 3 February 2020Source: Computers & GeosciencesAuthor(s): Philipp Koyan, Jens TronickeAbstractGround-penetrating radar (GPR) is an established geophysical tool to explore a wide range of near-surface environments. Today, the use of synthetic GPR data is largely limited to 2D because 3D modeling is computationally more expensive. In fact, only recent developments of modeling tools and powerful hardware allow for a time-efficient computation of extensive 3D data sets. Thus, 3D subsurface models and resulting GPR data sets, which are of great interest to develop and evaluate novel approaches in data analysis and interpretation, have not been made publicly available up to now.We use a published hydrofacies data set of an aquifer-analog study within fluvio-glacial deposits to infer a realistic 3D porosity model showing heterogeneities at multiple spatial scales. Assuming fresh-water saturated sediments, we generate synthetic 3D GPR data across this model using novel GPU-acceleration included in the open-source software gprMax. We present a numerical approach to examine 3D wave-propagation effects in modeled GPR data. Using the results of this examination study, we conduct a spatial model decomposition to enable a computationally efficient 3D simulation of a typical GPR reflection data set across the entire model surface. We process the resulting GPR data set using a standard 3D structural imaging sequence and compare the results to selected input data to demonstrate the feasibility and potential of the presented modeling studies. We conclude on conceivable applications of our 3D GPR reflection data set and the underlying porosity model, which are both publicly available and, thus, can support future methodological developments in GPR and other near-surface geophysical techniques.
       
  • Automated extraction of in situ contact angles from micro-computed
           tomography images of porous media
    • Abstract: Publication date: Available online 3 February 2020Source: Computers & GeosciencesAuthor(s): Anelechi Ibekwe, Dubravka Pokrajac, Yukie TaninoAbstractWe present a simple and robust algorithm for the automated measurement of in situ contact angles from segmented micro-computed tomography images of immiscible fluid pairs in porous media and 2D slices of them. The algorithm comprises three steps: identification of contact points, an initial coarse estimate of contact angle, and its refinement. To obtain the coarse estimate we identify the vectors which point into the wetting phase and are also normal to the fluid/fluid and fluid/solid surfaces within the vicinity of each contact point. The coarse estimate is subsequently refined by fitting planes across fluid/fluid and fluid/solid surfaces to obtain the final estimate of contact angle. The algorithm was applied to a packed bed of glass spheres using air/water as the fluid pairs. A wide distribution of contact angles spanning the full range of possible values was measured for both the 2D and 3D cases. The distributions are skewed towards their respective means and are longer-tailed compared to a Gaussian distribution. The mean contact angles were found to be approximately 65±21°, which is significantly larger than bulk contact angles of 35±3° measured on a flat glass substrate using identical test fluids. The disagreement suggests that bulk contact angles can dramatically underestimate in situ contact angles even in the simplest porous media.
       
  • A 3D model for chimney formation in sedimentary basins
    • Abstract: Publication date: Available online 3 February 2020Source: Computers & GeosciencesAuthor(s): Magnus WangenAbstractThis paper introduces a 3D model for chimney formations in tight rocks in sedimentary basins. This is an adaption of a model for hydraulic fracturing in an anisotropic stress field by fluid injection (fracking). The model assumes that a chimney formation is triggered and sourced by overpressure build-up in permeable units, such as reservoirs or aquifers. Cells in the numerical models fracture when the fluid pressure exceeds the least compressive stress and a random rock strength. Chimney growth is represented by chains of cells (branches) that emanate from the base of the cap rock. The branches have an enhanced permeability during ascension, because the fluid pressure in the fracture network is greater than the least compressive stress. When the branches reach the hydrostatic surface, the fluid pressure drops below the fracture pressure and the fracture network closes. The reservoir is drained by the branches in the closed fracture network that reaches the seafloor. The model produces pipe-like structures and chimneys as accumulations of branches that reach the surface. The degree of random rock strength controls how pipe-like the chimneys become. Chimney formation stops when the rate of fluid leakage through the chimneys surpasses the production of excess fluid by the overpressure-building process. A “low” permeability of the chimney branches produces wide chimneys with many branches, and a “high” permeability gives narrow chimneys made of just a few branches. The model is demonstrated in a setup that could be relevant for the chimneys observed in the cap rock over the Utsira aquifer in the North Sea. By using the proposed model, the permeability of such chimneys is estimated to be of the order of 10 μD.
       
  • Comparing RSVD and Krylov methods for linear inverse problems
    • Abstract: Publication date: Available online 1 February 2020Source: Computers & GeosciencesAuthor(s): Nick Luiken, Tristan van LeeuwenAbstractIn this work we address regularization parameter estimation for ill-posed linear inverse problems with an ℓ2 penalty. Regularization parameter selection is of utmost importance for all of inverse problems and estimating it generally relies on the experience of the practitioner. For regularization with an ℓ2 penalty there exist a lot of parameter selection methods that exploit the fact that the solution and the residual can be written in explicit form. Parameter selection methods are functionals that depend on the regularization parameter where the minimizer is the desired regularization parameter that should lead to a good solution. Evaluation of these parameter selection methods still requires solving the inverse problem multiple times. Efficient evaluation of the parameter selection methods can be done through model order reduction. Two popular model order reduction techniques are Lanczos based methods (a Krylov subspace method) and the Randomized Singular Value Decomposition (RSVD). In this work we compare the two approaches. We derive error bounds for the parameter selection methods using the RSVD. We compare the performance of the Lanczos process versus the performance of RSVD for efficient parameter selection. The RSVD algorithm we use is based on the Adaptive Randomized Range Finder algorithm which allows for easy determination of the dimension of the reduced order model. Some parameter selection also require the evaluation of the trace of a large matrix. We compare the use of a randomized trace estimator versus the use of the Ritz values from the Lanczos process. The examples we use for our experiments are two model problems from geosciences.
       
  • An interactive web-based geovisual analytics platform for co-clustering
           spatio-temporal data
    • Abstract: Publication date: Available online 30 January 2020Source: Computers & GeosciencesAuthor(s): Xiaojing Wu, Ate Poorthuis, Raul Zurita-Milla, Menno-Jan KraakAbstractClustering methods are useful in analyzing patterns from big spatio-temporal data. However, previous studies typically rely on traditional clustering methods to explore spatial or temporal patterns. Co-clustering methods allow the concurrent analysis of spatial and temporal patterns by identifying location- and timestamp-clusters at the same time. By combining co-clustering with coordinated multiple views (CMV) in an interactive geovisual analytics platform, we facilitate the exploratory co-clustering analysis of spatio-temporal data and the results. Further enhanced by Web 2.0 standards, our geovisual analytics platform ease the access to co-clustering analysis from any web browser. More specifically, our platform allows users to upload data and to visually explore it using interactive CMV to help the selection of co-clustering parameters. Our platform also allows users to run co-clustering and to visually and interactively explore the results. To illustrate the use of our platform, we analyze Dutch annual average temperature for 28 stations from 1992 to 2011. Results show that our platform not only helps to get a better understanding of the dataset but also to choose the co-clustering parameters. Our platform helps to interpret the co-clustering results too, and it supports the extraction and exploration of complex patterns buried in the data. In the era of big data, our web-based platform enables the exploration of concurrent spatio-temporal patterns from large datasets by combing both computer power and human interpretative capabilities.
       
  • Coupling OGC WPS and W3C PROV for provenance aware geoprocessing workflows
    • Abstract: Publication date: Available online 28 January 2020Source: Computers & GeosciencesAuthor(s): Mingda Zhang, Liangcun Jiang, Jing Zhao, Peng Yue, Xuequan ZhangAbstractWith the advancement of cyberinfrastructure, an increasing amount of geoporcessing functions are available on the Web. Scientific workflows are frequently used to orchestrate distributed services to address complex geospatial problems. Geospatial data provenance in workflow system is very important because it can help evaluate data reliability and usability, and reproduce data products, especially considering the heterogeneous data and computing resources in Web environment. W3C PROV is an expressive model for provenance information in general domain, which is extended using OGC WPS to describe provenance in geoprocessing workflows. A conceptual model that couples OGC WPS and W3C PROV is proposed, and its XML schema definitions are also implemented. The new model can provide a more complete provenance information including geospatial data and geoprocessing services used, and their plans, which helps advance provenance awareness in workflow system. Coupling OGC WPS and W3C PROV can benefit from the maturity and interoperability of the existing standards.
       
  • Identifying microseismic events in a mining scenario using a convolutional
           neural network
    • Abstract: Publication date: Available online 23 January 2020Source: Computers & GeosciencesAuthor(s): Andy H. Wilkins, Andrew Strange, Yi Duan, Xun LuoAbstractMicroseismic monitoring in mining can potentially forecast catastrophic disasters, and help in the optimization of day-to-day operational and safety issues. However, it is difficult to fully utilize microseismic data due to the large quantity of small microseismic events and because automatically detecting events is frustrated by the pollution of the microseismic signal by instrument noise and mining activities. Recent research has demonstrated that convolutional neural networks (CNNs) trained on vast seismic data sets can accurately detect seismic events. We train a CNN on hand-labeled, modestly sized, multi-channel microseismic data from a coal mine. We make the labeled data available as part of this paper. Using a k-fold cross validation approach, we demonstrate that the CNN surpasses the accuracy of a human microseismic expert, both in picking more true events and in eliminating more spurious (false) events. This demonstrates the feasibility of including a CNN in an automated system to detect, classify and locate microseismic events, for use in mine safety and operation.
       
  • An improved partitioning method for dissolving long and narrow patches
    • Abstract: Publication date: Available online 14 November 2019Source: Computers & GeosciencesAuthor(s): Chengming Li, Pengda Wu, Yong Yin, Wei WuAbstractPartition strategies are the only way to resolve automatic generalization problems for massive map data. Nevertheless, when partition strategies are introduced to dissolve long and narrow patches, traditional partitioning methods are not conducive to balancing the computation load on machines, and grid boundaries easily affect dissolution accuracy and stability of the process. Hence, this paper proposes an improved partitioning method for dissolving long and narrow patches. First, the vast quantity of patches is roughly partitioned using regular grids, and then the grids are meticulously partitioned based on the area balance of long and narrow patches. Second, the topology is established and combined with the semantic information of the boundary arcs to correct the boundaries of the regular grids. Finally, the actual data from the national geographic census at 1:10000 scale (125779 patches) for Chishui City, Guizhou Province are used for validation. The results demonstrate that the proposed method is highly efficient and rational.
       
 
JournalTOCs
School of Mathematical and Computer Sciences
Heriot-Watt University
Edinburgh, EH14 4AS, UK
Email: journaltocs@hw.ac.uk
Tel: +00 44 (0)131 4513762
 


Your IP address: 3.94.202.172
 
Home (Search)
API
About JournalTOCs
News (blog, publications)
JournalTOCs on Twitter   JournalTOCs on Facebook

JournalTOCs © 2009-