A  B  C  D  E  F  G  H  I  J  K  L  M  N  O  P  Q  R  S  T  U  V  W  X  Y  Z  

  Subjects -> SCIENCES: COMPREHENSIVE WORKS (Total: 374 journals)
The end of the list has been reached or no journals were found for your choice.
Similar Journals
Journal Cover
J : Multidisciplinary Scientific Journal
Number of Followers: 0  

  This is an Open Access Journal Open Access journal
ISSN (Online) 2571-8800
Published by MDPI Homepage  [84 journals]
  • J, Vol. 5, Pages 198-213: Examination of the Performance of a Three-Phase
           Atmospheric Turbulence Model for Line-Source Dispersion Modeling Using
           Multiple Air Quality Datasets

    • Authors: Saisantosh Vamshi Harsha Madiraju, Ashok Kumar
      First page: 198
      Abstract: One of the weaknesses of current line-source models for predicting downwind concentrations from mobile sources is accounting for the dispersion of effluents. Most of the investigators in the field have taken different approaches over the last 50 years, ranging from the use of Pasquill–Gifford (P-G) dispersion curves to the use of equations based on atmospheric turbulence for point source dispersion. Madiraju and Kumar (2021) proposed a three-phase turbulence (TPT) model using the key features of mobile source dispersion that appear in the existing literature. This paper examines the performance of line-source models using an updated TPT model. The generic dispersion equations were considered from the SLINE 1.1, CALINE 4, ADMS, and SLSM models. Multiple air quality field data sets collected by other investigators near the roadways were used during this study. These include field data collected from the Idaho Falls Tracer Experiment 2008 (used as the dataset to compare with the initial model), the CALTRANS Highway 99 Tracer experiment, and the Raleigh 2006 experiment. The predicted concentrations were grouped under unstable and stable atmospheric conditions. The evaluation of the model was performed using several statistical parameters such as FB, NMSE, R2, MG, VG, MSLE, and MAPE. The results indicate that the ADMS and SLINE 1.1 models perform better than CALINE4 and SLSM. SLINE 1.1 tends to overpredict for stable atmospheric conditions and underpredict for unstable atmospheric conditions. A trial test was performed to implement the TPT model in the basic line-source model (SLSM). The results indicate that the majority (FB, NMSE, R2, and MSLE) of the indicators have improved and are in the satisfactory range of a good model performance level.
      Citation: J
      PubDate: 2022-03-29
      DOI: 10.3390/j5020015
      Issue No: Vol. 5, No. 2 (2022)
  • J, Vol. 5, Pages 214-231: Affinity and Correlation in DNA

    • Authors: Giovanni Villani
      First page: 214
      Abstract: A statistical analysis of important DNA sequences and related proteins has been performed to study the relationships between monomers, and some general considerations about these macromolecules can be provided from the results. First, the most important relationship between sites in all the DNA sequences examined is that between two consecutive base pairs. This is an indication of an energetic stabilization due to the stacking interaction of these couples of base pairs. Secondly, the difference between human chromosome sequences and their coding parts is relevant both in the relationships between sites and in some specific compositional rules, such as the second Chargaff rule. Third, the evidence of the relationship in two successive triplets of DNA coding sequences generates a relationship between two successive amino acids in the proteins. This is obviously impossible if all the relationships between the sites are statistical evidence and do not involve causes; therefore, in this article, due to stacking interactions and this relationship in coding sequences, we will divide the concept of the relationship between sites into two concepts: affinity and correlation, the first with physical causes and the second without. Finally, from the statistical analyses carried out, it will emerge that the human genome is uniform, with the only significant exception being the Y chromosome.
      Citation: J
      PubDate: 2022-04-14
      DOI: 10.3390/j5020016
      Issue No: Vol. 5, No. 2 (2022)
  • J, Vol. 5, Pages 232-254: Quantum Matter Overview

    • Authors: Melanie Swan, Renato P. Dos Santos, Frank Witte
      First page: 232
      Abstract: Quantum matter (novel phases of matter at zero temperature with exotic properties) is a growing field with applications in its own domain, and in providing foundational support to quantum sciences fields more generally. The ability to characterize and manipulate matter at the smallest scales continues to advance in fundamental ways. This review provides a plain-language, non-technical description of contemporary activity in quantum matter for a general science audience, and an example of these methods applied to quantum neuroscience. Quantum matter is the study of topologically governed phases of matter at absolute zero temperature that exhibit new kinds of emergent order and exotic properties related to topology and symmetry, entanglement, and electronic charge and magnetism, which may be orchestrated to create new classes of materials and computational devices (including in the areas of spintronics, valleytronics, and quantum computing). The paper is organized to discuss recent developments in quantum matter on the topics of short-range topologically protected materials (namely, topological semimetals), long-range entangled materials (quantum spin liquids and fractional quantum Hall states), and codes for characterizing and controlling quantum systems. A key finding is that a shift in the conceptualization of the field of quantum matter may be underway to expand the core focus on short-range topologically protected materials to also include geometry-based approaches and long-range entanglement as additionally important tools for the understanding, characterization, and manipulation of topological materials.
      Citation: J
      PubDate: 2022-04-20
      DOI: 10.3390/j5020017
      Issue No: Vol. 5, No. 2 (2022)
  • J, Vol. 5, Pages 255-276: Modelling of Positive Streamers in SF6 Gas under
           Non-Uniform Electric Field Conditions: Effect of Electronegativity on
           Streamer Discharges

    • Authors: Boakye-Mensah, Bonifaci, Hanna, Niyonzima, Timoshkin
      First page: 255
      Abstract: The use of SF6 in electrical insulation and fast-switching applications cannot be overemphasized. This is due to its excellent dielectric properties and high breakdown voltage, which are especially important for practical applications such as gas-insulated switchgears and pulsed power switches where pressurized SF6 is used. Breakdown in the gas occurs via streamer–leader transition; however, this transition is difficult to quantify numerically at atmospheric pressure because of the electronegativity of the gas. In the present work, streamer discharges in SF6 gas at pressures of 10 and 100 kPa were studied using a plasma fluid model implementation. Analysis of the electric field in the streamer body, streamer velocity, diameter, and the effect of the high electronegativity of the gas on streamer parameters are presented for positive polarity in a point-to-plane geometry. The streamers in SF6 for non-uniform background fields are compared to those in air, which have already been studied extensively in the literature.
      Citation: J
      PubDate: 2022-05-09
      DOI: 10.3390/j5020018
      Issue No: Vol. 5, No. 2 (2022)
  • J, Vol. 5, Pages 277-286: Improvement of the Photocatalytic Activity of
           Au/TiO2 Nanocomposites by Prior Treatment of TiO2 with Microplasma in an
           NH3 and H2O2 Solution

    • Authors: Nguyen Thi Thu Thuy, Do Hoang Tung, Le Hong Manh, Pham Hong Minh, Nguyen The Hien
      First page: 277
      Abstract: Plasmonic photocatalytic nanocomposites of TiO2 and Au nanoparticles (NPs) have recently attracted the attention of researchers, who aim to improve the photocatalytic activity of potential TiO2 NPs. In this study, we report photocatalytic activity enhancement for a Au/TiO2 nanocomposite prepared by the plasma–liquid interaction method using an atmospheric microplasma apparatus. The enhanced photocatalytic activity of the prepared Au/TiO2 is demonstrated by the degradation of methylene blue (MB) in water under both ultraviolet (UV) and visible light irradiation. The prior treatment of TiO2 with microplasma in a NH3 and H2O2 solution is found to strongly improve the photocatalytic activity of both the treated TiO2 NPs, as well as the synthesized Au/TiO2 nanocomposite.
      Citation: J
      PubDate: 2022-05-19
      DOI: 10.3390/j5020019
      Issue No: Vol. 5, No. 2 (2022)
  • J, Vol. 5, Pages 287-297: The Use of Radioactive Tracers to Detect and
           Correct Feed Flowrate Imbalances in Parallel Flotation Banks

    • Authors: Felipe Henríquez, Luis Maldonado, Juan Yianatos, Paulina Vallejos, Francisco Díaz, Luis Vinnett
      First page: 287
      Abstract: This work presents the application of radioactive tracers to detect and correct feed flowrate imbalances in parallel rougher flotation banks. Several surveys were conducted at Minera Los Pelambres concentrator, in banks consisting of 250 m3 mechanical flotation cells. The feed pulp distribution was estimated from the mean residence times, which were obtained from residence time distribution measurements. The tracer was injected in the feed distributor and the inlet and outlet tracer signals of cells 1 and 2 were measured by on-stream sensors. The baseline condition for the pulp distribution was defined by the valve settings in the feed distributor, which led to an unbalanced condition for two parallel rougher banks, with 34% of the pulp being fed to bank A and 66% to bank B. New valve configurations were evaluated, with a fraction of the feed being directed to the rougher bank C, which was not initially fed from the same distributor. The feed distribution was finally balanced with 49% of the pulp being fed to bank A versus 51% to bank B. Thus, the radioactive traces proved to be a powerful tool to industrially detect and improve feed distributions in parallel flotation circuits.
      Citation: J
      PubDate: 2022-06-12
      DOI: 10.3390/j5020020
      Issue No: Vol. 5, No. 2 (2022)
  • J, Vol. 5, Pages 298-317: Principal Component Analysis and Related Methods
           for Investigating the Dynamics of Biological Macromolecules

    • Authors: Kitao
      First page: 298
      Abstract: Principal component analysis (PCA) is used to reduce the dimensionalities of high-dimensional datasets in a variety of research areas. For example, biological macromolecules, such as proteins, exhibit many degrees of freedom, allowing them to adopt intricate structures and exhibit complex functions by undergoing large conformational changes. Therefore, molecular simulations of and experiments on proteins generate a large number of structure variations in high-dimensional space. PCA and many PCA-related methods have been developed to extract key features from such structural data, and these approaches have been widely applied for over 30 years to elucidate macromolecular dynamics. This review mainly focuses on the methodological aspects of PCA and related methods and their applications for investigating protein dynamics.
      Citation: J
      PubDate: 2022-06-20
      DOI: 10.3390/j5020021
      Issue No: Vol. 5, No. 2 (2022)
  • J, Vol. 5, Pages 1-14: Direct Photon Production in High-Energy Heavy Ion
           Collisions within the Integrated Hydrokinetic Model

    • Authors: Yuri Sinyukov, Volodymyr Shapoval
      First page: 1
      Abstract: The results on description of direct photon yields, transverse momentum spectra, and flow harmonics, measured in ultrarelativistic heavy-ion collisions at the Relativistic Heavy Ion Collider (RHIC) and the Large Hadron Collider (LHC) for different collision centrality classes, analyzed within the Integrated Hydrokinetic Model (iHKM) are reviewed. The iHKM simulation results, corresponding to the two opposite approaches to the matter evolution treatment at the final stage of the system’s expansion within the model, namely, the chemically equilibrated and the chemically frozen evolution, are compared. The so-called “direct photon puzzle” is addressed, and its possible solution, suggesting the account for additional photon emission at confinement, is considered.
      Citation: J
      PubDate: 2022-01-06
      DOI: 10.3390/j5010001
      Issue No: Vol. 5, No. 1 (2022)
  • J, Vol. 5, Pages 15-34: Efficient Color Correction Using Normalized
           Singular Value for Duststorm Image Enhancement

    • Authors: Ho-Sang Lee
      First page: 15
      Abstract: A duststorm image has a reddish or yellowish color cast. Though a duststorm image and a hazy image are obtained using the same process, a hazy image has no color distortion as it has not been disturbed by particles, but a duststorm image has color distortion owing to an imbalance in the color channel, which is disturbed by sand particles. As a result, a duststorm image has a degraded color channel, which is rare in certain channels. Therefore, a color balance step is needed to enhance a duststorm image naturally. This study goes through two steps to improve a duststorm image. The first is a color balance step using singular value decomposition (SVD). The singular value shows the image’s diversity features such as contrast. A duststorm image has a distorted color channel and it has a different singular value on each color channel. In a low-contrast image, the singular value is low and vice versa. Therefore, if using the channel’s singular value, the color channels can be balanced. Because the color balanced image has a similar feature to the haze image, a dehazing step is needed to improve the balanced image. In general, the dark channel prior (DCP) is frequently applied in the dehazing step. However, the existing DCP method has a halo effect similar to an over-enhanced image due to a dark channel and a patch image. According to this point, this study proposes to adjustable DCP (ADCP). In the experiment results, the proposed method was superior to state-of-the-art methods both subjectively and objectively.
      Citation: J
      PubDate: 2022-01-10
      DOI: 10.3390/j5010002
      Issue No: Vol. 5, No. 1 (2022)
  • J, Vol. 5, Pages 35-51: Formalising the R of Reduce in a Circular Economy
           Oriented Design Methodology for Pedestrian and Cycling Bridges

    • Authors: Kostas Anastasiades, Thijs Lambrechts, Jaan Mennes, Amaryllis Audenaert, Johan Blom
      First page: 35
      Abstract: The construction industry consumes over 32% of the annually excavated natural resources worldwide. Additionally, it is responsible for 25% of the annually generated solid waste. To become a more sustainable industry, a circular economy is necessary: resources are kept in use as long as possible, aiming to reduce and recirculate natural resources. In this paper, the investigation focuses on pedestrian truss bridges of the types Warren and Howe. Many pedestrian bridges currently find themselves in their end-of-life phase and most commonly these bridges are demolished and rebuilt, thus needing a lot of new materials and energy. The aim is thus first and foremost to reduce the amount of necessary new materials. For this reason, a design tool will be created, using the software ‘Matlab’, in which truss bridges can be evaluated and compared in the conceptual design stage. The tool is based on the theory of morphological indicators: the volume indicator, displacement indicator, buckling indicator and first natural frequency indicator. These allow a designer to determine the most material efficient Warren or Howe truss bridge design with user-defined constraints concerning deflection, load frequency, buckling and overall dimension. Subsequently, the tool was tested and compared to calculations made in the finite element modelling software Diamonds. In total, 72 steel bridge structures were tested. From these it could be concluded that the manual calculations in Diamonds in general confirmed the results obtained with the automated design tool based on morphological indicators. As such, it allows a designer to converge more quickly towards the best performing structure, thus saving time, materials, and corresponding costs and energy.
      Citation: J
      PubDate: 2022-01-17
      DOI: 10.3390/j5010003
      Issue No: Vol. 5, No. 1 (2022)
  • J, Vol. 5, Pages 52-63: On the Tree Gauge in Magnetostatics

    • Authors: Francesca Rapetti, Ana Alonso Rodríguez, Eduardo De Los Santos
      First page: 52
      Abstract: We recall the classical tree-cotree technique in magnetostatics. (1) We extend it in the frame of high-order finite elements in general domains. (2) We focus on its connection with the question of the invertibility of the final algebraic system arising from a high-order edge finite element discretization of the magnetostatic problem formulated in terms of the magnetic vector potential. With the same purpose of invertibility, we analyse another classically used condition, the Coulomb gauge. (3) We conclude by underlying that the two gauges can be naturally considered in a high order framework without any restriction on the topology of the domain.
      Citation: J
      PubDate: 2022-01-21
      DOI: 10.3390/j5010004
      Issue No: Vol. 5, No. 1 (2022)
  • J, Vol. 5, Pages 64-91: Law, Socio-Legal Governance, the Internet of
           Things, and Industry 4.0: A Middle-Out/Inside-Out Approach

    • Authors: Pompeu Casanovas, Louis de Koker, Mustafa Hashmi
      First page: 64
      Abstract: The Web of Data, the Internet of Things, and Industry 4.0 are converging, and society is challenged to ensure that appropriate regulatory responses can uphold the rule of law fairly and effectively in this emerging context. The challenge extends beyond merely submitting digital processes to the law. We contend that the 20th century notion of ‘legal order’ alone will not be suitable to produce the social order that the law should bring. The article explores the concepts of rule of law and of legal governance in digital and blockchain environments. We position legal governance from an empirical perspective, i.e., as an explanatory and validation concept to support the implementation of the rule of law in the new digital environments. As a novel contribution, this article (i) progresses some of the work done on the metarule of law and complements the SMART middle-out approach with an inside-out approach to digital regulatory systems and legal compliance models; (ii) sets the state-of-the-art and identifies the way to explain and validate legal information flows and hybrid agents’ behaviour; (iii) describes a phenomenological and historical approach to legal and political forms; and (iv) shows the utility of separating enabling and driving regulatory systems.
      Citation: J
      PubDate: 2022-01-21
      DOI: 10.3390/j5010005
      Issue No: Vol. 5, No. 1 (2022)
  • J, Vol. 5, Pages 92-104: Photocatalytic H2 Production on Au/TiO2: Effect
           of Au Photodeposition on Different TiO2 Crystalline Phases

    • Authors: Stefano Andrea Balsamo, Salvatore Sciré, Marcello Condorelli, Roberto Fiorenza
      First page: 92
      Abstract: In this work, we investigated the role of the crystalline phases of titanium dioxide in the solar photocatalytic H2 production by the reforming of glycerol, focusing the attention on the influence of photodeposited gold, as a metal co-catalyst, on TiO2 surface. We correlated the photocatalytic activity of 1 wt% Au/TiO2 in anatase, rutile, and brookite phases with the structural and optical properties determined by Raman spectroscopy, N2 adsorption–desorption measurements, UV–vis Diffuse Reflectance Spectroscopy (UV–vis DRS), X-ray photoelectron spectroscopy (XPS), Photoluminescence spectroscopy (PL), and Dynamic Light scattering (DLS). The best results (2.55 mmol H2 gcat−1 h−1) were obtained with anatase and gold photodeposited after 30 min of solar irradiation. The good performance of Au/TiO2 in anatase form and the key importance of the strong interaction between gold and the peculiar crystalline phase of TiO2 can be a starting point to efficiently improve photocatalysts design and experimental conditions, in order to favor a green hydrogen production through solar photocatalysis.
      Citation: J
      PubDate: 2022-01-24
      DOI: 10.3390/j5010006
      Issue No: Vol. 5, No. 1 (2022)
  • J, Vol. 5, Pages 105-106: Acknowledgment to Reviewers of J in 2021

    • Authors: J Editorial Office J Editorial Office
      First page: 105
      Abstract: Rigorous peer-reviews are the basis of high-quality academic publishing [...]
      Citation: J
      PubDate: 2022-01-28
      DOI: 10.3390/j5010007
      Issue No: Vol. 5, No. 1 (2022)
  • J, Vol. 5, Pages 107-113: A Comparison of Post-Operative Occlusion with
           3-D vs. 2-D Miniplate Fixation in the Management of Isolated Mandibular
           Angle Fractures

    • Authors: Anosha Mujtaba, Namrah Rafiq Malik, Muhammad Farooq Umer, Hasan Mujtaba, Shumaila Zofeen, Zahoor Ahmad Rana
      First page: 107
      Abstract: Mandibular angle fractures (MAFs) are treated in a variety of ways; however, the standard therapy is still up for debate. Despite the fact that many studies have generated evidence for the appropriate biomechanical stability of 3-D miniplates, there is an insufficient amount of data on the treatment of mandibular angle fractures with these plates. A comparative study was conducted at The Department of Oral and Maxillofacial Surgery, Pakistan Institute of Medical Sciences (PIMS), Islamabad. Patients were randomly divided into two groups of 52 patients each. Patients in group A were treated with 3-D miniplate placement on the lateral cortex following the principle of 3-D fixation proposed by Farmand and Dupoirieux, whereas patients included in group B were treated using 2-D conventional miniplate, placed according to Champy’s line of ideal osteosynthesis. A single surgical team performed the procedure. On the first and seventh post-operative days, the first month, and then the third month after surgery, regular evaluations were conducted. Assessment regarding Post Open Reduction and Internal Fixation (ORIF) occlusion was performed with the help of measuring tools. On the first day post-operative follow-up, 41 (78.8%) patients in group A and 31 (59.6%) patients in group B had satisfactory occlusion. The seventh day post-operative follow-up showed that 43 (82.7%) patients in group A and 41 (78.8%) patients in group B had satisfactory occlusion (p > 0.05). In both treatment groups, the first and third month follow-up evaluations revealed optimal occlusion. In comparison to conventional 2-D miniplate, the 3-D miniplate system produced better results and can be recommended as a better option for the management of mandibular angle fractures.
      Citation: J
      PubDate: 2022-02-02
      DOI: 10.3390/j5010008
      Issue No: Vol. 5, No. 1 (2022)
  • J, Vol. 5, Pages 114-125: Structural Stability Analysis of Proteins Using
           End-to-End Distance: A 3D-RISM Approach

    • Authors: Yutaka Maruyama, Ayori Mitsutake
      First page: 114
      Abstract: The stability of a protein is determined from its properties and surrounding solvent. In our previous study, the total energy as a sum of the conformational and solvation free energies was demonstrated to be an appropriate energy function for evaluating the stability of a protein in a protein folding system. We plotted the various energies against the root mean square deviation, required as a reference structure. Herein, we replotted the various energies against the end-to-end distance between the N- and C-termini, which is not a required reference and is experimentally measurable. The solvation free energies for all proteins tend to be low as the end-to-end distance increases, whereas the conformational energies tend to be low as the end-to-end distance decreases. The end-to-end distance is one of interesting measures to study the behavior of proteins.
      Citation: J
      PubDate: 2022-02-14
      DOI: 10.3390/j5010009
      Issue No: Vol. 5, No. 1 (2022)
  • J, Vol. 5, Pages 126-138: Metrics, Explainability and the European AI Act

    • Authors: Francesco Sovrano, Salvatore Sapienza, Monica Palmirani, Fabio Vitali
      First page: 126
      Abstract: On 21 April 2021, the European Commission proposed the first legal framework on Artificial Intelligence (AI) to address the risks posed by this emerging method of computation. The Commission proposed a Regulation known as the AI Act. The proposed AI Act considers not only machine learning, but expert systems and statistical models long in place. Under the proposed AI Act, new obligations are set to ensure transparency, lawfulness, and fairness. Their goal is to establish mechanisms to ensure quality at launch and throughout the whole life cycle of AI-based systems, thus ensuring legal certainty that encourages innovation and investments on AI systems while preserving fundamental rights and values. A standardisation process is ongoing: several entities (e.g., ISO) and scholars are discussing how to design systems that are compliant with the forthcoming Act, and explainability metrics play a significant role. Specifically, the AI Act sets some new minimum requirements of explicability (transparency and explainability) for a list of AI systems labelled as “high-risk” listed in Annex III. These requirements include a plethora of technical explanations capable of covering the right amount of information, in a meaningful way. This paper aims to investigate how such technical explanations can be deemed to meet the minimum requirements set by the law and expected by society. To answer this question, with this paper we propose an analysis of the AI Act, aiming to understand (1) what specific explicability obligations are set and who shall comply with them and (2) whether any metric for measuring the degree of compliance of such explanatory documentation could be designed. Moreover, by envisaging the legal (or ethical) requirements that such a metric should possess, we discuss how to implement them in a practical way. More precisely, drawing inspiration from recent advancements in the theory of explanations, our analysis proposes that metrics to measure the kind of explainability endorsed by the proposed AI Act shall be risk-focused, model-agnostic, goal-aware, intelligible, and accessible. Therefore, we discuss the extent to which these requirements are met by the metrics currently under discussion.
      Citation: J
      PubDate: 2022-02-18
      DOI: 10.3390/j5010010
      Issue No: Vol. 5, No. 1 (2022)
  • J, Vol. 5, Pages 139-149: The Good, the Bad, and the Invisible with Its
           Opportunity Costs: Introduction to the ‘J’ Special Issue on
           “the Impact of Artificial Intelligence on Law”

    • Authors: Ugo Pagallo, Massimo Durante
      First page: 139
      Abstract: Scholars and institutions have been increasingly debating the moral and legal challenges of AI, together with the models of governance that should strike the balance between the opportunities and threats brought forth by AI, its ‘good’ and ‘bad’ facets. There are more than a hundred declarations on the ethics of AI and recent proposals for AI regulation, such as the European Commission’s AI Act, have further multiplied the debate. Still, a normative challenge of AI is mostly overlooked, and regards the underuse, rather than the misuse or overuse, of AI from a legal viewpoint. From health care to environmental protection, from agriculture to transportation, there are many instances of how the whole set of benefits and promises of AI can be missed or exploited far below its full potential, and for the wrong reasons: business disincentives and greed among data keepers, bureaucracy and professional reluctance, or public distrust in the era of no-vax conspiracies theories. The opportunity costs that follow this technological underuse is almost terra incognita due to the ‘invisibility’ of the phenomenon, which includes the ‘shadow prices’ of economy. This introduction provides metrics for such assessment and relates this work to the development of new standards for the field. We must quantify how much it costs not to use AI systems for the wrong reasons.
      Citation: J
      PubDate: 2022-02-19
      DOI: 10.3390/j5010011
      Issue No: Vol. 5, No. 1 (2022)
  • J, Vol. 5, Pages 150-165: Strategies for Studying Acidification and
           Eutrophication Potentials, a Case Study of 150 Countries

    • Authors: Modeste Kameni Nematchoua
      First page: 150
      Abstract: Acidification and eutrophication are two environmental impacts that have a significant effect on air pollution and human health. The quantitative analysis of these two impacts remains hitherto unknown at the scale of new neighborhoods. The main objective of this study is to evaluate, analyze and compare the acidification and eutrophication potentials of one neighborhood initially located in Belgium. For making this comparison, this neighborhood was built in 149 other countries by applying four parameters such as building materials, energy mix, occupants’ mobility and local climate. The environmental costs of acidification and eutrophication coming from this neighborhood were assessed over 100 years. This research, extended to the scale of several nations, will enable new researchers, and especially policy-makers, to measure the effectiveness of sustainable neighborhoods. Eutrophication and acidification potentials were assessed under different phases (construction, use, renovation and demolition), with Pleiades software (version The effects of the energy mix were the most significant among the other parameters. The results show that 72%, and 65% of acidification and eutrophication potentials are produced during the operational phase of the neighborhood. In the case of sustainable neighborhoods, the acidification potential is 22.1% higher in the 10 top low-income countries than in the 10 top high-income countries. At the neighborhood scale, the main eutrophication potential component is water (34.2%), while the main source of acidification potential is electricity production (45.1%).
      Citation: J
      PubDate: 2022-03-01
      DOI: 10.3390/j5010012
      Issue No: Vol. 5, No. 1 (2022)
  • J, Vol. 5, Pages 166-185: A Hypothesis on How the Azolla Symbiosis
           Mitigates Nitrous Oxide Based on In Silico Analyses

    • Authors: Dilantha Gunawardana, Venura Herath
      First page: 166
      Abstract: Nitrous oxide is a long-lived greenhouse gas that exists for 114 years in the atmosphere and is 298-fold more potent than carbon dioxide in its global warming potential. Two recent studies showcased the utility of Azolla plants for a lesser footprint in nitrous oxide production from urea and other supplements to the irrigated ecosystem, which mandates exploration since there is still no clear solution to nitrous oxide in paddy fields or in other ecosystems. Here, we propose a solution based on the evolution of a single cytochrome oxidase subunit II protein (WP_013192178.1) from the cyanobiont Trichormus azollae that we hypothesize to be able to quench nitrous oxide. First, we draw attention to a domain in the candidate protein that is emerging as a sensory periplasmic Y_Y_Y domain that is inferred to bind nitrous oxide. Secondly, we draw the phylogeny of the candidate protein showcasing the poor bootstrap support of its position in the wider clade showcasing its deviation from the core function. Thirdly, we show that the NtcA protein, the apical N-effecting transcription factor, can putatively bind to a promoter sequence of the gene coding for the candidate protein (WP_013192178.1), suggesting a function associated with heterocysts and N-metabolism. Our fourth point involves a string of histidines at the C-terminal extremity of the WP_013192178.1 protein that is missing on all other T. azollae cytochrome oxidase subunit II counterparts, suggesting that such histidines are perhaps involved in forming a Cu center. As the fifth point, we showcase a unique glycine-183 in a lengthy linker region containing multiple glycines that is absent in all proximal Nostocales cyanobacteria, which we predict to be a DNA binding residue. We propose a mechanism of action for the WP_013192178.1 protein based on our in silico analyses. In total, we hypothesize the incomplete and rapid conversion of a likely heterocystous cytochrome oxidase subunit II protein to an emerging nitrous oxide sensing/quenching subunit based on bioinformatics analyses and past literature, which can have repercussions to climate change and consequently, future human life.
      Citation: J
      PubDate: 2022-03-04
      DOI: 10.3390/j5010013
      Issue No: Vol. 5, No. 1 (2022)
  • J, Vol. 5, Pages 186-197: Applications of Time-Resolved Thermodynamics for
           Studies on Protein Reactions

    • Authors: Masahide Terazima
      First page: 186
      Abstract: Thermodynamics and kinetics are two important scientific fields when studying chemical reactions. Thermodynamics characterize the nature of the material. Kinetics, mostly based on spectroscopy, have been used to determine reaction schemes and identify intermediate species. They are certainly important fields, but they are almost independent. In this review, our attempts to elucidate protein reaction kinetics and mechanisms by monitoring thermodynamic properties, including diffusion in the time domain, are described. The time resolved measurements are performed mostly using the time resolved transient grating (TG) method. The results demonstrate the usefulness and powerfulness of time resolved studies on protein reactions. The advantages and limitations of this TG method are also discussed.
      Citation: J
      PubDate: 2022-03-08
      DOI: 10.3390/j5010014
      Issue No: Vol. 5, No. 1 (2022)
School of Mathematical and Computer Sciences
Heriot-Watt University
Edinburgh, EH14 4AS, UK
Email: journaltocs@hw.ac.uk
Tel: +00 44 (0)131 4513762

Your IP address:
Home (Search)
About JournalTOCs
News (blog, publications)
JournalTOCs on Twitter   JournalTOCs on Facebook

JournalTOCs © 2009-