Freshwater crayfish are amongst the largest macroinvertebrates and play a keystone role in the ecosystems they occupy. Understanding the global distribution of these animals is often hindered due to a paucity of distributional data. Additionally, non-native crayfish introductions are becoming more frequent, which can cause severe environmental and economic impacts. Management decisions related to crayfish and their habitats require accurate, up-to-date distribution data and mapping tools. Such data are currently patchily distributed with limited accessibility and are rarely up-to-date. To address these challenges, we developed a versatile e-portal to host distributional data of freshwater crayfish and their pathogens (using Aphanomyces astaci, the causative agent of the crayfish plague, as the most prominent example). Populated with expert data and operating in near real-time, World of Crayfish™ is a living, publicly available database providing worldwide distributional data sourced by experts in the field. The database offers open access to the data through specialized standard geospatial services (Web Map Service, Web Feature Service) enabling users to view, embed, and download customizable outputs for various applications. The platform is designed to support technical enhancements in the future, with the potential to eventually incorporate various additional features. This tool serves as a step forward towards a modern era of conservation planning and management of freshwater biodiversity.
- Keywords
- Aphanomyces astaci, Astacidae, Cambaridae, Endangered species, Invasive species, Open data, Parastacidae, Species distribution,
- MeSH
- Aphanomyces MeSH
- Databases, Factual MeSH
- Ecosystem MeSH
- Internet MeSH
- Astacoidea * microbiology MeSH
- Fresh Water * MeSH
- Animals MeSH
- Check Tag
- Animals MeSH
- Publication type
- Journal Article MeSH
The article deals with and discusses two main approaches in building semantic structures for electrophysiological metadata. It is the use of conventional data structures, repositories, and programming languages on one hand and the use of formal representations of ontologies, known from knowledge representation, such as description logics or semantic web languages on the other hand. Although knowledge engineering offers languages supporting richer semantic means of expression and technological advanced approaches, conventional data structures and repositories are still popular among developers, administrators and users because of their simplicity, overall intelligibility, and lower demands on technical equipment. The choice of conventional data resources and repositories, however, raises the question of how and where to add semantics that cannot be naturally expressed using them. As one of the possible solutions, this semantics can be added into the structures of the programming language that accesses and processes the underlying data. To support this idea we introduced a software prototype that enables its users to add semantically richer expressions into a Java object-oriented code. This approach does not burden users with additional demands on programming environment since reflective Java annotations were used as an entry for these expressions. Moreover, additional semantics need not to be written by the programmer directly to the code, but it can be collected from non-programmers using a graphic user interface. The mapping that allows the transformation of the semantically enriched Java code into the Semantic Web language OWL was proposed and implemented in a library named the Semantic Framework. This approach was validated by the integration of the Semantic Framework in the EEG/ERP Portal and by the subsequent registration of the EEG/ERP Portal in the Neuroscience Information Framework.
- Keywords
- EEG/ERP portal, electrophysiology, object-oriented code, ontology, semantic framework, semantic web,
- Publication type
- Journal Article MeSH
Large-scale biodiversity data, for example, on species distribution and richness information, are being mobilized and becoming available at an increasing rate. Interactive web applications like atlases have been developed to visualize available datasets and make them accessible to a wider audience. Web mapping tools are changing rapidly, and different underlying concepts have been developed to visualize datasets at a high cartographic standard.Here, we introduce the Combined Atlas Framework for the development of interactive web atlases for ecological data visualization. We combine two existing approaches: the five stages of the user-centred design approach for web mapping applications and the three U approach for interface success.Subsequently, we illustrate the use of this framework by developing the Atlas of Plant Invasions based on the Global Naturalized Alien Flora (GloNAF) database. This case study illustrates how the newly developed Combined Atlas Framework with a user-centred design philosophy can generate measurable success through communication with the target user group, iterative prototyping and competitive analysis of other existing web mapping approaches.The framework is useful in creating an atlas that employs user feedback to determine usability and utility features within an interactive atlas system. Finally, this framework will enable a better-informed development process of future visualization and dissemination of biodiversity data through web mapping applications and interactive atlases.
- Keywords
- D3, GloNAF, JavaScript, atlas, cartography, framework development, invasive alien species, web mapping, workflow,
- Publication type
- Journal Article MeSH
HotSpot Wizard is a web server for automatic identification of 'hot spots' for engineering of substrate specificity, activity or enantioselectivity of enzymes and for annotation of protein structures. The web server implements the protein engineering protocol, which targets evolutionarily variable amino acid positions located in the active site or lining the access tunnels. The 'hot spots' for mutagenesis are selected through the integration of structural, functional and evolutionary information obtained from: (i) the databases RCSB PDB, UniProt, PDBSWS, Catalytic Site Atlas and nr NCBI and (ii) the tools CASTp, CAVER, BLAST, CD-HIT, MUSCLE and Rate4Site. The protein structure and e-mail address are the only obligatory inputs for the calculation. In the output, HotSpot Wizard lists annotated residues ordered by estimated mutability. The results of the analysis are mapped on the enzyme structure and visualized in the web browser using Jmol. The HotSpot Wizard server should be useful for protein engineers interested in exploring the structure of their favourite protein and for the design of mutations in site-directed mutagenesis and focused directed evolution experiments. HotSpot Wizard is available at http://loschmidt.chemi.muni.cz/hotspotwizard/.
- MeSH
- beta-Lactamases chemistry MeSH
- Glycoside Hydrolases chemistry MeSH
- Phosphoric Triester Hydrolases chemistry MeSH
- Hydrolases chemistry MeSH
- Internet MeSH
- Protein Engineering * MeSH
- Reproducibility of Results MeSH
- Software * MeSH
- User-Computer Interface MeSH
- Publication type
- Journal Article MeSH
- Research Support, Non-U.S. Gov't MeSH
- Names of Substances
- beta-Lactamases MeSH
- Glycoside Hydrolases MeSH
- haloalkane dehalogenase MeSH Browser
- Phosphoric Triester Hydrolases MeSH
- Hydrolases MeSH
- licheninase MeSH Browser
BACKGROUND: Next generation sequencing (NGS) technology allows laboratories to investigate virome composition in clinical and environmental samples in a culture-independent way. There is a need for bioinformatic tools capable of parallel processing of virome sequencing data by exactly identical methods: this is especially important in studies of multifactorial diseases, or in parallel comparison of laboratory protocols. RESULTS: We have developed a web-based application allowing direct upload of sequences from multiple virome samples using custom parameters. The samples are then processed in parallel using an identical protocol, and can be easily reanalyzed. The pipeline performs de-novo assembly, taxonomic classification of viruses as well as sample analyses based on user-defined grouping categories. Tables of virus abundance are produced from cross-validation by remapping the sequencing reads to a union of all observed reference viruses. In addition, read sets and reports are created after processing unmapped reads against known human and bacterial ribosome references. Secured interactive results are dynamically plotted with population and diversity charts, clustered heatmaps and a sortable and searchable abundance table. CONCLUSIONS: The Vipie web application is a unique tool for multi-sample metagenomic analysis of viral data, producing searchable hits tables, interactive population maps, alpha diversity measures and clustered heatmaps that are grouped in applicable custom sample categories. Known references such as human genome and bacterial ribosomal genes are optionally removed from unmapped ('dark matter') reads. Secured results are accessible and shareable on modern browsers. Vipie is a freely available web-based tool whose code is open source.
- Keywords
- Assembly, Metagenomics, NGS analysis, Parallel processing, Viral dark matter, Viromes, Virus, Visualization,
- MeSH
- Genetic Variation MeSH
- Genomics methods MeSH
- Internet * MeSH
- Humans MeSH
- Microbiota genetics MeSH
- Software * MeSH
- Viruses genetics MeSH
- High-Throughput Nucleotide Sequencing * MeSH
- Check Tag
- Humans MeSH
- Publication type
- Journal Article MeSH
- Research Support, Non-U.S. Gov't MeSH
BACKGROUND: The beginning of the coronavirus disease (COVID-19) epidemic dates back to December 31, 2019, when the first cases were reported in the People's Republic of China. In the Czech Republic, the first three cases of infection with the novel coronavirus were confirmed on March 1, 2020. The joint effort of state authorities and researchers gave rise to a unique team, which combines methodical knowledge of real-world processes with the know-how needed for effective processing, analysis, and online visualization of data. OBJECTIVE: Due to an urgent need for a tool that presents important reports based on valid data sources, a team of government experts and researchers focused on the design and development of a web app intended to provide a regularly updated overview of COVID-19 epidemiology in the Czech Republic to the general population. METHODS: The cross-industry standard process for data mining model was chosen for the complex solution of analytical processing and visualization of data that provides validated information on the COVID-19 epidemic across the Czech Republic. Great emphasis was put on the understanding and a correct implementation of all six steps (business understanding, data understanding, data preparation, modelling, evaluation, and deployment) needed in the process, including the infrastructure of a nationwide information system; the methodological setting of communication channels between all involved stakeholders; and data collection, processing, analysis, validation, and visualization. RESULTS: The web-based overview of the current spread of COVID-19 in the Czech Republic has been developed as an online platform providing a set of outputs in the form of tables, graphs, and maps intended for the general public. On March 12, 2020, the first version of the web portal, containing fourteen overviews divided into five topical sections, was released. The web portal's primary objective is to publish a well-arranged visualization and clear explanation of basic information consisting of the overall numbers of performed tests, confirmed cases of COVID-19, COVID-19-related deaths, the daily and cumulative overviews of people with a positive COVID-19 case, performed tests, location and country of infection of people with a positive COVID-19 case, hospitalizations of patients with COVID-19, and distribution of personal protective equipment. CONCLUSIONS: The online interactive overview of the current spread of COVID-19 in the Czech Republic was launched on March 11, 2020, and has immediately become the primary communication channel employed by the health care sector to present the current situation regarding the COVID-19 epidemic. This complex reporting of the COVID-19 epidemic in the Czech Republic also shows an effective way to interconnect knowledge held by various specialists, such as regional and national methodology experts (who report positive cases of the disease on a daily basis), with knowledge held by developers of central registries, analysts, developers of web apps, and leaders in the health care sector.
- Keywords
- COVID-19, CRISP-DM, Czech Republic, app, coronavirus disease, data mining, epidemiological overview, epidemiology, health data, interactive reporting, modeling, public health, virus, web app,
- MeSH
- Betacoronavirus * MeSH
- COVID-19 MeSH
- Data Mining MeSH
- Internet MeSH
- Coronavirus Infections epidemiology MeSH
- Humans MeSH
- Pandemics MeSH
- SARS-CoV-2 MeSH
- Software MeSH
- Pneumonia, Viral epidemiology MeSH
- Check Tag
- Humans MeSH
- Publication type
- Journal Article MeSH
- Research Support, Non-U.S. Gov't MeSH
- Geographicals
- Czech Republic epidemiology MeSH
This paper reviews the current status of the knowledge on body surface potential mapping (BSPM) and ECG imaging (ECGI) methods for patient selection, left ventricular (LV) lead positioning, and optimisation of CRT programming, to indicate the major trends and future perspectives for the application of these methods in CRT patients. A systematic literature review using PubMed, Scopus, and Web of Science was conducted to evaluate the available clinical evidence regarding the usage of BSPM and ECGI methods in CRT patients. The preferred reporting items for systematic reviews and meta-analyses (PRISMA) statement was used as a basis for this review. BSPM and ECGI methods applied in CRT patients were assessed, and quantitative parameters of ventricular depolarisation delivered from BSPM and ECGI were extracted and summarised. BSPM and ECGI methods can be used in CRT in several ways, namely in predicting CRT outcome, in individualised optimisation of CRT device programming, and the guiding of LV electrode placement, however, further prospective or randomised trials are necessary to verify the utility of BSPM for routine clinical practice.
- Keywords
- Body surface potential mapping, CRT, ECG imaging, heart failure,
- Publication type
- Journal Article MeSH
- Review MeSH
- MeSH
- Chromosomes genetics MeSH
- Physical Chromosome Mapping MeSH
- Internet MeSH
- Chromosome Mapping MeSH
- Mice MeSH
- Chromosomes, Artificial, Yeast MeSH
- Animals MeSH
- Check Tag
- Mice MeSH
- Animals MeSH
- Publication type
- Journal Article MeSH
- Research Support, Non-U.S. Gov't MeSH
Transperineal template prostate biopsies (TTPB) are performed for assessments after unexpected negative transrectal ultrasound biopsies (TRUSB), correlation with imaging findings and during active surveillance. The impact of TTPBs on pathology has not been analysed. The European Network of Uropathology (ENUP) distributed a survey on TTPB, including how specimens were received, processed and analysed. Two hundred forty-four replies were received from 22 countries with TTPBs seen by 68.4% of the responders (n = 167). Biopsies were received in more than 12 pots in 35.2%. The number of cores embedded per cassette varied between 1 (39.5%) and 3 or more (39.5%). Three levels were cut in 48.3%, between 2 and 3 serial sections in 57.2% and unstained spare sections in 45.1%. No statistical difference was observed with TRUSB management. The number of positive cores was always reported and the majority gave extent per core (82.3%), per region (67.1%) and greatest involvement per core (69.4%). Total involvement in the whole series and continuous/discontinuous infiltrates were reported in 42.2 and 45.4%, respectively. The majority (79.4%) reported Gleason score in each site or core, and 59.6% gave an overall score. A minority (28.5%) provided a map or a diagram. For 19%, TTPB had adversely affected laboratory workload with only 27% managing to negotiate extra costs. Most laboratories process samples thoroughly and report TTPB similarly to TRUSB. Although TTPB have caused considerable extra work, it remains uncosted in most centres. Guidance is needed for workload impact and minimum standards of processing if TTPB work continues to increase.
- Keywords
- Survey, Transperineal template biopsy, Transrectal ultrasound biopsies, Workload impact,
- MeSH
- Biopsy methods MeSH
- Internet MeSH
- Humans MeSH
- Prostatic Neoplasms diagnosis MeSH
- Specimen Handling methods MeSH
- Surveys and Questionnaires MeSH
- Research Design MeSH
- Check Tag
- Humans MeSH
- Male MeSH
- Publication type
- Journal Article MeSH
- Geographicals
- Europe MeSH
The rising and continuous pollution of the soil from anthropogenic activities is of great concern. Owing to this concern, the advent of digital soil mapping (DSM) has been a tool that soil scientists use in this era to predict the potentially toxic element (PTE) content in the soil. The purpose of this paper was to conduct a review of articles, summarize and analyse the spatial prediction of potentially toxic elements, determine and compare the models' usage as well as their performance over time. Through Scopus, the Web of Science and Google Scholar, we collected papers between the year 2001 and the first quarter of 2019, which were tailored towards the spatial PTE prediction using DSM approaches. The results indicated that soil pollution emanates from diverse sources. However, it provided reasons why the authors investigate a piece of land or area, highlighting the uncertainties in mapping, number of publications per journal and continental efforts to research as well as published on trending issues regarding DSM. This paper reveals the complementary role machine learning algorithms and the geostatistical models play in DSM. Nevertheless, geostatistical approaches remain the most preferred model compared to machine learning algorithms.
- Keywords
- Algorithms, Digital soil mapping, Geostatistics, Machine learning, Potentially toxic elements, Soil pollution, Spatial prediction,
- MeSH
- Algorithms MeSH
- Bibliometrics MeSH
- Geologic Sediments analysis MeSH
- Soil Pollutants analysis MeSH
- Environmental Monitoring methods MeSH
- Soil * MeSH
- Machine Learning MeSH
- Environmental Pollution analysis MeSH
- Publication type
- Journal Article MeSH
- Review MeSH
- Names of Substances
- Soil Pollutants MeSH
- Soil * MeSH