BACKGROUND: High-throughput bioinformatics analyses of next generation sequencing (NGS) data often require challenging pipeline optimization. The key problem is choosing appropriate tools and selecting the best parameters for optimal precision and recall. RESULTS: Here we introduce ToTem, a tool for automated pipeline optimization. ToTem is a stand-alone web application with a comprehensive graphical user interface (GUI). ToTem is written in Java and PHP with an underlying connection to a MySQL database. Its primary role is to automatically generate, execute and benchmark different variant calling pipeline settings. Our tool allows an analysis to be started from any level of the process and with the possibility of plugging almost any tool or code. To prevent an over-fitting of pipeline parameters, ToTem ensures the reproducibility of these by using cross validation techniques that penalize the final precision, recall and F-measure. The results are interpreted as interactive graphs and tables allowing an optimal pipeline to be selected, based on the user's priorities. Using ToTem, we were able to optimize somatic variant calling from ultra-deep targeted gene sequencing (TGS) data and germline variant detection in whole genome sequencing (WGS) data. CONCLUSIONS: ToTem is a tool for automated pipeline optimization which is freely available as a web application at https://totem.software .
Production of amylases by fungi under solid-state fermentation is considered the best methodology for commercial scaling that addresses the ever-escalating needs of the worldwide enzyme market. Here response surface methodology (RSM) was used for the optimization of process variables for α-amylase enzyme production from Trichoderma virens using watermelon rinds (WMR) under solid-state fermentation (SSF). The statistical model included four variables, each detected at two levels, followed by model development with partial purification and characterization of α-amylase. The partially purified α-amylase was characterized with regard to optimum pH, temperature, kinetic constant, and substrate specificity. The results indicated that both pH and moisture content had a significant effect (P < 0.05) on α-amylase production (880 U/g) under optimized process conditions at a 3-day incubation time, moisture content of 50%, 30 °C, and pH 6.98. Statistical optimization using RSM showed R2 values of 0.9934, demonstrating the validity of the model. Five α-amylases were separated by using DEAE-Sepharose and characterized with a wide range of optimized pH values (pH 4.5-9.0), temperature optima (40-60 °C), low Km values (2.27-3.3 mg/mL), and high substrate specificity toward large substrates. In conclusion, this study presents an efficient and green approach for utilization of agro-waste for production of the valuable α-amylase enzyme using RSM under SSF. RSM was particularly beneficial for the optimization and analysis of the effective process parameters.
We have developed a new microextraction technique for equilibrium, non-exhaustive analyte preconcentration from aqueous solutions into organic solvents lighter than water. The key point of the method is application of specially designed and optimized bell-shaped extraction device, BSED. The technique has been tested and applied to the preconcentration of selected volatile and semi volatile compounds which were determined by gas chromatography/mass spectrometry in spiked water samples. The significant parameters of the extraction have been found using chemometric procedures and these parameters were optimized using the central composite design (CCD) for two solvents. The analyte preconcentration factors were in a range from 8.3 to 161.8 (repeatability from 7 to 14%) for heptane, and 50.0-105.0 (repeatability from 0 to 5%) for tert-butyl acetate. The reproducibility of the technique was within 1-8%. The values of limits of detection and determination were 0.1-3.3 ng mL(-1) for heptane and 0.3-10.7 ng mL(-1) for tert-butyl acetate. The new microextraction technique has been found to be a cheap, simple and flexible alternative to the common procedures, such as SPME or LLME. This BSED-LLME technique can also be combined with other separation methods, e.g., HPLC or CE.
- MeSH
- Water Pollutants, Chemical analysis isolation & purification MeSH
- Equipment Design MeSH
- Liquid Phase Microextraction instrumentation methods MeSH
- Mineral Waters analysis MeSH
- Drinking Water analysis MeSH
- Gas Chromatography-Mass Spectrometry methods MeSH
- Reproducibility of Results MeSH
- Water analysis MeSH
- Publication type
- Journal Article MeSH
- Evaluation Study MeSH
- Research Support, Non-U.S. Gov't MeSH
Srdeční resynchronizační terapie (CRT) zlepšuje kvalitu života a/nebo hemodynamické parametry jen u dvou třetin pacientů s biventrikulárním kardiostimulátorem naimplantovaným pro srdeční selhání. U ostatních pacientů (nonrespondérů) se provádí další jemnější programace kardiostimulačních parametrů. Tato optimalizace atrioventrikulárního a ventrikulo‑ventrikulárního zpoždění (AVD a VVD) může zlepšit výkon srdce u části z nich. Efekt AVD a VVD programace se nejčastěji hodnotí pomocí echokardiografických veličin (charakter plnění levé komory, délka diastolické fáze, tepový objem/srdeční výdej, ejekční frakce, LV dP/dT, synchronie kontrakce levé komory pomocí tissue Doppler nebo speckle trackingu). Zatímco všechny tyto parametry prokázaly bezprostřední efekt AVD/VVD optimalizace ve vybraných souborech CRT pacientů, dlouhodobý benefit optimalizace se nepodařilo prokázat randomizovanými studiemi ani metaanalýzou. Článek popisuje současný teoretický koncept optimalizace, metodologické problémy a nevyřešené otázky a dostupnou důkazní literaturu. Možnosti optimalizace jsou shrnuty v současných odborných guidelines, doporučuje se však individuální přístup.
Cardiac resynchronization therapy (CRT) improves the quality of life and/or haemodynamic parameters only in 2/3 of heart failure patients with a biventricular pacemaker implanted. In the rest of these patients (non‑responders), further refinement of pacing parameters is provided. This atrioventricular delay (AVD) and ventriculoventricular delay (VVD) optimization may help to improve cardiac performance in some of them. Echocardiography is widely used to assess the effect of AVD and VVD programming. The diastolic filling pattern, the length of the diastole, stroke volume/cardiac output, ejection fraction, LV dP/dT and LV contraction synchrony by tissue Doppler or speckle tracking are the most frequent criteria used for optimization. Whilst all these variables are proved to demonstrate an instant effect of AVD/VVD optimization in selected groups of CRT patients, neither a randomized study nor a meta‑analysis showed any long‑term benefit in the CRT population. This article describes the theoretical concept of optimization, certain methodological problems and unresolved issues in CRT optimization and evidence in literature already published. Optimization options are summarized in current guidelines but an individual approach is recommended in non‑responders.
- MeSH
- Diastole physiology MeSH
- Echocardiography methods MeSH
- Hemodynamics MeSH
- Cardiac Pacing, Artificial methods MeSH
- Clinical Trials as Topic MeSH
- Humans MeSH
- Evidence-Based Medicine MeSH
- Arrhythmias, Cardiac physiopathology MeSH
- Heart Ventricles physiopathology MeSH
- Cardiac Resynchronization Therapy * methods MeSH
- Heart Failure therapy MeSH
- Check Tag
- Humans MeSH
Východiska: Práce vychází z jednoho ze známých algoritmů kolektivního testování - metody skupinového testu sloučených vzorků - kterou experimentálně ověřujeme při povinném plošném screeningu pomocí POC antigenních testů v souvislosti s šířením nákazy COVID-19.Cíl: Verifikovat metodu a stanovit doporučení pro skupinové testování na kvalitativní stanovení přítomnosti antigenu viru SARS-CoV-2 prostřednictvím POC antigenních testů v rámci celoplošného screeningu.Metody: Experimentální pilotní studie s použitím POC testovacích sad pro kvalitativní detekci virových nukleoproteinových antigenů SARS-CoV-2 z přímých nazálních, nazofaryngeálních nebo orofaryngeálních výtěrů u vybrané populace doporučenou metodikou testování a metodikou kolektivního (skupinového) testování.Výsledky a diskuze: Porovnali jsme optimalizaci rozsahu kolektivně testované skupiny s citlivostí POC testovacích sad. Tato citlivost nedostačuje pro optimalizované rozsahy (okolo 10 vzorků) pro odhadovanou prevalenci mezi 1 % až 2 %, avšak citlivosti testovacích sad redukované optimální rozsahy (do pěti vzorků) nabízejí nezanedbatelnou úsporu. Závěry: Navrhujeme testovací protokol kolektivního testování pětičlenných skupin, které za předpokladu 1 % až 2 % prevalence poskytují faktor úspory materiálu 0,25-0,3.
Background: This paper is based on one of the algorithms of collective testing, the method of group screening, which we experimentally verify during mandatory screening using POC antigen tests in connection with the spread of COVID-19.Aim: To verify the model and make recommendations for collective testing for qualitative determination of SARS-CoV-2 antigen detection by POC antigen tests in the context of population-wide screening.Methods: POC antigen test kits for the qualitative detection of SARS-CoV-2 viral nucleoprotein antigens from direct nasal, nasopharyngeal, or oropharyngeal swabs in a selected population using recommended testing and collective (group) testing methodologies.Results and discussion: We compared the optimization of the size of the collectively tested group with the sensitivity of the POC test sets. This sensitivity is not sufficient for optimized size (around 10 samples) for an estimated prevalence between 1 % and 2 %, but even the size cut down by the test sensitivity (up to 5 samples) provides not negligible savings.Conclusions: We propose a test protocol for collective testing of five-member groups, which, assuming a 1 % to 2 % prevalence, provide a material saving factor of 0,25-0.3.
In silico methods like molecular docking and pharmacophore modeling are established strategies in lead identification. Their successful application for finding new active molecules for a target is reported by a plethora of studies. However, once a potential lead is identified, lead optimization, with the focus on improving potency, selectivity, or pharmacokinetic parameters of a parent compound, is a much more complex task. Even though in silico molecular modeling methods could contribute a lot of time and cost-saving by rationally filtering synthetic optimization options, they are employed less widely in this stage of research. In this review, we highlight studies that have successfully used computer-aided SAR analysis in lead optimization and want to showcase sound methodology and easily accessible in silico tools for this purpose.
- Publication type
- Journal Article MeSH
- Review MeSH
Four methods for global numerical black box optimization with origins in the mathematical programming community are described and experimentally compared with the state of the art evolutionary method, BIPOP-CMA-ES. The methods chosen for the comparison exhibit various features that are potentially interesting for the evolutionary computation community: systematic sampling of the search space (DIRECT, MCS) possibly combined with a local search method (MCS), or a multi-start approach (NEWUOA, GLOBAL) possibly equipped with a careful selection of points to run a local optimizer from (GLOBAL). The recently proposed "comparing continuous optimizers" (COCO) methodology was adopted as the basis for the comparison. Based on the results, we draw suggestions about which algorithm should be used depending on the available budget of function evaluations, and we propose several possibilities for hybridizing evolutionary algorithms (EAs) with features of the other compared algorithms.
The improving performance of the laser-induced breakdown spectroscopy (LIBS) triggered its utilization in the challenging topic of soft tissue analysis. Alterations of elemental content within soft tissues are commonly assessed and provide further insights in biological research. However, the laser ablation of soft tissues is a complex issue and demands a priori optimization, which is not straightforward in respect to a typical LIBS experiment. Here, we focus on implementing an internal standard into the LIBS elemental analysis of soft tissue samples. We achieve this by extending routine methodology for optimization of soft tissues analysis with a standard spiking method. This step enables a robust optimization procedure of LIBS experimental settings. Considering the implementation of LIBS analysis to the histological routine, we avoid further alterations of the tissue structure. Therefore, we propose a unique methodology of sample preparation, analysis, and subsequent data treatment, which enables the comparison of signal response from heterogenous matrix for different LIBS parameters. Additionally, a brief step-by-step process of optimization to achieve the highest signal-to-noise ratio (SNR) is described. The quality of laser-tissue interaction is investigated on the basis of the zinc signal response, while selected experimental parameters (e.g., defocus, gate delay, laser energy, and ambient atmosphere) are systematically modified.
- MeSH
- Cells MeSH
- Laser Therapy * MeSH
- Lasers * MeSH
- Reference Standards MeSH
- Spectrum Analysis MeSH
- Light MeSH
- Publication type
- Journal Article MeSH
Six population-based methods for real-valued black box optimization are thoroughly compared in this article. One of them, Nelder-Mead simplex search, is rather old, but still a popular technique of direct search. The remaining five (POEMS, G3PCX, Cauchy EDA, BIPOP-CMA-ES, and CMA-ES) are more recent and came from the evolutionary computation community. The recently proposed comparing continuous optimizers (COCO) methodology was adopted as the basis for the comparison. The results show that BIPOP-CMA-ES reaches the highest success rates and is often also quite fast. The results of the remaining algorithms are mixed, but Cauchy EDA and POEMS are usually slow.
The acid dissociation constant is an important molecular property, and it can be successfully predicted by Quantitative Structure-Property Relationship (QSPR) models, even for in silico designed molecules. We analyzed how the methodology of in silico 3D structure preparation influences the quality of QSPR models. Specifically, we evaluated and compared QSPR models based on six different 3D structure sources (DTP NCI, Pubchem, Balloon, Frog2, OpenBabel, and RDKit) combined with four different types of optimization. These analyses were performed for three classes of molecules (phenols, carboxylic acids, anilines), and the QSPR model descriptors were quantum mechanical (QM) and empirical partial atomic charges. Specifically, we developed 516 QSPR models and afterward systematically analyzed the influence of the 3D structure source and other factors on their quality. Our results confirmed that QSPR models based on partial atomic charges are able to predict pKa with high accuracy. We also confirmed that ab initio and semiempirical QM charges provide very accurate QSPR models and using empirical charges based on electronegativity equalization is also acceptable, as well as advantageous, because their calculation is very fast. On the other hand, Gasteiger-Marsili empirical charges are not applicable for pKa prediction. We later found that QSPR models for some classes of molecules (carboxylic acids) are less accurate. In this context, we compared the influence of different 3D structure sources. We found that an appropriate selection of 3D structure source and optimization method is essential for the successful QSPR modeling of pKa. Specifically, the 3D structures from the DTP NCI and Pubchem databases performed the best, as they provided very accurate QSPR models for all the tested molecular classes and charge calculation approaches, and they do not require optimization. Also, Frog2 performed very well. Other 3D structure sources can also be used but are not so robust, and an unfortunate combination of molecular class and charge calculation approach can produce weak QSPR models. Additionally, these 3D structures generally need optimization in order to produce good quality QSPR models.
- MeSH
- Chemical Phenomena * MeSH
- Quantitative Structure-Activity Relationship * MeSH
- Quantum Theory MeSH
- Molecular Conformation * MeSH
- Models, Molecular * MeSH
- Computer Simulation MeSH
- Drug Design MeSH
- Publication type
- Journal Article MeSH
- Research Support, Non-U.S. Gov't MeSH
- Research Support, N.I.H., Extramural MeSH