Electroencephalography (EEG) experiments typically generate vast amounts of data due to the high sampling rates and the use of multiple electrodes to capture brain activity. Consequently, storing and transmitting these large datasets is challenging, necessitating the creation of specialized compression techniques tailored to this data type. This study proposes one such method, which at its core uses an artificial neural network (specifically a convolutional autoencoder) to learn the latent representations of modelled EEG signals to perform lossy compression, which gets further improved with lossless corrections based on the user-defined threshold for the maximum tolerable amplitude loss, resulting in a flexible near-lossless compression scheme. To test the viability of our approach, a case study was performed on the 256-channel binocular rivalry dataset, which also describes mostly data-specific statistical analyses and preprocessing steps. Compression results, evaluation metrics, and comparisons with baseline general compression methods suggest that the proposed method can achieve substantial compression results and speed, making it one of the potential research topics for follow-up studies.
- Klíčová slova
- Artificial neural networks, Data compression, Electroencephalography, Machine learning, Neuroinformatics,
- MeSH
- autoenkodér MeSH
- elektroencefalografie * metody MeSH
- komprese dat * metody MeSH
- lidé MeSH
- neuronové sítě * MeSH
- počítačové zpracování signálu * MeSH
- Check Tag
- lidé MeSH
- Publikační typ
- časopisecké články MeSH
- dataset MeSH
The utilization of computer vision in smart farming is becoming a trend in constructing an agricultural automation scheme. Deep learning (DL) is famous for the accurate approach to addressing the tasks in computer vision, such as object detection and image classification. The superiority of the deep learning model on the smart farming application, called Progressive Contextual Excitation Network (PCENet), has also been studied in our recent study to classify cocoa bean images. However, the assessment of the computational time on the PCENet model shows that the original model is only 0.101s or 9.9 FPS on the Jetson Nano as the edge platform. Therefore, this research demonstrates the compression technique to accelerate the PCENet model using pruning filters. From our experiment, we can accelerate the current model and achieve 16.7 FPS assessed in the Jetson Nano. Moreover, the accuracy of the compressed model can be maintained at 86.1%, while the original model is 86.8%. In addition, our approach is more accurate than ResNet18 as the state-of-the-art only reaches 82.7%. The assessment using the corn leaf disease dataset indicates that the compressed model can achieve an accuracy of 97.5%, while the accuracy of the original PCENet is 97.7%.
- Klíčová slova
- deep learning, model compression, progressive contextual excitation, pruning filters,
- MeSH
- automatizace MeSH
- farmy MeSH
- fyzikální jevy MeSH
- komprese dat * MeSH
- zemědělství * MeSH
- Publikační typ
- časopisecké články MeSH
Distinguishing cause from effect is a scientific challenge resisting solutions from mathematics, statistics, information theory and computer science. Compression-Complexity Causality (CCC) is a recently proposed interventional measure of causality, inspired by Wiener-Granger's idea. It estimates causality based on change in dynamical compression-complexity (or compressibility) of the effect variable, given the cause variable. CCC works with minimal assumptions on given data and is robust to irregular-sampling, missing-data and finite-length effects. However, it only works for one-dimensional time series. We propose an ordinal pattern symbolization scheme to encode multidimensional patterns into one-dimensional symbolic sequences, and thus introduce the Permutation CCC (PCCC). We demonstrate that PCCC retains all advantages of the original CCC and can be applied to data from multidimensional systems with potentially unobserved variables which can be reconstructed using the embedding theorem. PCCC is tested on numerical simulations and applied to paleoclimate data characterized by irregular and uncertain sampling and limited numbers of samples.
New methods of securing the distribution of audio content have been widely deployed in the last twenty years. Their impact on perceptive quality has, however, only been seldomly the subject of recent extensive research. We review digital speech watermarking state of the art and provide subjective testing of watermarked speech samples. Latest speech watermarking techniques are listed, with their specifics and potential for further development. Their current and possible applications are evaluated. Open-source software designed to embed watermarking patterns in audio files is used to produce a set of samples that satisfies the requirements of modern speech-quality subjective assessments. The patchwork algorithm that is coded in the application is mainly considered in this analysis. Different watermark robustness levels are used, which allow determining the threshold of detection to human listeners. The subjective listening tests are conducted following ITU-T P.800 Recommendation, which precisely defines the conditions and requirements for subjective testing. Further analysis tries to determine the effects of noise and various disturbances on watermarked speech's perceived quality. A threshold of intelligibility is estimated to allow further openings on speech compression techniques with watermarking. The impact of language or social background is evaluated through an additional experiment involving two groups of listeners. Results show significant robustness of the watermarking implementation, retaining both a reasonable net subjective audio quality and security attributes, despite mild levels of distortion and noise. Extended experiments with Chinese listeners open the door to formulate a hypothesis on perception variations with geographical and social backgrounds.
- MeSH
- algoritmy MeSH
- jazyk (prostředek komunikace) MeSH
- komprese dat metody MeSH
- lidé MeSH
- percepce řeči MeSH
- rozpoznávání automatizované metody MeSH
- telekomunikace MeSH
- Check Tag
- lidé MeSH
- Publikační typ
- časopisecké články MeSH
- Geografické názvy
- Čína MeSH
The performance of ECG signals compression is influenced by many things. However, there is not a single study primarily focused on the possible effects of ECG pathologies on the performance of compression algorithms. This study evaluates whether the pathologies present in ECG signals affect the efficiency and quality of compression. Single-cycle fractal-based compression algorithm and compression algorithm based on combination of wavelet transform and set partitioning in hierarchical trees are used to compress 125 15-leads ECG signals from CSE database. Rhythm and morphology of these signals are newly annotated as physiological or pathological. The compression performance results are statistically evaluated. Using both compression algorithms, physiological signals are compressed with better quality than pathological signals according to 8 and 9 out of 12 quality metrics, respectively. Moreover, it was statistically proven that pathological signals were compressed with lower efficiency than physiological signals. Signals with physiological rhythm and physiological morphology were compressed with the best quality. The worst results reported the group of signals with pathological rhythm and pathological morphology. This study is the first one which deals with effects of ECG pathologies on the performance of compression algorithms. Signal-by-signal rhythm and morphology annotations (physiological/pathological) for the CSE database are newly published.
- MeSH
- algoritmy MeSH
- databáze faktografické MeSH
- elektrokardiografie metody MeSH
- fraktály MeSH
- komprese dat metody MeSH
- lidé MeSH
- vlnková analýza MeSH
- Check Tag
- lidé MeSH
- Publikační typ
- časopisecké články MeSH
- práce podpořená grantem MeSH
3D macromolecular structural data is growing ever more complex and plentiful in the wake of substantive advances in experimental and computational structure determination methods including macromolecular crystallography, cryo-electron microscopy, and integrative methods. Efficient means of working with 3D macromolecular structural data for archiving, analyses, and visualization are central to facilitating interoperability and reusability in compliance with the FAIR Principles. We address two challenges posed by growth in data size and complexity. First, data size is reduced by bespoke compression techniques. Second, complexity is managed through improved software tooling and fully leveraging available data dictionary schemas. To this end, we introduce BinaryCIF, a serialization of Crystallographic Information File (CIF) format files that maintains full compatibility to related data schemas, such as PDBx/mmCIF, while reducing file sizes by more than a factor of two versus gzip compressed CIF files. Moreover, for the largest structures, BinaryCIF provides even better compression-factor ten and four versus CIF files and gzipped CIF files, respectively. Herein, we describe CIFTools, a set of libraries in Java and TypeScript for generic and typed handling of CIF and BinaryCIF files. Together, BinaryCIF and CIFTools enable lightweight, efficient, and extensible handling of 3D macromolecular structural data.
- MeSH
- chemické databáze MeSH
- komprese dat metody MeSH
- krystalografie metody MeSH
- makromolekulární látky chemie ultrastruktura MeSH
- molekulární modely * MeSH
- software * MeSH
- Publikační typ
- časopisecké články MeSH
- práce podpořená grantem MeSH
- Research Support, N.I.H., Extramural MeSH
- Research Support, U.S. Gov't, Non-P.H.S. MeSH
- Názvy látek
- makromolekulární látky MeSH
Compression of ECG signal is essential especially in the area of signal transmission in telemedicine. There exist many compression algorithms which are described in various details, tested on various datasets and their performance is expressed by different ways. There is a lack of standardization in this area. This study points out these drawbacks and presents new compression algorithm which is properly described, tested and objectively compared with other authors. This study serves as an example how the standardization should look like. Single-cycle fractal-based (SCyF) compression algorithm is introduced and tested on 4 different databases-CSE database, MIT-BIH arrhythmia database, High-frequency signal and Brno University of Technology ECG quality database (BUT QDB). SCyF algorithm is always compared with well-known algorithm based on wavelet transform and set partitioning in hierarchical trees in terms of efficiency (2 methods) and quality/distortion of the signal after compression (12 methods). Detail analysis of the results is provided. The results of SCyF compression algorithm reach up to avL = 0.4460 bps and PRDN = 2.8236%.
Stimulus-frequency otoacoustic emissions (SFOAEs) are generated by coherent reflection of forward traveling waves by perturbations along the basilar membrane. The strongest wavelets are backscattered near the place where the traveling wave reaches its maximal amplitude (tonotopic place). Therefore, the SFOAE group delay might be expected to be twice the group delay estimated in the cochlear filters. However, experimental data have yielded steady-state SFOAE components with near-zero latency. A cochlear model is used to show that short-latency SFOAE components can be generated due to nonlinear reflection of the compressor or suppressor tones used in SFOAE measurements. The simulations indicate that suppressors produce more pronounced short-latency components than compressors. The existence of nonlinear reflection components due to suppressors can also explain why SFOAEs can still be detected when suppressors are presented more than half an octave above the probe-tone frequency. Simulations of the SFOAE suppression tuning curves showed that phase changes in the SFOAE residual as the suppressor frequency increases are mostly determined by phase changes of the nonlinear reflection component.
3D imaging approaches based on X-ray microcomputed tomography (microCT) have become increasingly accessible with advancements in methods, instruments and expertise. The synergy of material and life sciences has impacted biomedical research by proposing new tools for investigation. However, data sharing remains challenging as microCT files are usually in the range of gigabytes and require specific and expensive software for rendering and interpretation. Here, we provide an advanced method for visualisation and interpretation of microCT data with small file formats, readable on all operating systems, using freely available Portable Document Format (PDF) software. Our method is based on the conversion of volumetric data into interactive 3D PDF, allowing rotation, movement, magnification and setting modifications of objects, thus providing an intuitive approach to analyse structures in a 3D context. We describe the complete pipeline from data acquisition, data processing and compression, to 3D PDF formatting on an example of craniofacial anatomical morphology in the mouse embryo. Our procedure is widely applicable in biological research and can be used as a framework to analyse volumetric data from any research field relying on 3D rendering and CT-biomedical imaging.
- MeSH
- anatomické modely MeSH
- automatizované zpracování dat MeSH
- komprese dat statistika a číselné údaje MeSH
- lebka anatomie a histologie embryologie MeSH
- myši MeSH
- obličejové kosti anatomie a histologie embryologie MeSH
- rentgenová mikrotomografie statistika a číselné údaje MeSH
- rentgenový obraz - interpretace počítačová MeSH
- šíření informací metody MeSH
- software * MeSH
- zobrazování trojrozměrné statistika a číselné údaje MeSH
- zvířata MeSH
- Check Tag
- myši MeSH
- zvířata MeSH
- Publikační typ
- časopisecké články MeSH
- práce podpořená grantem MeSH
Despite recent success of deep learning models in numerous applications, their widespread use on mobile devices is seriously impeded by storage and computational requirements. In this paper, we propose a novel network compression method called Adaptive Dimension Adjustment Tucker decomposition (ADA-Tucker). With learnable core tensors and transformation matrices, ADA-Tucker performs Tucker decomposition of arbitrary-order tensors. Furthermore, we propose that weight tensors in networks with proper order and balanced dimension are easier to be compressed. Therefore, the high flexibility in decomposition choice distinguishes ADA-Tucker from all previous low-rank models. To compress more, we further extend the model to Shared Core ADA-Tucker (SCADA-Tucker) by defining a shared core tensor for all layers. Our methods require no overhead of recording indices of non-zero elements. Without loss of accuracy, our methods reduce the storage of LeNet-5 and LeNet-300 by ratios of 691× and 233 ×, respectively, significantly outperforming state of the art. The effectiveness of our methods is also evaluated on other three benchmarks (CIFAR-10, SVHN, ILSVRC12) and modern newly deep networks (ResNet, Wide-ResNet).
- Klíčová slova
- Compression, Convolutional neural network, Dimension adjustment, Tucker decomposition,
- MeSH
- benchmarking MeSH
- deep learning * trendy MeSH
- komprese dat metody trendy MeSH
- lidé MeSH
- neuronové sítě * MeSH
- strojové učení MeSH
- Check Tag
- lidé MeSH
- Publikační typ
- časopisecké články MeSH