distributed processing
Dotaz
Zobrazit nápovědu
This work focuses on improving a camera system for sensing a workspace in which dynamic obstacles need to be detected. The currently available state-of-the-art solution (MoveIt!) processes data in a centralized manner from cameras that have to be registered before the system starts. Our solution enables distributed data processing and dynamic change in the number of sensors at runtime. The distributed camera data processing is implemented using a dedicated control unit on which the filtering is performed by comparing the real and expected depth images. Measurements of the processing speed of all sensor data into a global voxel map were compared between the centralized system (MoveIt!) and the new distributed system as part of a performance benchmark. The distributed system is more flexible in terms of sensitivity to a number of cameras, better framerate stability and the possibility of changing the camera number on the go. The effects of voxel grid size and camera resolution were also compared during the benchmark, where the distributed system showed better results. Finally, the overhead of data transmission in the network was discussed where the distributed system is considerably more efficient. The decentralized system proves to be faster by 38.7% with one camera and 71.5% with four cameras.
- Klíčová slova
- collaboration, distributed processing, human–robot interaction, obstacles detection, sensors network, workspace monitoring,
- MeSH
- počítačové komunikační sítě * MeSH
- Publikační typ
- časopisecké články MeSH
Processing of memory is supported by coordinated activity in a network of sensory, association, and motor brain regions. It remains a major challenge to determine where memory is encoded for later retrieval. Here, we used direct intracranial brain recordings from epilepsy patients performing free recall tasks to determine the temporal pattern and anatomical distribution of verbal memory encoding across the entire human cortex. High γ frequency activity (65-115 Hz) showed consistent power responses during encoding of subsequently recalled and forgotten words on a subset of electrodes localized in 16 distinct cortical areas activated in the tasks. More of the high γ power during word encoding, and less power before and after the word presentation, was characteristic of successful recall and observed across multiple brain regions. Latencies of the induced power changes and this subsequent memory effect (SME) between the recalled and forgotten words followed an anatomical sequence from visual to prefrontal cortical areas. Finally, the magnitude of the memory effect was unexpectedly found to be the largest in selected brain regions both at the top and at the bottom of the processing stream. These included the language processing areas of the prefrontal cortex and the early visual areas at the junction of the occipital and temporal lobes. Our results provide evidence for distributed encoding of verbal memory organized along a hierarchical posterior-to-anterior processing stream.
- Klíčová slova
- cognition, cortical mapping, electrocorticography, high-frequency oscillations, network oscillations,
- MeSH
- časové faktory MeSH
- elektrokortikografie MeSH
- gama rytmus EEG fyziologie MeSH
- lidé MeSH
- mapování mozku MeSH
- mozková kůra fyziologie patofyziologie MeSH
- percepce řeči fyziologie MeSH
- refrakterní epilepsie patofyziologie psychologie MeSH
- rozpomínání fyziologie MeSH
- slovní zásoba MeSH
- zraková percepce fyziologie MeSH
- Check Tag
- lidé MeSH
- Publikační typ
- časopisecké články MeSH
- multicentrická studie MeSH
- práce podpořená grantem MeSH
- Research Support, U.S. Gov't, Non-P.H.S. MeSH
Reduction of fossil fuel usage, clean energy supply, and dependability are all major benefits of integrating distributed energy resources (DER) with electrical utility grid (UG). Nevertheless, there are difficulties with this integration, most notably accidental islanding that puts worker and equipment safety at risk. Islanding detection methods (IDMs) play a critical role in resolving this problem. All IDMs are thoroughly evaluated in this work, which divides them into two categories: local approaches that rely on distributed generation (DG) side monitoring and remote approaches that make use of communication infrastructure. The study offers a comparative evaluation to help choose the most efficient and applicable IDM, supporting well-informed decision-making for the safe and dependable operation of distributed energy systems within electrical distribution networks. IDMs are evaluated based on NDZ outcomes, detection duration, power quality impact, multi-DG operation, suitability, X/R ratio reliance, and efficient functioning.
- Klíčová slova
- Artificial neural network, Distributed generation, Islanding detection, Microgrid, Non-detection zone, Renewable energy, Signal processing,
- Publikační typ
- časopisecké články MeSH
The distributed nature of modern research emphasizes the importance of collecting and sharing the history of digital and physical material, to improve the reproducibility of experiments and the quality and reusability of results. Yet, the application of the current methodologies to record provenance information is largely scattered, leading to silos of provenance information at different granularities. To tackle this fragmentation, we developed the Common Provenance Model, a set of guidelines for the generation of interoperable provenance information, and to allow the reconstruction and the navigation of a continuous provenance chain. This work presents the first version of the model, available online, based on the W3C PROV Data Model and the Provenance Composition pattern.
- Klíčová slova
- Common Provenance Model, Provenance Composition, Provenance information, W3C PROV, distributed processes,
- MeSH
- biologické vědy * MeSH
- reprodukovatelnost výsledků MeSH
- Publikační typ
- časopisecké články MeSH
Cassava is a staple food in many countries, and this food source differs from other crops in that its processing generates a highly polluting and toxic residue (manipueira) that requires further treatment. The present study analyzed the economic feasibility of anaerobic digestion of manipueira for producing clean electricity through distributed generation (DG) while simultaneously eliminating toxic compounds. This eliminates the toxic residues. For this, an approach for the sizing of DG plants from manipueira biogas was presented, a non-trivial task which is not widespread in the literature. For two plants with different capacities, a deterministic economic analysis was carried out based on the criteria of Net Present Value, Internal Rate of Return, and Discounted Payback. Finally, the project risk was assessed through a sensitivity and stochastic analysis using Monte Carlo Simulation. The empirical verification was done on Brazilian data. When considering the NPV criterion, the results indicate a feasibility probability of 9.25% and 81.21% for scenarios 01 and 02, respectively. The results show that scale gains were important in reducing the impact of the investment and, at the same time, the larger scale of the project makes the cost of capital more relevant to the result. These findings show the need for subsidies for the investment, in addition to the promotion of specific credit lines that enable small-scale generation, or that can improve results in greater capacity.
- Klíčová slova
- Manipueira, discounted cashflow, distributed generation, financial viability, investment,
- MeSH
- biopaliva * MeSH
- Publikační typ
- časopisecké články MeSH
- Geografické názvy
- Brazílie MeSH
- Názvy látek
- biopaliva * MeSH
In stochastic neuronal models, an interspike interval corresponds to the time interval during which the process imitating the membrane potential reaches a threshold from an initial depolarization. For neurons with an extensive dendritic structure, a stochastic process combining diffusion and discontinuous development of its trajectory is considered a good description of the membrane potential. Due to a lack of analytical solutions of the threshold passage distribution for such a process, a method for computer simulation is introduced here. For the diffusion Ornstein-Uhlenbeck process with exponentially distributed moments of constant jumps a program is given. The relation between the simulation step, accuracy of simulation and amount of computing time required is discussed.
The article presents a synthesis method to design electrical circuit elements with fractional-order impedance, referred to as a Fractional-Order Element (FOE) or Fractor, that can be implemented by Metal-Oxide-Semiconductor (MOS) transistors. This provides an approach to realize this class of device using current integrated circuit manufacturing technologies. For this synthesis MOS transistors are treated as uniform distributed resistive-capacitive layer structures. The synthesis approach adopts a genetic algorithm to generate the MOS structures interconnections and dimensions to realize an FOE with user-defined constant input admittance phase, allowed ripple deviations, and target frequency range. A graphical user interface for the synthesis process is presented to support its wider adoption. We synthetized and present FOEs with admittance phase from 5 to 85 degrees. The design approach is validated using Cadence post-layout simulations of an FOE design with admittance phase of 74 ± 1 degrees realized using native n-channel MOS devices in TSMC 65 nm technology. Overall, the post-layout simulations demonstrate magnitude and phase errors less than 0.5% and 0.1 degrees, respectively, compared to the synthesis expected values in the frequency band from 1 kHz to 10 MHz. This supports that the design approach is appropriate for the future fabrication and validation of FOEs using this process technology.
- Klíčová slova
- Distributed element, Fractional-order element, Fractor, Genetic algorithm, MOS transistor,
- Publikační typ
- časopisecké články MeSH
PURPOSE: The authors investigated whether the method of mathematical processing of retinal images with the use of a computer can be used to evaluate ocular background figures of patients with physiologic retinal findings. This method is based on identification of vascular endings in an examined retinal area. When the authors mention vascular endings, they do not refer to factual endings, but recognizable vascular endings; there are no endings in a vascular system. METHODS: Adaptive contrast control (ACC) method was used to determine a number of vascular endings. The method is based on mathematical processing of a digitized retina picture with the use of a computer. On a digitized retinal picture, the vascular system is identified with the use of the conditional erosion methodology, and the number of vascular endings is then determined. The ACC method was used to process a file of retinal pictures of 38 patients (76 eyes) with physiologic retinal findings. RESULTS: Based on the results of statistical analysis, the authors detected that the number of vascular endings showed a normal curve (Gaussian distribution, p=0.05). A tight correlation between quantities of vascular endings in the right and the left eyes was also detected, which means that the quantity of vascular endings in the right and the left eyes is in a very close correlation (p=0.05). CONCLUSIONS: The authors highlight that the curve of the number of vascular endings of patients with physiologic retinal findings shows a Gaussian distribution.
- MeSH
- dospělí MeSH
- funkční lateralita MeSH
- lidé středního věku MeSH
- lidé MeSH
- matematické výpočty počítačové MeSH
- mladiství MeSH
- normální rozdělení MeSH
- počítačové zpracování obrazu metody MeSH
- retinální cévy anatomie a histologie MeSH
- senioři nad 80 let MeSH
- senioři MeSH
- věkové rozložení MeSH
- Check Tag
- dospělí MeSH
- lidé středního věku MeSH
- lidé MeSH
- mladiství MeSH
- mužské pohlaví MeSH
- senioři nad 80 let MeSH
- senioři MeSH
- ženské pohlaví MeSH
- Publikační typ
- časopisecké články MeSH
- práce podpořená grantem MeSH
For decades, biologists have relied on software to visualize and interpret imaging data. As techniques for acquiring images increase in complexity, resulting in larger multidimensional datasets, imaging software must adapt. ImageJ is an open-source image analysis software platform that has aided researchers with a variety of image analysis applications, driven mainly by engaged and collaborative user and developer communities. The close collaboration between programmers and users has resulted in adaptations to accommodate new challenges in image analysis that address the needs of ImageJ's diverse user base. ImageJ consists of many components, some relevant primarily for developers and a vast collection of user-centric plugins. It is available in many forms, including the widely used Fiji distribution. We refer to this entire ImageJ codebase and community as the ImageJ ecosystem. Here we review the core features of this ecosystem and highlight how ImageJ has responded to imaging technology advancements with new plugins and tools in recent years. These plugins and tools have been developed to address user needs in several areas such as visualization, segmentation, and tracking of biological entities in large, complex datasets. Moreover, new capabilities for deep learning are being added to ImageJ, reflecting a shift in the bioimage analysis community towards exploiting artificial intelligence. These new tools have been facilitated by profound architectural changes to the ImageJ core brought about by the ImageJ2 project. Therefore, we also discuss the contributions of ImageJ2 to enhancing multidimensional image processing and interoperability in the ImageJ ecosystem.
- Klíčová slova
- Fiji, ImageJ, image analysis, imaging, microscopy, open source software,
- MeSH
- počítačové zpracování obrazu * MeSH
- software * MeSH
- umělá inteligence * MeSH
- Publikační typ
- časopisecké články MeSH
- práce podpořená grantem MeSH
- Research Support, N.I.H., Extramural MeSH
The digital polymerase chain reaction (dPCR) is an irreplaceable variant of PCR techniques due to its capacity for absolute quantification and detection of rare deoxyribonucleic acid (DNA) sequences in clinical samples. Image processing methods, including micro-chamber positioning and fluorescence analysis, determine the reliability of the dPCR results. However, typical methods demand high requirements for the chip structure, chip filling, and light intensity uniformity. This research developed an image-to-answer algorithm with single fluorescence image capture and known image-related error removal. We applied the Hough transform to identify partitions in the images of dPCR chips, the 2D Fourier transform to rotate the image, and the 3D projection transformation to locate and correct the positions of all partitions. We then calculated each partition's average fluorescence amplitudes and generated a 3D fluorescence intensity distribution map of the image. We subsequently corrected the fluorescence non-uniformity between partitions based on the map and achieved statistical results of partition fluorescence intensities. We validated the proposed algorithms using different contents of the target DNA. The proposed algorithm is independent of the dPCR chip structure damage and light intensity non-uniformity. It also provides a reliable alternative to analyze the results of chip-based dPCR systems.
- MeSH
- algoritmy MeSH
- DNA * genetika MeSH
- počítačové zpracování obrazu * MeSH
- polymerázová řetězová reakce MeSH
- reprodukovatelnost výsledků MeSH
- Publikační typ
- časopisecké články MeSH
- práce podpořená grantem MeSH
- Názvy látek
- DNA * MeSH