Refine
Document Type
- Conference Proceeding (31) (remove)
Year of publication
- 2021 (31) (remove)
Language
- English (31) (remove)
Keywords
- Blockchain (8)
- Maschinelles Lernen (3)
- Fluoreszenz-Resonanz-Energie-Transfer (2)
- Industrie 4.0 (2)
- Accounting (1)
- Agri-food (1)
- Ant Colony System (1)
- Assembly (1)
- Beam shaping (1)
- Big Data (1)
Over the last two decades, the rapid advances in digitization methods put us on the fourth industrial era’s cusp. It is an era of connectivity and interactivity between various industrial processes that need a new, trusted environment to exchange and share information and data without relying on third parties. Blockchain technologies can provide such a trusted environment. This paper focuses on utilizing the blockchain with its characteristics to build machine-to-machine (M2M) communication and digital twin solutions. We propose a conceptual design for a system that uses smart contracts to construct digital twins for machines and products and executes manufacturing processes inside the blockchain. Our solution also employs the decentralized identifiers standard (DIDs) to provide self-sovereign digital identities for machines and products. To validate the approach and demonstrate its applicability, the paper presents an actual implementation of the proposed design to a simulated case study done with the help of Fischertechnik factory model.
Abstract: Blockchain Technology has become an innovative, mature tool for digital transformation, disrupting more and more application areas in their business processes, values, or even economic models. This paper leverages more than 30 academic publications on prototypes and their Blockchain-based use cases to transact certificates in the context of public education. The conceptual design and guiding ideas are reflected in the practical application development for the Federal Ministry of Education and Research ECHT! project within the showcase region WIR! in Mittweida and are used for the research design. During this approach we applied agile methods and the current certificate process to propose a comprehensive disclosure of a new software prototype including a three-layered architecture with multi-stakeholder components. The artefact instantiation contributes to the practical knowledge base within Information System Research and specifically in digital certificate processes starting from creation, searching, and proofing up to revoking by consideration of an existing IT landscape as well as organizational hierarchy.
With the advancement in cryptography and emerging internet technology, electronic voting is gaining popularity since it ensures ballot secrecy, voter security, and integrity. Many commercial startups and e-Voting systems have been proposed, but due to lack of trust, privacy, transparency, and hacking issues, many solutions have been suspended. Blockchain, along with cryptographic primitives, has emerged as a promising solution due to its transparent, immutable, and decentralized nature. In this paper, we summarized the properties that existing solutions should satisfy and explained some cryptographic primitives like ZKP, Ring signatures along with their security limitations. We gave a comprehensive review of some blockchain-based e-Voting systems and discussed their strengths and weaknesses based on the given properties with table of comparison.
Cryptocurrencies are characterized by high volatility, both in the short and long term. Experienced traders exploit this to make profits from price fluctuations by swing trading. However, this requires closely observing and analyzing the prices and trading positions at the right time. Only a few specialists, who spend time focusing on this, or optimized trading bots are able to actually make continuously profits. The autradix protocol is a selfoptimizing and self-learning parametric trading algorithm that analyzes price actions in real-time and adaptively optimizes the algorithm’s parameters to realize the user’s investment objective. Embedded in an adaptive genetic algorithm, possible parameterizations are simulated and the optimal for the investigated trading pairs are calculated. The generic trading protocol API enables coupling with various crypto exchanges and decentralized protocols. A smart contract based decentralized, trustless, and tokenized fund, controlled by a DAO, enables users to invest, operate trading agents, and to participate in the profits generated according to their share.
While blockchain technology is still in an early stage of its development, it is already of surging economic importance.
In the literature, blockchain is referred to as either being a disruptive, institutional, foundational, or general purpose technology. There is still no consensus about the economic theory that should apply for analyzing its economic effects. This article draws on use cases from the coffee supply chain to explore, which theories could potentially apply to an emerging blockchain economy.
The wind energy sector is undergoing digitalization processes that span multi-tier supply chains of turbine components and wind farm maintenance, amongst others. In an industrial use case that includes Siemens Gamesa Renewable Energy, Vestas and APQP4Wind, the processes of producing, fastening, and servicing bolts in turbines are mapped to a digital model. The model follows the lifetime of turbine bolts from the manufacturing phase, to fastening in turbines and maintenance, until their replacement and recycling. The development of the digital model is iteratively addressed in a design science research approach, as the authors actively contribute to the project. Distributed ledgers (DLs) support the notary documentation of the bolts and turbines, from their registration phase to the assembly-, technical service verification- and recycling phases. The immutable and decentralized nature of DLs secures the data against tampering and prevents any changes taken unilaterally by engaging the service stakeholders and component providers in a blockchain consortium.
Global challenges like climate change, food security, and infectious diseases such as the COVID-19 pandemic are nearly impossible to tackle when established experts and upstart innovators work in silos. If research organizations, governments, universities, NGOs, and the private sector could collaborate on these challenges more easily, lasting solutions would certainly come more quickly. Aligned with the United Nations’ Sustainable Development Goals, SAIRA connects key players in different arenas: scientists and engineers at research and technology organizations (RTOs) looking to collaborate on sustainable development projects, companies seeking R&D support to tackle their most challenging problems, and startups with innovative ideas and a desire to scale. The platform is a blockchain-secured open innovation platform, anchored on Max Plank Digital Library's blockchain network bloxberg, that assures the authenticity and integrity of all user-generated content and collaboration processes.
Mapping identities, digital assets, and people’s profiles on the internet is getting much traction in the blockchain cosmos lately. The new technology is currently forming architectures that will further pave new ways to reach fundamental mechanisms to interact in a decentralized, user-centered manner. These schemes are often declared as the next generation of the web. Within the article will be shown, how the internet has evolved in managing identities, what problems arose, and how new data architectures help build applications on top of privacy rights. Both technological and ethical perspectives are viewed to answer which guidelines should be considered to fulfill the upcoming branch of decentralized services and what we can learn from historical schemes regarding their privacy, accounting, and user data.
Blockchain and other distributed ledger technologies are evolving into enabling infrastructures for innovative ICT-solutions. Numerous features, such as decentralization, programmability, and immutability of data, have led to a multitude of use cases that range from cryptocurrencies, tracking and tracing to automated business protocols or decentralized autonomous systems. For organizations that seek blockchain adoption, the overwhelming spectrum of potential application areas requires guidance reducing complexity and support the development of blockchain-based concepts. This paper introduces a classification approach to provide design and implementation guidance that goes beyond current textbook classifications. As an outcome, a typology for management and business architects is developed, before the paper concludes with an instantiation of existing use cases and a discussion of their classes.
Bitcoin's energy consumption and social costs in relation to its capacity as a settlement layer
(2021)
Bitcoin runs on energy. The decentralized network’s amount of energy consumption has resulted in multifaceted discussions about its efficiency and environmental impact. To put Bitcoin’s energy consumption into perspective, we propose to relate (a) the energy consumption in TWh and (b) resulting social costs in the form of carbon emissions to the Dollar value settled on the Bitcoin network. Both metrics allow to relate and quantify the capacity of Bitcoin as a settlement layer to the network’s energy consumption and resulting carbon missions, or social costs. We find that in early 2021 Bitcoin (a) settles between $2,333 and $7,555 for each Dollar spent on energy and (b) that, on average, a Dollar settled on the Bitcoin blockchain causes in social costs between 0.007% and 0.01%, depending on the estimated energy consumption converted into the costs of carbon emissions. These results help to assess the efficiency, cost and sustainability of Bitcoin and may allow a comparison of Bitcoin with existing settlement base layers such as Fedwire or gold
Smart ultrafast laser processing with rotating beam – Laser micro drilling, cutting and turning
(2021)
Current micro drilling, cutting and turning processes are mainly based on EDM, milling, stamping, honing or grinding. All these technologies are using a tool with a predefined geometry that is transferred to the working piece. In contrast the laser is a highly flexible tool, which can adapt its size very fast by changing only a software setting. Thanks to the efforts in laser development during the last years, stable ultrafast lasers with sufficient average power and high repetition rates became industrially available. For using as many pulses as possible, a cost-efficient production demands for innovative processes and machining setups with fast axes movement and special optics for beam manipulation. GFH has developed a helical drilling optics, which rotates the beam up to 30.000 rpm in a very precise circle and allows furthermore to adjust the diameter and the incidence angle. This enables the laser to be used for high precision drilling and cutting and also for micro turning processes.
Pulsed laser processing of vacuum component surfaces is a promising method for electron cloud mitigation in particle accelerators. By generating a hierarchically structured surface, the escape probability of secondary electrons is reduced. The choice of laser treatment parameters – such as laser power, scanning speed and line distance – has an influence on the resulting surface morphology as well as on its performance. The impact of processing parameters on the surface properties of copper is investigated by Secondary Electron Yield (SEY) measurements, Scanning Electron Microscopy (SEM), ablation depth measurements in an optical microscope and particle release analysis. Independent of the laser wavelength (532nm and 1064nm), it was found that the surface morphology changes when varying the processing parameters. The ablation depth increases and the SEY reduces with increasing laser fluence. The final application requires the capability to treat tens of meters of vacuum pipes. The limiting factors of this type of surface treatment for the applicability in particle accelerators are discussed.
Increasing speed in laser processing is driven by the development of high-power lasers into ranges of more than 1 kW. Additionally, a proper distribution of these laser power is required to achieve high quality processing results. In the case of high pulse repletion rates, a proper distribution of the pulses can be obtained from ultrafast beam deflection in the range of several 100 m/s. A two-dimensional polygon mirror scanner has been used to distribute a nanosecond pulsed laser with up to 1 kW average power at a wavelength of 1064 nm for multi pass laser engraving. The pulse duration of this laser can be varied between 30 ns and 240 ns and the pulse repetition rate is set between 1 and 4 MHz. The depth information is included in greyscale bitmaps, which were used to modulate the laser during the scanning accordingly to the lateral position and the depth. The process allows high processing rates and thus high throughput.
Beam shaping and splitting with diffractive optics for high performance laser scanning systems
(2021)
Diffractive optical elements (DOEs) enable novel high performance and process-tailored scanning strategies for galvanometer-based scan heads. Here we present several such concepts integrating DOEs with laser scanners and the respective application use cases. Beam shaping DOEs providing a homogeneous fluence over a custom defined profile, such as a rectangular Top-Hat, enable increased process quality in Laser-Induced Forward Transfer (LIFT) compared to the Gaussian beam of the laser source. We show that aberrations which occur over the necessary large wafer-sized image field can be eliminated through the use of a synchronous XY-stage motion. Another application that benefits from the use of DOEs is laser drilling. Drilling in display and electronics manufacturing demands high throughput that can only be achieved through the use of beam splitting DOEs for parallel processing. To this end, the joint MULTISCAN project is developing a variable multi-beam tool capable of scanning and switching each individual beamlet for increased control.
The shape-memory Nitinol as a nickel-titanium alloy is widely used in actuator and medical applications. However, the connection of a flange to the rod is a critical point. Therefore, laser rod end melting enables material accumulations to generate a preform at the end of a rod, followed by die forming, so that the flange can be generated. This process has been successfully applied on 1.4301 steel. This study is aimed to investigate laser rod end melting of shape-memory Nitinol regarding the resultant surface quality of the preforms. The results showed that spherical preforms could be generated without visible surface discoloration due to oxidation. By using different scan rates, different solidification conditions occurred which led to significantly different surface structures. These findings show that laser rod end melting can principally be applied on Nitinol to generate preforms for flanges whereby the surface quality depends on the solidification conditions.
We demonstrate a thulium-based fiber amplifier delivering pulses tunable between <120fs and 2ps duration at up to 228 μJ of pulse energy at a center wavelength of 1940 nm and 500-kHz repetition rate. Due to the excellent long-term stability, this system proves the ability of this technology to be integrated into ultra-fast material processing machines.
We propose a method for edge detection in images with multiplicative noise based on Ant Colony System (ACS). To adapt the Ant Colony System algorithm to multiplicative noise, global pheromone matrix is computed by the Coefficient of Variation. We carried out a performance comparison of the edge detection Ant Colony System algorithm among several techniques, the best results were found in the gradient and the coefficient of variation.
At a global level, different studies disclose that transport systems are responsible for 25% of CO2 emissions. In the context of sustainable mobility, one of the challenges in the short term is associated with the research and improvement of alternative fuels, which should allow a fast decrease in the generation of greenhouse gases due to sustainable transport means. In this sense, green hydrogen can play a fundamental role. Green hydrogen is the basis for producing synthetic fuels, which can replace oil and its derivatives. Synthetic fuels or e-fuel are hydrocarbons produced from carbon dioxide (CO2) and green hydrogen (H2) as the only raw materials. H2 or efuel could be used in many sectors (manufacturing, residential, transportation, mining and other industries). In this study, different applications of hydrogen are evaluated by techno-economic analysis. The main variable that affects the production of hydrogen and its derivatives is the cost of electricity. Considering the renewable energy potential of Chile, it is feasible to develop in Chile the green hydrogen production as an energy vector, which would be technically and economically viable, together with the environmental benefits
In this paper, we designed, implemented, and tested a special surveillance camera system based on a combination of classical image processing algorithms. The system’s sub-objective consists of tracking experimental vehicles driving on a defined trajectories (Rail) in real time. Furthermore, it analyzes the scene to collect additional vehicles & rail-related information. The system then uses the gathered data to reach its main objective which confines oneself in independently predicting vehicles collision. Consequently, we propose a hybrid method of detecting and tracking ATLAS-vehicles efficiently. To detect the vehicle at the beginning of the video, periodically every n-frame, and in the case where the tracked vehicle has been lost, we used Histogram Back-Projection. By contrast, Kernelized correlation filter is used to track the detected vehicles. Combining these two methods provides one of the best trade-offs between accuracy and speed even on a single processing core. The proposed method achieves the best performance compared with three different approaches on a custom dataset.
Standard assembly time is an important piece of data in product development that is used to compare different product variants or manufacturing variants. In the presented approach, standard time is created with the use of a decision tree regarding standard manual and machine-manual operations, taking into consideration product characteristics and typical tools, equipment and layout. The analysed features include, among others: information determined during product development, such as product structure, parts characteristics (e.g. weight, size), connection type, as well as the information determined during assembly planning: tools (e.g. hand screw driver, power screw driver, pliers), equipment (e.g. press, heater), workstation layout (e.g. distance, way of feeding). The object-attribute-value (OAV) framework was applied for the assembly characteristic. An example of the decision tree application to predict standard assembly time was presented for a mechanical subassembly. The case study was dedicated to standard time prediction for a bearing assembly. The presented approach is particularly important for the enterprises which offer customized products.
Sensor fusion is an important and crucial topic in many industrial applications. One of the challenging problems is to find an appropriate sensor combination for the dedicated application or to weight their information adequately. In our contribution, we focus on the application of the sensor fusion concept together with the reference to the distance-based learning for object classification purposes. The developed machine learning model has a bi-functional architecture, which learns on the one side the discrimination of the data regarding their classes and, on the other side, the importance of the single signals, i.e., the contribution of each sensor to the decision. We show that the resulting bi-functional model is interpretative, sparse, and simple to integrate in many standard artificial neural networks.
Learning Vector Quantization (LVQ) methods have been popular choices of classification models ever since its introduction by T. Kohonen in the 90s. These days, LVQ is combined with Deep Learning methods to provide powerful yet interpretable machine-learning solutions to some of the most challenging computational problems.
However, techniques to model recurrent relationships in the data using prototype methods still remain quite unsophisticated. In particular, we are not aware of any modification of LVQ that allows the input data to have different lengths. Needless to say, such data is abundant in today's digital world and demands new processing techniques to extract useful information. In this paper, we propose the use of the Siamese architecture to not only model recurrent relationships within the prototypes but also the ability to handle prototypes of various dimensions simultaneously.
Prototype-based Vector Quantization is one of the key methods in data processing like data compression or interpretable classification learning. Prototype vectors serve as references for data and data classes. The data are given as vectors representing objects by numerical features. Famous approaches are the Neural Gas Vector Quantizer (NGVQ) for data compression and Learning Vector Quantizers (LVQ) for classification tasks. Frequently, training of those models is time consuming. In the contribution we discuss modifications of these algorithms adopting ideas from quantum computing. The aim for this is a least twofold: First quantum computing provides ideas for enormous speedup making use of quantum mechanical systems and inherent parallelization.
Second, considering data and prototype vectors in terms of quantum systems, implicit data processing is performed, which frequently results in better data separation. We will highlight respective ideas and difficulties when equipping vector quantizers with quantum computing features.
Marker-based systems can digitally record human movements in detail. Using the digital biomechanical human model Dynamicus, which was developed by the Institut für Mechatronik, it is possible to model joint angles and their velocities such accurately that it can be used to improve motion analysis in competitive sports or for ergonomic evaluation of motion sequences. In this paper, we use interpretable machine learning techniques to analyze the gait. Here, the focus is on the classification between foot touchdown and drop-off during normal walking. The motion data for training the model is labeled using force plates. We analyze how we could apply our machine learning models directly on new motion data recorded in a different scenario compared to the initial training, more precise on a treadmill. We use the properties of the interpretable model
to detect drift and to transfer our model if necessary.
This article aims to explain mathematically, why the so called double descent observed by Belkin et al., Reconciling modern machine-learning practice and the classical bias-variance trade-off, PNAS 116(32) (2019), p. 15849-15854, occurs on the way from the classical approximation regime of machine learning to the modern interpolation regime. We argue that this phenomenon may be explained by a decomposition of mean squared error plus complexity into bias, variance and an unavoidable irreducible error inherent to the problem. Further, in case of normally distributed output errors, we apply this decomposition to explain, why LASSO provides reliable predictors avoiding overfitting.
We use machine learning for the selection and classification of single–molecule trajectories to replace commonly used user–dependent sorting algorithms. Measured fluorescence time series of labelled single molecules need to be sorted into ’good molecules’ and ’bad’ molecules before further kinetic and thermodynamic analysis.
Currently, processing, sorting and analysis of the data is mainly done with the help of laboratory specific programs.
Although there are freely available programs for processing smFRET data, they do not offer ’molecular sorting’ or it is purely empirical. Only recently, new approaches came up to solve this problem by means of machine learning. Here, we describe a sound terminology for molecular sorting of smFRET data and present an efficient workflow for manual annotation followed by the training of the ML algorithm. Descriptive statistics of our generated dataset are provided and will serve as the basis for supervised ML-based molecular sorting algorithms yet to be developed.
Reducing costs is an important part in todays business. Therefore manufacturers try to reduce unnecessary work processes and storage costs. Machine maintenance is a big, complex, regular process. In addition, the spare parts required for this must be kept in stock until a machine fails. In order to avoid a production breakdown in the event of an unexpected failure, more and more manufacturers rely on predictive maintenance for their machines. This enables more precise planning of necessary maintenance and repair work, as well as a precise ordering of the spare parts required for this. A large amount of past as well as current information is required to create such a predictive forecast about machines. With the classification of motors based on vibration, this paper deals with the implementation of predictive maintenance for thermal systems. There is an overview of suitable sensors and data processing methods, as well as various classification algorithms. In the end, the best sensor-algorithm combinations are shown.
Development of a genetic biomonitoring test for the investigation of pollinator-plant-interactions
(2021)
There is a world-wide decline in biodiversity recorded. Especially insects and accompanying pollinators are threatened. When the foraging behaviour of pollinators is understood in detail, future crop and floral pollination services can be sustained and it is possible to establish projects for the conservation of pollinators and plant biodiversity. With the use of nanopore sequencing methods it is possible to detect pollen species that were collected by pollinators by their genetic information. In this study, a protocol for portable nanopore sequencing of DNA from pollen that was collected by honey bees, bumble bees and wild bees is being designed. DNAmetabarcoding is used to identify species within the mixed DNA sample. The ITS2-region will be used as a barcode. We will investigate pollen preferences of three pollinator species by placing their hives or nests at the same. Based on the results, landscape management schemes are developed that target pollen preferences and nutritional requirements of managed and wild social bee species as well as solitary wild bees.
This work deals with the construction of a microscope for combined total internal reflection fluorescence (TIRF) and confocal microscopy. It is especially designed for single-molecule fluorescence spectroscopy. The design of the microscope body is based on the miCube (Hohlbein lab, Wageningen University, NL). The excitation and detection pathways were adapted to allow both TIRF and confocal illumination as well as camera and pointdetection for two color-channels to allow single-molecule Förster resonance transfer measurements
Cyanobacteria, prokaryotic microorganisms with basically the same oxygenic photosynthesis as higher plants, are becoming excellent green cell factories for sustainable generation of renewable chemicals and fuels from solar energy and carbon dioxide. In the presentation I will visualize the concept green cell factories by introducing and discussing two examples: (i) engineering cyanobacteria to produce the important bulk chemical and potential blend-in biofuel butanol from sunlight and carbon dioxide, so called photosynthetic butanol, and (ii) generation of a functional semisynthetic [FeFe]-hydrogenase linking to the native metabolism in living cells of the unicellular cyanobacterium Synechocystis PCC 6803.
Long-range tertiary interactions between RNA tetraloops and their receptors stabilize the folding of ribosomal RNA and support the maturation of the ribosome. Here, we use FRET-assisted structure prediction to develop a structural model of the GAAA tetraloop receptor (TLR) interaction and its dynamics. We build the docked TLR de novo, label the RNA in silico and compute FRET histograms based on MD simulations. The predicted mean FRET efficiency is remarkably consistent with single-molecule experiments of the docked tetraloop. This hybrid approach of experiment and simulation will promote the elucidation of dynamic RNA tertiary contacts and accelerate the discovery of novel RNA and RNA-protein interactions as potential future drug targets.