Refine
Document Type
- Master's Thesis (119)
- Bachelor Thesis (90)
- Conference Proceeding (66)
- Diploma Thesis (14)
- Final Report (6)
Year of publication
Language
- English (295) (remove)
Keywords
- Blockchain (40)
- Maschinelles Lernen (27)
- Vektorquantisierung (9)
- Algorithmus (7)
- Bioinformatik (6)
- Bitcoin (6)
- Graphentheorie (6)
- Internet der Dinge (6)
- Neuronales Netz (6)
- Unternehmen (6)
- Deep learning (5)
- Ethereum (5)
- Supply Chain Management (5)
- Kryptologie (4)
- Proteine (4)
- RNS (4)
- Sequenzanalyse <Chemie> (4)
- Smart contract (4)
- Softwareentwicklung (4)
- Videospiel (4)
- Virtuelle Währung (4)
- Bildgebendes Verfahren (3)
- Biomarker (3)
- Biotechnologie (3)
- China (3)
- DNA Barcoding (3)
- Fluoreszenz-Resonanz-Energie-Transfer (3)
- Kundenmanagement (3)
- Künstliche Intelligenz (3)
- Lernendes System (3)
- Social Media (3)
- Strategisches Management (3)
- Support-Vektor-Maschine (3)
- Vektor (3)
- Zeitreihe (3)
- Bildverarbeitung (2)
- Biomedizin (2)
- CRISPR/Cas-Methode (2)
- Cluster-Analyse (2)
- Cryptocurrency (2)
- DNS (2)
- Deutschland (2)
- E-Learning (2)
- Education (2)
- Film (2)
- Geschichte (2)
- Indien (2)
- Industrie 4.0 (2)
- Kraftfahrzeugbau (2)
- Kryptorchismus (2)
- Logistik (2)
- Membranproteine (2)
- Nanopartikel (2)
- Objekterkennung (2)
- Sphäroproteine (2)
- Thulium (2)
- Trust (2)
- Ultrafast (2)
- Unternehmensentwicklung (2)
- Verifikation (2)
- Windkraftwerk (2)
- cis-trans-Isomerie (2)
- zero knowledge proof (2)
- 3D-Druck (1)
- Accounting (1)
- Ackerbohne (1)
- Agri-food (1)
- Aminosäurensequenz (1)
- Amyloid (1)
- Anomalieerkennung (1)
- Ant Colony System (1)
- Anthropocene Disease (1)
- Anthropologie (1)
- Arbeitgeber (1)
- Arbeitsplatz (1)
- Arbeitszufriedenheit (1)
- Assembly (1)
- Assessment (1)
- Atomic Swaps (1)
- Ausländer (1)
- Auswirkung (1)
- Axialbelastung (1)
- Beam shaping (1)
- Bekleidungsindustrie , Marketingstrategie (1)
- Beratung (1)
- Berufszufriedenheit (1)
- Beurteilung (1)
- Beziehungsmanagement (1)
- Bibliometric analysis (1)
- Biene <Gattung> (1)
- Big Data (1)
- Bildung (1)
- Biochemie (1)
- Biomarker , Krebs <Medizin> (1)
- Biometrie (1)
- Blender <Programm> (1)
- Bodenorganismus (1)
- Bridge (1)
- Bruchmechanik (1)
- Bruchzähigkeit (1)
- Business Perspective (1)
- Business Reputation System (1)
- COVID-19 (1)
- Chemotherapie (1)
- Cloud Computing (1)
- Cluster , Cluster-Analyse (1)
- Cluster <Datenanalyse> (1)
- Codierungstheorie (1)
- Collective Action (1)
- Common Pool Resources (1)
- Computerforensik (1)
- Computersicherheit (1)
- Computerspiel , Musik (1)
- Corporate Social Responsibility (1)
- Cross-Chain (1)
- Crypto currencies (1)
- Cryptocurrencies (1)
- Cyber-physisches System (1)
- DAB <Rundfunktechnik> (1)
- DAO (1)
- DNA-metabarcoding (1)
- DNS , Geschlechtsbestimmung (1)
- Datenanalyse (1)
- Datenbank (1)
- Datenbanksystem (1)
- Datenerfassung (1)
- Datenübertragung (1)
- DeFi (1)
- DeSci (1)
- Decentralized Crypto Economics (1)
- Degeneration (1)
- Depression , Stressor (1)
- Deutschland , Nordamerika , Alkoholismus , Suchttherapie (1)
- Dezentralisation (1)
- Dienstleistung (1)
- Diffusion , Mathematisches Modell , Zellularer Automat (1)
- Digital Certificates (1)
- Digital Identity (1)
- Digital Signatures (1)
- Digitalisierung (1)
- Direct Laser Interference Patterning (1)
- Direktvertrieb (1)
- Diskreter Logarithmus (1)
- Distributed Ledger Technologies (1)
- Distributed Ledger Technology (1)
- Distributed-Ledger-Technology (1)
- Dokumentverarbeitung (1)
- Druckfallkrankheit (1)
- Dürrestress (1)
- Echtzeitsystem (1)
- Edge Detection (1)
- Effizienz (1)
- Eigenwertproblem (1)
- Einkauf , Strategische Planung (1)
- Electronic Commerce (1)
- Elektrizitätserzeugung (1)
- Elektrizitätswirtschaft , USA (1)
- Elektrostimulation , Stammzelle , Knochenbildung (1)
- Embryonalentwicklung (1)
- Energiewirtschaft (1)
- Engineering cyanobacteria (1)
- Entscheidungsbaum (1)
- Erfolgsfaktor (1)
- Erneuerbare Energien (1)
- Extraktion (1)
- Femtosekundenlaser (1)
- Fernsehsendung (1)
- Fernunterricht (1)
- Feuchtgebiet (1)
- Fiber-laser (1)
- Filmwirtschaft (1)
- Finanzdienstleistungsinstitut (1)
- Fledermäuse (1)
- Flexibilitätsmarkt (1)
- Flexplattform (1)
- Fluoreszenzmarkierung (1)
- Formula Student Germany (1)
- Forschung (1)
- Fusion (1)
- Führung (1)
- GAAA tetraloop (1)
- GDPR (1)
- Game-Based Learning (1)
- Ganganalyse (1)
- Gedruckte Schaltung (1)
- Gen (1)
- General Purpose Technology (1)
- Generative Adversarial Network (1)
- Genexpression (1)
- Gerste (1)
- Geschäftsmodell (1)
- Geschäftsplan (1)
- Gesichtserkennung (1)
- Gesundheitsfürsorge (1)
- Globalisierung (1)
- Glucosinolate , Kreuzblütler , Proteine , Hydrolysat (1)
- Graph (1)
- Green hydrogen (1)
- Haus , Schalldämmung , Trittschallschutz , Luftschall , Mathematisches Modell (1)
- Hirntumor (1)
- Hitzeschock-Proteine (1)
- Hydraulik (1)
- Hydroakustik (1)
- Hydroventil (1)
- ID Union (1)
- IP (1)
- Identitätsverwaltung (1)
- In silico-Methode (1)
- Industrial Internet (1)
- Influencer (1)
- Influenza-A-Virus (1)
- Informationstechnik (1)
- Informationsverarbeitung , Mehragentensystem (1)
- Inhibitor , Rezeptor-Tyrosinkinasen , Epidermaler Wachstumsfaktor-Rezeptor , Lungenkrebs , Zelllinie (1)
- Innovation (1)
- Integriertes Lernen (1)
- Intelligent methods (1)
- Intelligentes Stromnetz (1)
- Interkulturelle Kompetenz (1)
- Internet , Medienkonsum , Konzentrationsfähigkeit (1)
- Internet-TV (1)
- Interpretable Models (1)
- Journalismus (1)
- Kanal (1)
- Kasachstan (1)
- Kaufverhalten (1)
- Klein- und Mittelbetrieb (1)
- Klimaänderung (1)
- Kollektive Handlung (1)
- Kommunikationsstrategie (1)
- Komplexität (1)
- Konfokale Mikroskopie (1)
- Konnossement (1)
- Kontrolltheorie , Stabilität , Steuerungstheorie (1)
- Korrosion (1)
- Kryptoanalyse (1)
- Kryptosystem (1)
- Kugelspalt (1)
- Kulturpflanzen (1)
- Kunde (1)
- Kunststoff (1)
- Landwirtschaft (1)
- Laser beam welding (1)
- Laser end rod melting (1)
- Laserablation (1)
- Lebensraum (1)
- Lernerfolg (1)
- Ligand <Biochemie> (1)
- Linearer Code (1)
- Literature Review (1)
- Local Flexibility Market (1)
- Logistiksystem (1)
- Los Angeles- Hollywood (1)
- Luftschall (1)
- Lungenentzündung (1)
- MD simulation (1)
- MIMO (1)
- Makroökonomie (1)
- Malawi (1)
- Marke (1)
- Markenpolitik (1)
- Marketingstrategie (1)
- Marktanalyse , Sales-promotion (1)
- Markteintrittsstrategie (1)
- Marktforschung (1)
- Maschinelles Sehen (1)
- Materialfluss (1)
- Materialität (1)
- Mathematische Modellierung , Computersimulation , Simulationsspiel , Schiffsnavigation (1)
- Maximal Extractable Value (1)
- Medizin (1)
- Meinungsbildung (1)
- Mergers and Acquisitions (1)
- MerkleProof (1)
- Messenger-RNS (1)
- Metrik <Mathematik> (1)
- MicroLED (1)
- Microstructure (1)
- Migration (1)
- Mikrofinanzierung (1)
- Mikroorganismus (1)
- Mikroskopie (1)
- Mikrospore (1)
- Mikrostruktur (1)
- Molekülstruktur (1)
- Motion Capturing (1)
- Multifunktionalität (1)
- Multiplicative Noise (1)
- Mutante (1)
- München (1)
- Nachhaltigkeit (1)
- Nanostruktur (1)
- Nepal , Biogasgewinnung , Sozioökonomischer Wandel (1)
- Netzwerkanalyse (1)
- Netzwerkverwaltung (1)
- Neuromarketing (1)
- Neuseeland (1)
- Nichteuklidische Geometrie (1)
- Nitinol (1)
- Non-Fungible Token (1)
- Non-coding RNA (1)
- Numerische Mathematik (1)
- Oberflächenbehandlung (1)
- Object Detection and Tracking (1)
- Objektorientierte Programmierung (1)
- Offshoring (1)
- Optische Spektroskopie (1)
- Pandemie (1)
- Paper-based Coffee Cups (1)
- Parvalbumine (1)
- Passwort (1)
- Pathogene Bakterien (1)
- Patient (1)
- Peer-to-Peer-Netz (1)
- Personalmarketing (1)
- Pestizid (1)
- Pflanzen (1)
- Pflanzenkläranlage (1)
- Philanthropie (1)
- Photorezeptor , Netzhautdegeneration (1)
- Photosynthese (1)
- Photosynthetic butanol (1)
- Planar Homography (1)
- Planung (1)
- Pollen (1)
- Polygon scanner processing (1)
- Polymethylmethacrylate (1)
- Polynom (1)
- Polynom , Graphentheorie (1)
- Polysaccharide (1)
- Predictive maintenance (1)
- Primaten (1)
- Procurement (1)
- Produkt (1)
- Produkteinführung (1)
- Programmierung (1)
- Projektmanagement (1)
- Projektplanung (1)
- Prostatakrebs (1)
- Proteinbiosynthese (1)
- Proteine , Bioinformatik (1)
- Proteinfaltung (1)
- Proteinfamilie , Alignment <Biochemie> , Bioinformatik (1)
- Proteinmuster , Bioinformatik (1)
- Prototye-based models (1)
- Prozessüberwachung (1)
- Präsidentenwahl (1)
- Prüfmittel (1)
- Qualitätsmanagement (1)
- Quantencomputer (1)
- RFID (1)
- RNS-Interferenz (1)
- Ranking , Software , Wettbewerb (1)
- Raucher (1)
- Real time quantitative PCR , Genotypisierung (1)
- Realistische Computergrafik (1)
- Recurent Neural Networks (1)
- Regularisierung (1)
- Requirements engineering (1)
- Risiko (1)
- Risikomanagement (1)
- Role-Object Pattern (1)
- Role-based Programming (1)
- Rollenspiel (1)
- SARS-Cov- 2 (1)
- SSI (1)
- Sandwich Attacks (1)
- Satellitenfunk (1)
- Satellitentechnik (1)
- Schallausbreitung (1)
- Schifffahrt (1)
- Schrägkugellager (1)
- Security (1)
- Sekundärstruktur (1)
- Selbstorganisierende Karte (1)
- Selenoproteide (1)
- Self-Sovereign identities (1)
- Semisynthetic [FeFe]-hydrogenase (1)
- Sharing Economy (1)
- Siliziumbearbeitung (1)
- Smart City (1)
- Smart Contract Programming (1)
- Smart Contracts (1)
- Smart Market (1)
- Software (1)
- Soulbound Token (1)
- Soziale Software , Business-to-Business-Marketing (1)
- Spaltströmung (1)
- Sportartikelmarkt (1)
- Sportberichterstattung (1)
- Sportsponsoring (1)
- Stakeholder (1)
- Stochastisches Modell (1)
- Stoffwechsel (1)
- Strukturmodell (1)
- Stöchiometrie (1)
- Surface texturing (1)
- Surface topography (1)
- Systemmedizin (1)
- Südafrika , Zeitung , Fernsehen , Hörfunk (1)
- TIRFM (1)
- Techno-economic analysis (1)
- Teilchenbeschleuniger (1)
- Telekommunikation (1)
- Tiefschweißen (1)
- Tokenization (1)
- Traceability (1)
- Transactions (1)
- Transkriptionsfaktor (1)
- Transparenz (1)
- Tutte-Polynom (1)
- Typoloy (1)
- Ultrakurzpulslaser (1)
- Ultraviolett (1)
- Umweltbelastung (1)
- Umweltbezogenes Management , Marketingstrategie (1)
- Unternehmen , Internationalisierung (1)
- Unternehmensgründung (1)
- Unternehmensgründung , Sozioökonomischer Wandel (1)
- Unternehmenskultur (1)
- User Generated Content (1)
- Vakuumtechnik (1)
- Vector Association (1)
- Vector Quantization (1)
- Verbraucherverhalten (1)
- Versicherung (1)
- Viability analysis (1)
- Videokonferenz (1)
- Visualisierung (1)
- Vollmacht (1)
- Vorstellungsgespräch (1)
- Wahrscheinlichkeitsrechnung (1)
- Wahrscheinlichkeitsverteilung (1)
- Wasserschall (1)
- Web of Things (1)
- Web3 (1)
- Weblog , Kommunikation , Electronic Commerce (1)
- Werbesendung (1)
- Wert (1)
- Wettbewerbsvorteil (1)
- Wildtiere (1)
- Wirtschaftsentwicklung (1)
- Work-Life-Balance (1)
- YouTube (1)
- Zebrabärbling (1)
- Zeitreihe , Vektor , Hankel-Matrix (1)
- Zeitreihenanalyse (1)
- Zeitreise (1)
- Zellkultur , Säugetiere , Hydrogel (1)
- Zigarette (1)
- Zufallsgraph (1)
- Zulassung (1)
- anomaly detection (1)
- atomic swaps (1)
- automated trading (1)
- bccm (1)
- beam splitting (1)
- bee foraging (1)
- bias-variance (1)
- bill of lading (1)
- biodiversity monitoring (1)
- bloxberg (1)
- carbon emissions (1)
- catalog of criteria (1)
- climate change (1)
- collective trauma (1)
- cross cultural work environment (1)
- cross-cultural dynamics (1)
- dApp (1)
- data annotation (1)
- decentralized computation (1)
- decentralized computation architecture (1)
- decentralized science (1)
- diffractive optics (1)
- digital data management technologies (1)
- digital identity (1)
- digital signatures (1)
- digital twin (1)
- double descent (1)
- e-Voting (1)
- employee engagement (1)
- exploit detection (1)
- fasteners (1)
- green finance (1)
- high repetition rate (1)
- high throughput (1)
- high troughput (1)
- human skeletal remains (1)
- hybrid modeling (1)
- ightning network (1)
- incubed (1)
- industrial lasers (1)
- intercultural competence (1)
- interpretable models (1)
- job satisfaction (1)
- laser applications (1)
- laser drilling (1)
- laser micro cutting (1)
- laser micro drilling (1)
- laser micro turning (1)
- laser processing (1)
- laser scanning (1)
- launch strategies (1)
- learning motivation (1)
- local energy market (1)
- machine to machine communication (1)
- materiality (1)
- metal surface structuring (1)
- mev-inspect (1)
- micro drilling (1)
- micromachining (1)
- mindful leadership (1)
- mint mechanism (1)
- molecular sorting, (1)
- molecule classification (1)
- multicultural workplace (1)
- nanosecond pulsed laser (1)
- negatively-valenced emotions (1)
- network analysis (1)
- optical coherence tomography (1)
- optimization (1)
- pandemic (1)
- parametric (1)
- polygon mirror scanner (1)
- polygon scanner (1)
- pricing strategy (1)
- redactable blockchain (1)
- saira (1)
- scholar publishing (1)
- scientific paper token (1)
- sdg (1)
- sensor fusion (1)
- sensor technology (1)
- sensors evaluation (1)
- silicon processing, high power (1)
- slock.it (1)
- smart contracts (1)
- supply chain (1)
- technology studies (1)
- temporal energy deposition (1)
- temporal network analysis (1)
- time standard (1)
- train delay insurance (1)
- trauma studies (1)
- ultra-fast (1)
- ultrafast laser (1)
- unwind (1)
- value (1)
- waitro (1)
- wind turbine (1)
- workplace mental health (1)
- zk-SNARKS (1)
- Ökosystem (1)
Institute
- Angewandte Computer‐ und Biowissenschaften (114)
- 06 Medien (35)
- 03 Mathematik / Naturwissenschaften / Informatik (28)
- Wirtschaftsingenieurwesen (21)
- 01 Elektro- und Informationstechnik (7)
- 04 Wirtschaftswissenschaften (7)
- Ingenieurwissenschaften (4)
- Sonstige (4)
- 02 Maschinenbau (2)
- 05 Soziale Arbeit (1)
In bioinformatics one important task is to distinguish between native and mirror protein models based on the structural information. This information can be obtained from the atomic coordinates of the protein backbone. This thesis tackles the problem of distinction of these conformations, looking at the statistics of the dihedral angles’ distribution regarding the protein backbone. This distribution is visualized in Ramachandran plots. By means of an interpretable machine learning classification method – Generalized Matrix Learning Vector Quantization – we are able to distinguish between native and mirror protein models with high accuracy. Further, the classifier model supplies supplementary information on the important distributional regions for distinction, like α-helices and β-strands.
The games industry has significantly grown over the last 30 years. Projects are getting bigger and more expensive, making it essential to plan, structure and track them more efficiently.
The growth of projects has increased the administrative workload for producers, project managers and leads, as they have to monitor and control the progress of the project in order to keep a permanent overview of the project. This is often accompanied by a lack of insight into the project and basic communication within the team. Therefore, the goal of this thesis is to enhance conventional project management methods with process structures that occur frequently in game development.
This thesis initially elaborates on what project management in the game industry actually is: Here, methods are considered, especially agile methods and progress tracking prac-tices, which were created for software development and have become a standard in game development. Subsequently, an example is used to demonstrate how process management can function within the development of video games. Based on this, the ideal is depicted, which is implemented and used in a tool at the German games studio KING Art GmbH. This ideal is compared with expert interviews in order to verify its gen-eral validity in the industry.
By integrating process structures, the administrative effort can be reduced, communica-tion within game development can be simplified, while the current project status can be permanently presented. This benefits both project management and leads, as well as the entire team. Further application tests of this theory would have to be organized to check scalability and to draw comparisons to other applications.
Brassica oleracea like all crucifers plants have a defense mechanism against natural enemies, which are chemical compounds formed form the enzymatic degradation of glucosinolates. In the presence of epithiospecifier proteins (ESP), the hydrolysis of glucosinolates will form epithionitriles or nitriles depending on the glucosinolate structure, This research proved that three predicted sequences (ESP) taken from NCBI database has a role in the enzymatic hydrolysis of glucosinolates in Brassica oleracea.
Laser welding of hidden T-joints, connecting the web-sheet through the face-sheet of the joint can provide advantages like increased lightweight potential in manufacturing sandwich structures with thin-walled cores. However, maintaining the correct positioning of the beam relative to the joint is challenging. A method to reduce the effort of positioning is using optical coherence tomography (OCT), that interferometrically measures the reflection distance inside of the keyhole during laser deep penetration welding. In this study new approaches for targeted data processing of the OCT-signal to automatically detect misalignments are presented. It is shown that considering multiple components from the inference pattern and the respective signal intensities improve the detection accuracy of misalignments.
The emerging Internet of Things (IoT) technology interconnects billions of embedded devices with each other. These embedded devices are internet-enabled, which collect, share, and analyze data without any human interventions. The integration of IoT technology into the human environment, such as industries, agriculture, and health sectors, is expected to improve the way of life and businesses. The emerging technology possesses challenges and numerous
security threats. On these grounds, it is a must to strengthen the security of IoT technology to avoid any compromise, which affects human life. In contrast to implementing traditional cryptosystems on IoT devices, an elliptic curve cryptosystem (ECC) is used to meet the limited resources of the devices. ECC is an elliptic curve-based public-key cryptography which provides equivalent security with shorter key size compared to other cryptosystems such as Rivest–Shamir–Adleman (RSA). The security of an ECC hinges on the hardness to solve the elliptic curve discrete logarithm problem (ECDLP). ECC is faster and easier to implement and also consumes less power and bandwidth. ECC is incorporated in internationally recognized standards for lightweight applications due to the
benefits ECC provides.
In the following study we evaluated capabilities of how a simple autoencoder can be used to trainGeneralized Learning Vector Quantization classifier. Specifically, we proved that the bottlenecks of an autoencoder serve as an "information filter" which tries to best represent the desired output in that particular layer in the statistical sense of mutual information.
Autoencoder model was trained for purely unsupervised task and leveraged the advantages by learning feature representations. As a result, the model got the significant value of the accuracy. Implementation and tuning of the model was carried out using Tensor Flow [1].
An extra study has been dedicated to improve traditional GLVQ algorithm taken from sklearn-lvg [2] using the bottleneck from an autoencoder.
The study has revealed potential of bottlenecks of an autoencoder as pre-processing tool in improving the accuracy of GLVQ. Specifically, the model was capable to identify 75% improvements of accuracy in GLVQ comparing to original one, which has about 62%. Consequently, the research exposed the need for further improvement of the model in the present problem case.
A Protein is a large molecule that consists of a vast number of atoms; one can only imagine the complexity of such a molecule. Protein is a series of amino acids that bind to each other to form specific sequences known as peptide chains. Proteins fold into three-dimensional conformations (or so-called protein’s native structure) to perform their functions. However, not every protein folds into a correct structure as a result of mutations occurring in their amino acid sequences. Consequently, this mutation causes many protein misfolding diseases. Protein folding is a severe problem in the biological field. Predicting changes in protein stability free energy in relation to the amino acid mutation (ΔΔG) aids to better comprehend the driving forces underlying how proteins fold to their native structures. Therefore, measuring the difference in Gibbs free energy provides more insight as to how protein folding occurs. Consequently, this knowledge might prove beneficial in designing new drugs to treat protein misfolding related diseases. The protein-energy profile aids in understanding the sequential, structural, and functional relationship, by assigning an energy profile to a protein structure. Additionally, measuring the changes in the protein-energy profile consequent to the mutation (ΔΔE) by using an approach derived from statistical physics will lead us to comprehend the protein structure thoroughly. In this work, we attempt to prove that ΔΔE values will be approximate to ΔΔG values, which can lead the future studies to consider that the energy profile is a good predictor of protein binding affinity as Gibbs free energy to solve the protein folding problem.
In the following bachelor thesis the current trends and potential applications of digitalization in the service industry will be discussed. With the nowadays surging demand on digitalization in all industries, there are branches of the service industry where digitalization is yet to be exploited to its full potential. However, it is difficult to pick and choose which branches of the industry should be fully digitized and which should be partially digitized. The result of this work should therefore facilitate the process of applying digitization in the consulting services where face to face human interaction has been the key to the industry for years. For this purpose, essential factors to be taken into account were identified, which are to be sought after through the analysis, in the specification of the system requirements as well as in the performance of a utility value analysis.
Probabilistic Micropayments
(2022)
Probabilistic micropayments are important cryptography research topics in electronic commerce. The Probabilistic micropayments have the potential to be researched in order to obtain efficient algorithms with low transaction costs and high speeding computer power. To delve into the topic, it is vital to scrutinize the cryptographic preliminaries such as hash functions and digital signatures. This thesis investigates the important probabilistic methods based on a centralized or decentralized network. Firstly, centralized networks such as lottery-based tickets, Payword, coin-flipping, and MR2 are described, and an approach based on blind signatures is also discussed. Then, decentralized network methods such as MICROPAY3, a transferable scheme on the blockchain network, along with an efficient model for cryptocurrencies, are explained. Then we compare the different probabilistic micropayment methods by improving their drawback with a new technique. To set the results from the theoretical analysis of different methods into some context, we analyze the attacks that reduce the security and, therefore, the system’s efficiency. Particularly, we discuss various methods for detecting double-spending and eclipse attacks occurrence
Digital data is rising day by day and so is the need for intelligent, automated data processing in daily life. In addition to this, in machine learning, a secure and accurate way to classify data is important. This holds utmost importance in certain fields, e.g. in medical data analysis. Moreover, in order to avoid severe consequences, the accuracy and reliability of the classification are equally important. So if the classification is not reliable, instead of accepting the wrongly classified data point, it is better to reject such a data point. This can be done with the help of some strategies by using them on top of a trained model or including them directly in the objective function of the desired training model. We discuss such strategies and analyze the results on data sets in this thesis.
Bitcoin's energy consumption and social costs in relation to its capacity as a settlement layer
(2021)
Bitcoin runs on energy. The decentralized network’s amount of energy consumption has resulted in multifaceted discussions about its efficiency and environmental impact. To put Bitcoin’s energy consumption into perspective, we propose to relate (a) the energy consumption in TWh and (b) resulting social costs in the form of carbon emissions to the Dollar value settled on the Bitcoin network. Both metrics allow to relate and quantify the capacity of Bitcoin as a settlement layer to the network’s energy consumption and resulting carbon missions, or social costs. We find that in early 2021 Bitcoin (a) settles between $2,333 and $7,555 for each Dollar spent on energy and (b) that, on average, a Dollar settled on the Bitcoin blockchain causes in social costs between 0.007% and 0.01%, depending on the estimated energy consumption converted into the costs of carbon emissions. These results help to assess the efficiency, cost and sustainability of Bitcoin and may allow a comparison of Bitcoin with existing settlement base layers such as Fedwire or gold
The application described in this thesis has been created, built and designed to help nurses or any medical personnel all around the world in being able to access a real-time database to store patient records like Patient Name, Patient ID, Patient Age and Date of Birth, and the Symptoms that the patient is experiencing. A real-time database is a live database where all changes made to it are reflected across all devices accessing it. This application will be beneficial especially in countries where access to a computer or medical equipment is not always possible. A phone is always ready use and at the reach of the hand, users of this application will always be able to access the data at any given time and place. We will be able to add a new patient or search for existing patients. In addition, this application allows us to take RAW medical images that can be used to identify anomalies in the blood sample. RAW images are important for this application because they’re uncompressed, which means, they do not lose any quality or details. The users of this application are the medical personnel that will be taking care of the patients. These users will have to create a profile on the database in order to use the application, since their data, like user ID, will be used in order to control the behaviour of the data retrieved and stored. We will also discuss the current and future features of this application, as well as, the benefits of this application when it comes to the medical personnel, as well as patients. Finally, we will also go
over the implementation of such application from a hardware perspective, as well as a software one.
In recent years the term Cloud has become popular in the world of technology. It is used to describe many different Information Technology offerings, but people are adapting this word without truly understanding it. “Demystifying the Cloud – Drawing the Lines between Technologies and Concepts” by Kevin Arnot takes a look at many levels of the Cloud and gives a comprehensive overview of the technologies and ideas that make it a paradigm shift. The author analyzes the term methodically by leveraging appropriate information from the Internet as well as from experts. An important milestone in understanding the Cloud accurately is differentiating between its components. These include: underlying technologies, the three Cloud Service Models (SaaS, IaaS and PaaS) and how it is deployed, publically or privately. The result is to understand that a Cloud can be composed in different ways and therefore serves exactly the needs of its users. Furthermore, the author describes challenges that individuals and busi-nesses have to deal with equally and reviews possible solutions. Cloud technology will continue to evolve; however, the future business value of the term “Cloud” will depend on how companies continue using or misusing it.
Several algorithms have been proposed for the testing of series-parallel graphs in linear time. We give our alternate algorithms for testing series-parallel graphs, their tree decompositions, and the independence number when the input is undirected biconnected series-parallel graphs, which run (approximately) linearly in polynomial time.
Purpose: The study is aimed to determine the Incentives for German SMEs to offshore their business activities in India and China.
Design: This study is based on quantitative approach. Primary and secondary data is being used in the study. The data was collected from individuals working in different SMEs in Germany, having relative offshoring experience. Theories from the articles, peer reviewed journal along with relevant books were consulted throughout the study.
Findings: The findingssuggest that the benefits and advantages of offshoring strategy in India and China are cost efficiency and technology. Moreover, the challenges that are being faced by the firms while executive offshoring strategy is cultural mix especially language/cultural barriers, security issues and loss of market performance.
Originality and Value: The study on incentives of German SMEs to offshore business activities in India and China enables me to understand why companies are interested in offshoring strategy in low cost countries for expanding their business while evaluating the challenges, merits and demerits of offshoring
VQ-VAE is a successful generative model which can perform lossy compression. It combines deep learning with vector quantization to achieve a discrete compressed representation of the data. We explore using different vector quantization techniques with VQ-VAE, mainly neural gas and fuzzy c-means. Moreover, VQ-VAE consists of a non-differentiable discrete mapping which we will explore and propose changes to the original VQ-VAE loss to fit the alternative vector quantization techniques.
With the growing market of cryptocurrencies, blockchain is becoming central to various research areas relevant from a mathematical and cryptographic point of view. Moreover, it is capable of transforming the traditional methods involving centralized network operations into decentralized peer-to-peer functionalities. At the same time, it provides an alternative to digital payments in a robust and tamperproof manner by adding the element of cryptography, consequently making it traversable for an individual who is a part of the blockchain network. Furthermore, for a blockchain to be optimal and efficient, it must handle the blockchain trilemma of security, decentralization, and scalability constraints in an effective manner. Algorand, a blockchain cryptocurrency protocol intended to solve blockchain’s trilemma, has been studied and discussed. It is a permissionless (public) blockchain protocol and uses pure proof of stake as its consensus mechanism.
We propose a method for edge detection in images with multiplicative noise based on Ant Colony System (ACS). To adapt the Ant Colony System algorithm to multiplicative noise, global pheromone matrix is computed by the Coefficient of Variation. We carried out a performance comparison of the edge detection Ant Colony System algorithm among several techniques, the best results were found in the gradient and the coefficient of variation.
Due to the intractability of the Discrete Logarithm Problem (DLP), it has been widely used in the field of cryptography and the security of several cryptosystems is based on the hardness of computation of DLP. In this paper, we start with the topics on Number Theory and Abstract Algebra as it will enable one to study the nature of discrete logarithms in a comprehensive way, and then, we concentrate on the application and computation of discrete logarithms. Application of discrete logarithms such as Diffie Hellman key exchange, ElGamal signature scheme, and several attacks over the DLP such as Baby-step Giant-step method, Silver Pohlig Hellman algorithm, etc have been analyzed. We also focus on the elliptic curve along with the discrete logarithm over the elliptic curve. Attacks for the elliptic curve discrete logarithm problem, ECDLP have been discussed. Moreover, the extension of several discrete logarithms-based protocols over the elliptic curve such as the elliptic curve digital signature algorithm, ECDSA have been discussed also.
In dieser Arbeit wurden neuartige Proteasen aus psychrotoleranten Bakterienstämmen isoliert und auf ihre biochemischen Eigenschaften charakterisiert. Des Weiteren konnten S8 Familie Proteasen Gene amplifiziert werden und Unterschiede in der Aminosäuresequenz konnten in Zusammenhang mit den biochemischen Eigenschaften der Proteasen in Verbindung gebracht werden.
After the expression of the titin-Hsp27-construct with the following purification supplies no satisfied results which makes the realization of the atomic force microscopy not possible. The devel-opment of the structure model by using different bioinformatic methods can establish a model for the protein sequence. As bioinformatic methods the template search by different BLAST runs and free available software like SwissModel, Pcons, ModWeb and other tools are used. Nevertheless, the generated model is not the native conformation and has to be analyzed with other software until a stable conformation of the structure can be predicted. Depending on the time which is provided the generated model is a good approach for the aim this master thesis has.
There are a lot of people taking part in more than one competition. The competitions are also of a different kind. From local events with a small number of participants to international tournaments watched by many viewers. Naturally it becomes necessary a system to assess and compare the success in various competitions.
The existing ranking systems are usually specialized to fit their application area. More general ranking methods also exist. They can be applied to a wide spectrum of competition fields. However these ranking methods are still not universal and don't cover some important features of the competitions.
A totally new ranking system has been developed within the present master thesis. Its primary purpose is to evaluate and measure prestige gained by participants in competitions. The main contribution of the thesis consists of an original mathematical model that makes the ranking system unique.
The developed ranking system claims to be universal and interdisciplinary. It is based on the fundamental element that distinguishes the competition from the non-competition areas, namely standings that rank the participants according to their performance. The universality and the interdisciplinarity of the ranking system make available cross-disciplinary comparisons, which is usually very subjective and difficult for implementation.
The contribution of the master thesis extends beyond the theoretical area. A ranking software that fully implements this novel ranking system has been designed and developed. The software makes the practical benefits of the ranking system immediately available to potential application areas such as sports clubs and universities.
And finally, the developed ranking system offers a new viewpoint to the competitions – as a way of gaining prestige, rather than the traditional viewpoint of demonstrating mastery.
The Blockchain is a technology which has the capabilities to change the way, the world operates. As promising as this may be, there are still many challenges which do not exist or are way simpler to solve in conventional software solutions. Services which are offered over the blockchain suffer from so called Block-confirmation-times where the customer simply has to wait till the transaction is confirmed. In this paper possible solutions to that problem will be examined and challenges that arise from the specific criteria of the Ethereum Blockchain will be analyzed.
This scientific work deals with the current opportunities of business development. Purpose of the work is study and analysis of the organization's development strategy and its development. The subject of the study is the mechanism of formation of an organization's development strategy, understanding of business development and its core methodologies and branches. This thesis is based on the operations of the real engineering company and main part of the research could be applied in reality. Main goal of the thesis is to find recommendations on the implementation of strategic changes organization's development strategy.
Pulsed laser processing of vacuum component surfaces is a promising method for electron cloud mitigation in particle accelerators. By generating a hierarchically structured surface, the escape probability of secondary electrons is reduced. The choice of laser treatment parameters – such as laser power, scanning speed and line distance – has an influence on the resulting surface morphology as well as on its performance. The impact of processing parameters on the surface properties of copper is investigated by Secondary Electron Yield (SEY) measurements, Scanning Electron Microscopy (SEM), ablation depth measurements in an optical microscope and particle release analysis. Independent of the laser wavelength (532nm and 1064nm), it was found that the surface morphology changes when varying the processing parameters. The ablation depth increases and the SEY reduces with increasing laser fluence. The final application requires the capability to treat tens of meters of vacuum pipes. The limiting factors of this type of surface treatment for the applicability in particle accelerators are discussed.
This thesis investigates the efficacy of four machine learning algorithms, namely linear regression, decision tree, random forest and neural network in the task of lead scoring. Specifically, the study evaluates the performance of these algorithms using datasets without sampling and with random under-sampling and over-sampling using SMOTE. The performance of each algorithm is measure using various performance metrics, including accuracy, AUC-ROC, specificity, sensitivity, precision, recall, F1 score, and G-mean. The results indicate that models trained on the dataset without sampling achieved higher accuracy than those trained on the dataset with either random under-sampling or random over-sampling using SMOTE. However, the neural network demonstrated remarkable results on each dataset compared to the other algorithms. These findings provide valuable insights into the effectiveness of machine learning algorithms for lead scoring tasks, particularly when using different sampling techniques. The findings of this study can aid lead management practices in selecting the most suitable algorithm and sampling technique for their needs. Furthermore, the study contributes to the literature by providing a comprehensive evaluation of the performance of machine learning algorithms for lead scoring tasks. This thesis has practical implications for businesses looking to improve their lead management practices, and future research could extend the analysis to other machine learning algorithms or more extensive datasets.
kein Abstract vorhanden
Proteins are involved in almost every aspect of life, mediating a wide range of cellular tasks. The protein sequence dictates the spatial arrangement of the residues and thus ultimately the function of a rotein. Huge effort is put into cumbersome structure eludication experiments which obtain models describing the observed spatial conformation of a protein, enabling users to predict their function, to understand their mode of action or to design tailored drugs to cure disease caused by misfolded or misregulated proteins.
However, the result of structure determination experiments are merely models of reality, made under simplifying assumptions - sometimes containing major undetected errors. On the other hand, such experiments are resource demanding and they cannot supply the actual demand.
Thus, scientists are predicting the structure of proteins in silico, resulting in models that are even
more prone to error.
In consequence, the structure biologists search after a practicable definition of structure quality and over the last two decades several model quality assessment programs emerged, measuring the local and global quality of peculiar structures. Seven representatives were studied, regarding the paradigms they follow and the features they use to describe the quality of residues. Their predications were compared, showing that there is almost no common ground among the tools.
Is there a way to combine their statements anyway?
Finally, the accumulated knowledge was used to design a novel evaluation tool, addressing problems previously spotted. Thereby, high quality of its predication as well as superior usability was
key. The strategy was compared to existing approaches and evaluated on suitable datasets.
This study explores the opportunities and risks associated with user-generated content (UGC) in the communication strategies of marketing departments from a business perspective. With the rise of social media and online platforms, UGC has become a powerful tool for brands to engage with their audience, build trust, and enhance brand awareness. However, implementing UGC also comes with inherent risks, including the loss of control over brand messaging, potential negative user-generated content, and legal implications.
To investigate these dynamics, an empirical mixed-methods approach was employed, including expert interviews and a comprehensive literature review. The findings indicate that UGC offers significant opportunities for marketing departments, such as increased customer loyalty, enhanced authenticity, brand awareness, as well as a diverse set of possible content. However, the study also reveals the potential risks associated with UGC, highlighting the importance of managing these risks effectively.
Adversarial robustness of a nearest prototype classifier assures safe deployment in sensitive use fields. Much research has been conducted on artificial neural networks regarding their robustness against adversarial attacks, whereas nearest prototype classifiers have not chalked similar successes. This thesis presents the learning dynamics and numerical stability regarding the Crammer-normalization and the Hein-normalization for adversarial robustness of nearest prototype classifiers. Results of conducted experiments are penned down and analyzed to ascertain the bounds given by Saralajew et al. and Hein et al. for adversarial robustness of nearest prototype classifiers.
The main purpose of this thesis is to investigate factors influencing the buying decision of cigarette smokers. To achieve this, different theories concerning consumer buying behavior and factors influencing have been discussed to achieve a deeper understand of consumer behaviour. To enable me comprehend the influence factors that influence the buying decision of a smoker as a consumer, a survey with questionnaires was performed. The results of the survey indicates that brand awareness, quality of the tobacco, price, packaging, advertisement, influence by others and availability are the major factors influencing the buying decision of a smoker, with availability, quality, price and brand awareness having the most effective influence on a smoker.
In this work, the task is to cluster microarray gene expression data of the cyanobacterium Nostoc PCC 7120 for detection of messenger RNA (mRNA) degradation patterns. Searched are characteristic patterns of degradation which are caused by specific enzymes (ribonucleases) allowing a further biological investigation regarding biochemical mechanisms. The mRNA degradation is part of the regulation of gene expression because it regulates the amount and longevity of mRNA, which is available for translation into proteins. A particular class of RNA degrading enzymes are exoribonucleases which degrade the molecule from its ends, whereby a degradation from the 5’ end, the 3’ end or from both ends is theoretically possible.
In this investigation, the information about exoribonucleolytic degradation is given in a microarray data set containing gene expression values of 1,251 genes. The data set provides gene expression vectors containing the expression values of up to ten short distinct sections of a gene ordered from the genes 5’ end to its 3’ end. For each gene, expression vectors are available for both nitrogen fixing and non-nitrogen fixing conditions, which have to be considered separately due to biological reasons. Accordingly, after filtering and preprocessing, two datasets for clustering are obtained consisting of 133 ten-dimensional expression vectors. The similarity of the expression vectors is judged by a newly correlation based similarity measure and compared with the results obtained by use of the Euclidean distance. A non-linear transformation of the correlations was applied to obtain a dissimilarity measure. By choice of parameters within this transformation a user specific differentiation between negative and positive correlated gene expression vectors and an adequate adjustment regarding the noise level of gene expression values is possible.
Clustering was performed using Affinity Propagation (AP). The number of clusters obtained by AP depends on the so-called self-similarity for the data vectors. This dependence was used to identify stable cluster solutions by self-similarity control. To evaluate the clustering results, Median Fuzzy c-Means (M-FCM) was used. Further, several cluster validity measures are applied and visual inspections by t-distributed Stochastic Neighbor Embedding (t-SNE) as well as cluster visualization are provided for mathematical interpretation analysis of clusters.
To validate the clustering results biologically, the found data structure is checked for biological adequacy. A deeper investigation into the mechanisms behind mRNA-degradation was achieved by use of a RNA-Seq data set. Contained 40 (base pair) bp long reads for non-nitrogen fixing and nitrogen fixing conditions were assembled using bacteria-specific ab-initio assembly of Rockhopper. Thus, mRNA (transcript)-sequences of the clustered genes are obtained. A further investigation of the untranslated regions (UTRs) is performed here due to the assumption that exoribonucleases recognize specific transcript-sequences outside of the annotated gene regions as their binding sites. These UTRs need to be analyzed regarding sequence similarity using motif-finding algorithms.
he automatic comparison of RNA/DNA or rather nucleotide sequences is a complex task requiring careful design due to the computational complexity. While alignment-based models suffer from computational costs in time, alignment-free models have to deal with appropriate data preprocessing and consistently designed mathematical data comparison. This work deals with the latter strategy. In particular, a systematic categorization is proposed, which emphasizes two key concepts that have to be combined for a successful comparison analysis: 1) the data transformation comprising adequate mathematical sequence coding and feature extraction, and 2) the subsequent (dis-)similarity evaluation of the transformed data by means of problem specific but mathematically consistent proximity measures. Respective approaches of different categories
of the introduced scheme are examined with regard to their suitability to distinguish natural RNA virus sequences from artificially generated ones encompassing varying degrees of biological feature preservation. The challenge in this application is the limited additional biological information available, such that the decision has to be made solely on the basis of the sequences and their
inherent structural characteristics. To address this, the present work focuses on interpretable, dissimilarity based classification models of machine learning, namely variants of Learning Vector Quantizers. These methods are known to be robust and highly interpretable, and therefore,
allow to evaluate the applied data transformations together with the chosen proximity measure with respect to the given discrimination task. First analysis results are provided and discussed, serving as a starting point for more in-depth analysis of this problem in the future.
The following thesis contains a detailed business plan of a formula student combustion racecar. This includes the evaluating of existing knowledge about the car combined with required information about the market and seed capital. Subsequently the already presented plan is described with the interpretation for future business plans. In this connection the acceptance of electro mobility shall be evaluated and first ideas for the presentation of an electric car shall be created.
This thesis looks at Customer Relationship Management in a different way. In order to identify factors that influence the acceptance of one of its components – the analytical CRM – it focuses on theopinion of the company’s employees. The objective of this thesis is to identify factors that positvely influence the acceptance of analytical Customer Relationship Management within organizations.
The endogen steroid hormone 17b-estradiol is a central player in a wide range of physiologic, behavioral processes and diseases in vertebrates. As a consequence, it is a main target for molecular design and drug discovery efforts in medicine and environmental sciences, which requires in-depth knowledge of protein-ligand binding processes. This work develops a bioinformatic framework based on local and global structure similarity for the characterization of E2-protein interactions in all 35 publicly available three-dimensional structures of estradiol-protein complexes. Subsequently, it uses gained data to identify four geometrically conserved estradiol binding residue motifs, against which the Protein Data Bank is queried. As result of this database query, 15 hits present in seven protein structures are found. Five of these structures do not contain E2 as ligand and had thus not been included in this work’s initial data set. One of these newly detected structures is structurally and functionally dissimilar, as well as evolutionarily distant from all other proteins analyzed in this work. Nevertheless, the ability of this protein to actually bind estradiol must be further analyzed. Finally, geometrically conserved E2-protein interactions are identified and a new research direction using these conserved interaction ensembles for the detection of novel estradiol targets is proposed.
Fermat proposed fermat’s little theorem in 1640, but a proof was not officially published until 1736. In this thesis paper, we mainly focus on different proofs of fermat’s little theorem like combinatorial proof by counting necklaces, multinomial proofs, proof by modular arithmetic, dynamical systems proof, group theory proof etc. We also concentrate on the generalizations of fermat’s little theorem given by Euler and Laplace. Euler was the first scientist to prove the fermat’s little theorem. We will also go through three different proofs given by Euler for fermat’s little theorem. This theorem has many applications in the field of mathematics and cryptography. We focus on applications of fermat’s little theorem in cryptography like primality testing and publickey cryptography. Primality test is used to determine if the given number n is a prime number or composite number. In this paper, we also concentrate on fermat primality test and Miller-Rabin primality test, which is an extension of fermat primality test. We also discuss the most widely used public-key cryptosystem i.e, the RSA Algorithm, named after its developers R. Rivest, A. Shamir, and L. Adleman. The algorithm was invented in 1978 and depends heavily on fermat’s little theorem.
A classical topic in the theory of random graphs is the probability of at least one isolated vertex in a given random graph. An isolated node has a huge impact on social networks which can be given by a random graph. We present a distribution on the number of isolated vertex using the probability generating function. We discuss the relationship between isolated edges and extended cut polynomials, extended matching polynomials using the principle of inclusion exclusion. We introduce an algorithm based on colored graphs for general graphs. We apply this to the components of a graph as well. Finally, we implement the idea on a special class of graphs like cycle, bipartite graph, path, and others. We discuss recursive procedure based on the analogous coloring rules for ladder and fan graphs.
Implementation of a customised business model for innovative engineering consultancy services
(2019)
Business development is vital for every organisation who intend to grow. It follows expansion through organic and inorganic means. Also, there are many innovative business styles which help organisations to expand. This thesis shows how engineering services organisation chose its form of business expansion
The following thesis explains how engineering service sector company uses its expertise to expand its business towards consultancy market with the demonstration of the real-life executed business model.
The thesis provides a solution for the following issues
1) What is the best in-house strategy to be developed for business expansion in the service industry?
2) How did the niche market experiences help for business expansion?
Machine learning models for timeseries have always been a special topic of interest due to their unique data structure. Recently, the introduction of attention improved the capabilities of recurrent neural networks and transformers with respect to their learning tasks such as machine translation. However, these models are usually subsymbolic architectures, making their inner working hard to interpret without comprehensive tools. In contrast, interpretable models such learning vector quantization are more transparent in the ability to interpret their decision process. This thesis tries to merge attention as a machine learning function with learning vector quantization to better handle timeseries data. A design on such a model is proposed and tested with a dataset used in connection with the attention based transformers. Although the proposed model did not yield the expected results, this work outlines improvements for further research on this approach.
This work concentrates on the frequently used marketing instrument brand personality. Its effect on the consumer and how it drives consumer behaviour through TV advertis-ing are the focus. Scientific material, utilising research results of the last 20 years, has been analysed to investigate this subject. Furthermore, the example of Southern Comfort provides an insight of brand personality being applied to the real world of marketing business.
Analysis of Continuous Learning Strategies at the Example of Replay-Based Text Classification
(2023)
Continuous learning is a research field that has significantly boosted in recent years due to highly complex machine and deep learning models. Whereas static models need to be retrained entirely from scratch when new data get available, continuous models progressively adapt to new data saving computational resources. In this context, this work analyzes parameters impacting replay-based continuous learning approaches at the example of a data-incremental text classification task using an MLP and LSTM. Generally, it was found that replay improves the results compared to naive approaches but achieves not the performance of a static model. Mainly, the performances increased with more replayed examples, and the number of training iterations has a significant influence as it can partly control the stability-plasticity-trade-off. In contrast, the impact of balancing the buffer and the strategy to select examples to store in the replay buffer were found to have a minor impact on the results in the present case.
With globalization and the increasing diversity of the workforce, organizations are faced with the challenge of effectively managing multicultural teams. Understanding how employee engagement and job satisfaction are influenced by multicultural factors is crucial for organizations to create inclusive work environments that foster productivity and wellbeing. This literature review aims to explore the relationship between employee engagement, job satisfaction, and multi-cultural workplaces. It examines relevant studies and provides insights into the key factors, challenges, and strategies for enhancing employee engagement and job satisfaction in multicultural workplaces. The findings will shed light upon the author's research area on the factors influencing employee engagement and job satisfaction in multicultural work environments and contribute to a deeper understanding of cross-cultural dynamics in the workplace.
Aspects of Mindful Leadership Upon the Psychological Health of Employees in an Intercultural Context
(2023)
Across the globe, organizations are in the midst of rapid transformation. Immigration, digitalization and the push for sustainability are just to name a few. Organizational structures are being pushed for more agility, co-opetition, integration, tenable and resilient workplaces. Social structures of companies are being reformed and the weight of cooperation and integration lays upon the leaders and employees. But from this weight of integration what psychological effects does it play upon the migrant and domestic employees to be engaged at work? What role does the leadership style impact the mental health and engagement in the cross-cultural workplace? Previous work has shown the importance of workplace integration, however, the impact of the mental health of domestic employees needs more attention from the scholars in this new context. The object of the research is to define the connection of mindful leadership and the psychological health of employees within a cross-cultural workplace and to develop strategies to improve workplace engagement.
Stability of control systems is one of the central subjects in control theory. The classical asymptotic stability theorem states that the norm of the residual between the state trajectory and the equilibrium is zero in limit. Unfortunately, it does not in general allow computing a concrete rate of convergence particularly due to algorithmic uncertainty which is related to numerical imperfections of floating-point arithmetic. This work proposes to revisit the asymptotic stability theory with the aim of computation of convergence rates using constructive analysis which is a mathematical tool that realizes equivalence between certain theorems and computation algorithms. Consequently, it also offers a framework which allows controlling numerical imperfections in a coherent and formal way. The overall goal of the current study also matches with the trend of introducing formal verification tools into the control theory. Besides existing approaches, constructive analysis, suggested within this work, can also be considered for formal verification of control systems. A computational example is provided that demonstrates extraction of a convergence certificate for example dynamical systems.
Die vorliegende Arbeit befasst sich mit der Analyse der kritischen Erfolgsfaktoren für die Zulassung europäischer Industrieprodukte in Indien, anhand eines europäisch entwickelten und produzierten Produktes für die indische Rolling Stock Industrie. Die dabei berücksichtigen Themenschwerpunkte, die im Detail betrachtet sind über: Welche Standards werden derzeit in Indien bzw. in Europa offiziell für den Zulassungsprozess herangezogen? Aktuelle Situation erfassen. Vergleich der technischen Zulassung Standards zwischen IR (Indischen Railway) Standards und
In Machine Learning, Learning Vector Quantization(LVQ) is well known as supervised learning method. LVQ has been studied to generate optimal reference vectors because of its simple and fast learning algorithm [12]. In many tasks of classification, different variants of LVQ are considered while training a model. In this thesis, the two variants of LVQ, Generalized Matrix Learning Vector Quantization(GMLVQ) and Generalized Tangent Learning Vector Quantization(GTLVQ) have been discussed. And later, transfer learning technique for different variants of LVQ has been implemented, visualized and we have compared the results using different datasets.
Path decomposition of a graph has received an important amount of interest over the past decades because of its applications in algorithmic graph theory and in real life problems. For the computation of a path decomposition of small width, we use different heuritics approaches. One of the most useful method is by Bodlaender and Kloks. In this thesis, we focus on the computation, applications, transformation and approximation of a path decomposition of small width.
It is easy to convert a path decomposition in to nice path decomposition with same width, which is more convinent to use to find the graph parameters like independent sets, chromatic polynomials etc. Inspired by [28], we find an algorithm to compute the chromatic polynomial of a graph via nice path decomposition with small width.
Prototype-based classification methods like Generalized Matrix Learning Vector Quantization (GMLVQ) are simple and easy to implement. An appropriate choice of the activation function plays an important role in the performance of (deep) multilayer perceptrons (MLP) that rely on a non-linearity for classification and regression learning. In this thesis, successful candidates of non-linear activation functions are investigated which are known for MLPs for application in GMLVQ to realize a non-linear mapping. The influence of the non-linear activation functions on the performance of the model with respect to accuracy, convergence rate are analyzed and experimental results are documented.
This study presents an analysis of the coverage made by the journals El País (Spain), Folha de S. Paulo (Brazil) and Süddeutsche Zeitung (Germany) about the protests in Brazil against the 2013 Confederations Cup and the 2014 FIFA World Cup to establish a comparison between them and see which topics were emphasized by the newspapers and which tone they use in their reporting. Based on the research questions, four categories were developed for the analysis of the journals: article structure; topic of the article; actors/group of persons and tone of the reporting, all of them composed by several subcategories. It was concluded that the themes highlighted by the European newspapers were different from those stressed on the Brazilian diary. Nonetheless, all the reviewed newspapers made a neutral coverage of the protests.
Going green, environmental protection, eco-friendliness, sustainability or sustainable development have become frequent terms in everyone’s life. The negative impact of human activities, causing increased environmental pollution and decline, is a matter of dire concern nowadays. In the last few decades greater attention has been payed towards these issues. Understanding society´s new concerns, increasingly more companies have begun to modify their behaviours toward a more eco-friendly and responsible one. The term green marketing is an emerging area of interest, and is a tool of modern marketing used by companies in various industries. It is a full-service marketing strategy that includes green marketing plan development, sustainable auditing and planning, branding, design, and communication. An effective, authentic and transparent green presentation of a company provides a chance to successfully assert on the market, communicate core company values and build long-term customer relations. The young and innovative company SWOX Surf Protection, which entered the market with a long-lasting waterproof sunscreen particular designed for surfers and snowboarders, wants to foster growth by expanding their existing target group to a broader segment comprising all outdoor activists. Moreover, the brand strives to become the leading sunscreen manufacturer for outdoor sports and wants to position itself as a lifestyle brand. In 2016 the company started to produce “greener” sunscreen tubes with an imminent launch at hand. Due to the fact that especially surfers, snowboarders and outdoor activists are in close contact with nature and spend a lot of time in the sun, it is assumed that they have particular interest in making use of sunscreen on a healthrelated aspect, while at the same time showing increased commitment towards environmental protection. In this context, it is assumed that a holistic green and organic sunscreen could provide added values. This paper intends to examine whether green marketing could be a relevant strategy for SWOX Surf Protection to differentiate themselves from their competitors, attract potential customers, build long-term customer relations - and as a result position itself as a successful sunscreen lifestyle brand in the market. This will be verified through comprehensive literature review and detailed market research.
The bachelor thesis is about cis-trans isomerization of Xaa-Pro (Xaa = any amino acid), their quantitative acquisition and the selection of 3D structure information for the prediction with a support vector machine (SVM). The quantitative detection of occurrence of cis-, trans- and cis/trans conformation in membrane proteins will be examined and evaluated. The 3D structure informa-tions include 12 features, the amino acids around proline and are including of proline. These include the inside/outside classification, the real secondary structure, energy consideration, as well as five further amino acid occur properties within a defined radius of the proline. From this information, a data set was created for the SVM. This program is used for the prediction of unknown and known Xaa Pro Isomerisms. The methods for the analysis were implemented with the platform independent programming language Java. Two programs have emerged from the work to a Xaa PIPT for the quantitative detection and extracting structural information and m Xaa-PIPT to the pure prediction of Xaa-Pro isomerism in protein structures. 389 Membrane proteins from the PDB (Protein Data Bank) served as a basis. The data were also statistically analysed and evaluated.
RNA tertiary contact interactions between RNA tetraloops and their receptors stabilize the folding of ribosomal RNA and support the maturation of the ribosome. Here we use FRET assisted structure prediction to develop structural models of two ribosomal tertiary contacts, one consisting of a kissing loop and a GAAA tetraloop and one consisting of the tetraloop receptor (TLR) and a GAAA tetraloop. We build bound and unbound states of the ribosomal contacts de novo, label the RNA in silico and compute FRET histograms based on MD simulations and accessible contact volume (ACV) calculations. The predicted mean FRET efficiency from molecular dynamics (MD) simulations and ACV determination show agreement for the KL-TLGAAA construct. The KL construct revealed too high FRET efficiency and artificial dye behavior, which requires further investigation of the model. In the case of the TLR, the importance of the correct dye and construct parameters in the modeling was shown, which also leads to a renewed modeling. This hybrid approach of experiment and simulation will promote the elucidation of dynamic RNA tertiary contacts and accelerate the discovery of novel RNA interactions as potential future drug targets.
Long-range tertiary interactions between RNA tetraloops and their receptors stabilize the folding of ribosomal RNA and support the maturation of the ribosome. Here, we use FRET-assisted structure prediction to develop a structural model of the GAAA tetraloop receptor (TLR) interaction and its dynamics. We build the docked TLR de novo, label the RNA in silico and compute FRET histograms based on MD simulations. The predicted mean FRET efficiency is remarkably consistent with single-molecule experiments of the docked tetraloop. This hybrid approach of experiment and simulation will promote the elucidation of dynamic RNA tertiary contacts and accelerate the discovery of novel RNA and RNA-protein interactions as potential future drug targets.
The voluntary international blog VaultingNews exists for two years now. Meanwhile the
team grew and the costs increased. This thesis is a collection of tools, which can help
to improve the communication of the team members who are spread all over the world
and introduces monetization ideas where the focus lies on establishing an online fan
shop based in Germany. This chapter results in a check list which laws have to be
observed.
This paper set out to determine what the effect of daily internet usage on a short attention span was and whether this had an effect on academic performance. As described briefly in the introduction this paper consisted of laying the groundwork through defining the relevant terminology, applying the methodology to the Hypotheses and making conclusive statements.
Two Hypotheses were presented to give the paper the aim. While Hypothesis 1 can be proven true through the two-step terminology applied, Hypothesis 2 does not stand up to the scrutiny. For lack of sufficient and specific evidence, the only conclusive statement that can be made regarding it is that it is untrue.
Approx. 80% of the population sample analysed were between the age of 19 – 30 which automatically reduces the analysis, extrapolations and scientific statements to a more specific age group. The other ages represented were almost all above, meaning that the findings could not accurately be applied to older age groups.
Nonetheless, the data collected was accurate and good be applied to prove Hypothesis 1, meaning that daily internet usage breeds and invites a short attention span. For lack of a fitting data collection method, physcial, social, mental factors along with motivation of an individual make up his academic performance. These were factors that could not be taken into consideration.
Conclusively, the author predicts that a present internet connection coupled with the growing popularity of digital technology attention spans will contin ue to stay as short as they are. Individuals will find ways to direct their short attention span where it is needed and apply it as necessary.
Crowd-Powered Medical Diagnosis : The Potential of Crowdsourcing for Patients with Rare Diseases
(2023)
With the recent rise in medical crowdsourcing platforms,
patients with chronic illnesses increasingly broadcast their
medical records to obtain an explanation for their complex
health conditions. By providing access to a vast pool of
diverse medical knowledge, crowdsourcing platforms have
the potential to change the way patients receive a medical
diagnosis. We developed a conceptual model that details
a set of variables. To further the understanding of
crowdsourcing as an emerging phenomenon in health care,
we provide a contextualization of the various factors that
drive participants to exert effort. For this purpose, we used
CrowdMed.com as a platform from which we gathered and
examined a unique dataset that involves tasks of diagnosing
rare medical conditions. By promoting crowdsourcing
as a robust and non-discriminatory alternative to seeking
help from traditional physicians, we contribute to the acceptance
and adoption of crowdsourcing services in health
economics.
The H.323 umbrella standard describes audiovisual communication over packetswitched networks. This thesis illustrates the standard in detail with regards to architecture and implementation. The second part of this dissertation is dedicated to examining the Gmail Voice and Video plug-in, an Internet-based audiovisual communication platform. In the course of this thesis a secured kiosk environment for the Gmail Voice and Video plug-in is being developed.
The intention of this thesis is to examine the beneficial impact of renewable energies in general and biogas technologies in particular on socioeconomic status of people, by considering all applicable sides affecting its development as per political, cultural, environmental, and institutional means. As energy and development are very much correlative with each other, biogas technologies figure prominently as part of a decentralized, sustainable, renewable, energy network especially in rural areas of Nepal.
We report on our recent progress in creating a new type of compact laser that uses thulium-based fiber CPA technology to emit a central wavelength of 2 μm. This laser can produce pulse energies of >100 μJ and an average power of >15 W. It is designed to be long-lasting and is built for industrial use, making it a great fit for integration into laser machines used for materials processing. These laser parameters are ideal for working with semiconductors like silicon, allowing for tasks such as micro-welding, cutting of filaments, dicing, bonding and more.
We demonstrate a thulium-based fiber amplifier delivering pulses tunable between <120fs and 2ps duration at up to 228 μJ of pulse energy at a center wavelength of 1940 nm and 500-kHz repetition rate. Due to the excellent long-term stability, this system proves the ability of this technology to be integrated into ultra-fast material processing machines.
Cancer is one of the main causes of death in developed countries, and cancer treatment heavily depends on successful early detection and diagnosis. Tumor biomarkers are helpful for early diagnose. The goal of this discovery method is to identify genetic variations as well as changes in gene expression or activity that can be linked to a typical cancer state.
First, several cancer gene signaling pathways were introduced and then combined. 27 candidate genes were selected, through the analysis of several data sets in the GEO database, a few expression difference matrices were established. Those candidate genes were tested in the matrices and found five genes PLA1A, MMP14, CCND1, BIRC5 and MYC that have the potential to be tumor biomarkers. Two of these genes have been further discussed, PLA1A is a potential biomarker for prostate cancer, and MMP14 can be considered as a biomarker for NSC lung cancer.
Finally, the significance of this study and the potential value of the two genes are discussed, and the future research in this direction is a prospect.
nicht vorhanden
Over the last two decades, the rapid advances in digitization methods put us on the fourth industrial era’s cusp. It is an era of connectivity and interactivity between various industrial processes that need a new, trusted environment to exchange and share information and data without relying on third parties. Blockchain technologies can provide such a trusted environment. This paper focuses on utilizing the blockchain with its characteristics to build machine-to-machine (M2M) communication and digital twin solutions. We propose a conceptual design for a system that uses smart contracts to construct digital twins for machines and products and executes manufacturing processes inside the blockchain. Our solution also employs the decentralized identifiers standard (DIDs) to provide self-sovereign digital identities for machines and products. To validate the approach and demonstrate its applicability, the paper presents an actual implementation of the proposed design to a simulated case study done with the help of Fischertechnik factory model.
This paper explores the origins of Maori images in New Zealand film history. Discussing the history of Maori and their society brings us closer to a, once almost extinct, race and its struggle for self-representation and self-governance. By taking an in-depth look at New Zealands film history we get to understand how Maori were the subject of the earliest films and at what time they started making their own films. Combining those elements gives us the opportunity to understand how early images of Maori were created by Pakeha directors. By looking at different films throughout film history shows how Maori images evolved in time, especially when Maori started depicting themselves. This paper not only answers questions about Maori images in film but also tries to make people realise what odds Maori had to overcome in their daily struggle for selfdetermination.
nicht vorhanden
As part of the research project Trusted Blockchains for the Open, Smart Energy Grid of the Future (tbiEnergy), one of the objectives is to investigate how a holistic blockchain approach for the realization of a local energy market could be accomplished and how corresponding hardware security mechanisms can be integrated. This paper provides an overview of the implemented prototype and describes the system and its processes.
Vicia faba leaves and calli were transformed using CRISPR Cas RNP. Two kinds of CPP fused SpyCas9 were used with sgRNA7, sgRNA5 or sgRNA13 targeting PDS exon 1, PDS exon 2 or MgCh exon 3 respectively. RNP were applied using high pressure spraying, biolistic delivery, incubation in RNP solution and infiltration of leaf tissue. A PCR and restriction enzyme based approach was used for detection of mutation. Screening of 679 E. coli colonies containing the cloned fragments resulted in detection of 14 mutations. Most of the 14 mutations were deletions of sizes 150, 500 or 730 bp. 5 out of the 14 mutations were point mutations located two to three bp upstream of PAM.
Functions, which can be summarized to the keyword Internet Protocol Television (IPTV) describe the transmission of video services to users via Internet Protocol (IP). Accompanying to this new television transmission path Home Theatre PCs (HTPC) running a so called Media Center platform are more and more entering the living rooms as a companion for the popular LCD and Plasma displays. Perfect ease of use and the visual integration on the screen and also into the living room is raising their acceptance. These HTPCs are a central node for multimedia services such as TV, radio and email within the networked household. Thus, there are good preconditions for the use of a HTPC as end device for Telco operator driven IPTV and telecommunication services. In the context of this diploma thesis possibilities for the provisioning of IPTV and Next Generation Network (NGN) services on a converged multimedia home entertainment platform for the living room will be investigated, especially Vista Media Center platforms. For this reason, standardization activities will be investigated, which deal with the integration of IPTV and telecommunication services into NGN. The validation of the results will be achieved by the design and implementation of a Vista Media Center Add-In, which can be integrated as an IP Multimedia Subsystem (IMS) based User Agent (UA) in ETSI TISPAN Release 2 IPTV infrastructures. Additionally, a Cross Domain messaging service for IMS based UA is created, which enables a cross-network communication between users.
Procurement processes are deemed to lack supporting digital technologies that raise efficiency and automation.
Blockchain solutions are piloted in procurement in order to offer a decentralized IT infrastructure covering these needs. This paper aims at identifying current blockchain approaches in the field of procurement and presenting affected business processes. In order to get an overview of the current state of the art, a systematic literature mapping is conducted.
Moreover, the out-comes are gathered and categorized in a classification scheme. Based on the analysis, systematic maps are presented to showcase relevant findings. Within the findings, several blockchain use cases in the field of procurement are identified and information about addressed challenges, utilized blockchain frameworks and affected business processes are extracted.
Where does the cocoa, which we consume on a regular basis, come from? Supply chains are not always transparent, much less easily comprehensible. The cocoa industry faces ongoing challenges. Whether it be the chocolate manufacturers’ promise to maintain a sustainable and ethical supply chain, the minimal impact on the environment or the maximum adherence to human rights in their production process. This paper revises important steps which lead to the compliance with UN standards and questions the role of consumers in the construct of ethical chocolate products.
The cultivation of mammalian cells in the third dimension has a great potential for a
wide application in regenerative medicine, pharmaceutical industry or cancer research.
An overview about actual 3-D cultivation techniques like hydrogels and porous scaffolds as well as their various materials and modifications is given in this thesis. Also different products and their implementation for a new application of 3-D cell
culture in a laboratory are described.
Analysis of the Forensic Preparation of Biometric Facial Features for Digital User Authentication
(2023)
Biometrics has become a popular method of securing access to data as it eliminates the need for users to remember a password. Although exploiting the vulnerabilities of biometric systems increased with their usage, these could also be helpful during criminal casework.
This thesis aims to evaluate approaches to bypass electronic devices with forged faces to access data for law enforcement. Here, obtaining the necessary data in a timely manner is critical. However, unlocking the devices with a password can take several years with a brute force attack. Consequently, biometrics could be a quicker alternative for unlocking.
Various approaches were examined to bypass current face recognition technologies. The first approaches included printing the user's face on regular paper and aimed to unlock devices performing face recognition in the visible spectrum. Further approaches consisted of printing the user's infrared image and creating three-dimensional masks to bypass devices performing face recognition in the near-infrared. Additionally, the underlying software responsible for face recognition was reverse-engineered to get information about its operation mode.
The experiments demonstrate that forged faces can partly bypass face recognition and obtain secured data. Devices performing face recognition in the visible spectrum can be unlocked with a printed image of the user's face. Regarding devices with advanced near-infrared face recognition, only one could be bypassed with a three-dimensional face mask. In addition, its underlying software provided evidence about the demands of face recognition. Other devices under attack remained locked, and their software provided no clues.
Introducing natural adversarial observations to a Deep Reinforcement Learning agent for Atari Games
(2021)
Deep Learning methods are known to be vulnerable to adversarial attacks. Since Deep Reinforcement Learning agents are based on these methods, they are prone to tiny input data changes. Three methods for adversarial example generation will be introduced and applied to agents trained to play Atari games. The attacks target either single inputs or can be applied universally to all possible inputs of the agents. They were able to successfully shift the predictions towards a single action or to lower the agent’s confidence in certain actions, respectively. All proposed methods had a severe impact on the agent’s performance while producing invisible adversarial perturbations. Since natural-looking adversarial observations should be completely hidden from a human evaluator, the negative impact on the performance of the agents should additionally be undetectable. Several variants of the proposed methods were tested to fulfil all posed criteria. Overall, seven generated observations for two of three Atari games are classified as natural-looking adversarial observations.
Social media platforms play an increasing role in marketing, politics and police affairs, because they can strongly influence opinions. So called “opinion leaders” exert their influence in a given network and shape the opinions of other users. Identifying central nodes in a social graph has been of interest for decades. However, not all centrality measures were developed for social media platforms. They were built for social graphs, which did not include additional metrics (e.g. “likes”, “shares”). Nevertheless, these metrics play a crucial role on modern platforms. Hence, outdated measures need to be adjusted and additional metrics need to be integrated to ensure the best possible results.
Blockchain and other distributed ledger technologies are evolving into enabling infrastructures for innovative ICT-solutions. Numerous features, such as decentralization, programmability, and immutability of data, have led to a multitude of use cases that range from cryptocurrencies, tracking and tracing to automated business protocols or decentralized autonomous systems. For organizations that seek blockchain adoption, the overwhelming spectrum of potential application areas requires guidance reducing complexity and support the development of blockchain-based concepts. This paper introduces a classification approach to provide design and implementation guidance that goes beyond current textbook classifications. As an outcome, a typology for management and business architects is developed, before the paper concludes with an instantiation of existing use cases and a discussion of their classes.
nicht vorhanden
Proteins are macromolecules that consist of linear-bonded amino acids. They are essential elements in various metabolic processes. The three-dimensional structure of a protein is determined by the order of amino acids, also referred to as the protein sequence. This conformation corresponds to the structural state in which the protein is functionally active. However, relationships between protein sequence, structure and function have not been fully understood yet. Additionally, information about structural properties or even the entire protein structure are crucial for understanding the dynamics that define protein functionality and mechanisms. From this, the role of a protein in its molecular context can be described closely. For instance, interactions can be investigated and comprehended as a biological dynamic network that is sensitive to alternations, i.e. changes which are caused by diseases. Such knowledge can aid in drug design, whereas compounds need to be specifically tailored and adjusted to their molecular targets. Protein energy profile-basedmethods can be applied to investigate protein structures concerning dynamics and alternations. The publications enclosed to this work discuss in general the scientific potentials of energy profilebased techniques and algorithms. On the one hand, changes in stability caused by protein mutations and proteinligand interactions are discussed in the context of energy profiles. On the other hand, energetic relations to protein sequence, structure and function are elucidated in detail. Finally, the presented discussions focus on recent enhancements of the eProS (energy profile suite) database and toolbox. eProS freely provides all elucidated methodologies to the scientific community. Thus, one can address biological questions with the presented methods at hand. Additionally, eProS provides annotations related to foreign databases. This ensures a broad view on biological data and information. In particular, energetic characteristics can be identified which contribute to a protein’s structure and function.
Reputation is indispensable for online business since it supports customers in their buying decisions and allows sellers to justify premium prices. While IS research has investigated reputation systems mainly as review systems on online platforms for business-to-consumer (B2C) transactions, no proper solutions have been developed for business-to-business (B2B) transactions yet. We use blockchain technology to propose a new class of reputation systems that apply ratings as voluntary bonus payments: Before a transaction is performed, customers commit to pay a bonus that is granted if a service provider has performed a service properly. As opposed to rival reputation systems that build on cumulated ratings or reviews, our system enables monetized reputation mechanisms that are inextricably linked with online transactions. We expect this system class to provide more trustworthy ratings, which might reduce agency costs and serve quality providers to establish a reputation towards new customers.
Target of this Diploma Thesis is the development of a thermal simulation card to analyze the thermal behavior of a LTE PCIe Mini data card for GSM/UMTS based wireless networks in different environments. The power consumption of modern wireless communication systems has increased dramatically during the last years. Especially for the next generation of wireless modem cards the thermal dissipations will be slightly on or even beyond the official guidelines of the components and the whole card. To gain knowledge about the behavior of the data card, it shall be simulated with software as well as real hardware. As the ASIC components are not available yet, a hardware emulation shall be developed. The thesis covers the whole development process from the idea, the conception, the layout to the assembly and the measurements. It starts with finding a way of emulating the mounted components, measuring and powering. Afterwards a card, incorporating the principles found before, will be developed. An additional software simulation gives comparative values against the measurements. After assembling the emulation cards and running reference measurements, trials for temperature improvements will be ran and compared with the simulations.
This study shows the potential for the make-or-buy theory in several scenarios – production, assembling and development. The evaluation of these possibilities is conducted, based on Bosch’s core competencies. A decision model is developed to support the decision making process. Based on these results, the serial production at RBAC in China is planned and suggestions for setting up the assembly line are given
This feasibility study shows possibilities, how logistical concepts can be
improved or reorganized. Therefore, the assambly line for hydraulic blocks at
Bosch Rexroth Changzhou is checked and new ideas are shown. To ensure
comparability, three different cases are considered. Based on this evaluation,
recommendations for further development are displayed.
This work emphasises the synergy between anthropologi-cal research on human skeletal remains and suitable doc-umentation strategies. Highlighting the significance of data recording and the use of digital databases in various aspects of anthropological work on bones, including scien-tific standards, skeletal collections, analysis of research re-sults, ethical considerations, and curation, it provides a comprehensive examination of these topics to demonstrate the value of investing time and resources in this field, countering the existing lack of funding that has led to sig-nificant deficiencies. Additionally, the paper outlines the requirements and challenges associated with standard data protocoling and suggests that digital data manage-ment frameworks and technologies such as ontologies and semantic web technologies for anthropological information should be a central focus in developing solutions.
This Bachelor thesis investigates the learning rules of the Hebbian, Oja and BCM neuron models for their convergence to, and the stability of, the fixed points. Existing research is presented in a structured manner using consistent notation. Hebbian learning is neither convergent nor stable. Oja learning converges to a stable fixed point, which is the eigenvector corresponding to the largest eigenvalue of the covariance matrix of the input data. BCM learning converges to a fixed point which is stable, when assuming a discrete distribution of orthogonal inputs that occur with equal probability. Hebbian learning can therefore not be used in further applications, where convergence to a stable fixed point is required. Furthermore, this Bachelor thesis came to the conclusion that determining the fixed points of the BCM learning rule explicitly involves extensive calculation and other methods for verifying the stability of possible fixed points should be considered.
Tokenization projects are currently very present when it comes to new blockchain technologies. After explaining the fundamentals of cross-chain interaction, the bachelor thesis will focus on tokenizing technology for Bitcoin on Ethereum. To get a more practical context, implementing the currently most successful decentralized tokenization project is described.
Mapping identities, digital assets, and people’s profiles on the internet is getting much traction in the blockchain cosmos lately. The new technology is currently forming architectures that will further pave new ways to reach fundamental mechanisms to interact in a decentralized, user-centered manner. These schemes are often declared as the next generation of the web. Within the article will be shown, how the internet has evolved in managing identities, what problems arose, and how new data architectures help build applications on top of privacy rights. Both technological and ethical perspectives are viewed to answer which guidelines should be considered to fulfill the upcoming branch of decentralized services and what we can learn from historical schemes regarding their privacy, accounting, and user data.
The financial world of blockchains is mostly covered by Bitcoin, taking up about 210 billion dollars in market cap. Despite the huge security and independence which the technology offers to the users, it's not quite easy to adapt with upcoming applications due to the regulated infrastructure behind. For small-scale transactions, everyday use applications or the access to a variety of crypto technologies and projects, Bitcoin is relatively limited in future development. The compatibility for most of those applications is covering currencies from more development-driven blockchains like Ethereum. Those want to reach out for the user base that's already in hold of Bitcoins and offer them a seamless transition to new applications without the risk of losing their funds. Within the article, atomic swaps and tokenization are covered up and current approaches compared. Both mechanisms are used to fulfill this symbiosis between Bitcoin and Ethereum.
To get a more practical view, an example on how to implement such a tokenization within an app is shown. This will give deeper insights and offers inspiration for digital identity-based app development.
The topic of soulbound, non-transferable tokens is getting lots of interest within the blockchain space lately as decentralized societies become more tangible with Web3 social media applications and DAOs. In this article, I want to outline how such tokens function, their problems for adoption and standardization, and how they differ from verifiable credentials in the SSI field. As such soulbound assets will likely rely on extended recovery and asset management schemes to become viable identities that safely gain reputation and trust, features like social recovery and contract-based accounting are incorporated. By combining those new technologies and the theoretical crypto-native identity construct, the paper will give an impression of the future user-centric data economy.
Traditional user management on the Internet has historically required individuals to give up control over their identities. In contrast, decentralized solutions promise to empower users and foster decentralized interactions. Over the last few years, the development of decentralized accounts and tokens has significantly increased, aiming at broader user adoption and shared social economies.
This thesis delves into smart contract standards and social infrastructure for Ethereum-based blockchains to enable identity-based data exchange between abstracted blockchain accounts. In this regard, the standardization landscapes of account and social token developments were analyzed in-depth to form guidelines that allow users to retain complete control over their data and grant access selectively.
Based on the evaluations, a pioneering Solidity standard is presented, natively integrating consensual restrictive on-chain assets for abstracted blockchain accounts. Further, the architecture of a decentralized messaging service has been defined to outline how new token and account concepts can be intertwined with efficient and minimal data-sharing principles to ensure security and privacy, while merging traditional server environments with global ledgers.
The primary objective of this work and the research at the “Helmholtz-Zentrum für Umweltforschung” was to gain a deeper understanding of the basically transformation processes, especially for nitrogen species, in constructed wetlands. Therefore two different types of laboratory scale model systems, run with two different artificial wastewaters, had been observed for about 4 months. Data about the situation of three nitrogen species (ammonium, nitrate, nitrite), the physical condition of the pore water and the carbon sources contained by the water had been collected and compared. The present work will provide a summary about the actual knowledge of the microbial processes in constructed wetlands and the general character of such constructions. It will explain the different methods used to gain the data which will be later wards discussed with the aid of the created graphs in the final argumentation.
nicht vorhanden
In this master thesis, we define a new bivariate polynomial which we call the defensive alliance polynomial and denote it by da(G; x; y). It is a generalization of the alliance polynomial and the strong alliance polynomial. We show the relation between da(G; x; y) and the alliance, the strong alliance, the induced connected subgraph polynomials as well as the cut vertex sets polynomial. We investigate information encoded about G in da(G; x; y). We discuss the defensive alliance polynomial for the path graphs, the cycle graphs, the star graphs, the double star graphs, the complete graphs, the complete bipartite graphs, the regular graphs, the wheel graphs, the open wheel graphs, the friendship graphs, the triangular book graphs and the quadrilateral book graphs. Also, we prove that the above classes of graphs are characterized by its defensive alliance polynomial. We present the defensive alliance polynomial of the graph formed of attaching a vertex to a complete graph. We show two pairs of graphs which are not characterized by the alliance polynomial but characterized by the defensive alliance polynomial.
Also, we present three notes on results in the literature. The first one is improving a bound and the other two are counterexamples.
The wind energy sector is undergoing digitalization processes that span multi-tier supply chains of turbine components and wind farm maintenance, amongst others. In an industrial use case that includes Siemens Gamesa Renewable Energy, Vestas and APQP4Wind, the processes of producing, fastening, and servicing bolts in turbines are mapped to a digital model. The model follows the lifetime of turbine bolts from the manufacturing phase, to fastening in turbines and maintenance, until their replacement and recycling. The development of the digital model is iteratively addressed in a design science research approach, as the authors actively contribute to the project. Distributed ledgers (DLs) support the notary documentation of the bolts and turbines, from their registration phase to the assembly-, technical service verification- and recycling phases. The immutable and decentralized nature of DLs secures the data against tampering and prevents any changes taken unilaterally by engaging the service stakeholders and component providers in a blockchain consortium.
Many companies use machine learning techniques to support decision-making and automate business processes by learning from the data that they have. In this thesis we investigate the theory behind the most widely used in practice machine learning algorithms for solving classification and regression problems.
In particular, the following algorithms were chosen for the classification problem: Logistic Regression, Decision Trees, Random Forest, Support Vector Machine (SVM), Learning Vector Quantization (LVQ). As for the regression problem, Decision Trees, Random Forest and Gradient Boosted Tree were used. We then apply those algorithms to real company data and compare their performances and results.
In the following study the properties of the superabsorbent polymer Broadleaf P4 were investigated according to the aim to apply that polymer within constructed wetlands. The application of the polymer in constructed wetlands shall result in an improvement of the removal of pesticides. For that the polymer was given into lab-scale wetlands together with pumice and were compared to a control wetland, which was filled with gravel. The wetlands were running for several weeks in which the nutrient removal was recorded. The polymer was also tested according to its property to adsorb the pesticides before adding the pesticides to the wetland beds.