Machine Learning Talks on Campus - Past events
Machine Learning Talks on Campus is an information service about talks, workshops and other events in the local community.
If you want to get our News, please subscribe to our mailing list.
If you want to announce a talk for the following week, please send an email to machine-learning@uni-heidelberg.de by Wednesday night.
To access table on mobile please scroll to right.
Past events
Table
23.09.2024 - 27.09.2024 | IWR Summer School 2024 This year we are organizing an interdisciplinary summer school to equip early-career modeling scientists with the main concepts and methods of researching and implementing models for climate-sensitive infectious diseases. The topics include:
as well as their application to the converging global challenges: climate change and the spread of infectious diseases. Target Audience Prerequisites:
| IWR - Summer School | Mathematikon, Im Neuenheimer Feld 205, 69120 Heidelberg | ||||
09.09.2024 1:15 PM - 2:15 PM | Meyerhof Symposium 2024: "Biophysics in the age of AI" by Frank Noé (FU Berlin) | MPI for Medical Research | Max Planck Institute for Medical Research, Seminar room ABC, Jahnstr. 29, 69120 Heidelberg | ||||
Workshop on Avant-Garde AI Research Obviously, the development of artificially intelligent systems is not driven by one discipline alone, but rather by an interplay of several different sciences and disciplines. This is why we are delighted to present the Avenues of Intelligence 2024 workshop – a workshop showcasing some of the most intriguing academic branches currently working on artificial intelligence. Turning the Aesopian fable of the fox and the stork, who served each other food in dishes they couldn't access, on its head, we’ve designed this workshop to ensure that all talks are accessible to a broad academic audience. We will feature presentations on the latest developments and theories of intelligent systems from multiple perspectives: neuroscience, philosophy, computer science, psychology, engineering, and cognitive science. Interested? Join us by https://www.kip.uni-heidelberg.de/aoi2024/register for the workshop! And download the workshop program https://antonio.bikic.io/documents/workshop_schedule_aoi_2024.pdf. | |||||||
Journalism and AI - how do both get along with each other? The recent development of AI-based software to create texts, images or even videos has an increasing impact on the work of journalists. This talk will outline benefits as well as concerns and risks of this emerging technology for journalism. | |||||||
29.07.2024 4:00 PM | Efficient Hardware for Neural Networks The last decade has witnessed significant breakthroughs of deep neural networks (DNNs) in many fields. These breakthroughs have been achieved at extremely high computation and memory cost. Accordingly, the increasing complexity of DNNs has led to a quest for efficient hardware platforms. In this talk, class-aware pruning is first presented to reduce the number of multiply-and-accumulate (MAC) operations in DNNs. Class-exclusion early-exit is then examined to reveal the target class before the last layer is reached. To accelerate DNNs, digital accelerators such as systolic array from Google can be used. Such an accelerator is composed of an array of processing elements to efficiently execute MAC operations in parallel. However, such accelerators suffer from high energy consumption. To reduce energy consumption of MAC operations, we select quantized weight values with good power and timing characteristics. To reduce energy consumption incurred by data movement, logic design of neural networks is presented. In the end, on-going research topics and future research plan will be summarized. Bio: Grace Li Zhang received the Dr.-Ing. degree from the Technical University of Munich (TUM), in 2018. She joined TU Darmstadt in 2022 as a Tenure Track Assistant Professor. She leads the Hardware for Artificial Intelligence Group. Her research focuses on efficient hardware acceleration for AI algorithms and systems, AI computing with emerging devices, e.g., RRAM and optical components, circuit and system design methodologies for AI, explainability of AI and neuromorphic computing. Three of her papers were nominated as Best Paper Award at Design, Automation and Test in Europe (DATE) and one of her paper was nominated as Best Paper Award at Design Automation Conference (DAC). She received Dr.-Wilhelmy-VDI-Preis in Germany and Walter Gademann Preis at TU Munich for outstanding dissertation in 2019.
| Holger Fröning | NF350, U014. | ||||
18.07.2024 7:30 PM | DEMYSTIFYING CHAT GPT Simon Ostermann Wie große Sprachmodelle funktionieren und wie wir sie erklärbar machen können. | Deutsches Forschungszentrum für künstliche Intelligenz Rotaract | INF 205 Hörsaal | ||||
18.07.2024 11 AM - 12:30 AM | Rudolf Mößbauer Colloquium "Understanding protein dynamics by combining simulation, experimental data, and machine learning" Cecilia Clementi (FU Berlin) | Max Planck Institute for Medical Research | Max Planck Institute for Medical Research, The talk will also be streamed live via Zoom. Registration is required under: | ||||
IWR Colloquium Summer Semester 2024 “Robust Optimization Approaches for PDE-constrained Optimization under Uncertainty" The consideration of uncertainty in optimization problems is an important aspect in many practical applications. In this talk we consider optimization problems with partial differential equation (PDE) constraints, where uncertainty occurs in parameters (for example material parameters, usage scenarios) as well as in the optimization variables (e.g. manufacturing tolerances). There exist several approaches to handle uncertainty in optimization, in particular stochastic optimization, probabilistic constraints and (distributionally) robust optimization. We will focus on robust and distributionally robust optimization approaches for problems with expensive PDE-constraints. In robust optimization, relevant realizations of the uncertain variables are described by an uncertainty set and it is required that the optimization variables satisfy critical constraints for all realizations of the uncertain variables in the uncertainty set. In distributionally robust optimization one considers an ambiguity set of relevant distributions for uncertain parameters and requires that the constraints are satisfied in expectation for all distributions in the ambiguity set. The resulting robust formulation is a semi-infinite optimization problem and can be written in a min-max form. We present various techniques how to approximate and reformulate the robust problem by using optimality conditions and duality theory in order to obtain a tractable problem. Depending on the approach, the reformulation contains conic constraints, complementarity conditions and conjugate functions. We will sketch suitable solution methods. Moreover, we will illustrate, how adjoint techniques as well as model order reduction and error estimation can be used to solve these problems efficiently in the PDE constraint case. We present several application examples, in particular the design optimization of electric machines under uncertainty. | |||||||
10.07.2024 4:00 PM | AI in Health Mini Seminar Series “Challenges and Opportunities of Big Datasets in AI for Healthcare” | Stadt Heidelberg, Do Tank Network for Wearable Technologies https://www.heidelberg.de/HD/Arbeiten+in+Heidelberg/living+lab+fuer+wearable+technologies.html | Pop-Up Living Lab, Kurfürsten-Anlage 3, 69115 Heidelberg
| ||||
Machine learning galore! Lab presentations Rocket science Manuel Brenner (Koppe lab) Peter Park (Kreshuk lab) To help plan the catering, please register for free by June 24th! | |||||||
27.06.2024 3:00 PM | Improving Atomistic Simulations With Machine Learning
Interatomic potentials are an integral part of atomistic simulations, describing the interactions between atoms and molecules. Within recent years, machine learning has been used to improve the accuracy of interatomic potentials. I will describe the work I have carried out in this area including how ML models can be improved through their architecture, training data and fitting methods. | HITS Frauke Gräter | Room: Alan Turing Zoom: Meeting ID: 914 3032 9214
| ||||
"Meet Your Scientific Hero Seminar Series`" AIH-ELLIS Charlie Beirnaert “Development of a Machine Learning Model for Late-onset Sepsis | |||||||
Cohomology classes in the RNA transcriptome Single-cell sequencing data consists of a point cloud where the points
| |||||||
Probabilistic Machine Learning and Latent Variable Models for Molecular Data Nowadays researchers can obtain quantitative measurements of the abundances and activities of different types of molecules across experimental conditions and spatio-temporal resolutions. In this talk, I will discuss how latent variable models can help us to extract major underlying patterns from the noisy, heterogeneous and high-dimensional data and how to accommodate the statistical properties of molecular data in such models. Case studies will demonstrate how this can help us to highlight core biological processes along development, in health and disease and pinpoint underlying molecular drivers. | |||||||
Transformation des Kunden- und Service-Managements durch künstliche Intelligenz Der Einsatz von generativer KI ist ein „Game Changer“ für viele Arbeitsbereiche – so auch im Kunden- und Service-Management. Doch wo macht der Einsatz bereits heute Sinn und wo sind die Grenzen? Die performio GmbH aus Brühl unterstützt hierbei über 500 Geschäftskunden in Deutschland – vom kleinen SMB bis hin zu Enterprise Kunden. Nico Hemker, Geschäftsführer, verdeutlicht den Einsatz und Mehrwert von KI anhand von Kundenbeispielen und zeigt Ihnen Einstiegschancen in diesem Bereich auf.
| Die Veranstaltung findet online statt. Anmeldung auch per E-Mail an: simone.lasser@uni-heidelberg.de | ||||||
Gesellschaftliche Perspektiven der Künstlichen Intelligenz Prof. Dr. Vincent Heuveline erläutert, wie ein Algorithmus lernt und welche Entscheidungen man ihm übertragen kann. Barbara Klauß (RNZ Wirtschaftsredaktion) moderiert den Abend. | |||||||
SCIENCE SPARKS STARTUPS */What to Expect:/* • Inspiring talks on trends and developments • Exclusive networking opportunities with other innovative startups and established companies | |||||||
19.06.2024 11:15 AM - 12:45 PM | State of What Art?
I will review a recently published paper (https://arxiv.org/abs/2401.00595) that addresses the challenges of evaluating and comparing large language models (LLMs). The paper highlights that the current practice of assessing LLMs by assigning multiple tasks is unreliable. This is because the ranking of the models changes when we modify the instructions of the tasks. I'll delve into the importance of this issue, and propose alternative ways to evaluate LLMs based on their practical applications. | Stefan Riezler | SR10, Inf 205 (Mathematikon) Zoom: | ||||
06.06.2024 3 PM -.4:30 PM | Model scale versus domain knowledge in long-term forecasting of chaos
Chaos and unpredictability are traditionally synonymous, yet large-scale statistical-learning methods have recently demonstrated surprising ability to forecast chaotic systems well beyond typical predictability horizons. I will describe my recent work evaluating forecasting methods on a large-scale dynamical systems dataset. At long forecast horizons, diverse methods—ranging from reservoir computers to neural ordinary differential equations—exhibit a universal tradeoff between inductive bias and forecast ability. In this regime, we find that classical properties like Lyapunov exponents fail to determine empirical predictability. Rather, forecast methods are limited by their ability to learn long-timescale slow manifolds associated with a system’s invariant measure. Our results inform a general view of complex dynamics as a generative process, with implications for how we construct and constrain time series models. | ZI (Zentralinstitut für Seelische Gesundheit) Manuel Brenner | Konferenzraum Fünfter Stock am Mathematikon | ||||
Community Forum Data-Intensive Computing (DIC) Agenda | |||||||
05.06.2024 11:15 AM - 12:45 PM | Dynamics and control of human cognition and behavior for mental health applications Understanding and predicting maladaptive human behavior and cognition, which are pertinent to many psychiatric disorders, requires understanding its underlying dynamics and dynamical mechanisms. Modifying these behaviors requires control over these dynamics. My research group focuses on inferring the dynamics underlying human behavioral and neural time series, utilizing both process-driven and data-driven modeling approaches. Based on these methods, we develop mental health applications where dynamics are used to predict or control future system states or to study their underlying generative mechanisms. Our applications include: 1) Smartphone apps that assess psychological ratings as proxies for mental health states, forecast changes over time, and use these forecasts to tailor mental health exercises presented on the smartphone. 2) Social exchange games where we create human-like agents that can engage in social interactions and foster positive social experiences. 3) Web-based cognitive experimental platforms in which we specifically tailor experimental paradigms to reliably and validly measure complex decision making behavior. And 4) we develop and apply models for the robust and reliable detection of dynamical systems features predictive of psychiatric dysfunction. | Stefan Riezler | SR11, INF 205 (Mathematikon) Online: | ||||
Chemical Compound Space Conference (CCSC2024) CCSC2024 will bring together the vibrant community of chemists, physicists, and data scientists who employ quantum machine learning to gain physics-informed understanding of the vast chemical compound space. We have assembled a star-studded lineup of keynote and invited speakers, and hope to see many | |||||||
17.05.24 2:30 - 4:30 PM | KI in Anwendung - Die Geschichte der KI Es geht also um ein spannendes Kapitel deutscher Technikgeschichte und die Praxis seiner Erforschung, sowie die damit verbundenen Fragen von Wissenschaftskommunikation, Wissensvermittlung und Bildungsangeboten
| HCDH | HSE (Bergheimer Straße 104) Anmeldung bis zum 16.5 an: | ||||
15.05.24 - 17.05.24 | 2nd SIMPLAIX Workshop on Machine Learning for Multiscale Molecular Modeling Invited speakers: | SIMPLAIX RTG 2450 | Studio Villa Bosch Heidelberg, Schloss-Wolfsbrunnenweg 33, 69118 Heidelberg Registration is now open. Registration deadline: 15 April 2024. | ||||
ELLIS Life / NCT Data Science seminar / Heidelberg.ai Precision oncology requires complex biomarkers which are often based on molecular and genetic tests of tumor tissue. For many of these tests, universal implementation in clinical practice is limited. However, for virtually every cancer patient, pathology tissue slides stained with hematoxylin and eosin (H&E) are available. Artificial intelligence (AI) can extract biomarkers for better treatment decisions from these images. This talk will summarize the state of the art of AI in oncology for precision oncology biomarkers. It will cover the technical foundations, emerging use cases and established applications which are already available for clinical use.
| |||||||
Helmholz Imaging Conference Keynote Speaker: Joost Batenburg is a professor at Computer Science institute of Leiden University (LIACS) and his chair is Imaging and Visualization. He is also affiliated with the national research institute for mathematics and computer science in the Netherlands (CWI) and is program director of the interdisciplinary research program Society, Artificial Intelligence and Life Sciences (SAILS). The call for abstracts is open. Registration is still open. | |||||||
Live Q&A with OpenAI: AI and the Future of Humanity Public Viewing Are you interested in frontier AI systems, their astonishing capabilities and risks for humanity? Then join us for a thought-provoking deep dive and OpenAI Live Q&A on AI safety. Agenda: Confirm your spot by filling external form and prepare your most burning questions. | |||||||
LARGE LANGUAGE MODELS (LLMS) IN DER PRAXIS: ANWENDUNGSBEISPIELE UND ERFOLGSSTRATEGIEN In diesem Vortrag erfahren Sie, wie generative künstliche Intelligenz (KI) bereits heute verschiedene Industriebereiche revolutioniert. Mithilfe konkreter Anwendungsbeispiele von Large Language Models (LLMs) werden die vielfältigen Einsatzmöglichkeiten dieser Technologie in der deutschen Industrie verdeutlicht. Neben der Erläuterung grundlegender Konzepte erhalten Sie wertvolle Ratschläge für eine erfolgreiche Implementierung. Gemeinsam werden die Chancen und Herausforderungen, die sich aus der Nutzung von LLMs ergebe, diskutiert. | |||||||
26.04.24 6:15 PM | Physikalisches Kolloquium The road to AI-based discovery in particle physics Prof. Dr. Gregor Kasieczka, Institut für Experimentalphysik, Universität Hamburg | Physikalisches Institut | KIP, INF 227, Hörsaal 1 | ||||
26.04.24 3 PM | Responsible AI
In the first part, to set the stage, we cover irresponsible AI: (1) discrimination (e.g., facial recognition, justice); (2) pseudoscience (e.g., biometric based predictions); (3) limitations (e.g., human incompetence, minimal adversarial AI), (4) indiscriminate use of computing resources (e.g., large language models) and (5) the impact of generative AI (disinformation, mental health and copyright issues). These examples do have a personal bias but set the context for the second part where we address three challenges: (1) principles & governance, (2) regulation and (3) our cognitive biases. We finish discussing our responsible AI initiatives and the near future.
| Michael Gertz | large lecture hall (Hörsaal), Mathematikon | ||||
## Cancelled ## STRUCTURES JOUR FIXE In this talk, I will discuss the idea of tensor decompositions and their applications for efficient function approximation. Another popular approach are neural networks and neural operators. I will discuss interesting problems and algorithms that arise while using them. Pretalk by Jakob Zech starts at 1 pm | |||||||
Machine learning galore! Lab presentations Rocket science Hendrik Borras (Fröning lab) Miguel A. Ibarra Arellano (Schapiro lab) To help plan the catering, please register for free by April 23th! | |||||||
25.04:24 4:15 PM | Two cases of using machine learning in mathematical modelling
In this talk I will describe two topics. First part is devoted to how generative models can be used for 2D to 3D reconstruction problem (with application to digital rock analysis).In the second part I will overview our recent paper that uses neural operators to learn nonlinear preconditioners (coupled with a flexible iterative solver). | Robert Scheichl | Mathematikon, SRA | ||||
18.04.24 4:15 PM | Phinli: a physics informed surrogate model for elliptic PDEs and its Bayesian inverse problem analysis The talk addresses Bayesian inferences in inverse problems with uncertainty quantification involving a computationally expensive forward map associated with solving a partial differential equations. To mitigate the computational cost, the paper proposes a new surrogate model informed by the physics of the problem, specifically when the forward map involves solving a linear elliptic partial differential equation. The study establishes the consistency of the posterior distribution for this surrogate model and demonstrates its effectiveness through numerical examples with synthetic data. The results indicate a substantial improvement in computational speed, reducing the processing time from several months with the exact forward map to a few minutes, while maintaining negligible loss of accuracy in the posterior distribution. | Rob Scheichl | INF 205 Mathematikon, Seminarraum A | ||||
12.04.24 1 PM -2 PM | Simulation-based inference and the places it takes us Many fields of science make extensive use of mechanistic forward models which are implemented through numerical simulators. Simulation-based inference aims to make it possible to perform Bayesian inference on such models by only using model-simulations, but not requiring access to likelihood evaluations. I will speak about recent work on developing simulation based inference methods using flexible density estimators parameterised with neural networks, on improving their robustness and efficiency, and applications to modelling problems in neuroscience, computational imaging and astrophysics. Finally, I will talk about the prospect of building large-scale models of neural circuits in the Drosophila melanogster by combing connectomics and simulation-based machine learning. | CZS Heidelberg Initiative for Model-Based AI | Seminar Room A + B (0.202 + 0.203), Mathematikon, INF 205, Heidelberg | ||||
12.04.24 9:30 AM - 11 AM | Practical Equivariances via Relational Conditional Neural Processes
Conditional Neural Processes (CNPs) are a class of meta-learning models popular for combining the runtime efficiency of amortized inference with reliable uncertainty quantification. Many relevant machine learning tasks, such as in spatiotemporal modeling, Bayesian Optimization, and continuous control, inherently contain equivariances – for example to translation – which the model can exploit for maximal performance. However, prior attempts to include equivariances in CNPs do not scale effectively beyond two input dimensions. In this talk, I will introduce the theory behind CNPs and discuss our recent proposal on how to incorporate equivariances into any neural process model and how we can ensure scalability to higher dimensions. | Fred Hamprecht | INF 205 Mathematikon Seminarraum 10 | ||||
19.03.24 11:15 AM - 12:15 AM | Topologically penalized regression on manifolds
We study a regression problem on a compact manifold. In order to take advantage of the underlying geometry and topology of the data, we propose to perform the regression task on the basis of eigenfunctions of the Laplace-Beltrami operator of the manifold that are regularized with topological penalties. We will discuss the approach and the penalties, provide some supporting theory and illustrate the performance of the methodology on some data sets, illustrating the relevance of our approach in the case where the target function is ``topologically smooth”. This is joint work with O. Hacquard, K. Balasubramanian, G. Blanchard and C. Levrard. | Enno Mammen | SR 8 (4th floor) | ||||
HITS Colloquium “AI at HITS” – Symposium in honor of Wilfried Juling Speakers at the symposium are: REGISTRATION: | |||||||
From subcellular mapping to modeling the human cell Biological systems are functionally defined by the nature, amount and spatial location of the totality of their proteins. We have generated an image-based map of the subcellular distribution of the human proteome and showed that there is great complexity to the subcellular organization of the cell giving rise to potential pleiotropic effects. As much as half of all proteins localize to multiple compartments and around 20% of the human proteome shows temporal variability. Our temporal mapping results shows that cell cycle progression explains less than half of all temporal protein variability, and that most cycling proteins are regulated post-translationally, rather than by transcriptomic cycling. | DKFZ Communication Center Registration | ||||||
“Meet Your Scientific Hero" Seminar Series: AIH-ELLIS The AIH-ELLIS “Meet Your Scientific Hero” Research Seminar Series called upon regional researchers to nominate their favorite AI visionaries from around the world to give a keynote presentation. Through this program, five AI experts will give a talk in the Heidelberg – Mannheim AI community. | |||||||
NCT Data Science seminar / ELLIS Life / Heidelberg.ai In this talk, I will present several cutting-edge machine learning methods developed in our lab which enable us to discover and analyze the governing equations of medicine, moving beyond traditional causal discovery methods. The focus will be on how this new way of modeling medical and biological processes as dynamical systems offers unprecedented insights into disease progression and the efficacy of treatment strategies over time. Through real-world examples, I will illustrate the transformative impact which I believe this new strand of machine learning can play in deciphering complex medical phenomena and improving patient care. | |||||||
12.03.24 10 AM - 11 AM | Semi-supervised learning: The provable benefits of unlabeled data for sparse Gaussian classification
The premise of semi-supervised learning (SSL) is that combining labeled and unlabeled data enables learning significantly more accurate models. Despite empirical successes, the theoretical understanding of SSL is still far from complete. In this talk, we consider SSL for high dimensional sparse Gaussian classification. A key challenge here is feature selection, detecting the few variables informative for the classification problem. For this SSL setting, we derive information theoretic lower bounds as well as computational lower bounds, based on the low-degree likelihood ratio framework. Our key contribution is the identification of a regime in the problem parameters (dimension, sparsity, number of labeled and unlabeled samples) where a polynomial time SSL algorithm that we propose succeeds, but any computationally efficient supervised or unsupervised schemes, that separately use only the labeled or unlabeled data would fail. This result highlights the provable benefits of combining labeled and unlabeled data for feature selection in high dimensions. | Fred Hamprecht | SR 8 (4th floor) | ||||
NCT Data Science Seminar In this talk we discuss expression rates for neural network-based operator surrogates, which are employed to approximate smooth maps between infinite-dimensional spaces. Such surrogates have a wide range of applications and can be used in uncertainty quantification and parameter estimation problems in fields such as classical mechanics, fluid mechanics, electrodynamics, earth sciences etc. In this case, the operator input represents the problem configuration and models initial conditions, material properties, forcing terms and/or the domain of a partial differential equation (PDE) describing the underlying physics. The output of the operator is the corresponding PDE solution. Our analysis is based on representing the operator in- and outputs in stable bases and exploiting the resulting sparsity created by a separation in high- and low-frequency features. We show that algebraic convergence rates that are free from the curse of dimension can be achieved. | |||||||
Multiscale exploration of single cell data with geometric harmonic analysis High-throughput data collection technologies are becoming increasingly common in many fields, especially in biomedical applications involving single cell data (e.g., scRNA-seq and CyTOF). These introduce a rising need for exploratory analysis to reveal and understand hidden structure in the collected (high-dimensional) Big Data. A crucial aspect in such analysis is the separation of intrinsic data geometry from data distribution, as (a) the latter is typically biased by collection artifacts and data availability, and (b) rare subpopulations and sparse transitions between meta-stable states are often of great interest in biomedical data analysis. In this talk, I will show several tools that leverage manifold learning, graph signal processing, and harmonic analysis for biomedical (in particular, genomic/proteomic) data exploration, with emphasis on visualization, data generation/augmentation, and nonlinear feature extraction. A common thread in the presented tools is the construction of a data-driven diffusion geometry that both captures intrinsic structure in data and provides a generalization of Fourier harmonics on it. These, in turn, are used to process data features along the data geometry for interpretability, denoising, and generative purposes. Finally, I will demonstrate the application of the resulting tools in biomedical applications, such as early embryoid body development and COVID 19 mortality. | |||||||
Workshop in der Reihe "KI in Anwendung" Generative KI revolutioniert Bildungssysteme weltweit. Das Potenzial großer Sprachmodelle, Wissensvermittlung individualisiert zu unterstützen, ist enorm, genauso wie ihre Risiken. Und doch steht diese Entwicklung erst ganz am Anfang - und die nächste Generation von GPT-Assistenten schon in den Startlöchern! Anstatt die Debatte abstrakt und spekulierend zu führen, möchten wir praktisch werden. Aus diesem Grund haben sich die Heidelberg School of Education (HSE) und das Heidelberg Center for Digital Humanities (HCDH) zusammengeschlossen, um mit den Forschenden und Lehrenden der Universität Heidelberg und der Pädagogischen Hochschule Heidelberg zusammen Nutzungsszenarien für KI in der Hochschullehre (u.a. mit Fokus auf das Lehramt) zu entwickeln und zu erproben. Es sollen dazu Veranstaltungen für grundlegende Verständnisfragen auf Einsteigerniveau sowie praktische Hands-On-Workshops stattfinden, zu denen jeweils gesondert eingeladen wird und die expliziten Werkstattcharakter haben werden. Es werden zu bestimmten Veranstaltungen Expert*innen eingeladen, um bestimmte Thematiken fokussiert besprechen zu können. Die Teilnehmendenzahlen werden voraussichtlich auf ein Höchstmaß an 25 Teilnehmenden für Praxisworkshops und bis zu 50 für themenzentrierte Veranstaltungen begrenzt werden, um produktiv arbeiten zu können. | |||||||
19.02.24 12:45 PM - 5 PM | HI4AI – Human Intelligence for Artificial Intelligence in Medicine
Program 13:00 – 13:45 Introduction to the TRN 13:45 – 14:30 Spotlight talks 14:30 – 14:50 Coffee break 14:50 – 15:40 Breakout discussions 15:40 – 16:00 Coffee break
16:45 – 17:00 Résumé: expectations, gaps & collaborative next steps
Registration
| Organized by PD Dr. Holger A. Lindner, Prof. Stefan Riezler, Prof. Jan Rummel, Prof. Vera Araujo-Soares, PD. Dr. Dr. Verena-Schneider-Lindner, Christopher Jones, M.Sc.
| CUBEX ONE, Mannheim Medical Technology Campus Franz Volhard Straße 5, 68167 Mannheim | ||||
Compact course: Python Packaging In this course we will learn how to package a Python library, how to publish it on PyPI and on conda-forge, as well as look at more advanced topics like building pre-compiled wheels including c++ extensions using pybind11, and automatically publishing new releases using continuous integration and cibuildwheel. | |||||||
Geometric-harmonic data exploration High-throughput data collection technologies are becoming increasingly common in many fields, especially in biomedical applications involving single cell data (e.g., scRNA-seq and CyTOF). These introduce a rising need for exploratory analysis to reveal and understand hidden structure in the collected (high-dimensional) Big Data. A crucial aspect in such analysis is the separation of intrinsic data geometry from data distribution, as (a) the latter is typically biased by collection artifacts and data availability, and (b) rare subpopulations and sparse transitions between meta-stable states are often of great interest in biomedical data analysis. In this talk, I will show several tools that leverage manifold learning, graph signal processing, and harmonic analysis for biomedical (in particular, genomic/proteomic) data exploration, with emphasis on visualization, data generation/augmentation, and nonlinear feature extraction. A common thread in the presented tools is the construction of a data-driven diffusion geometry that both captures intrinsic structure in data and provides a generalization of Fourier harmonics on it. These, in turn, are used to process data features along the data geometry for interpretability, denoising, and generative purposes. Finally, time permitting, I will relate this approach to the geometric scattering transform that generalizes Mallat's scattering to non-Euclidean domains and provides a mathematical framework for theoretical understanding of geometric deep learning. | |||||||
01.02.24 2:15 PM | Coarse-grained molecular dynamics for proteins with neural networks: Challenges and breakthroughs Neural network force-fields have enabled molecular dynamics (MD) simulations at unprecedented accuracy by efficiently emulating expensive ab initio calculations. However, these advances have not yet accelerated the long-timescale modelling of biomolecular complexes, where the computational cost of classical force-fields is difficult to reduce. One leading approach for adapting neural network force fields to this context focuses on creating force-fields at a reduced (i.e. coarse-grained) resolution. We here discuss how this task differs from that at the atomistic resolution and discuss recent advances by myself and colleagues which have brought the idea of an accurate and extrapolative neural network protein coarse-grained force-fields within reach, with focus on the collection and processing of training data. | Tristan Berau | Institute for Theoretical Physics Philosophenweg 19, Seminar Room | ||||
Integrated surveillance and novel data streams for infectious disease outbreak prediction Pandemic preparedness and prevention focus on increasing the capacity to detect, manage and prevent infectious disease outbreaks spread between the environment, animals, and humans. To improve the preparedness systems novel surveillance data streams and data integration across sectors are required, as well as the development of decision guiding predictive models coupled with effective response mechanisms. This talk focuses on describing the state-of-the-art within the intersection of pandemic preparedness and climate-sensitive infectious disease and early showcasing a few interesting developments and applications of machine learning for sensors and surveillance. For example, in relation to mosquito smart traps, tick citizen science and IoT sensors for bioacoustics of animals. It will further discuss predictive modelling of emerging infectious diseases and provide example on what model requirements and features are important to consider within this area. Registration: | |||||||
Machine Learning and AI for the Sciences: toward understanding In recent years, machine learning (ML) and artificial intelligence (AI) methods have begun to play a more and more enabling role in the sciences and in industry. In particular, the advent of large and/or complex data corpora has given rise to new technological challenges and possibilities. In his talk, Müller will touch upon the topic of ML applications in the sciences, in particular in chemistry and physics. He will also discuss possibilities for extracting information from machine learning models to further our understanding by explaining nonlinear ML models. Finally, Müller will briefly discuss perspectives and limitations. | |||||||
Geometric Learning via PDE-G-CNNs: Training of Association Fields We consider PDE-based Group Convolutional Neural Networks (PDE-G-CNNs) that generalize Group equivariant Convolutional Neural Networks (G-CNNs). In PDE-G-CNNs a network layer is a set of PDE-solvers. The underlying (non)linear PDEs are defined on the homogeneous space M(d) of positions and orientations within the roto-translation group SE(d) and provide a geometric design of the roto-translation equivariant neural network. | |||||||
Science Notes „KI und Sprache“ Künstliche Intelligenz hat die Welt auf den Kopf gestellt. Große Sprachmodelle wie ChatGPT scheinen auf alle Fragen Antworten zu haben und erzeugen unterschiedlichste Texte in nur wenigen Augenblicken. Wo geht diese Entwicklung hin? Und wo liegen die Grenzen von Künstlicher Intelligenz? Bei den Science Notes beleuchten wir, wie Künstliche Intelligenz Sprache erzeugt und fragen, was das für uns bedeutet. Vier Forscher:innen berichten von ihrer Arbeit; mit auf der Bühne ist die Schriftstellerin Kathrin Passig. Das Duo Ströme begleitet den Abend musikalisch an ihren modularen Synthesizern. | |||||||
Machine learning galore! Lab presentations Rocket science Roman Remme (Hamprecht Lab) Lara Alegre (Heneka lab) To help plan the catering, please register for free by Jan 15th! | |||||||
Discretization of diffusions via gradient flows We consider the non-linear Fokker-Planck equation. We show uniform in time bounds for the error of particle approximation in terms of the second Wasserstein distance. We use the variational formulation of non-linear Fokker-Planck and we show that particle approximation can be seen as an inexact gradient flow of free energy functional. To deal with non-local interaction kernel we use results from propagation of chaos theory. The talk will be based on the joint work with Matej Benko, Iwona Chlebicka and Jorgen Endal | |||||||
Bayesian Inference Models for Healthcare Resilience: Navigating the challenges of the global COVID-19 pandemic required strategic decisions tailored to each country’s unique circumstances. In less developed nations, the threat of overwhelming hospital capacity was especially severe. Rather than building scenarios based on mathematical models, our team used a dynamic forecasting approach. We developed a series of models that provided 4-week probabilistic forecasts, complete with uncertainty quantification. These forecasts, crucially informed by real-time data, predicted the demand for hospital beds and ventilators, which served as the backbone for decisions and public policies adopted by federal health authorities in Mexico from April 2020 to January 2022. The journey was a challenging one. An incompletely characterized virus and the unpredictable dynamics of societal behavior made crafting a useful model difficult. | |||||||
Analog in-memory computing for deep learning Deep neural networks (DNNs) are revolutionizing the field of artificial intelligence and are key drivers of innovation in device technology and computer architecture. While there has been significant progress in the development of specialized hardware for DNN inference, many of the existing architectures physically split the memory and processing units. This means that DNN models are typically stored in a separate memory location, and that computational tasks require constant shuffling of data between the memory and processing units – a process that slows down computation and limits the maximum achievable energy efficiency. Analog in-memory computing (AIMC) is a promising approach that addressing this challenge by borrowing two key features of how biological neural networks are realized. Synaptic weights are physically localized in nanoscale memory elements and the associated computational operations are performed in the analog/mixed-signal domain. | |||||||
11.01.24 2.15 PM | Machine Learning in condensed matter: from molecules and materials to quantum systems
Machine learning (ML) encompasses a wide range of algorithms and models, which have been prominently applied to condensed matter. Some applications range from atomistic simulations, generative quantum and classical distributions, predictors of physicochemical properties, differential equations’ ansatz, among many others. In this talk, we will present some examples of how ML models have advanced our understanding of molecular systems and their complex interactions. In particular, we will focus on how combining machine learned force fields and quantum interatomic dilation, not only reveals the intricate nature of molecular systems, but also shows the limitations of many electronic structure methods. Additionally, we will briefly show some of the current applications of ML to quantum systems in our group, this with particular emphasis to describe excited states in second quantized representation and their paramount importance while describing experimental results. | Tristan Bereau | Institute for Theoretical Physics, Philosophenweg 19, Seminar room | ||||
An Interdisciplinary Journey of Computational Mathematics in Theoretical Chemistry Numerical simulations are widely used as a third pillar besides experimental and theoretical investigations in many sciences such as physics and chemistry as well as engineering science. It requires the development of robust and efficient numerical methods for the resolution of the underlying physical laws that arise often in form of partial differential equations (PDEs). In this talk, I will describe two interdisciplinary stories. While illustrating the key-ideas of the theory and methods, I will also highlight the occasions where not only the mathematical tools have been successfully developed and transferred to the application but also where the interdisciplinary interactions raised new mathematical questions and triggered new answers and theories in mathematics. From the application viewpoint, this talk will touch upon implicit solvation models and Born-Oppenheimer molecular dynamics but the methods rely on various and divers mathematical concepts from Grassmann manifolds to descriptors from machine learning and perturbation theory. | |||||||
11.12.23 9:30 AM -11 AM | Enhancing Accuracy in Deep Learning Using Random Matrix Theory
We discuss applications of random matrix theory (RMT) to the training of deep neural networks (DNNs). Our focus is on pruning of DNN parameters, guided by the Marchenko-Pastur spectral approach. Our numerical results show that this pruning leads to a drastic reduction of parameters while not reducing the accuracy of DNNs and CNNs. Moreover, pruning the fully connected DNNs actually increases the accuracy and decreases the variance for random initializations. We next show how these RMT techniques can be used to remove 20% of parameters from state-of-the-art DNNs such as Resnet and VIT while reducing accuracy by at most 2% and, at some instances, even increasing accuracy. Finally, we provide a theoretical understanding of these results by proving the Pruning Theorem that establishes a rigorous relation between the accuracy of the pruned and non-pruned DNNs. Joint work with E. Sandier (U. Paris 12), Y. Shmalo (PSU student) and L. Zhang (Jiao Tong U.) | IWR Fred Hamprecht | Mathematikon Im Neuenheimer Feld 205 Konferenzraum / 5. Stock, Raum 5/104 69210 Heidelberg | ||||
NCT Data Science Seminar Combining different views from complementary data layers is key for robust predictions in biomedical research. However, it is still challenging to incorporate dependencies and relationships into coherent modeling and prediction frameworks, especially when applying machine learning (ML). I focus on two approaches that I combine with ML: Network-based methods allow for representing and harnessing the interaction of entities and data layers, and dynamical models can represent temporal properties and intricate dependencies of the investigated processes. In this talk, I will present recent examples of our developed methods for integrative data analysis. These range from differential integrated multi-omics networks, over neural networks with (multi-)graph input, to infusing prior knowledge from dynamical models into ML. We tackle problems such as drug response prediction or epidemic time series forecasting in scenarios with sparse data. | |||||||
About Vision and Language models: What grounded linguistic phenomena do they understand? How much do they use the image and text modality? In this talk, we will introduce Vision and Language (VL) models which can very well say if an image and text are related and answer questions about images. While performance on these tasks is important, task-centered evaluation does not tell us why they are so good at these tasks, such as what are the fine-grained linguistic capabilities of VL models use when solving them. Therefore, we present our work on the VALSE💃 benchmark to test six specific linguistic phenomena grounded in images. Our zero-shot experiments with five widely-used pretrained VL models suggest that current VL models have considerable difficulty addressing most phenomena. In the second part, we ask how much a VL model uses the image and text modality in each sample or dataset. To measure the contribution of each modality in a VL model, we developed MM-SHAP which we applied in two ways: (1) to compare VL models for their average degree of multimodality, and (2) to measure for individual models the contribution of individual modalities for different tasks and datasets. Experiments with six VL models on four VL tasks highlight that unimodal collapse can occur to different degrees and in different directions, contradicting the wide-spread assumption that unimodal collapse is one-sided. | |||||||
Advanced AI Workflows in Higher Education - A Hands-On Workshop with Dr. Sarah Schwettman (MIT) Anstatt die Debatte abstrakt und spekulierend zu führen, möchten wir praktisch werden. Der Workshop wird in die brandneuen Erweiterungen verfügbarer Sprachmodelle einführen und im Detail erklären, wie deren Möglichkeiten im Kontext Hochschullehre ganz konkret eingesetzt werden können. Mit Dr. Sarah Schwettmann (Massachusetts Institute of Technology/Berkman Klein Center, Harvard) wird uns eine ausgewiesene Expertin im Bereich sprachlicher und visueller KI eine „Tiefenbohrung“ in die Materie ermöglichen, die gerade auch für die hochschulische Ausbildung künftiger Lehrer:innen (und deren späteres Arbeitsfeld Schule) zahlreiche Fragen aufwirft. | |||||||
Predict to control Mutual predictability is the key to interaction. Or in simpler terms: "experience makes teamwork". | |||||||
Colloquium Laura Maria Sangalli: Recent years have seen an explosive growth in the recording of increasingly complex and high-dimensional data, whose analysis calls for the definition of new methods, merging ideas and approaches from statistics and applied mathematics. My talk will focus on spatial and functional data observed over non-Euclidean domains, such as linear networks, two-dimensional Riemannian manifolds and non-convex volumes. I will present an innovative class of methods, based on regularizing terms involving Partial Differential Equations (PDEs), defined over the complex domains being considered. These Physics-Informed statistical learning methods enable the inclusion of the available problem specific information, suitably encoded in the regularizing PDE. Illustrative applications from environmental and life sciences will be presented. | Studio Villa Bosch Schloss-Wolfsbrunnenweg 33 REGISTRATION: You can find both links on: | ||||||
Machine learning galore! Lab presentations Rocket science Sven Köhler (Engelhardt lab) Florin Walter (Britta Velten lab) To help plan the catering, please register for free by clicking here. | |||||||
KoMSO Academy TorchPhysics: Deep Learning for partial differential equations TorchPhysics has been jointly developed by Bosch and the University of Bremen. It aims at providing an ‘as simple as possible’ platform for testing AI concepts for solving PDEs. This workshop aims at providing an overview of the state of the art AI concepts for solving PDEs and related parametric studies/parameter identication problems. We include an introduction in using TorchPhysics as well as hands on exercises using this toolbox. Furthermore we highlight recent advances in modeling injection molding processes in industrial applications via deep learning based surrogate models. Keynote speakers: | Akademie der Wissenschaften in Heidelberg Online participation is possible for a limited number of participants. Registration: Registration fee for industry: 600 € (online 350 €) Registration fee for academia: 250 € (online 150 €) (Registration fee on site incl. conference dinner, 07.11.) For registration please send an e-mail to the organization comittee: komso-academy@math.uni-bremen.de | ||||||
STRUCTURES Jour Fixe Deep artificial neural networks are a prominent approach for decision-making in scenarios involving uncertainty. These networks have significantly enhanced performance in various prediction tasks, such as image recognition, speech processing, and signal analysis. However, their utilization demands substantial computational resources and memory. On the other hand, there is a growing need to implement machine learning techniques on resource-constrained devices, including Internet of Things (IoT) devices, edge devices, and mobile platforms. In this talk, we will start by examining prior research focused on accelerating Deep Neural Networks (DNNs) through compression techniques, particularly quantization, pruning, and architecture optimization. While DNNs excel at operating under uncertainty, they are incapable of reasoning about uncertainty itself. Detecting situations where a neural architecture cannot provide a well-founded prediction is crucial. Consequently, probabilistic models have recently garnered significant interest. We will provide a brief overview of these models and discuss potential avenues to address their substantially increased computational demands. | |||||||
24.10.2023 | Minisymposium:
09:00 AM Konstantin Rusch (Massachusetts Institute of Technology, ETH Zurich)
10:00 AM Johannes Hertrich (TU Berlin)
02:00 PM Johannes Wiesel (Carnegie Mellon University)
03:00 PM Caroline Geiersbach (WIAS Berlin) | Jan Johannes | Uni Heidelberg, Mathematikon Raum 5.104, Im Neuenheimer Feld 205, 69120 Heidelberg | ||||
23.10.24 | Minisymposium: 09:00 AM Johannes Maly (University of Munich)
10:00 AM Lisa Kreusser (University of Bath)
02:00 PM Jakob Zech (Heidelberg University)
03:00 PM Diyora Salimova (University of Freiburg) | Jan Johannes | Uni Heidelberg, Mathematikon Raum 5.104, Im Neuenheimer Feld 205, 69120 Heidelberg | ||||
AI InScide Out Unconference This event offers the opportunity for leading scientists in AI applied to health, life, and natural sciences to meet and to have interdisciplinary exchanges. Join us for a vibrant community meeting that will bring together AI scientists in the areas of bioinformatics, imaging, structural biology, physics (and more), to stimulate new collaborations and spark boundary-pushing discussions. AI InScide Out will feature keynote talks from distinguished scientists, presentations from researchers associated with AI Health Innovation Cluster and ELLIS-Life Heidelberg, and flash talks from submitted abstracts. The unconference session will be distinct from a regular conference, providing ample opportunities for scientific exchange and informal discussions among all participants. We invite researchers from all career stages to register. You will have the opportunity to present your own work, either focused on a scientific advance, failure and challenge, or novel opportunities for scientific questions that are ready to be tackled using AI. The event is free of charge but registration is mandatory. Please register online by September 15: https://indico.dkfz.de/e/ai. Admission will be on a first come first-serve basis with priority given to submissions that include a (short) scientific abstract. | |||||||
ELLIS Life, Heidelberg.ai, NCT Data Science Seminar Image Denoising and the Generative Accumulation of Photons "Shot noise is a fundamental property of many imaging applications, especially in fluorescence microscopy. Removing this noise is an ill-defined problem since many clean solutions exist for a single noisy image. Traditional approaches aiming to map a noisy image to its clean counterpart usually find the minimum mean square error (MMSE) solution,i.e. , they predict the expected value of the posterior distribution of possible clean images. We present a fresh perspective on shot noise corrupted images and noise removal. By viewing image formation as the sequential accumulation of photons on a detector grid, we show that a network trained to predict the where the next photon could arrive is in fact solving the traditional MMSE denoising task. This new perspective allows us to make three contributions: | |||||||
Workshop CoE STRUCTURES and MRA Cognitive Science (FoF4) Human Intelligence (HI) meets Artificial Intelligence (AI) Join us for a day with many opportunities to get in touch with Cognitive Science/Field of Focus 4 and STRUCTURES, meet new colleagues and collaborators, discuss science with keynote speakers, and to network. We are very happy to welcome researchers at all career stages (doctoral students, post docs, professors) and especially those faculty members who joined Heidelberg University only recently and who are interested to build new transdisciplinary collaborations in the context of HI meets AI. Topics could be e.g. cognitive and computational neuroscience, the computational mind, neural networks, superstatistics, dynamical systems... All these topics are of interest for cognitive research, AI and neuroscience, and at the same time potentially also for other fields like data and computer sciences, mathematics, microbiology, technology development and medical sciences. Other fields and field suggestions are very welcome! This in-person 2-days workshop will cover many aspects of this emerging field of research, and will combine insights into exciting current research activities and key discoveries with the development of a future perspective for Cognitive Science, HI and AI. A program andmore details will follow during the next weeks. We are very much looking forward to a day with stimulating presentations and discussion in an enjoyable atmosphere. | |||||||
NCT Data Science Seminar: QI DOU Image-Based Robotic Surgery Intelligence Department of Computer Science & Engineering at The Chinese University of Hong Kong With rapid advancements in medicine and engineering technologies, the operating room has evolved to a highly complex environment, where surgeons take advantage of computers, endoscopes and robots to perform procedures with more precision while less incision. Intelligence, together with its authorized cognitive assistance, smart data analytics, and automation, is envisaged to be a core fuel to transform next-generation robotic surgery in numerous known or unknown exciting ways. In this talk, I will present ideas, methodologies and applications of image-based robotic surgery intelligence from three perspectives, i.e., AI-enabled surgical situation awareness to improve surgical procedures, AI-powered large-scale data analysis to enhance surgical education, AI-driven multi-sensory perception to achieve surgical subtask automation.. | |||||||
Big PyData BBQ #5: Large Language Models If you like cool talks about 🧑🔬 Data Science, 🤖Artificial Intelligence, 🐍 coding or 🤗 community, the Big PyData BBQ is the place to be! 🔥 The event will be live streamed and published on https://www.youtube.com/@PyDataTV. | |||||||
DAGM German Conference on Pattern Recognition The German Conference on Pattern Recognition (GCPR) is the annual symposium of the German Association for Pattern Recognition (DAGM). It is the national venue for recent advances in image processing, pattern recognition, and computer vision and it follows the long tradition of the DAGM conference series, which has been renamed to GCPR in 2013 to reflect its increasing internationalization. In 2023 in Heidelberg, the conference series will celebrate its 45th anniversary. | |||||||
5th Summer School in Medical Physics 2023: This summer school is a hybrid event and is subdivided into an online phase and a hybrid phase (attendance phase or live online phase). participants can decide to follow the course online and on site or 100% virtually. The online phase with pre-recorded lectures introduces the basics in machine learning, focusing on 3D voxelized geometries, its most frequent applications, and methodological as well as ethical aspects. The live online and attendance phases will then expose how machine learning may interfere radiotherapy workflows in detail. The knowledge of deep learning methodologies for image synthesis, organ and target segmentation, and image registration in the context of radiotherapy will be expanded. Furthermore, applications in computational dosimetry and plan optimization will be discussed, ranging from dose prediction, guided plan optimization, and treatment outcome prediction. The summer school is designed for PhD/MD, MSc or BSc students. More information is available on our website: http://www.dkfz.de/summer_school2023_de. Registration necessary:Y es | |||||||
Carl-Zeiss-Stiftung-Summer-School 2023 A conference for young, motivated and interdiscplinary researches in natural science with a focus on astrophysics and machine learning Invited Speakers: | |||||||
PiCR lecture (online): Interactive machine learning @ DKFZ This lecture is part of the Progress in Cancer Research (PiCR) lecture series April - July 2023 organized by the DKFZ International PhD Program. | |||||||
IWR Colloquium Summer 2023 Convolutional Neural Networks (CNN) are the current backbone of deep learning architectures in a wide range of applications which process array data, like 2D/3D images. Despite this overwhelming success of the application of CNNs in terms of qualitative (visual) results and classification test set accuracies, this talk will point out some servere inherent problems of CNNs regarding their insufficient signal processing capabilities. While these “signal processing flaws” have been largely ignored as test accuracies on many problems have been increasing over many years, recent research showed that current models are highly vulnerable to even the slightest changes in input distributions. In our latest works [1-5], we showed that the missing “robustness” of CNNs is not only due to insufficient training data, but to a large extend is also caused by faulty operators and architectures which are failing to adhere to basic signal processing demands. The aim of this talk is to give an overview of the dominant problems in the context of image processing and analysis and discuss possible counter measures. A | |||||||
AGI Summer Talks
Aleph Alpha, Europe’s leading AGI research company, is inviting you to their AGI Summer Talks, bringing together the region's most brilliant AI researchers and institutions in the picturesque city of Heidelberg to talk about the latest developments and the future of AGI. The AGI Summer Talks will feature three talks and Q&A with AI professors including distinguished names such as Matthias Bethge and Kristian Kersting. I addition, we are honored to present an inspiring keynote titled “Are Large Language Models the last invention we need to make?“ by Joscha Bach, an influential figure in the field of AGI. The event will conclude with a panel discussion that will bring all the speakers back on stage to explore the implications and future of AGI. | |||||||
ELLIS live/NCT Data Science Seminar: Tristan Bereau Advanced statistical methods are rapidly impregnating many scientific fields, offering new perspectives on long-standing problems. In materials science, data-driven methods are already bearing fruit in various disciplines, such as hard condensed matter or inorganic chemistry, while comparatively little has happened in soft matter. I will describe how we use multiscale simulations to leverage data-driven methods in soft matter. We aim at establishing structure-property relationships for complex thermodynamic processes across the chemical space of small molecules. Akin to screening experiments, we devise a high-throughput coarse-grained simulation framework. Coarse-graining is an appealing screening strategy for two main reasons: it significantly reduces the size of chemical space and it can suggest a low-dimensional representation of the structure-property relationship. I will briefly mention a biological application of our methodology that led to the discovery of in vivo active compounds. Finally, I will mention a number of ways machine learning can help fulfill the promise of connecting models at different scales. | |||||||
Colloquium Rafal Weron: Electricity price forecasting in the 2020s Registration: | |||||||
GIS Colloquium – Talks (Summer Term 2023) GeoAI Research and Technology Transfer for National Mapping GeoAI Research and Technology Transfer for National Mapping" highlights the application of Artificial Intelligence (AI) and Geographic Information Systems (GIS) for national mapping. The presentation emphasizes the use of GeoAI technology for efficient and accurate data acquisition, processing, and analysis in the GIS, Cartography and Mapping fields. Dr. Arundel will discuss the potential benefits of AI in mapping, such as reduced costs, increased accuracy, and faster mapping processes. The presentation also discusses various applications of GeoAI technology, such as image recognition, object detection, and optical character recognition (OCR). Of particular emphasis is the importance of partnerships between research institutions and government agencies to promote the adoption of AI technology in mapping. Finally, the presentation showcases some examples of successful implementation of GeoAI technology in national mapping, including the use of AI for feature extraction from various types of imagery, noise reduction in point-clouds, and OCR for knowledge extraction from historical maps. | |||||||
Interpretable Representations and Neuro-symbolic Methods in Deep Learning Current state-of-the-art machine learning methods impress with their capabilities for prediction, classification, or when solving even complex analytical tasks in the case of large language models. However, these methods often appear as a “black box” from the outside which makes it hard to understand how a result was achieved. In this talk, I will discuss several approaches to interpretability in machine learning. First, I will describe a method for representation learning which leads to an interpretable latent representation. Second, I will present our work at the interface between symbolic and sub-symbolic representations, so-called neuro-symbolic methods, which enable a direct interpretation of a model’s intermediate output. The talk concludes with a discussion of the relationship between interpretability and causality. | |||||||
Workshop: “Data Science and Dependence” Registration to attend talks: by email to Prof. Dahlhaus dahlhaus@statlab.uni-heidelberg.de by July 1st, | |||||||
Structures Jour Fixe The role of multiscale modeling in molecular discovery Advanced statistical methods are rapidly impregnating many scientific fields, offering new perspectives on long-standing problems. In materials science, data-driven methods are already bearing fruit in various disciplines, such as hard condensed matter or inorganic chemistry, while comparatively little has happened in soft matter. I will describe how we use multiscale simulations to leverage data-driven methods in soft matter. We aim at establishing structure-property relationships for complex thermodynamic processes across the chemical space | |||||||
06.07.23 4pm | How to enhance chemical databases for atomistic machine learning? Luis Itza Vazquez-Salazar, University of Basel Machine learning (ML) has revolutionized the field of atomistic simulations. It is now possible to obtain high-quality predictions of chemical properties at a low computational cost. Given that the computational effort to evaluate such a statistical model is independent of the quality of the input data, the most significant bottleneck for devising yet better ML models is the considerable amount of data required to train them. Although the community consensus is that more data naturally leads to better performance, it has been found that this working hypothesis is not necessarily correct for predicting chemical properties. Consequently, there is a need to identify how to obtain suitable data for training ML models while retaining the best performance of the model. In this contribution, we will discuss the use of uncertainty quantification (UQ) methods for atomistic neural networks, such as Deep Evidential Regression and Regression Prior Networks, for identifying outliers in chemical space. Furthermore, results from using different data augmentation (DA) methods like sampling from conformational space and the Atom-in-Molecule (AMONS) fragments to improve the prediction of specific chemical moieties will be discussed. Additionally, the application of UQ techniques to potential energy surfaces will be illustrated. Combining UQ and DA methods set the stage for a workflow to obtain more robust and data-efficient chemical databases while retaining prediction accuracy. | Tristan Bereau | Philosophenweg 19, 69120 Heidelberg, seminar room | ||||
GIS Colloquium – Talks (Summer Term 2023) Spatial Optimization, Significance and Evolving GIScience Spatial optimization is introduced and reviewed in historical terms. The significance of spatial optimization is demonstrated through current analysis, management, planning and policy contexts focused on emergency response, food production, wildfire risk mitigation and public health monitoring. | |||||||
Machine learning galore! Lab presentations Rocket Science Florian Hess (Durstewitz lab) Felix Draxler (Rother lab) To help plan the catering, please register for free by clicking here. | |||||||