Today, embedded systems are ubiquitous in our everyday life, in cell phones and washing machines, but also in cars and even medical equipment. With this escape from their former habitat within PCs, these systems are increasingly interacting with their environment: Sensors measure physical phenomena like temperature, acceleration, or magnetic fields, while actors manipulate the outside world, like in robots or electronically controlled combustion engines. The combination of an electronic system with a physical process is called a cyber-physical system (CPS).
With this new paradigm, many new challenges need to be faced during modeling, design, implementation, verification, and test. For the design of hardware and software of CPS, new approaches need to be developed, taking into account non-functional requirements like energy efficiency and reliability even in harsh environments. Real-time aspects often play an important role. Furthermore, if a system is interacting with its physical environment, it becomes difficult to prove the functional correctness of the system. The combination of discrete and continuous behavior and the treatment of noisy sensor data are challenging problems.
The lecture notes will be published in the series LNCS by Springer publishing. All participants are invited to take part in a poster session during the event.
The summer school will take place in the House of Science (Haus der Wissenschaft), which is located in the city center of Bremen. Bremen is a city of around 550,000 inhabitants and an important science center in north-west Germany. Bremen has an airport with good connections to many European cities. From the airport, it is only a 15 minutes tram ride to the city center.
Note that the registration fee does not include accommodation. Here are some links to hostels and hotels that you might find suitable for your needs:
Sabine Glesner: Verification of Embedded Real-time Systems
Real-time systems are systems where the correctness does not only depend on their correct functioning but also on meeting real- time constraints. Such systems are often deployed in safety-critical applications, for example in airplanes, trains, or automotive systems. There, a failure may result in enormous costs or even in human injuries or loss of lifes. As a consequence, systematic verification and validation of real-time systems is a crucial issue. The main application area for real-time systems are embedded applications, where the system controls technical processes that also evolve in real-time. Such systems are usually composed of deeply integrated hardware and software components, and they are developed under severe resource limitations and high quality requirements. In connection with the steadily increasing demands on multi-functioning and flexibility, analog control components are more and more replaced by complex digital HW/SW systems. A major challenge is to develop automated quality assurance techniques that can be used for the verification and validation of complex embedded real-time systems that consist of both hardware and software. In this lecture, we give an overview over our research contributions to this topic. In particular, we present our framework for the verification of safety and timing properties of digital embedded real-time systems, which are modeled in SystemC, using timed automata and the Uppaal model checker.
Frédéric Mallet: MARTE/CCSL for Modeling Cyber-Physical Systems
Cyber Physical Systems (CPS) combine digital computational systems with surrounding physical processes. Computations are meant to control and monitor the physical environment, which in turn affects the computations. The intrinsic heterogeneity of CPS demands the integration of diverse models to cover the different aspects of systems. The UML proposes a great variety of models and is very commonly used in industry even though it does not prescribe a particular way of using those models together. The MARTE profile proposes a set of extensions to UML in a bid to allow for the modeling of real-time and embedded systems (RTES). Yet CPS are a wider class of systems than mere RTES. Hence a legitimate question arises as whether MARTE can be used for CPS as well. This lecture illustrates some possible uses of MARTE to model CPS and uses logical clocks as a way to bring together the different models.
Session II: Applications: Space Systems
Klaus Havelund: Specification of Parametric Monitors – Quantified Event Automata versus Rule Systems
Specification-based runtime verification is a technique for monitoring program executions against specifications formalized in formal logic. Such logics are usually temporal in natue, capturing the relation between events occurring at different time points. A particular challenge in runtime verification is the elegant specification and efficient monitoring of streams of events that carry data, also referred to as parametric monitoring. This lecture presents two parametric runtime verification systems representing two quite different approaches to the problem. QEA (Quantified Event Automata) is a state machine approach based on trace-slicing, while LogFire is rule-based approach based on the RETE algorithm known from AI as being the basis for many rule systems. The presentation focuses on how easy it is to specify properties in the two approaches by specifying a collection of properties gathered during the 1st International Competition of Software for Runtime Verification (CSRV 2014), affiliated with RV 2014 in Toronto, Canada.
Jens Dalsgaard Nielsen: CUBESATs – when Quality matters.
The era of cubesats started in 2003 with the first launch on June 30th. The cubesat concept was initiated by Bob Twiggs of Moorehead University, USA. He eased the launch service by developing a physical form factor for cubesats, sothea launch box – POD – could be standardized. By using piggy bag launches, the costs could be lowered which enabled access to low earth orbit space (< 900 km above surface) for student developed and constructed satellites. A cubesat could be constructed for less than €20000 not including launch service. Aalborg University participated from the beginning on and had its AAU CUBESAT on the very first cubesat launch in 2003. Quality as a measure has changed very much since 2003. In the early cubesat years, quality was the necessary mechanic quality of the spacecraft so it was allowed onboard the launch vehicle, including size conformance and passing vibration and vacuum tests. There were no real formal requirements for functionality, although it is obvious that the satellite makers tried to do their best. Failure rates of more than 50% were normal. In this lecture, the development of the quality assurance process for cubesat launches until today will be described. It will show which lessons can be learned from this experience for the design and implementation of complex hardware/software systems operating under extreme environmental conditions.
Session III: Applications: Robotics
Elsa Kirchner: Intuitive Interaction with Robots - Technical Approaches and Challenges
A challenging goal in human-robot interaction research is to build robots that are accepted by humans as interaction partners. In the ideal case, a robot may look and behave exactly like a human and can therefore no longer be distinguished from a human. However, for acceptance a robot must not necessarily look or behave like a human. Even simple natural language processing systems or simple toylike robots can be accepted as adequate interaction partners. On the other hand, for complex interaction, intelligent support, or cooperative behavior more advanced solutions have to be developed. This lecture will discuss relevant research in the field of human-robot interaction. The focus is to convey the complexity of research that is required and to point out different research areas which are relevant to achieve the goal of developing robots that are adequate and accepted interaction partners for humans.
Sami Haddadin: Physical Safety in Robotics
Over the last decade, safe physical Human-Robot Interaction (pHRI) has been made possible due to significant advances in mechatronics, control, and planning. One result of these developments were fully integrated safer lightweight robots that are equipped with sophisticated interaction control capabilities. These new robots have even opened up novel and unforeseen application domains, in which human and robot are sought to work and interact with each other. For this, safe physical interaction is prime. This lecture gives a brief overview on two of its central aspects: human safety from an injury and standards standpoint, and control for physical interaction with focus on interaction control and collision handling.
Session IV: Hybrid and Cyber-Physical Systems
Goran Frehse: An Introduction to Hybrid Automata, Numerical Simulation and Reachability Analysis
Hybrid automata combine finite state models with contin- uous variables that are governed by differential equations. Hybrid automata are used to model systems in a wide range of domains such as automotive control, robotics, electronic circuits, systems biology, and health care. Numerical simulation approximates the evolution of the variables with a sequence of points in discretized time. This highly scalable technique is widely used in engineering and design, but it is difficult to simulate all representative behaviors of a system. To ensure that no critical behaviors are missed, reachability analysis aims at accurately and quickly computing a cover of the states of the system that are reachable from a given set of initial states. Reachability can be used to formally show safety and bounded liveness properties. This lecture outlines the major concepts and discusses advantages and shortcomings of the different techniques.
Krish Chakrabarty: Advances in Design Automation Techniques for Digital-Microfluidic Biochips
Due to their emergence as an efficient platform for point-of-care clinical diagnostics, digital-microfluidic biochips (DMFBs) have received considerable attention in recent years. They combine electronics with biology, and they integrate various bioassay operations, such as sample preparation, analysis, separation, and detection. In this lecture, we first present an overview of digital-microfluidic biochips. We next describe emerging computer-aided design (CAD) tools for the automated synthesis and optimization of biochips from bioassay protocols. The chapter includes solutions for fluidic-operation scheduling, module placement, droplet routing, and pin-constrained chip design. We also show how recent advances in the integration of sensors into a DMFB can be exploited to provide cyberphysical system adaptation based on feedback-driven control.
Session V: Verification & Test
A. Haxthausen: Model Checking and Model-Based Testing in the Railway Domain
This lecture describes some approaches and emerging trends for verification and model-based testing of railway control systems. We describe state-of-the-art methods and associated tools for verifying interlocking systems and their configuration data, using bounded model checking and k-induction. Using real-world models of novel Danish interlocking systems, it is exemplified how this method scales up and is suitable for industrial application. For verification of the integrated HW/SW system performing the interlocking control tasks, a model-based hardware-in-the-loop testing approach is presented. The trade-off between complete test strategies capable of uncovering every error in implementations of a given fault domain on the one hand, and on the other hand the unmanageable load of test cases typically created by these strategies is discussed. Pragmatic approaches resulting in manageable test suites with good test strength are explained. Interlocking systems represent just one class of many others, where concrete system instances are created from generic representations, using configuration data for determining the behaviour of the instances. We explain how the systematic transition from generic to concrete instances in the development path is complemented by associated transitions in the verification and testing paths.
B. Becker: Modeling Unknown Values in Test and Verification
With increasing complexities and a component-based design style the handling of unkownn values (e.g. at the interface of components) becomes more and more important in electronic design automation (EDA) and production processes. Tools are required that allow an accurate modeling of unkowns in combination with algorithms balancing exactness of representation and efficiency of calculation. In this lecture, state-of-the-art approaches are described that enable an efficient and successful handling of unknown values using formal techniques in the areas of Test and Verification.