Bernhard Rinner

Professor and Chair of Pervasive Computing

Institute of Networked and Embedded Systems (NES), Alpen-Adria-Universität Klagenfurt

Research Overview

 

Bernhard Rinner and his team work on the analysis, design and evaluation of pervasive networked embedded systems. Their research focuses on smart camera and sensor networks, autonomous aerial (UAV) networks and resource coordination in networks. The research methods include modeling and simulation, sensor data analysis and machine learning, and prototyping of embedded platforms as well as system-level software. Research prototypes have been applied in traffic monitoring, first responder support, privacy protection and surveillance.

Research Areas

Research is conducted in close cooperation with national and international partners both in academia as well as in industry. Our Partners include TU Graz, TU Wien, Univ. Genova, UPC Barcelona, , Queen Mary University of London, ETH Zurich, Imperial College London, Univ. Paderborn, Univ. Birmingham, Univ. Oslo, National Univ. Singapore, TTTech, AIT and EADS. Research is funded by Carinthian Research Promotion Fund (KWF), Lakeside Labs, Austrian Research Promotion Fund (FFG) and the European Commission. Over the last six years the acquired third party funding exceeds 3 million EUR.


Smart Camera Networks

[top of page]
 

Smart camera networks are real-time distributed embedded systems that perform computer vision using multiple cameras. They have emerged thanks to the simultaneous advances in four key disciplines: computer vision, image sensors, embedded computing, and sensor networks.

TrustCAM "We aim at advancing this field of research by applying novel networking concepts as well as by developing various prototypes" Bernhard Rinner explains. Analyzing the captured data within the network in real-time is important to avoid transferring large volume of video data over the network.

The strong resource limitations are challenging and require efficient algorithms and network management. The research team has developed various camera platforms and has deployed several camera networks both in indoor and outdoor environments. Prototype installations include traffic monitoring, environmental monitoring and surveillance

Another thread of research is on security and privacy-preservation in smart cameras. "The central aspect is to protect all sensitive data before it leaves the camera" Thomas Winkler points out. "We can use the computing power in modern camera systems not only for video analysis but also for onboard privacy protection.

SmartCam-1 Bernhard Rinner has been involved in smart camera research for more than a decade. The following links document some highlights of that period.

Key Publications
Research Team
  • Bernhard Rinner (PI)
  • Adam Erdelyi (PhD student)
  • Ithesam Haider (PhD student)
  • Philipp Hübner (PhD student)
  • Felix Pletzer (2009-2012)
  • Wolfgang Schriebl (2007-2011)
  • Melanie Schranz (PhD student)
  • Thomas Winkler (PostDoc)
  • Subhan Ullah (PhD student)
Projects

Resource Coordination in Sensor Networks

[top of page]
 

SRSnet Wireless sensor networks are typically used to monitor buildings, agricultural areas, and other remote environments. In most applications, the monitoring tasks have to be performed within quality constraints with limited computing and energy resources. The large number of sensors compensates for the lack of processing power of single sensors and allows the network to fulfill complex jobs. However, in many cases, task coordination within the network is required to provide monitoring quality and reasonable network lifetime at the same time.

In current sensor networks, tasks are being coordinated manually or semi-automatically. This is infeasible in large or remote networks. More sophisticated and automatic are required, where tasks adapt to the environmental situation to provide context-sensitive monitoring. Bernhard Rinner explains "Some sensors focus on movement detection first and then change to person tracking. When changing to a new taks, we must consider the resources to perform it." Image resolution, frame rate and complexity of the processing tasks influence the resource consumption. Bernhard Dieber, a researcher in the Pervasive Computing group, develops a distributed approach for dynamic reconfiguration that achieves intended network operation with minimal resources.

Similar research topics are also investigated in the multi-disciplinary project EPiCS funded by the European Commission in the Future and Emerging Technologies program. Lukas Esterle and Bernhard Rinner collaborate with the University of Birmingham to develop methods for resource-aware tracking handover in camera networks. Their socio-economic approach applies auction mechanisms to control the tracking resources in highly dynamic environments. Recently, Esterle and Dieber merged their approaches into a holistic framework for resource-aware reconfiguration of visual sensor networks.

Key Publications
Research Team
  • Bernhard Rinner (PI)
  • Bernhard Dieber (2008-2013)
  • Lukas Esterle (Post-Doc)
  • Herwig Guggi (PhD student)
  • Umair Khan (2010-2013)
  • Jennifer Simonjan (PhD student)
Projects

Multi-UAV Systems

[top of page]
 

UAV Small-scale unmanned aerial vehicles (UAVs) are more and more exploited in civil and commercial applications for monitoring, surveillance, and disaster response. Equipped with cameras and other sensors, these autonomously flying robots can quickly sense the environment from a bird's eye view or transport goods from one place to another. Such systems support, for example, first time responders in case of disasters - flooding, mudslide, forest fire, and earthquake - to quickly assess the situation and coordinate action forces.

Check out uav.aau.at for more information on multi-UAV research at Klagenfurt University.

For some applications, it is beneficial if a team of coordinated UAVs rather than a single UAV is employed. Multiple UAVs can cover a given area faster or take photos from different perspectives at the same time. This emerging technology is still at an early stage and, consequently, profound research and development efforts are needed.

Being part of the research cluster Lakeside Labs, a team of seven researchers and four professors in Klagenfurt has been working on multi-UAV systems since 2008. The team has developed a system that provides functionality similar to Google Earth and Microsoft Virtual Earth but capturing small areas with much higher resolution: A user first outlines the area of interest on a map. The UAVs fly over the specified area, take images, and provide an accurate and up-to-date overview picture of the environment.

"Our solution requires mission planning and coordination for multiple aerial vehicles," explains senior researcher Markus Quaritsch. The system computes the flight routes for the individual UAVs taking into account the maximum flight time due to battery constraints. A flight route consists of a sequence of waypoints specified in GPS coordinates, the flight altitude, and a set of actions for every waypoint such as taking a photo, setting orientation. The UAVs autonomously fly according to the computed plan without any need for human interaction.

The UAVs are equipped with different sensors (e.g., thermal camera, conventional photo camera) to build a multi-layered overview image. The pictures taken are pre-processed on board the UAV and sent to the ground station during flight. At the ground station the individual pictures are mosaicked to a large overview image. The pictures show significant perspective distortions due to the low altitude of less than 150 meters. Under guidance of Bernhard Rinner, the team has developed and patented an incremental approach for computing the overview picture that provides quick feedback to the user and is geometrically correct and visually appealing at the same time.

The real-world applicability of the system is a major objective. It was demonstrated several times in coordinating fire fighters performing service drills. "We recently participated in a large-scale forest fire exercise in the Austrian-Slovenian border region," Quaritsch says. "Other demonstrations included the monitoring of a large construction site near Vienna and the observation of an industrial accident". The lessons learned by such applied research will further the work of a recently founded spinoff company.

Key Publications
Research Team
  • Bernhard Rinner (PI)
  • Asif Khan (PhD student)
  • Markus Quaritsch (2008-2012)
  • Omair Sarwar (PhD student)
  • Jürgen Scherer (PhD student)
  • Daniel Wischounig-Strucl (2009-2013)
  • Saeed Yahyanejad (2010-2015)
Projects

Funded Projects

[top of page]
 
Self-Organizing Intelligent Network of UAVs (SINUS)
[project site]
 

cDrones The design of a self-organizing system of multiple, autonomous UAVs is founded on key building blocks for sensing, communication & networking and coordination. The SINUS project focuses on the integration of these blocks and their interaction to e ectively close the sensing-networking-acting loop within the multi-UAV system. Such a tight integration is necessary for deploying self-organizing UAVs in dynamic and partly unknown environments.

This project is joint work with the Mobile Systems group and the Multimedia Communications group (all Klagenfurt University), It is supported by Lakeside Labs GmbH and funded by the European Regional Development Fund and the Carinthian Economic Promotion Fund.

Trustworthy Sensing and Collaboration in VSN (TrustEYE)
[project site]
 

TrustEYE In most visual sensor networks (VSN), sensitive personal data is captured and analyzed. There do exist a few, partial approaches towards security and privacy protection in VSNs, but systematically establishing a secure and privacy-preserving VSN is still an open research question. The strong resource limitations, the dynamic data analysis and the spontaneous collaboration pattern among camera nodes are the main reasons for the lack of holistic security and privacy protection.

The fundamental hypothesis of this research is that trust in resource-limited VSNs can be established by making security and privacy protection inherent properties of the image sensing unit. The key idea is to "protect" access to the sensor and encapsulate dedicated security and privacy functionality in a TrustEYE-a secure sensing unit embedded on the smart camera.

The TrustEYE research project has been reviewed and scientifically approved by the Austrian Science Fund (FWF) and is funded by the Carinthian Economic Promotion Fund (KWF). The project is conducted together with Prof. Mohan Kankanhalli at National University of Singapore (NUS).

ICE Doctoral School
[project site]
 
EPiCS

Five European universities have recently established an international doctoral school on Interactive and Cognitive Environments (ICE) within the Erasmus Mundus initiative of the European Commission. The ICE-consortium is comprised of the University of Genova (coordinator), UPC Barcelona, TU Eindhoven, Queen Mary University of London and Klagenfurt University. The Erasmus Mundus proposal has been accepted for funding in a highly competitive selection procedure. Klagenfurt is the only Austrian University participating in an Erasmus Mundus Joint Doctorate among the currently funded 34 doctoral schools.

The ICE doctoral school has started in fall 2010 and is expected to run for at least seven years. The five universities deliver a joint PhD title. The European Commission will fund 10 to 15 PhD positions each year---a fifth of these positions will be assigned to Klagenfurt. Each researcher will obtain a three-year employment contract. In the first year, four international PhD researchers have joined the ICE school at Klagenfurt University as home university with this funding.

Engineering Proprioception in Computing Systems (EPiCS)
[project site]
 

EPiCS The EPiCS project aims at laying the foundation for engineering the novel class of proprioceptive computing systems. Proprioceptive computing systems collect and maintain information about their state and progress, which enables self-awareness by reasoning about their behaviour, and self-expression by effectively and autonomously adapt their behaviour to changing conditions. The Pervasive Computing group will contribute research in the self-organization of visual sensor networks.

This project is joint work of eight partners - Univ. Paderborn (coordinator), Imperial College London, Univ. Oslo, Univ. Birmingham, EADS Munich, ETH Zurich, AIT Vienna and Klagenfurt University - and funded by the FP7-ICT Future Emerging Technologies (FET).

Mobile Traffic Checker (MobiTrick)
 

The focus of the MobiTrick project is outdoor mobile computer vision with all of its challenges. Mobile systems need to be compact and energy efficient and are frequently changing locations. Therefore they must be autonomous and perform processing locally. A number of challenges arise from these requirements for which the project aims to provide solutions: Being compact, there is not much space for a large number of sensors such as laser scanners, radar antennas and the like. The work in this project will focus on stereo vision but with two different types of cameras. Thus, the system must be designed to be very energy efficient. New approaches for dynamic power management will be explored in the project. To put the work into context, several applications from the area of traffic surveillance/toll enforcement will be implemented and tested in an application oriented setting.

This project is joint work with Graz University of Technology and EFKON AG Graz and is supported by Austrian Research Promotion Agency (FFG) via the FIT-IT [visual computing] program. The MobiTrick project received the award (3rd place) for the best research proposal in the FIT-IT [visual computing] program line.

Smart Resource-Aware Multi-Sensor Network (SRSnet)
 

SRSnet The SRSnet project focuses on the design of a smart resource-aware multi-sensor network capable of autonomously detecting and localizing various events such as screams, animal noise, tracks of persons and more complex human behaviors. The project's research areas include (i) collaborative audio and video analysis, (ii) complex event detection and (iii) network reconfiguration. The SRSnet will be demonstrated in an environmental case study at the Hohe Tauern National Park.

This project is joint work with the Institute of Smart Systems Technologies at Klagenfurt University, the University of Udine and Lakeside Labs. It is funded by the European Interreg 4 Fund and the Carinthian Economic Promotion Fund.

Collaborative Microdrones (cDrones)
[project site]
 

cDrones This "Collaborative Microdrones (cDrones)" project develops a system for aerial sensing based on cooperating, wireless networked unmanned aerial vehicles (microdrones). Several microdrones will fly in formation over the area of interest in a selforganizing manner and deliver high-quality sensor data such as images or videos. These images are fused on the ground, analyzed in real-time, and delivered to the user. The project performs original research in the areas (1) flight formation, (2) mission planning and control, and (3) sensor data interpretation, and it will demonstrate a collaborative microdrone system for fire response operations.

This project is joint work with the Mobile Systems group, the Multimedia Communications group and the Intelligent Systems and Business Informatics group (all Klagenfurt University), the Institute of Automation and Control at Graz University of Technology and the Computer Vision group at University of Central Florida. It is supported by Lakeside Labs GmbH and funded by the European Regional Development Fund and the Carinthian Economic Promotion Fund (grant 20214/17095/24772).

Self-organizing Multimedia Architecture (SOMA)
[project site]
 

SOMA The project "Self-organizing Multimedia Architecture (SOMA)" aims to capture the whole life-cycle of multimedia content in a single architecture for large distributed multimedia information systems. A network of smart sensors reports events to a distribution network captured in multimedia data units. In the distribution network events are analyzed, processed, stored, and prepared for delivery. Events and related continuous data are either pushed to users on a subscription basis or consumed by users based on pull mechanisms.

This project is joint work with the Institute of Information Technology at Klagenfurt University and ASFiNAG Mautservice, Austria. It is supported by Lakeside Labs GmbH and funded by the European Regional Development Fund and the Carinthian Economic Promotion Fund (grant 20214/17095/24774).

Closed-Loop Integration of Cognition, Communication and Control (CLIC)
 

CLIC The objective of the CLIC (Closed-Loop Integration of Cognition, Communication and Control) project is to integrate real-time image analysis, adaptive motion control, and synchronous communication between the imaging and control subsystems. This integration is demonstrated on optimizing the trajectory control of a crane additionally equipped with distributed smart cameras observing the crane's environment.

This project is joint work with Automation and Control Institute (Vienna University of Technology), the Institute of Computer Engineering (Vienna University of Vienna), and TTTech AG Vienna and is supported by Austrian Research Promotion Agency (FFG) via the FIT-IT [embedded system] program (grant no. 819482).

Autonomous Traffic Monitoring by Embedded Vision (EVis)
[project site]
 

This project deals with the evaluation and prototype development of a distributed embedded platform for online data fusion. This multi-sensor fusion architecture is targeted at various applications such as smart embedded systems, remote sensing, pervasive computing and monitoring.

This project is joint work with the Institute of Computer Vision and Graphics at Graz University of Technology and EFKON AG Graz and is supported by the Austrian Research Promotion Agency (FFG) via the FIT-IT [visual computing] program (grant no. 813399).

Multi-Camera Data Aggregation & Visualization (McDAV)
[project site]
 

Smart cameras perform image analysis onboard and deliver the abstracted data. By combining data delivered from multiple cameras observing the same scene we can further increase the usefulness of smart camera networks. An important goal for such multi-camera systems is to resolve object occlusions by aggregating views from different angles. The aim of this research is to develop a data aggregation and visualization system which is able to combine the high-level output of smart cameras to form a three-dimensional model of a scene.

This research is conducted in cooperation with the Austrian Institute of Technology, Vienna.

Embedded Multi-Sensor Fusion Framework (I-SENSE)
 

This project deals with the evaluation and prototype development of a distributed embedded platform for online data fusion. This multi-sensor fusion architecture is targeted at various applications such as smart embedded systems, remote sensing, pervasive computing and monitoring.

This project is joint work with EVK Graz and is supported by Austrian Research Promotion Agency (grant no. 812204).

A HW/SW Architecture for Embedded Data Fusion (Di-Fuse)
 

This project deals with the evaluation and development of middleware-services for distributed embedded systems. These services focus on dynamic reconfiguration which enables a modification of tasks during operation and a migration of tasks onto different embedded computing nodes. The dynamic reconfiguration methods are demonstrated in a novell traffic surveillance system.

This project is joint work with the Austrian Research Centers Seibersdorf (ARCS) and is supported by Austrian Research Promotion Agency (grant no.81072).

Smart Cameras (SmartCam)
 

The goal of this project is to evaluate and develop an embedded smart camera targeted for various surveillance applications such as traffic control. This camera is realized as an embedded system with tight power restrictions and combines video sensing, video processing and communication within a single device. It captures a video stream, computes high-level traffic information such as stationary vehicle detection and motion analysis, and transfers the compressed video stream and traffic information to a network node. The smart camera is realized using a CMOS sensor, high-performance signal processors and dedicated hardware.

This project is joint work with the Austrian Research Centers Seibersdorf (ARCS) and the Institute for Computer Graphics and Vision at TU Graz. This project is supported by Texas Instruments.

Intelligent Multi-Sensor System
 

This project deals with the design, evaluation and development of an infrastructure for combining/connecting various intelligent (video) sensors. This infrastructure provides mechanisms for transferring multimedia data from the sensors as well as for configuring/migrating computational and communication tasks depending on the context of the sensors. The intelligent sensors are based on SmartCams; the infrastructure is deployed in traffic surveillance applications.

Performance Modeling and Prediction in Automation Systems
 

This project deals with modeling and predicting the performance of the PUMA Open automation system. This complex automation system is targeted for the design and test of engines, transmissions and power trains and exhibits real-time and non real-time processing at the same platform. An important goal of this project is to introduce performance modeling into the design process in order to reduce the development time of new configurations of the automation system.

This project is joint work with AVL Graz GmbH.

Monitoring and Diagnosis of Technical Systems
 

The goal of this project is to investigate and develop a model-based monitoring and diagnosis system (MDS) for online operation. The MDS has been implemented using process control computers and PCs and has been demonstrated on monitoring and diagnosing a complex heating system.

This project is supported by the Austrian Science Fund under grant number P14233-INF.

Prototyping Environment for multi-DSP Systems (PEPSY)
 

This project aims at automating the design and implementation of multi-processor applications subject to various design constraints. The prototyping tool PEPSY maps a dataflow-oriented application onto a multi-processor system, generates a static schedule for all processor and synthesizes the complete source code. PEPSY has been completely implemented and tested on various applications in the area of signal processing and power-aware computing.

Self-calibrating Monitoring
 

The goal of this project was to develop a model-based monitoring system that exploits the measurements from the supervised system in order to refine its initial system's model. This self-calibrating monitoring system has been implemented using qualitative reasoning techniques.

This project was performed at the Department of Computer Sciences at the University of Texas and was supported by the Austrian Science Fund under grant number J1429-MAT.

Best Signal Selection
 

The goal of this project was to evaluate various techniques that determine the quality of different transmission channels. A prototype has been developed which online select the channel with the highest speech quality as output. The prototype has been implemented on a standard PC equipped with signal processor and audio boards.

This project was joint work with Frequentis Nachrichtentechnik GmbH Wien.

Distributed Computer Architecture for Qualitative Simulation
 

The goal of this project was to develop a specialized computer architecture for the AI-application "qualitative simulation". The performance of this application has been improved by two orders of magnitude by parallelization and software/hardware migration. The prototype architecture has been implemented using digital signal processors and FPGAs.

This project was supported by the Austrian Science Fund under grant number P10441-MAT.