direkt zum Inhalt springen

direkt zum Hauptnavigationsmenü

Sie sind hier

TU Berlin

Inhalt des Dokuments

Ubiquitous Computing


The Ubiquitous Computing paradigm describes a world in which electronic devices are not only unobtrusively embedded within the human’s environment but also in which electronic devices are interacting with each other to support humans in their daily life. Hence, Ubiquitous Computing deals with a wide range of research topics related to Pervasive Computing, Wearable Computing, Ambient Intelligence, Intelligent Environments, Automotive Computing and Smart Homes. Thereby, the context- and location-awareness of Ubiquitous Computing systems are the key enablers for many application scenarios such as carsharing, navigation systems, location-based marketing, home automation or context-aware services in general. The Ubiquitous Computing field is highly dependent on the underlying wireless communication technologies such as 5G networks, WLAN, Bluetooth, NFC etc., and many research questions deal with the optimization of protocols and algorithms that are applied over these wireless communication networks. 

Our research related to Ubiquitous Computing focuses on developing context-aware middlewares for smart mobile devices like smartphones, wearables or tablets. Our work encompasses the design and development of proactive context or location-based service infrastructures for indoor and outdoor environments as well the investigation of new methods for the datafication of mobile data. By enhancing context-aware systems with latest spatiotemporal data analytics techniques, we envision intelligent environments which are able to autonomically learn typical behavior patterns and adapt mobile applications according to situational needs. These self-adaptive systems come along with several interesting research challenges like automatic data processing and analysis, energy-efficient context-awareness and profile-based adaptation of mobile devices, which are all addressed within our research group.


Context Flow Graphs: Situation Modeling for Rule-based Proactive Context-aware Systems
Citation key 9266789
Author Rodriguez Garzon, S. and Louis, B.
Pages 1-22
Year 2020
ISSN 2169-3536
DOI 10.1109/ACCESS.2020.3040060
Journal IEEE Access
Abstract A proactive context-aware system automatically adapts its user interface to the user’s situational needs. This is achieved by continuously capturing the environmental properties, reasoning upon the context, and detecting situations where unsolicited adjustments are helpful or notifications informative. If the characteristics of those situations are well known in advance, their occurrence can be detected at runtime by the rule-based processing of raw sensor data. However, rule-based context reasoning methods determine the user’s situation mostly based on present sensor signals instead of considering the situation to be likewise the product of the past context. This article introduces a graph-based situation modeling formalism for the specification of system-relevant environmental circumstances as context flow graphs. A directed cyclic graph represents thereby the distinct contextual characteristics a user’s situation is made of and the temporal order in which these appear and disappear during the evolution of the situation. Complex situations for rule-based proactive context-aware systems can then be expressed at a high level of abstraction and without the need to understand the underlying sensor-related signal processing mechanisms. The technical feasibility is demonstrated by a prototypical distributed proactive context-aware middleware that, in addition, comes up with a web-based user interface for the interactive graphical and logical modeling of situations as context flow graphs.
Bibtex Type of Publication SNET Data Ubiquitous
Download Bibtex entry

Zusatzinformationen / Extras

Quick Access:

Schnellnavigation zur Seite über Nummerneingabe

TU Berlin - Service-centric Networking - TEL 19
Ernst-Reuter-Platz 7
10587 Berlin, Germany
Phone: +49 30 8353 58811
Fax: +49 30 8353 58409