This page provides tools, data sets, and results which deal with the automatic detection of human activity recognition (e.g., walking) as well as activities of daily living (e.g., watering plants).
This page requires Javascript! Javascript is disabled (maybe due to a Browser plugin). Please enable Javascript. This page was optimized for Chrome and Firefox.
Please refer to our publications if you use our datasets, tools, results, etc.
Please do not hesitate to contact me if something is missing or not working!
Applications
Our aim is to develop robust activity recognition methods based on mobile device sensors that generate high quality results in a real world setting. This section provides our developed apps including source code, and docs. We provide a comprehensive data collector app and develop currently a HAR app for everday life.
Data Sets
The data set covers the acceleration, GPS, gyroscope, light, magnetic field, and sound level data of the activities climbing stairs down and up, jumping, lying, standing, sitting, running/jogging, and walking of fifteen subjects. For each activity, the on-body positions chest, forearm, head, shin, thigh, upper arm, and waist were simultaneously recorded.
Frameworks
For processing the recorded sensor data, we developed useful frameworks. Our frameworks are available, free for use, and can be modified or enhanced without further requests. Currently, we focus on derving useful information from accelormeter data, i.e, segmenting the data stream and compute meaningful features.
This page provides an overview of the features that are supported by our Sensor Feature Factory framework. In detail, it covers 10 different features which can be used in context of time- and frequency-domain values. Our Framework uses the JTransforms library to transform the recorded sensor data (time-domain) to the frequency-domain which enables to compute, e.g., the expended energy.
$$IG(S,F) = E(S) - \sum\limits_{v \in Values(F)} \frac{|S_v|}{S}*E(S_v)$$ $$\text{where}\ S_v = \{s \in S | F(s)=v\}$$ $$E(S) = -\sum\limits_{i=1}^{|C|} P(i)*log_2(Pi)$$ $$\text{where}\ P(i)\ \text{ is the fraction of examples in $S$ which is assigned the label $c_i$}$$
Energy (Fourier, Parseval)
$$Energy(Y) = \frac{1}{n}*\sum\limits_{i=1}^{n} F_i^2$$ $$\text{where $F_i$ is the i-th component of the Fourier Transform of Y}$$
Online Random Forest Classifier
This is an implementation of an online random forest classifier in Java. The implementation details and architecture relies on the implementation of Amir Saffari. However, we enhanced the original implementation in several ways.
This page provides the results of our experiments. Each result file contains the test results (F-Measure, Confusion Matrix, ...) for each individual subjects. However, the results are not aggregated, i.e., does not provide average values of all subjects. We applied 10-fold cross validation and performed 10 runs where each time the data set was randomized and the 10-folds were recreated. These results belong to the publication On-body Localization of Wearable Devices: An Investigation of Position-Aware Activity Recognition.
This page provides the results of our experiments. Each result file contains the test results for each individual patient/day, i.e., the individual and aggregated results represented by precision, recall, and F-measure. Further, we provide the sensor and instance-based results as well as the precomputet instance candidates. These results belong to the publication Unsupervised Recognition of Interleaved Activities of Daily Living through Ontological and Probabilistic Reasoning.
This page provides the results of our experiments. Each result file contains the test results (F-Measure, Confusion Matrix, ...) for each individual subject. These results belong to the publication Position-Aware Activity Recognition with Wearable Devices.
As mentioned in the paper, the XES files that are provided on this page were created from data sets that were created by other researchers. The original data set "Activity Log UCI" was created by Ordóñez et al. where hh102, hh104, and hh110 originate from Cook et. al.
Personal Processes (Fuzzy Model)
Number
Short Description
Related To
Result
#1
Main personal activity for all users during the working days (frequency)
This page provides the results of our experiments. Each result file contains the test results (Precision, Recall, F-measure) for each individual subjects. However, the results are not aggregated, i.e., does not provide average values of all subjects, positions, or combinations. These results belong to the publication Online Personalization of Cross-Subjects based Activity Recognition Models on Wearable Devices.
Hint
All experiments were performed using random forest as a classifier. We considered two different versions of this classifier: online and offline learning. The former was self/re-implemented (source code) based on the work of A. Saffari et al. where the latter was developed by M. Hall et al..
Preprocessed Accerleration Data (Windows)
Data Set
Short Description
Download
Size [MB]
#1
Single sensor setup, separated by position and subject
This page provides the results of our experiments. Each result file contains the test results for each individual patient/day, i.e., the individual and aggregated results represented by precision, recall, and F-measure. These results belong to the publication POLARIS: Probabilistic and Ontological Activity Recognition in Smart-homes.
This page provides the results of our experiments. Each result file contains the individual results, i.e., not aggregated. The results were analyzed with pivot tables in Excel. These results belong to the publication Hips Do Lie! A Position-Aware Mobile Fall Detection System.
Datasets
Following, the datasets which we considered in our experiments can be found. Source provides a link to the original dataset, Publication provides a link to the related publication, and Download provides our preprocessed version of the original dataset. Further information can be found here (DataSets) and here (Traces).
This application provides the function to record all common build-in sensors of wearable devices. In addition, a labeling and visualization function is included.
The application recognizes the performed activity, e.g., walking, running, and sitting. It considers and recognizes the on-body position of the wearable device.
Sensor Data Collector
This application allows to record all common build-in sensors of wearable device. It is possible to specifiy the sampling rate and record several sensors simultaneously. In addition, it is possible to label the recorded data by specifying the current activity, posture, and location. The recorded data can be visualized as well as exported in several formats. Combining the smart-phone with at least one smart-watch enables also to record the same sensor simultaneously for different on-body positions. Further, the smart-watch provides a nadditional interface which allows to control this application without taking the smart-phone in hand. Hence, the user can easily update the current location or activity which helps to create an accurate labeling of the data.
Features
Recording of all build-in sensors (simultaneously)
Labeling recorded data (Activity, Location, Posture)
Visualize recorded data as well as live plotting
Provides several export formats (sqlite, csv)
Supports also smart-watches (sensor recording, additional interface)
Please select a subject for more details or individual downloads. (Download, ~3.5GB)
Description
The data set covers acceleration, GPS, gyroscope, light, magnetic field, and sound level data of the activities climbing stairs down and up, jumping, lying, standing, sitting, running/jogging, and walking of fifteen subjects (age 31.9±12.4, height 173.1±6.9, weight 74.1±13.8, eight males and seven females). For each activity, we recorded simultaneously the acceleration of the body positions chest, forearm, head, shin, thigh, upper arm, and waist. Each subject performed each activity roughly 10 minutes except for jumping due to the physical exertion (~1.7 minutes). Concerning male and female, the amount of data is equally distributed. Each movement was recorded by a video camera to facilitate the usage.
DataSet - Daily Log (ADL) (2016)
Please select a subject for more details or individual downloads. (Download, ~4.6GB)
Description
Seven individuals (age 23.1±1.81, height 179.0±9.09, weight 80.6±9.41, seven males) collected accelerometer, device orientation, and GPS sensor data and labeled this data simultaneously. The data was collected using a smart-phone (sensor recordning) and smart-watch (manually labeling). The subjects were not supervised but got an introduction and guidelines.
DataSet - First-Person View (ADL) (2018)
A description of the subjects will be added soon! (Download, ~1.0GB)