Human Activity Recognition

This page provides tools, data sets, and results which deal with the automatic detection of human activity recognition (e.g., walking) as well as activities of daily living (e.g., watering plants).

This page requires Javascript! Javascript is disabled (maybe due to a Browser plugin). Please enable Javascript. This page was optimized for Chrome and Firefox.

Please refer to our publications if you use our datasets, tools, results, etc.

Please do not hesitate to contact me if something is missing or not working!

Applications

Our aim is to develop robust activity recognition methods based on mobile device sensors that generate high quality results in a real world setting. This section provides our developed apps including source code, and docs. We provide a comprehensive data collector app and develop currently a HAR app for everday life.

Get it on Google Play

Data Sets

The data set covers the acceleration, GPS, gyroscope, light, magnetic field, and sound level data of the activities climbing stairs down and up, jumping, lying, standing, sitting, running/jogging, and walking of fifteen subjects. For each activity, the on-body positions chest, forearm, head, shin, thigh, upper arm, and waist were simultaneously recorded.

Our Data Set

Frameworks

For processing the recorded sensor data, we developed useful frameworks. Our frameworks are available, free for use, and can be modified or enhanced without further requests. Currently, we focus on derving useful information from accelormeter data, i.e, segmenting the data stream and compute meaningful features.

Get it on Github

Subject

  • Gender?
  • Age?
  • Height?
  • Weight?
  • Occupation
    ?
head
forearm
chest
shin
thigh
upperarm
waist

Subject

  • Gender?
  • Age?
  • Height?
  • Weight?
  • Occupation?

DailyLog

Activity Logcsvxes
Environment Logcsv
Posture Logcsv
Acceleration Sensorcsv
Orientation Sensorcsv
GPS Sensorcsv
0
days

Daily Routine (Fuzzy Model)

daily routine (fuzzy model)
Please reference to the corresponding work if you use our dataset, results, tools, source code, etc.

Journal Paper / Book Section

Position-Aware Activity Recognition with Wearable Devices (2017)
Published in: Pervasive and Mobile Computing
(Timo Sztyler, Heiner Stuckenschmidt and Wolfgang Petrich)
Self-Tracking Reloaded: Applying Process Mining to Personalized Health Care from Labeled Sensor Data (2016)
Published in: Transactions on Petri Nets and Other Models of Concurrency
(Timo Sztyler, Josep Carmona, Johanna Völker and Heiner Stuckenschmidt)

Conference Paper

Hips Do Lie! A Position-Aware Mobile Fall Detection System (2018)
Published in: Pervasive Computing and Communications
(Christian Krupitzer, Timo Sztyler, Janick Edinger, Martin Breitbach, Heiner Stuckenschmidt and Christian Becker)
NECTAR: Knowledge-based Collaborative Active Learning for Activity Recognition (2018)
Published in: Pervasive Computing and Communications
(Gabriele Civitarese, Claudio Bettini, Timo Sztyler, Daniele Riboni and Heiner Stuckenschmidt)
Recognizing Grabbing Actions from Inertial and Video Sensor Data in a Warehouse Scenario (2017)
Published in: Procedia Computer Science
(Alexander Diete, Timo Sztyler, Lydia Weiland and Heiner Stuckenschmidt)
Online Personalization of Cross-Subjects based Activity Recognition Models on Wearable Devices (2017)
Published in: Pervasive Computing and Communications
(Timo Sztyler and Heiner Stuckenschmidt)
Unsupervised Recognition of Interleaved Activities of Daily Living through Ontological and Probabilistic Reasoning
Published in: Pervasive and Ubiquitous Computing
(Daniele Riboni, Timo Sztyler, Gabriele Civitarese and Heiner Stuckenschmidt)
On-body Localization of Wearable Devices: An Investigation of Position-Aware Activity Recognition (2016)
Published in: Pervasive Computing and Communications
(Timo Sztyler and Heiner Stuckenschmidt)

Workshop / Demo Paper

Modeling and Reasoning with ProbLog: An Application in Recognizing Complex Activities (2018)
Published in: Pervasive Computing and Communication Workshops
(Timo Sztyler, Gabriele Civitarese and Heiner Stuckenschmidt)
Towards Systematic Benchmarking of Activity Recognition Algorithms (2018)
Published in: Pervasive Computing and Communication Workshops
(Timo Sztyler, Christian Meilicke and Heiner Stuckenschmidt)
Improving Motion-based Activity Recognition with Ego-centric Vision (2018)
Published in: Pervasive Computing and Communication Workshops
(Alexander Diete, Timo Sztyler, Lydia Weiland and Heiner Stuckenschmidt)
Towards Real World Activity Recognition from Wearable Devices (2017)
Published in: Pervasive Computing and Communication Workshops
(Timo Sztyler)
A Smart Data Annotation Tool for Multi-Sensor Activity Recognition (2017)
Published in: Pervasive Computing and Communication Workshops
(Alexander Diete, Timo Sztyler and Heiner Stuckenschmidt)
Exploring a Multi-Sensor Picking Process in the Future Warehouse (2016)
Published in: Pervasive and Ubiquitous Computing: Adjunct
(Alexander Diete, Timo Sztyler, Lydia Weiland and Heiner Stuckenschmidt)
Discovery of Personal Processes from Labeled Sensor Data (2015)
Published in: Algorithms & Theories for the Analysis of Event Data
(Timo Sztyler, Johanna Völker, Josep Carmona, Oliver Meier and Heiner Stuckenschmidt)

People

Timo Sztyler
Timo Sztyler
University of Mannheim
Heiner Stuckenschmidt
Heiner Stuckenschmidt
University of Mannheim
This page provides an overview of the features that are supported by our Sensor Feature Factory framework. In detail, it covers 10 different features which can be used in context of time- and frequency-domain values. Our Framework uses the JTransforms library to transform the recorded sensor data (time-domain) to the frequency-domain which enables to compute, e.g., the expended energy.

Framework

Supported Platforms
SourceCode:
Documentation:
coming soon

Implemented Features

Mean

$$\bar x = \frac{1}{n}*\sum\limits_{i=1}^n x_i$$

Variance

$$var(x) = \frac{1}{n}*\sum\limits_{i=1}^n (\bar x - n_i)^2$$

Standard Deviation

$$\sigma_x = \sqrt{var(x)}$$

Median

$$\widetilde{a}=\begin{cases}x_{\frac{n+1}{2}}&\text{n odd}\\\frac{1}{2}*(x_{\frac{n}{2}}+x_{\frac{n}{2}+1})&\text{$n$ even}\end{cases}$$ $$\text{where}\ \forall x_i,x_j \in X\ x_i\le x_j\ \text{and}\ i\le j$$

Interquartile Range (type R-5)

$$iqr = Q_{0.75}-Q_{0.25}$$ $$Q_p = x_{\lfloor h \rfloor} + (h-\lfloor h \rfloor)(x_{\lfloor h \rfloor+1}-x_{\lfloor h \rfloor})$$ $$h = Np+\frac{1}{2}$$ $$\text{where}\ \forall x_i,x_j \in X\ x_i\le x_j\ \text{and}\ i\le j$$

Mean absolute deviation

$$mad = \frac{1}{n}*\sum\limits_{i=1}^n |x_i - \bar x|$$

Kurtosis

$$w = \frac{1}{n}\sum\limits_{i=1}^n (\frac{x_i - \bar x}{\sigma_x})^4$$

Correlation Coefficient (Pearson)

$$r_{xy} = \frac{\sum\limits_{i=1}^n (x_i - \bar x)(y_i - \bar y)}{\sqrt{\sum\limits_{i=1}^n (x_i - \bar x)^2}*\sqrt{\sum\limits_{i=1}^n (y_i - \bar y)^2}}$$

Entropy (Shannon)

$$IG(S,F) = E(S) - \sum\limits_{v \in Values(F)} \frac{|S_v|}{S}*E(S_v)$$ $$\text{where}\ S_v = \{s \in S | F(s)=v\}$$ $$E(S) = -\sum\limits_{i=1}^{|C|} P(i)*log_2(Pi)$$ $$\text{where}\ P(i)\ \text{ is the fraction of examples in $S$ which is assigned the label $c_i$}$$

Energy (Fourier, Parseval)

$$Energy(Y) = \frac{1}{n}*\sum\limits_{i=1}^{n} F_i^2$$ $$\text{where $F_i$ is the i-th component of the Fourier Transform of Y}$$
This is an implementation of an online random forest classifier in Java. The implementation details and architecture relies on the implementation of Amir Saffari. However, we enhanced the original implementation in several ways.

ORF Classifer

Supported Platforms
SourceCode:
Documentation:
coming soon

Supports

Quality Measurements
Information Gain, Gini Index
Feature Threshold
random, incremental
Threading
Yes
Export/Import
Coming soon
Java
1.7 and newer (compatible with Android)

Usage

DataSet data = new ARFF();
data.load("data.arff");
RandomForest rf = new RandomForest(...)

for (int nEpoch = 0; nEpoch < config.numEpochs; nEpoch++) { // train
  data.randomize();
  for (Sample sample : arffTrain.getSamples()) {
    rf.update(sample);
  }
}

for (Sample sample : arffTest.getSamples()) { // test
  Result result = new Result(data.getNumOfClasses());
  rf.eval(sample, result);
  results.put(result, sample.getLabel());
}                   
More: Runnable example
(Please select a publication)

POLARIS: Probabilistic and Ontological Activity Recognition in Smart-homes (Under Review)

Recognition of activities of daily living (ADLs) is an enabling technology for several ubiquitous computing applications. In this field, most activity recognition systems rely on supervised learning to extract activity models from labeled datasets. A problem with that approach is the acquisition of comprehensive activity datasets, which ...

Modeling and Reasoning with ProbLog: An Application in Recognizing Complex Activities (2018)

Smart-home activity recognition is an enabling tool for a wide range of ambient assisted living applications. The recognition of ADLs usually relies on supervised learning or knowledge-based reasoning techniques. In order to overcome the well-known limitations of those two approaches and, at the same time, to combine their strengths to ...

Hips Do Lie! A Position-Aware Mobile Fall Detection System (2018)

Ambient Assisted Living using mobile device sensors is an active area of research in pervasive computing. In our work, we aim to implement a self-adaptive pervasive fall detection approach that is suitable for real life situations. The paper focuses on the problem that the device’s on-body position but also the falling pattern of a person ...

Online Personalization of Cross-Subjects based Activity Recognition Models on Wearable Devices (2017)

Human activity recognition using wearable devices is an active area of research in pervasive computing. In our work, we target patients and elders which are unable to collect and label the required data for a subject-specific approach. For that purpose, we focus on the problem of cross-subjects based recognition models and introduce an ...

Position-Aware Activity Recognition with Wearable Devices (2017)

Reliable human activity recognition with wearable devices enables the development of human-centric pervasive applications. We aim to develop a robust wearable-based activity recognition system for real life situations. Consequently, in this work we focus on the problem of recognizing the on-body position of the wearable device ensued by ...

Self-Tracking Reloaded: Applying Process Mining to Personalized Health Care from Labeled Sensor Data (2016)

Currently, there is a trend to promote personalized health care in order to prevent diseases or to have a healthier life. Using current devices such as smart-phones and smart-watches, an individual can easily record detailed data from her daily life. Yet, this data has been mainly used for self-tracking in order to enable personalized ...

Unsupervised Recognition of Interleaved Activities of Daily Living through Ontological and Probabilistic Reasoning (2016)

Recognition of activities of daily living (ADLs) is an enabling technology for several ubiquitous computing applications. Most activity recognition systems rely on supervised learning methods to extract activity models from labeled datasets. An inherent problem of that approach consists in the acquisition of comprehensive activity datasets, ...

On-body Localization of Wearable Devices: An Investigation of Position-Aware Activity Recognition (2016)

Human activity recognition using mobile device sensors is an active area of research in pervasive computing. In our work, we aim at implementing activity recognition approaches that are suitable for real life situations. This paper focuses on the problem of recognizing the on-body position of the mobile device which in a real world setting ...

Info

This page provides the results of our experiments. Each result file contains the test results (F-Measure, Confusion Matrix, ...) for each individual subjects. However, the results are not aggregated, i.e., does not provide average values of all subjects. We applied 10-fold cross validation and performed 10 runs where each time the data set was randomized and the 10-folds were recreated. These results belong to the publication On-body Localization of Wearable Devices: An Investigation of Position-Aware Activity Recognition.

Random Forest

Test Short Description Related To Result
#1 Position Detection (all activities) Table II show
#2 Position Detection (only static activities) Table III show
#3 Position Detection (only static activities) incl. gravity feature Table IV show
#4 Position Detection (only dynamic activities) Table III show
#5 Position Detection (activity-level dependend) Table VI see #3,#4
#6 Activity Recognition (single classifier, all activities) Table VII,VIII show
#7 Activity Recognition (assumption: position is known for sure, all activities) - show
#8 Activity Recognition (based on the position detection result of RF, incl. all mistakes) Table IX,X show
#9 Distinction between static and dynamic activity (all activities) Table V show

Naive Bayes

Test Short Description Related To Result
#1 Position Detection (all activities) Table II show
#2 Position Detection (only static activities) Table III show
#3 Position Detection (only static activities) incl. gravity feature Table IV show
#4 Position Detection (only dynamic activities) Table III show
#5 Position Detection (activity-level dependend) Table VI see #3,#4
#6 Activity Recognition (single classifier, all activities) Table VII,VIII show
#7 Activity Recognition (assumption: position is known for sure, all activities) - show
#8 Activity Recognition (based on the position detection result of RF, incl. all mistakes) Table IX,X show
#9 Distinction between static and dynamic activity (all activities) Table V show

Artificial Neural Network

Test Short Description Related To Result
#1 Position Detection (all activities) Table II show
#2 Position Detection (only static activities) Table III show
#3 Position Detection (only static activities) incl. gravity feature Table IV show
#4 Position Detection (only dynamic activities) Table III show
#5 Position Detection (activity-level dependend) Table VI see #3,#4
#6 Activity Recognition (single classifier, all activities) Table VII,VIII show
#7 Activity Recognition (assumption: position is known for sure, all activities) - show
#8 Activity Recognition (based on the position detection result of RF, incl. all mistakes) Table IX,X show
#9 Distinction between static and dynamic activity (all activities) Table V show

Decision Tree

Test Short Description Related To Result
#1 Position Detection (all activities) Table II show
#2 Position Detection (only static activities) Table III show
#3 Position Detection (only static activities) incl. gravity feature Table IV show
#4 Position Detection (only dynamic activities) Table III show
#5 Position Detection (activity-level dependend) Table VI see #3,#4
#6 Activity Recognition (single classifier, all activities) Table VII,VIII show
#7 Activity Recognition (assumption: position is known for sure, all activities) - show
#8 Activity Recognition (based on the position detection result of RF, incl. all mistakes) Table IX,X show
#9 Distinction between static and dynamic activity (all activities) Table V show

k-Nearest-Neighbor

Test Short Description Related To Result
#1 Position Detection (all activities) Table II show
#2 Position Detection (only static activities) Table III show
#3 Position Detection (only static activities) incl. gravity feature Table IV show
#4 Position Detection (only dynamic activities) Table III show
#5 Position Detection (activity-level dependend) Table VI see #3,#4
#6 Activity Recognition (single classifier, all activities) Table VII,VIII show
#7 Activity Recognition (assumption: position is known for sure, all activities) - show
#8 Activity Recognition (based on the position detection result of RF, incl. all mistakes) Table IX,X show
#9 Distinction between static and dynamic activity (all activities) Table V show

Support Vector Machine

Test Short Description Related To Result
#1 Position Detection (all activities) Table II show
#2 Position Detection (only static activities) Table III show
#3 Position Detection (only static activities) incl. gravity feature Table IV show
#4 Position Detection (only dynamic activities) Table III show
#5 Position Detection (activity-level dependend) Table VI see #3,#4
#6 Activity Recognition (single classifier, all activities) Table VII,VIII show
#7 Activity Recognition (assumption: position is known for sure, all activities) - show
#8 Activity Recognition (based on the position detection result of RF, incl. all mistakes) Table IX,X show
#9 Distinction between static and dynamic activity (all activities) Table V show

Info

This page provides the results of our experiments. Each result file contains the test results for each individual patient/day, i.e., the individual and aggregated results represented by precision, recall, and F-measure. Further, we provide the sensor and instance-based results as well as the precomputet instance candidates. These results belong to the publication Unsupervised Recognition of Interleaved Activities of Daily Living through Ontological and Probabilistic Reasoning.

MLNNC solver: show
Ontology: show

WSU CASAS Dataset

MLNNC Model: show
Probabilities (Ontology-based): show
Dataset (External Link): show
Test Short Description Related To Result
#1 Probabilities derived from our ontology Table II,III show
#2 Probabilities derived from the data set Table II,III show

SmartFaber Dataset

MLNNC Model: show
Probabilities (Ontology-based): Coming soon
Dataset : Not publicly available due to data privacy
Test Short Description Related To Result
#1 Probabilities derived from our ontology Table IV,V show
#2 Probabilities derived from the data set Table IV,V show


University of Milano
University of Mannheim
University of Cagliari

Info

This page provides the results of our experiments. Each result file contains the test results (F-Measure, Confusion Matrix, ...) for each individual subject. These results belong to the publication Position-Aware Activity Recognition with Wearable Devices.

Hint

This work is an extension. Please consider the following page for detailed results that correspond to Section 5.1-5.3: On-body Localization of Wearable Devices: An Investigation of Position-Aware Activity Recognition - (Numbers of tables have changed).

Subject-Specific Activity Recognition (Section 5.2)

Test Short Description Related To Result Size [MB]
#1 Single Sensor Figure V show ~20
#2 Two-part setup (all combinations) - show ~60
#3 Three-part setup (all combinations) - show ~100

Cross-Subjects Activity Recognition (Section 5.4)

Test Short Description Related To Result Size [MB]
#1 Dynamic Activity Recognition (Randomly, One accelormeter) Table 10, 11 show <1
#2 Dynamic Activity Recognition (L1O, One accelormeter) Table 10, 11 show <1
#3 Dynamic Activity Recognition (Top-Pairs, One accelormeter) Table 10, 11 show <1
#4 Dynamic Activity Recognition (Physical, One accelormeter) Table 10, 11 show <1
#5 Activity Recognition (Physical, One accelormeter) Table 12 show <1
#6 Activity Recognition (Physical, One accelormeter, including gravity feature for static activities) Table 12 show <1
#7 Activity Recognition (Physical, Two accelormeter, Only waist combinations) Table 12, 13 show <1
#8 Activity Recognition (Physical, Two accelormeter, Only waist combinations, including gravity feature for static activities) Table 12, 13 show <1
#9 Position Recognition (Randomly, One accelormeter) Table 12, 13 show <1
#10 Position Recognition (L1O, One accelormeter) Table 12, 13 show <1
#11 Position Recognition (Top-Pairs, One accelormeter) Table 12, 13 show <1
#12 Position Recognition (Physical, One accelormeter) Table 12, 13 show <1

Info

This page provides additional material of the publication Self-Tracking Reloaded: Applying Process Mining to Personalized Health Care from Labeled Sensor Data. In the following, we provide personal process maps and trace alignment clustering results of our experiments.

Hint

As mentioned in the paper, the XES files that are provided on this page were created from data sets that were created by other researchers. The original data set "Activity Log UCI" was created by Ordóñez et al. where hh102, hh104, and hh110 originate from Cook et. al.

Personal Processes (Fuzzy Model)

Number Short Description Related To Result
#1 Main personal activity for all users during the working days (frequency) 6.2 show
#2 Main personal activity for all users during the weekend days (frequency) 6.2 show
#3 Main personal activity for all users during the working days (duration) 6.2 show
#4 Main personal activity for all users during the weekend days (duration) 6.2 show

Personal Process Models (XES)

Number Short Description Related To Result
#1 Activity Log UCI Detailed (during the week) 6.2 show
#2 Activity Log UCI Detailed (Weekend) 6.2 show
#3 hh102 (during the week) 6.2 show
#4 hh102 (Weekend) 6.2 show
#5 hh104 (during the week) 6.2 show
#6 hh104 (Weekend) 6.2 show
#7 hh110 (during the week) 6.2 show
#8 hh110 (Weekend) 6.2 show

Trace Alignment

Number Short Description Related To Result
#1 Clustered Traces, Subject 1 (based on our data set) 7 show

Info

This page provides the results of our experiments. Each result file contains the test results (Precision, Recall, F-measure) for each individual subjects. However, the results are not aggregated, i.e., does not provide average values of all subjects, positions, or combinations. These results belong to the publication Online Personalization of Cross-Subjects based Activity Recognition Models on Wearable Devices.

Hint

All experiments were performed using random forest as a classifier. We considered two different versions of this classifier: online and offline learning. The former was self/re-implemented (source code) based on the work of A. Saffari et al. where the latter was developed by M. Hall et al..

Preprocessed Accerleration Data (Windows)

Data Set Short Description Download Size [MB]
#1 Single sensor setup, separated by position and subject Download ~120
#2 Two-part setup, separated by combination and subject Download ~690
#3 Three-part setup, separated by combination and subject Download ~1600
#4 Four-part setup, separated by combination and subject Download ~2100
#5 Five-part setup, separated by combination and subject Download ~1600
#6 Six-part setup, separated by combination and subject Download ~640

Cross-Subjects Activity Recognition (Section VI, Part A)

Test Short Description Related To Result Size [MB]
#1 Randomy, One Accelerometer, Offline learning Table II show ~1
#2 Randomy, Two Accelerometer, Offline learning Table II/III/IV show ~2
#3 Randomy, Three Accelerometer, Offline learning Table II show ~3
#4 Randomy, Four Accelerometer, Offline learning Table II show ~3
#5 Randomy, Five Accelerometer, Offline learning Table II show ~2
#6 Randomy, Six Accelerometer, Offline learning Table II show ~1
#7 L1O, One Accelerometer, Offline learning Table II show <1
#8 L1O, Two Accelerometer, Offline learning Table II/III/IV show <1
#9 L1O, Three Accelerometer, Offline learning Table II show <1
#10 L1O, Four Accelerometer, Offline learning Table II show <1
#11 L1O, Five Accelerometer, Offline learning Table II show <1
#12 L1O, Six Accelerometer, Offline learning Table II show <1
#13 Our approach, One Accelerometer, Offline learning Table II show <1
#14 Our approach, Two Accelerometer, Offline learning Table II/III/IV show <1
#15 Our approach, Three Accelerometer, Offline learning Table II show <1
#16 Our approach, Four Accelerometer, Offline learning Table II show <1
#17 Our approach, Five Accelerometer, Offline learning Table II show <1
#18 Our approach, Six Accelerometer, Offline learning Table II show <1
#19 Subject-Specific, One Accelerometer, Offline learning - show <1
#20 Subject-Specific, Two Accelerometer, Offline learning - show <1

Personalization: Online and Active Learning (Section VI, Part B)

Test Short Description Related To Result Size [MB]
#1 Our approach, One Accelerometer, Online Learning - show <1
#2 Our approach + User-Feedback, One Accelerometer, Online Learning - show ~1
#3 Our approach + User-Feedback + Smoothing, One Accelerometer, Online Learning - show ~1
#4 Our approach, Two Accelerometer, Online Learning Table V/VI show <1
#5 Our approach + Smoothing, Two Accelerometer, Online Learning Table V/VI show ~2
#6 Our approach + User-Feedback, Two Accelerometer, Online Learning Table V/VI show ~2
#7 Our approach + User-Feedback + Smoothing, Two Accelerometer, Online Learning Table V/VI/VII/VIII, Figure 4 show ~2
#8 Our approach + User-Feedback + Smoothing (varying confidence threshold), Two Accelerometer, Online Learning Figure 5 show ~23
#9 Our approach + User-Feedback + Smoothing (varying number of trees), Two Accelerometer, Online Learning Figure 6 show ~27
#10 Subject-Specific, One Accelerometer, Online Learning - show ~1
#11 Subject-Specific, Two Accelerometer, Online Learning - show ~2
MLNNC solver: show
Ontology: show (unchanged compared to D. Riboni et al. 2016)

WSU CASAS Dataset (Publication)

MLNNC Model: show (also unchanged)
Test Short Description Related To Result
#1 - - coming soon

SmartFaber Dataset (Publication)

MLNNC Model: show (also unchanged)
Test Short Description Related To Result
#1 - - coming soon


University of Milano
University of Mannheim
University of Cagliari


Info

This page provides additional material. Those belong to the publication Modeling and Reasoning with ProbLog: An Application in Recognizing Complex Activities.

ProbLog Online Editor: show

ProbLog Models

Model Short Description Download Size [MB]
#1 Minimal ProbLog Model (Figure 1) Download <1
#2 Final ProbLog Model (Running Example) Download <1

cooperation
This work was a cooperation between the
and the
cooperation
University of Milano
University of Mannheim

Info

This page provides the results of our experiments. Each result file contains the individual results, i.e., not aggregated. The results were analyzed with pivot tables in Excel. These results belong to the publication Hips Do Lie! A Position-Aware Mobile Fall Detection System.

Datasets

Following, the datasets which we considered in our experiments can be found. Source provides a link to the original dataset, Publication provides a link to the related publication, and Download provides our preprocessed version of the original dataset. Further information can be found here (DataSets) and here (Traces).

Results

Test Short Description Detailed Description Related To Result
#1 Fall Detection ReadMe Table III-VI show
#2 Positioning ReadMe Table VII show
#3 Fullstack (FallDetection) ReadMe Table VIII show
#4 Fullstack (Positioning) ReadMe Table IX show
#5 Clustering ReadMe - show

Additional Results (F-measure)

Test Short Description Related To Result
#1 F0.5, F1, and F2 Table IV show
#2 F0.5, F1, and F2 Table V show
#3 F0.5, F1, and F2 Table VII show
#4 F0.5, F1, and F2 Table VIII show
#5 F0.5, F1, and F2 Table IX show
cooperation
This work was a cooperation between
and
of the University of Mannheim
cooperation
Sensor Data Collector

Sensor Data Collector

PlayStore GitHub More

This application provides the function to record all common build-in sensors of wearable devices. In addition, a labeling and visualization function is included.

Physical Activity Recognition

HAR: Everyday Life

PlayStore GitHub More

The application recognizes the performed activity, e.g., walking, running, and sitting. It considers and recognizes the on-body position of the wearable device.

Android Sensor Data Collector
This application allows to record all common build-in sensors of wearable device. It is possible to specifiy the sampling rate and record several sensors simultaneously. In addition, it is possible to label the recorded data by specifying the current activity, posture, and location. The recorded data can be visualized as well as exported in several formats.
Combining the smart-phone with at least one smart-watch enables also to record the same sensor simultaneously for different on-body positions. Further, the smart-watch provides a nadditional interface which allows to control this application without taking the smart-phone in hand. Hence, the user can easily update the current location or activity which helps to create an accurate labeling of the data.

Features

  • Recording of all build-in sensors (simultaneously)
  • Labeling recorded data (Activity, Location, Posture)
  • Visualize recorded data as well as live plotting
  • Provides several export formats (sqlite, csv)
  • Supports also smart-watches (sensor recording, additional interface)
Android Sensor Data Collector Wearable
data sensor collector 3
data sensor collector 4
data sensor collector 5
Please select a subject for more details or individual downloads. (Download, ~3.5GB)

Description

The data set covers acceleration, GPS, gyroscope, light, magnetic field, and sound level data of the activities climbing stairs down and up, jumping, lying, standing, sitting, running/jogging, and walking of fifteen subjects (age 31.9±12.4, height 173.1±6.9, weight 74.1±13.8, eight males and seven females). For each activity, we recorded simultaneously the acceleration of the body positions chest, forearm, head, shin, thigh, upper arm, and waist. Each subject performed each activity roughly 10 minutes except for jumping due to the physical exertion (~1.7 minutes). Concerning male and female, the amount of data is equally distributed. Each movement was recorded by a video camera to facilitate the usage.
Please select a subject for more details or individual downloads. (Download, ~4.6GB)

Description

Seven individuals (age 23.1±1.81, height 179.0±9.09, weight 80.6±9.41, seven males) collected accelerometer, device orientation, and GPS sensor data and labeled this data simultaneously. The data was collected using a smart-phone (sensor recordning) and smart-watch (manually labeling). The subjects were not supervised but got an introduction and guidelines.