Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2020 Jul 22;20(15):4083.
doi: 10.3390/s20154083.

LARa: Creating a Dataset for Human Activity Recognition in Logistics Using Semantic Attributes

Affiliations

LARa: Creating a Dataset for Human Activity Recognition in Logistics Using Semantic Attributes

Friedrich Niemann et al. Sensors (Basel). .

Abstract

Optimizations in logistics require recognition and analysis of human activities. The potential of sensor-based human activity recognition (HAR) in logistics is not yet well explored. Despite a significant increase in HAR datasets in the past twenty years, no available dataset depicts activities in logistics. This contribution presents the first freely accessible logistics-dataset. In the 'Innovationlab Hybrid Services in Logistics' at TU Dortmund University, two picking and one packing scenarios were recreated. Fourteen subjects were recorded individually when performing warehousing activities using Optical marker-based Motion Capture (OMoCap), inertial measurement units (IMUs), and an RGB camera. A total of 758 min of recordings were labeled by 12 annotators in 474 person-h. All the given data have been labeled and categorized into 8 activity classes and 19 binary coarse-semantic descriptions, also called attributes. The dataset is deployed for solving HAR using deep networks.

Keywords: attribute-based representation; dataset; human activity recognition; inertial measurement unit; logistics; motion capturing.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure A1
Figure A1
Method of the dataset overview.
Figure A2
Figure A2
Datasets by year of publication.
Figure A3
Figure A3
Screenshot of the annotation and revision tool during the annotation.
Figure 1
Figure 1
Business process model of logistics Scenario 1—simplified order picking.
Figure 2
Figure 2
Physical laboratory set-up of logistics Scenario 1—simplified order picking.
Figure 3
Figure 3
Business process model of logistics Scenario 2 (Part 1)—real warehouse order picking.
Figure 4
Figure 4
Business process model of logistics Scenario 2 (Part 2)—real warehouse order picking.
Figure 5
Figure 5
Physical laboratory set-up of logistics Scenario 2—real warehouse order picking.
Figure 6
Figure 6
Business process model of logistics Scenario 3—real warehouse packaging work station.
Figure 7
Figure 7
Physical laboratory set-up of logistics Scenario 3—real warehouse packaging work station.
Figure 8
Figure 8
Marker position on a Optical marker-based Motion Capture (OMoCap) suit.
Figure 9
Figure 9
Positions of on-body devices (inertial measurement unit (IMU)) from set 1 (Texas Instruments Incorporated), set 2 (MbientLab), and set 3 (MotionMiners GmbH).
Figure 10
Figure 10
Subjects before the recordings.
Figure 11
Figure 11
Semantic attributes.
Figure 12
Figure 12
The Temporal Convolutional Neural Network (tCNN) architecture contains four convolutional layers of size [5×1×64]. According to the classification task, there are two types of last fully-connected layer: a softmax and a sigmoid.

References

    1. Bulling A., Blanke U., Schiele B. A Tutorial on Human Activity Recognition Using Body-Worn Inertial Sensors. ACM Comput. Surv. (CSUR) 2014;46:1–33. doi: 10.1145/2499621. - DOI
    1. Ordóñez F.J., Roggen D. Deep convolutional and LSTM recurrent neural networks for multimodal wearable activity recognition. Sensors. 2016;16:115. doi: 10.3390/s16010115. - DOI - PMC - PubMed
    1. Grzeszick R., Lenk J.M., Rueda F.M., Fink G.A., Feldhorst S., ten Hompel M. Deep Neural Network based Human Activity Recognition for the Order Picking Process; Proceedings of the 4th International Workshop on Sensor-Based Activity Recognition and Interaction; Rostock, Germany. 21–22 September 2017.
    1. Roggen D., Calatroni A., Nguyen-Dinh L.V., Chavarriaga R., Sagha H., Digumarti S.T. Activity Recognition Challenge|Opportunity. [(accessed on 20 March 2020)]; Available online: http://www.opportunity-project.eu/challenge.html.
    1. Reiss A. UCI Machine Learning Repository: PAMAP2 Physical Activity Monitoring Data Set. [(accessed on 20 March 2020)];2016 Available online: http://archive.ics.uci.edu/ml/datasets/PAMAP2+Physical+Activity+Monitoring.

LinkOut - more resources