Human Activity Recognition Using Smartphones
The Human Activity Recognition Dataset has been collected from 30 subjects performing six different activities (Walking, Walking Upstairs, Walking Downstairs, Sitting, Standing, Laying). It consists of inertial sensor data that was collected using a smartphone carried by the subjects.
Source: http://archive.ics.uci.edu/ml/datasets/Human+Activity+Recognition+Using+Smartphones
Image Source: https://www.youtube.com/watch?v=XOEN9W05_4A
Variants: HAR
This dataset is used in 2 benchmarks:
Task | Model | Paper | Date |
---|---|---|---|
Human Activity Recognition | AFVF | Virtual Fusion with Contrastive Learning … | 2023-12-01 |
Image Clustering | FCMI | Deep Fair Clustering via Maximizing … | 2022-09-26 |
Image Clustering | Selective HAR Clustering | Efficient Deep Clustering of Human … | 2022-09-17 |
Image Clustering | N2D (UMAP) | N2D: (Not Too) Deep Clustering … | 2019-08-16 |
Recent papers with results on this dataset: