HASCA2018

6th International Workshop on Human Activity Sensing Corpus and Application: Towards Open-Ended Context Awareness

Welcome to HASCA2018

Welcome to HASCA2018 Web site!

HASCA2018 is a sixth International Workshop on Human Activity Sensing Corpus and Applications: Towards Open-Ended Context Awareness. The workshop will be held in conjunction with UbiComp2018.

Abstract

The recognition of complex and subtle human behaviors from wearable sensors will enable next-generation human-oriented computing in scenarios of high societal value (e.g., dementia care). This will require large-scale human activity corpuses and much improved methods to recognize activities and the context in which they occur. This workshop deals with the challenges of designing reproducible experimental setups, running large-scale dataset collection campaigns, designing activity and context recognition methods that are robust and adaptive, and evaluating systems in the real world. We wish to reflect on future methods, such as lifelong learning approaches that allow open-ended activity recognition.

Unique this year, HASCA will welcome papers from participants to the Sussex-Huawei Locomotion and Transportation Recognition Competition in a special session.

"Sussex-Huawei Locomotion and Transportation Recognition Competition": http://www.shl-dataset.org/activity-recognition-challenge/

The objective of this workshop is to share the experiences among current researchers around the challenges of real-world activity recognition, the role of datasets and tools, and breakthrough approaches towards open-ended contextual intelligence. We expect the following domains to be relevant contributions to this workshop (but not limited to):

Data collection / Corpus construction

Experiences or reports from data collection and/or corpus construction projects, such as papers describing the formats, styles or methodologies for data collection. Cloud- sourcing data collection or participatory sensing also could be included in this topic.

Effectiveness of Data / Data Centric Research

There is a field of research based on the collected corpus, which is called “Data Centric Research”. Also, we solicit of the experience of using large-scale human activity sensing corpus. Using large-scape corpus with machine learning, there will be a large space for improving the performance of recognition results.

Tools and Algorithms for Activity Recognition

If we have appropriate and suitable tools for management of sensor data, activity recognition researchers could be more focused on their research theme. However, development of tools or algorithms for sharing among the research community is not much appreciated. In this workshop, we solicit development reports of tools and algorithms for forwarding the community.

Real World Application and Experiences

Activity recognition "in the Lab" usually works well. However, it is not true in the real world. In this workshop, we also solicit the experiences from real world applications. There is a huge gap/valley between "Lab Envi- ronment" and "Real World Environment". Large scale human activity sensing corpus will help to overcome this gap/valley.

Sensing Devices and Systems

Data collection is not only performed by the "off the shelf" sensors. There is a requirement to develop some special devices to obtain some sort of information. There is also a research area about the development or evaluate the system or technologies for data collection.

Mobile experience sampling, experience sampling strategies:

Advances in experience sampling ap- proaches, for instance intelligently querying the user or using novel devices (e.g. smartwatches) are likely to play an important role to provide user-contributed annotations of their own activities.

Unsupervised pattern discovery

Discovering mean- ingful repeating patterns in sensor data can be fundamental in informing other elements of a system generating an activity corpus, such as inquiring user or triggering annotation crowd sourcing.

Dataset acquisition and annotation through crowd-sourcing, web-mining

A wide abundance of sensor data is potentially in reach with users instrumented with their mobile phones and other wearables. Capitalizing on crowd-sourcing to create larger datasets in a cost effective manner may be critical to open-ended activity recognition. Online datasets could also be used to bootstrap recognition models.

Transfer learning, semi-supervised learning, lifelong learning

The ability to translate recognition mod- els across modalities or to use minimal supervision would allow to reuse datasets across domains and reduce the costs of acquiring annotations.


Program(Tentative)

October 12th

HASCA papers; 15min talk + 5 min question

8:30--8:40

Opening remarks and SHL introduction
(Kazuya Murao)

8:40--10:00

Session 1:
(Chair: TBD)

MEASURed: A Multi-Sensor Setting Activity Recognition Simulation Tool

Shingo Takeda:Kyushu Institute of Technology; Paula Lago:Kyushu Institute of Technology; Tsuyoshi Okita:Kyushu Institute of Technology; Sozo Inoue:Kyushu Institute of Technology;

Understanding how Non-experts Collect and Annotate Activity Data

Michael D Jones:Brigham Young University; Naomi Johnson:Brigham Young University; Kevin Seppi:Brigham Young University; Lawrence W Thatcher:Brigham Young University;

Fundamental Concept of a Living Laboratory in a University toward an Appropriate Feedback with Data Acquisition, Accumulation, and Analysis

Masaki Shuzo:Tokyo Denki University; Motoki Sakai:Tokyo Denki University; Eisaku Maeda:Tokyo Denki University;

Exploring the Number and Suitable Positions of Wearable Sensors in Automatic Rehabilitation Recording

Kohei Komukai:Toyohashi University of Technology; Ren Ohmura:Toyohashi University of Technology;

10:00--10:30

Coffee break

10:30--12:10

Session 2:
(Chair: TBD)

Study of LoRaWAN Technology for Activity Recognition

Tahera Hossain:Kyushu Institute of Technology; Tahia Tazin:North South University; Yusuke Doi:Kyushu Institute of Technology; Md Atiqur Rahman Ahad:Osaka University; Sozo Inoue:Kyushu Institute of Technology;

OpenHAR: Toolbox for easy access to publicly open human activity data sets

Pekka Siirtola:University of Oulu;Heli Koskim?ki:University of Oulu;Juha R?ning:University of Oulu;

Investigating the effect of sensor position for training type recognition in a body weight training support system

Masashi Takata:Nara Institute of Science and Technology; Yugo Nakamura:Nara Institute of Science and Technology/JSPS; Manato Fujimoto:Nara Institute of Science and Technology; Yutaka Arakawa:Nara Institute of Science and Technology/JST PRESTO; Keiichi Yasumoto:Nara Institute of Science and Technology;

On Robustness of Cloud Speech APIs: An Early Characterization

Akhil Mathur:Nokia Bell Labs/University College London; Anton Isopoussu:Nokia Bell Labs; Fahim Kawsar:Nokia Bell Labs;Robert Smith:University College London; Nicholas D. Lane:University of Oxford; Nadia Berthouze:University College London;

Improve Wi-Fi positioning accuracy by considering human body

Shohei Harada:Ritsumeikan University; Kazuya Murao:Ritsumeikan University; Masahiro Mochizuki:Ritsumeikan University; Nobuhiko Nishio:Ritsumeikan University;

12:10--13:30

Lunch

13:30--15:30

Session 3:
(Chair: TBD)

Activity Recognition: Translation Across Sensor Modalities Using Deep Learning

Tsuyoshi Okita:Kyushu Institute of Technology; Sozo Inoue:Kyushu Institute of Technology;

A case study for human gestures recognition from poorly annotated data

Mathias Ciliberto:University of Sussex; Lin Wang:University of Sussex; Daniel Roggen:University of Sussex; Ruediger Zillmer:Unilever R&D;

SHL Challenge: Introduction

Hristijan Gjoreski: Ss. Cyril and Methodius University, Macedonia, & University of Sussex, UK

SHL Challenge: Summary & Baseline evaluation

Lin Wang: University of Sussex, UK

SHL Team 1

SHL Challenge: Posters

Hamidi, Massinissa, et al.: Hybrid and Convolutional Neural Networks for Locomotion Recognition

Widhalm, Peter, et al.: Top In The Lab, Flop In The Field? Evaluation Of A Sensor-based Travel Activity Classifier With The SHL Dataset

Li, Shuai, et al.: Smartphone-sensors based Activity Recognition using IndRNN

Lee, Jong-Seok, et al.: Confidence-based Deep Multimodal Fusion for Activity Recognition

Gjoreski, Martin, et al.: Applying Multiple Knowledge to Sussex-Huawei Locomotion Challenge

Akamine, Kensaku, et al.: SHL recognition challenge: Team TK-2 - Combining Results of Multisize Instances -

Das Antar, Anindya, et al.: A Comparative Approach to Classification of Locomotion and Transportation Modes Using Smartphone Sensor Data

Saha, Swapnil Sayan, et al.: Supervised and Neural Classifiers for Locomotion Analysis

Wu, Jian, et al.: A Decision Level Fusion and Signal Analysis Technique for Activity Segmentation and Recognition on Smart Phones

Nakamura, Yugo, et al.: Multi-stage activity inference for locomotion and transportation analytics of mobile users

Zahid, Tarek, et al.: A Fast Resource Efficient Method for Human Action Recognition

Shuzo, Masaki, et al.: Application of CNN for Human Activity Recognition with FFT Spectrogram of Acceleration and Gyro Sensors

Janko, Vito, et al.: A New Frontier for Activity Recognition – The Sussex-Huawei Locomotion Challenge

Akbari, Ali, et al.: Hierarchical Signal Segmentation and Classification for Accurate Activity Recognition

Xia, Zhengxu, et al.: Deep Convolutional Bidirectional LSTM based Transportation Mode Recognition

Sloma, Michael, et al.: Activity Recognition by Classification with Time Stabilization for the Sussex-Huawei Locomotion Challenge

Wang, Lin, et al.: Benchmarking the SHL Recognition Challenge with Classical and Deep-Learning Pipelines

Wang, Lin, et al.: Summary of the Sussex-Huawei Locomotion-Transportation Recognition Challenge

Urano, Kenta, et al.: Short Segment Random Forest with Post Processing using Label Constraint for SHL Recognition Challenge

Nozaki, Junto, et al.: Activity Recognition using Dual-ConvLSTM Extracting Local and Global Features for SHL Recognition Challenge

15:00--15:30

Coffee break and SHL Challenge: Posters

15:30--17:00

Session 4:
(Chair: TBD)

SHL Team 2

SHL Team 3

SHL Team 4

SHL Team 5

SHL Challenge: Panel discussion

SHL Challenge: Ranking

17:00--17:05

SHL Winners Announcement

17:05--17:15

Discussion and Closing