Back to Explorer
Research PaperResearchia:202602.02019[Biotechnology > Biotechnology]

LLM-Guided Exemplar Selection for Few-Shot Wearable-Sensor Human Activity Recognition

Elsen Ronando

Abstract

In this paper, we propose an LLM-Guided Exemplar Selection framework to address a key limitation in state-of-the-art Human Activity Recognition (HAR) methods: their reliance on large labeled datasets and purely geometric exemplar selection, which often fail to distinguish similar wearable sensor activities such as walking, walking upstairs, and walking downstairs. Our method incorporates semantic reasoning via an LLM-generated knowledge prior that captures feature importance, inter-class confusability, and exemplar budget multipliers, and uses it to guide exemplar scoring and selection. These priors are combined with margin-based validation cues, PageRank centrality, hubness penalization, and facility-location optimization to obtain a compact and informative set of exemplars. Evaluated on the UCI-HAR dataset under strict few-shot conditions, the framework achieves a macro F1-score of 88.78%, outperforming classical approaches such as random sampling, herding, and k-center. The results show that LLM-derived semantic priors, when integrated with structural and geometric cues, provide a stronger foundation for selecting representative sensor exemplars in few-shot wearable-sensor HAR.

Topic Context: Wearable or implantable systems that sense biological signals in real time.


Source: arXiv PDF: https://arxiv.org/pdf/2512.22385v2

Submission:2/2/2026
Comments:0 comments
Subjects:Biotechnology; Biotechnology
Original Source:
View Original PDF
arXiv: This paper is hosted on arXiv, an open-access repository
Was this helpful?

Discussion (0)

Please sign in to join the discussion.

No comments yet. Be the first to share your thoughts!

LLM-Guided Exemplar Selection for Few-Shot Wearable-Sensor Human Activity Recognition | Researchia | Researchia