Back to Explorer
Research PaperResearchia:202603.20017[Neuroscience > Neuroscience]

Hierarchical Latent Structure Learning through Online Inference

Ines Aitsahalia

Abstract

Learning systems must balance generalization across experiences with discrimination of task-relevant details. Effective learning therefore requires representations that support both. Online latent-cause models support incremental inference but assume flat partitions, whereas hierarchical Bayesian models capture multilevel structure but typically require offline inference. We introduce the Hierarchical Online Learning of Multiscale Experience Structure (HOLMES) model, a computational framework for hierarchical latent structure learning through online inference. HOLMES combines a variation on the nested Chinese Restaurant Process prior with sequential Monte Carlo inference to perform tractable trial-by-trial inference over hierarchical latent representations without explicit supervision over the latent structure. In simulations, HOLMES matched the predictive performance of flat models while learning more compact representations that supported one-shot transfer to higher-level latent categories. In a context-dependent task with nested temporal structure, HOLMES also improved outcome prediction relative to flat models. These results provide a tractable computational framework for discovering hierarchical structure in sequential data.


Source: arXiv:2603.19139v1 - http://arxiv.org/abs/2603.19139v1 PDF: https://arxiv.org/pdf/2603.19139v1 Original Link: http://arxiv.org/abs/2603.19139v1

Submission:3/20/2026
Comments:0 comments
Subjects:Neuroscience; Neuroscience
Original Source:
View Original PDF
arXiv: This paper is hosted on arXiv, an open-access repository
Was this helpful?

Discussion (0)

Please sign in to join the discussion.

No comments yet. Be the first to share your thoughts!