ExplorerData ScienceMachine Learning
Research PaperResearchia:202605.06004

A Closed-Form Adaptive-Landmark Kernel for Certified Point-Cloud and Graph Classification

Sushovan Majhi

Abstract

We introduce PALACE (Persistence Adaptive-Landmark Analytic Classification Engine), the data-adaptive companion to PLACE, paying a small cross-validation tier on three knobs (budget, radii, bandwidth; $\leq 5$ choices each). A cover-theoretic core (Lebesgue-number criterion on the landmark cover) yields four closed-form guarantees. (i) A structural lower distortion bound $λ(τ;ν)$ on $\mathcal{D}_n$ under cross-diagram non-interference, with a $(D/L)^2$ budget reduction over the uniform grid when...

Submitted: May 6, 2026Subjects: Machine Learning; Data Science

Description / Details

We introduce PALACE (Persistence Adaptive-Landmark Analytic Classification Engine), the data-adaptive companion to PLACE, paying a small cross-validation tier on three knobs (budget, radii, bandwidth; 5\leq 5 choices each). A cover-theoretic core (Lebesgue-number criterion on the landmark cover) yields four closed-form guarantees. (i) A structural lower distortion bound λ(τ;ν)λ(τ;ν) on Dn\mathcal{D}_n under cross-diagram non-interference, with a (D/L)2(D/L)^2 budget reduction over the uniform grid when diagrams concentrate. (ii) Equal weights wk=K1/2w_k = K^{-1/2} maximizing λλ, and farthest-point-sampling positions 22-approximating the optimal kk-center covering radius; both derived from training labels alone, no gradient training. (iii) A kernel-RKHS classification rate O((k1)K/(γmmin))O((k-1)\sqrt{K}/(γ\sqrt{m_{\min}})) with binary necessity threshold m=Ω(K/γ)m = Ω(\sqrt K/γ) from a matching Le Cam lower bound, and a closed-form filtration-selection rule. The kernel-Mahalanobis margin ρ^Mah\hatρ_{\mathrm{Mah}} is the strongest closed-form ranker across the chemical-graph pool (mean Spearman ρ+0.60ρ\approx +0.60); the isotropic surrogate γ^/K\hatγ/\sqrt{K} admits a selection-consistency rate, and λ^\widehatλ from (i) provides an independent data-level signal (positive on COX2 and PTC). (iv) A per-prediction certificate, in non-asymptotic Pinelis and asymptotic Gaussian forms, with no calibration split. Empirically, PALACE is the strongest closed-form diagram-based method on Orbit5k (91.3±1.0%91.3 \pm 1.0\%, matching Persformer), leads every diagram-based competitor on COX2 and MUTAG, and is competitive on DHFR (within 1 pp of ECP). At 8×8\times domain inflation, adaptive placement maintains 94%94\% while the uniform grid collapses to chance (25%25\% on 4-class data).


Source: arXiv:2605.04046v1 - http://arxiv.org/abs/2605.04046v1 PDF: https://arxiv.org/pdf/2605.04046v1 Original Link: http://arxiv.org/abs/2605.04046v1

Please sign in to join the discussion.

No comments yet. Be the first to share your thoughts!

Access Paper
View Source PDF
Submission Info
Date:
May 6, 2026
Topic:
Data Science
Area:
Machine Learning
Comments:
0
Bookmark