# High-dimensional $p$-norms

4 CLASSIC - Computational Learning, Aggregation, Supervised Statistical, Inference, and Classification
DMA - Département de Mathématiques et Applications, ENS Paris - École normale supérieure - Paris, Inria Paris-Rocquencourt
Abstract : Let $\bX=(X_1, \hdots, X_d)$ be a $\mathbb R^d$-valued random vector with i.i.d.~components, and let $\Vert\bX\Vert_p= ( \sum_{j=1}^d|X_j|^p)^{1/p}$ be its $p$-norm, for $p>0$. The impact of letting $d$ go to infinity on $\Vert\bX\Vert_p$ has surprising consequences, which may dramatically affect high-dimensional data processing. This effect is usually referred to as the {\it distance concentration phenomenon} in the computational learning literature. Despite a growing interest in this important question, previous work has essentially characterized the problem in terms of numerical experiments and incomplete mathematical statements. In the present paper, we solidify some of the arguments which previously appeared in the literature and offer new insights into the phenomenon.
Keywords :
Type de document :
Pré-publication, Document de travail
19 pages. 2013
Domaine :

https://hal.archives-ouvertes.fr/hal-00879436
Contributeur : Gérard Biau <>
Soumis le : dimanche 3 novembre 2013 - 20:16:23
Dernière modification le : jeudi 27 avril 2017 - 09:45:44
Document(s) archivé(s) le : mardi 4 février 2014 - 04:28:14

### Fichiers

biau-mason-springer4.pdf
Fichiers produits par l'(les) auteur(s)

### Identifiants

• HAL Id : hal-00879436, version 1
• ARXIV : 1311.0587

### Citation

Gérard Biau, David Mason. High-dimensional $p$-norms. 19 pages. 2013. <hal-00879436>

Consultations de
la notice

## 330

Téléchargements du document