Abstract : Tensor decompositions have been increasingly used in Signal Processing during the last decade. In particular, Tucker and HOSVD decompositions are attractive for compression purposes, and the Canonical Polyadic decomposition (CP), sometimes referred to as Parafac, allows to restore identifiability in some Blind Identification problems, as pointed out already twenty years ago with the birth of Independent Component Analysis. There exist sufficient conditions under which the CP is essentially unique. These conditions have been first stated by Kruskal in the seventies, and impose a bound between tensor rank and the so-called k-rank of matrix factors, itself closely related to the notion of matrix spark used in Sparse Component Analysis. It is possible to relate this uniqueness condition to the so-called coherence condition obtained in the framework of compressed sensing. A computationally feasible variant of Kruskal's uniqueness condition is obtained, where the coherence appears as a proxy for k-rank. Problems of sparsest recovery with infinite continuous dictionaries, lowest-rank tensor representation, and Blind Source Separation are addressed in a uniform fashion. These statements are illustrated by an application in antenna array processing, namely blind localization and extraction of radiating sources from structured multiple antennas. With appropriate bounds on a certain distance between radiating sources, one can always guarantee the existence and uniqueness of a best rank-r approximation of a tensor, and hence of source estimates.