TY - JOUR TI - Near-neighbor preserving dimension reduction via coverings for doubling subsets of ℓ1 AU - Emiris, I.Z. AU - Margonis, V. AU - Psarros, I. JO - Theoretical Computer Science PY - 2023 VL - 942 TODO - null SP - 169-179 PB - Elsevier B.V. SN - 0304-3975 TODO - 10.1016/j.tcs.2022.11.031 TODO - Clustering algorithms; Embeddings; Geometry, Cauchy distribution; Dimension reduction; Dimensionality reduction; Doubling dimensions; Doublings; Embeddings; Manhattan metric; Nearest-neighbour; Point set; Randomized embedding, Nearest neighbor search TODO - Randomized dimensionality reduction has been recognized as one of the cornerstones in handling high-dimensional data, originating in various foundational works such as the celebrated Johnson-Lindenstrauss Lemma. More specifically, nearest neighbor-preserving embeddings exist for ℓ2 (Euclidean) and ℓ1 (Manhattan) metrics, as well as doubling subsets of ℓ2, where doubling dimension is today the most effective way of capturing intrinsic dimensionality, as well as input structure in various applications. These randomized embeddings bound the distortion only for distances between the query point and a point set. Motivated by the foundational character of fast Approximate Nearest Neighbor search in ℓ1, this paper settles an important missing case, namely that of doubling subsets of ℓ1. In particular, we introduce a randomized dimensionality reduction by means of a near neighbor-preserving embedding, which is related to the decision-with-witness problem. The input set gets represented with a carefully chosen covering point set; in a second step, the algorithm randomly projects the latter. In order to obtain the covering point sets, we leverage either approximate r-nets or randomly shifted grids, with different tradeoffs between preprocessing time and target dimension. We exploit Cauchy random variables, and derive a concentration bound of independent interest. Our algorithms are rather simple and should therefore be useful in practice. © 2022 Elsevier B.V. ER -