Adaptive granularity in tensors: A quest for interpretable structure

Data collected at very frequent intervals is usually extremely sparse and has no structure that is exploitable by modern tensor decomposition algorithms. Thus, the utility of such tensors is low, in terms of the amount of interpretable and exploitable structure that one can extract from them. In thi...

Full description

Bibliographic Details
Main Authors: Ravdeep S. Pasricha, Ekta Gujral, Evangelos E. Papalexakis
Format: Article
Language:English
Published: Frontiers Media S.A. 2022-11-01
Series:Frontiers in Big Data
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fdata.2022.929511/full
_version_ 1811215967374540800
author Ravdeep S. Pasricha
Ekta Gujral
Evangelos E. Papalexakis
author_facet Ravdeep S. Pasricha
Ekta Gujral
Evangelos E. Papalexakis
author_sort Ravdeep S. Pasricha
collection DOAJ
description Data collected at very frequent intervals is usually extremely sparse and has no structure that is exploitable by modern tensor decomposition algorithms. Thus, the utility of such tensors is low, in terms of the amount of interpretable and exploitable structure that one can extract from them. In this paper, we introduce the problem of finding a tensor of adaptive aggregated granularity that can be decomposed to reveal meaningful latent concepts (structures) from datasets that, in their original form, are not amenable to tensor analysis. Such datasets fall under the broad category of sparse point processes that evolve over space and/or time. To the best of our knowledge, this is the first work that explores adaptive granularity aggregation in tensors. Furthermore, we formally define the problem and discuss different definitions of “good structure” that are in practice and show that the optimal solution is of prohibitive combinatorial complexity. Subsequently, we propose an efficient and effective greedy algorithm called ICEBREAKER, which follows a number of intuitive decision criteria that locally maximize the “goodness of structure,” resulting in high-quality tensors. We evaluate our method on synthetic, semi-synthetic, and real datasets. In all the cases, our proposed method constructs tensors that have a very high structure quality.
first_indexed 2024-04-12T06:31:14Z
format Article
id doaj.art-395d31e6c9db4c95a3a5fc02a7b0184c
institution Directory Open Access Journal
issn 2624-909X
language English
last_indexed 2024-04-12T06:31:14Z
publishDate 2022-11-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Big Data
spelling doaj.art-395d31e6c9db4c95a3a5fc02a7b0184c2022-12-22T03:44:00ZengFrontiers Media S.A.Frontiers in Big Data2624-909X2022-11-01510.3389/fdata.2022.929511929511Adaptive granularity in tensors: A quest for interpretable structureRavdeep S. PasrichaEkta GujralEvangelos E. PapalexakisData collected at very frequent intervals is usually extremely sparse and has no structure that is exploitable by modern tensor decomposition algorithms. Thus, the utility of such tensors is low, in terms of the amount of interpretable and exploitable structure that one can extract from them. In this paper, we introduce the problem of finding a tensor of adaptive aggregated granularity that can be decomposed to reveal meaningful latent concepts (structures) from datasets that, in their original form, are not amenable to tensor analysis. Such datasets fall under the broad category of sparse point processes that evolve over space and/or time. To the best of our knowledge, this is the first work that explores adaptive granularity aggregation in tensors. Furthermore, we formally define the problem and discuss different definitions of “good structure” that are in practice and show that the optimal solution is of prohibitive combinatorial complexity. Subsequently, we propose an efficient and effective greedy algorithm called ICEBREAKER, which follows a number of intuitive decision criteria that locally maximize the “goodness of structure,” resulting in high-quality tensors. We evaluate our method on synthetic, semi-synthetic, and real datasets. In all the cases, our proposed method constructs tensors that have a very high structure quality.https://www.frontiersin.org/articles/10.3389/fdata.2022.929511/fulltensorunsupervised learningtemporal granularitytensor decompositionmulti-aspect data
spellingShingle Ravdeep S. Pasricha
Ekta Gujral
Evangelos E. Papalexakis
Adaptive granularity in tensors: A quest for interpretable structure
Frontiers in Big Data
tensor
unsupervised learning
temporal granularity
tensor decomposition
multi-aspect data
title Adaptive granularity in tensors: A quest for interpretable structure
title_full Adaptive granularity in tensors: A quest for interpretable structure
title_fullStr Adaptive granularity in tensors: A quest for interpretable structure
title_full_unstemmed Adaptive granularity in tensors: A quest for interpretable structure
title_short Adaptive granularity in tensors: A quest for interpretable structure
title_sort adaptive granularity in tensors a quest for interpretable structure
topic tensor
unsupervised learning
temporal granularity
tensor decomposition
multi-aspect data
url https://www.frontiersin.org/articles/10.3389/fdata.2022.929511/full
work_keys_str_mv AT ravdeepspasricha adaptivegranularityintensorsaquestforinterpretablestructure
AT ektagujral adaptivegranularityintensorsaquestforinterpretablestructure
AT evangelosepapalexakis adaptivegranularityintensorsaquestforinterpretablestructure