-
1
The tensor algebra compiler
Published 2021“…<jats:p>Tensor algebra is a powerful tool with applications in machine learning, data analytics, engineering and the physical sciences. …”
Get full text
Article -
2
-
3
Tensor factorization toward precision medicine
Published 2019“…In this opinion article, we analyze the modest literature on applying tensor factorization to various biomedical fields including genotyping and phenotyping. …”
Get full text
Article -
4
Smoothed analysis of discrete tensor decomposition and assemblies of neurons
Published 2022“…We analyze linear independence of rank one tensors produced by tensor powers of randomly perturbed vectors. …”
Get full text
Article -
5
Autoscheduling for Sparse Tensor Algebra with an Asymptotic Cost Model
Published 2022Get full text
Article -
6
Model Reduction and Simulation of Nonlinear Circuits via Tensor Decomposition
Published 2016“…In this paper, we utilize tensors (namely, a higher order generalization of matrices) to develop a tensor-based nonlinear model order reduction algorithm we named TNMOR for the efficient simulation of nonlinear circuits. …”
Get full text
Get full text
Article -
7
STAVES: Speedy tensor-aided Volterra-based electronic simulator
Published 2017“…Significant computational savings can often be achieved when the appropriate low-rank tensor decomposition is available. In this paper we exploit a strong link between tensors and frequency-domain Volterra kernels in modeling nonlinear systems. …”
Get full text
Get full text
Get full text
Article -
8
Tensor Computation: A New Framework for High-Dimensional Problems in EDA
Published 2017“…This paper presents “tensor computation” as an alternative general framework for the development of efficient EDA algorithms and tools. …”
Get full text
Get full text
Article -
9
Enabling High-Dimensional Hierarchical Uncertainty Quantification by ANOVA and Tensor-Train Decomposition
Published 2015“…In order to avoid the curse of dimensionality, we employ tensor-train decomposition at the high level to construct the basis functions and Gauss quadrature points. …”
Get full text
Get full text
Article -
10
A big-data approach to handle process variations: Uncertainty quantification by tensor recovery
Published 2017“…Specifically, we simulate integrated circuits and MEMS at only a small number of quadrature samples; then, a huge number of (e.g., 1.5×1027) solution samples are estimated from the available small-size (e.g., 500) solution samples via a low-rank and tensor-recovery method. Numerical results show that our algorithm can easily extend the applicability of tensor-product stochastic collocation to IC and MEMS problems with over 50 random parameters, whereas the traditional algorithm can only handle several random parameters.…”
Get full text
Get full text
Get full text
Article -
11
Computing Low-Rank Approximations of Large-Scale Matrices with the Tensor Network Randomized SVD
Published 2019“…We propose a new algorithm for the computation of a singular value decomposition (SVD) low-rank approximation of a matrix in the matrix product operator (MPO) format, also called the tensor train matrix format. Our tensor network randomized SVD (TNrSVD) algorithm is an MPO implementation of the randomized SVD algorithm that is able to compute dominant singular values and their corresponding singular vectors. …”
Get full text
Article -
12
-
13
Limits on All Known (and Some Unknown) Approaches to Matrix Multiplication
Published 2021“…Our main result is that there is a universal constant ℓ > 2 such that a large class of tensors generalizing the Coppersmith-Winograd tensor CW q cannot be used within the Galactic method to show a bound on ω better than i, for any q. …”
Get full text
Article -
14
Superneurons: dynamic GPU memory management for training deep neural networks
Published 2022“…Evaluations against Caffe, Torch, MXNet and TensorFlow have demonstrated that SuperNeurons trains at least 3.2432 deeper network than current ones with the leading performance. …”
Get full text
Article -
15
Learning Mixed Multinomial Logit Model from Ordinal Data
Published 2016“…In the process of proving these results, we obtain a generalization of existing analysis for tensor decomposition to a more realistic regime where only partial information about each sample is available.…”
Get full text
Get full text
Article -
16
PockEngine: Sparse and Efficient Fine-tuning in a Pocket
Published 2024“…PockEngine achieves up to 15 × speedup over off-the-shelf TensorFlow (Raspberry Pi), 5.6 × memory saving back-propagation (Jetson AGX Orin). …”
Get full text
Article -
17
On the local stability of semidefinite relaxations
Published 2022“…Our framework captures a wide array of statistical estimation problems including tensor principal component analysis, rotation synchronization, orthogonal Procrustes, camera triangulation and resectioning, essential matrix estimation, system identification, and approximate GCD. …”
Get full text
Article -
18
Hypercontractivity of Spherical Averages in Hamming Space
Published 2021“…The estimate for S\delta is harder to obtain since the latter is neither a part of a semigroup nor a tensor power. The result is shown by a detailed study of the eigenvalues of S\delta and Lp \rightarrow L2 norms of the Fourier multiplier operators \Pi a with symbol equal to a characteristic function of the Hamming sphere of radius a (in the notation common in boolean analysis \Pi af = f=a, where f=a is a degree-a component of function f). …”
Get full text
Article -
19
The convex algebraic geometry of linear inverse problems
Published 2012“…For example some problems to which our framework is applicable include (1) recovering an orthogonal matrix from limited linear measurements, (2) recovering a measure given random linear combinations of its moments, and (3) recovering a low-rank tensor from limited linear observations.…”
Get full text
Get full text
Get full text
Article -
20
Predicting traffic speed in urban transportation subnetworks for multiple horizons
Published 2015“…To this end, we develop various matrix and tensor based models by applying partial least squares (PLS), higher order partial least squares (HO-PLS) and N-way partial least squares (N-PLS). …”
Get full text
Get full text
Article