-
601
Multi-modal recommendation algorithm fusing visual and textual features.
Published 2023-01-01“…These algorithms use image features and text features to extend the available information, which alleviate the data sparsity problem effectively, but they also have some limitations. …”
Get full text
Article -
602
E-commerce Recommender System Using PCA and K-Means Clustering
Published 2022-02-01“…K-Means is used to overcome sparsity problems and to form user clusters to reduce the amount of data that needs to be processed. …”
Get full text
Article -
603
Robust Adaptive Filtering Algorithm for Self-Interference Cancellation with Impulsive Noise
Published 2021-01-01“…To solve this problem, the sparsity of the SI channel is estimated with the estimation of the weight vector at each iteration, and it is used to adjust the weight vector. …”
Get full text
Article -
604
A multi-intent based multi-policy relay contrastive learning for sequential recommendation
Published 2022-08-01“…The recent contrastive learning (CL) has shown potential in mitigating the issue of data sparsity. Many item representations are destined to be poorly learned due to data sparsity. …”
Get full text
Article -
605
A Biased Proportional-Integral-Derivative-Incorporated Latent Factor Analysis Model
Published 2021-06-01“…This leads to a data sparsity problem. The latent factor analysis (LFA) model has been proposed as the solution to the data sparsity problem. …”
Get full text
Article -
606
Salt and Pepper Noise Removal with Multi-Class Dictionary Learning and L0 Norm Regularizations
Published 2018-12-01“…In this work, image sparsity is enhanced with a fast multiclass dictionary learning, and then both the sparsity regularization and robust data fidelity are formulated as minimizations of L0-L0 norms for salt and pepper impulse noise removal. …”
Get full text
Article -
607
Graph Neural Network-Guided Contrastive Learning for Sequential Recommendation
Published 2023-06-01“…The model can enhance recommendation performance and mitigate the data sparsity problem.…”
Get full text
Article -
608
Sparse Regularization-Based Approach for Point Cloud Denoising and Sharp Features Enhancement
Published 2020-06-01“…The L1 norm is a way to measure the sparsity of a solution, and applying an L1 optimization to the point cloud can measure the sparsity of sharp features, producing clean point set surfaces with sharp features. …”
Get full text
Article -
609
Investigating Multi-Array Antenna Signal Convergence using Wavelet Transform and Krylov Sequence
Published 2018-01-01“…For rapid convergence, two ambiguities should be addressed; Eigenvalue spread and sparse identification or sparsity of the signal. Eigen value spread is defining as the ratio of minimum to maximum Eigenvalue, whereas sparsity is defining as the loosely bounded system. …”
Get full text
Article -
610
On the Performance of Efficient Channel Estimation Strategies for Hybrid Millimeter Wave MIMO System
Published 2020-10-01“…Therefore, in this paper, we proposed a novel channel estimation strategy based on the symmetrical version of alternating direction methods of multipliers (S-ADMM), which exploits the sparsity and low rank property of channel altogether in a symmetrical manner. …”
Get full text
Article -
611
Multi-modal recommendation algorithm fusing visual and textual features
Published 2023-01-01“…These algorithms use image features and text features to extend the available information, which alleviate the data sparsity problem effectively, but they also have some limitations. …”
Get full text
Article -
612
Proximal linearized method for sparse equity portfolio optimization with minimum transaction cost
Published 2023-11-01“…The former is achieved by including the ℓ 0 $\ell _{0}$ -norm regularization of the asset weights to promote sparsity. Subjected to a minimum expected return, the proposed model turns out to be an objective function consisting of discontinuous and nonconvex terms. …”
Get full text
Article -
613
Recommendation System Using Autoencoders
Published 2020-08-01“…Collaborative filtering is widely used in this type of systems, but high dimensions and data sparsity are always a main problem. With the idea of deep learning gaining more importance, several works have emerged to improve this type of filtering. …”
Get full text
Article -
614
Research on Mixed Matrix Estimation Algorithm Based on Improved Sparse Representation Model in Underdetermined Blind Source Separation System
Published 2023-01-01“…The simulation results show that the sparsity of the mixed signal and the estimation accuracy of the mixed matrix are improved. …”
Get full text
Article -
615
Deep Learning-Based Context-Aware Recommender System Considering Contextual Features
Published 2021-12-01“…Also, for the dataset with data sparsity problem, it was confirmed that the performance of the proposed method is higher than that of existing methods. …”
Get full text
Article -
616
Explicit Construction of RIP Matrices Is Ramsey‐Hard
Published 2021“…While it is known that random matrices satisfy the RIP with high probability even for n = logO(1)p, the explicit deteministic construction of such matrices defied the repeated efforts, and most of the known approaches hit the so-called (Formula presented.) sparsity bottleneck. The notable exception is the work by Bourgain et al. constructing an n × p RIP matrix with sparsity s = Θ(n1/2 + ϵ), but in the regime n = Ω(p1 − δ). …”
Get full text
Article -
617
Design collaborative filtering recommender systems to solve cold-start problem
Published 2022“…As a result, the rating quality suffers. The sparsity of the rating matrix is also a significant issue, as it makes it difficult to identify items that are related to one another and are similar. …”
Get full text
Final Year Project (FYP) -
618
Quantitative recovery conditions for tree-based compressed sensing
Published 2016“…As shown by Blumensath and Davies (2009) and Baraniuk et al. (2010), signals whose wavelet coefficients exhibit a rooted tree structure can be recovered using specially adapted compressed sensing algorithms from just n = O(k) measurements, where k is the sparsity of the signal. Motivated by these results, we introduce a simplified proportional-dimensional asymptotic framework, which enables the quantitative evaluation of recovery guarantees for tree-based compressed sensing. …”
Journal article -
619
Progressive skeletonization: trimming more fat from a network at initialization
Published 2020“…Recent studies have shown that skeletonization (pruning parameters) of networks at initialization provides all the practical benefits of sparsity both at inference and training time, while only marginally degrading their performance. …”
Conference item -
620
Multigrid solvers for the de Rham complex with optimal complexity in polynomial degree
Published 2024“…We overcome this with the finer Hiptmair space decomposition and the use of incomplete Cholesky factorizations imposing the sparsity pattern arising from static condensation, which applies whether static condensation is used for the solver or not. …”
Journal article