Multimodal Colearning Meets Remote Sensing: Taxonomy, State of the Art, and Future Works
In remote sensing (RS), multiple modalities of data are usually available, e.g., RGB, multispectral, hyperspectral, light detection and ranging (LiDAR), and synthetic aperture radar (SAR). Multimodal machine learning systems, which fuse these rich multimodal data modalities, have shown better perfor...
Main Authors: | Nhi Kieu, Kien Nguyen, Abdullah Nazib, Tharindu Fernando, Clinton Fookes, Sridha Sridharan |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2024-01-01
|
Series: | IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10474099/ |
Similar Items
-
Editorial: Advances in multimodal learning: pedagogies, technologies, and analytics
by: Heng Luo
Published: (2023-10-01) -
Multimodal Fusion: A Review, Taxonomy, Open Challenges, Research Roadmap and Future Directions
by: Mohd Anas Wajid, et al.
Published: (2021-08-01) -
Multimodal Emotion Recognition and Sentiment Analysis Using Masked Attention and Multimodal Interaction
by: Tatiana Voloshina, et al.
Published: (2023-05-01) -
Highly Nonlinear Multimode Tellurite Fibers: From Glass Synthesis to Practical Applications in Multiphoton Imaging
by: Marianne Evrard, et al.
Published: (2023-01-01) -
Aesthetic Engagement with Mouse Bird Snake Wolf – Literary Multimodal Literacy in English Language Education
by: Mari Skjerdal Lysne
Published: (2023-11-01)