Generative adversarial network enables rapid and robust fluorescence lifetime image analysis in live cells
In this study, Chen et al. introduced a new deep learning-based method termed flimGANE to rapidly generate accurate and high-quality FLIM images even in the photon-starved conditions. flimGANE is particularly useful in fundamental biological research and clinical applications.
Main Authors: | Yuan-I Chen, Yin-Jui Chang, Shih-Chu Liao, Trung Duc Nguyen, Jianchen Yang, Yu-An Kuo, Soonwoo Hong, Yen-Liang Liu, H. Grady Rylander, Samantha R. Santacruz, Thomas E. Yankeelov, Hsin-Chih Yeh |
---|---|
Format: | Article |
Language: | English |
Published: |
Nature Portfolio
2022-01-01
|
Series: | Communications Biology |
Online Access: | https://doi.org/10.1038/s42003-021-02938-w |
Similar Items
-
Neurobiologically realistic neural network enables cross-scale modeling of neural dynamics
by: Yin-Jui Chang, et al.
Published: (2024-03-01) -
Chemical Regulation of Fluorescence Lifetime
by: Jianan Dai, et al.
Published: (2023-10-01) -
Photoacoustic and fluorescence lifetime imaging of diatoms
by: Cvjetinovic, J, et al.
Published: (2020) -
Simple and Robust Deep Learning Approach for Fast Fluorescence Lifetime Imaging
by: Quan Wang, et al.
Published: (2022-09-01) -
Bayesian analysis of fluorescence lifetime imaging data
by: Rowley, M, et al.
Published: (2011)