Adiabatic propagation of beams in nonlocal nonlinear media with gradual linear loss/gain

The linear loss is a crucial factor in optical communications, which is unavoidable in actual physical system. To overcome the effect of the linear losses, the linear amplification of the optical fields is often conducted. However, the optical beam will shed partial energy, then cannot restore to it...

Full description

Bibliographic Details
Main Authors: Yuxin Zheng, Xiangwei Chen, Guo Liang, Qi Guo
Format: Article
Language:English
Published: Elsevier 2023-09-01
Series:Results in Physics
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S2211379723007027
Description
Summary:The linear loss is a crucial factor in optical communications, which is unavoidable in actual physical system. To overcome the effect of the linear losses, the linear amplification of the optical fields is often conducted. However, the optical beam will shed partial energy, then cannot restore to its original state after one period of propagation. We investigate the evolution of optical beam in the nonlocal nonlinear media with linear loss and gain using the variational approach and the numerical simulation. When the linear loss gradually changes to the linear gain, the optical beams can restore to their initial states, the phenomenon we called the “adiabatic propagation”. We have demonstrated that, as long as the changing rate of the linear loss and gain is small enough, the linear gain can exactly compensate the linear loss and the adiabatic propagation can occur for any beams with any profiles. The numerical simulations agree well with the variational results. In experiment, this kind of linear gain can be realized by distributed Raman amplification.
ISSN:2211-3797