Abstract
The phase-slip process, which occurs as the instantaneous variation of the phase of the order parameter on a multiple of 2π for one-dimensionally distributed systems, is studied in detail. The general analytical results are supplemented by a computer simulation of the phase-slip process for the case of the Ginzburg-Landau equation.
- Received 22 July 1991
DOI:https://doi.org/10.1103/PhysRevA.45.4175
©1992 American Physical Society