Abstract
The correlation energy (the exact energy minus the Hartree-Fock energy) of an electron gas with a high and slowly varying density is examined. The term proportional to the square of the density gradient is evaluated by the application of perturbation theory to the external field and of the random-phase (or high-density) approximation to the Coulomb interaction. This term has the form , where is the electron density. is found, by summing the leading divergent diagrams, to be Ry, with the length measured in units of the Bohr radius. The role of the density gradient in the correlation energy problem of atoms is discussed.
- Received 15 June 1967
DOI:https://doi.org/10.1103/PhysRev.165.18
©1968 American Physical Society