Published online by Cambridge University Press: 10 March 2003
Using linear kinetic plasma theory the relation between electron density and magnetic field fluctuations for low-frequency plasma waves for Maxwellian background distribution functions of arbitrary temperatures in a uniform magnetic field is derived. By taking the non-relativistic temperature limit this ratio is calculated for the diffuse intercloud medium in our Galaxy. The diffuse intercloud medium is the dominant phase of the interstellar medium with respect to radio wave propagation, dispersion and rotation measure studies. The differences between the relation of electron density and magnetic field fluctuations from the linear kinetic theory compared with the classical magnetohydrodynamics theory are established and discussed.