I was basically repeating/interpreting what the fine article said:
FRBs show a frequency-dependent dispersion, a delay in the radio
signal caused by how much material it has gone through.
I believe interstellar dispersion of electromagnetic waves is
caused by interactions with ionized interstellar medium. A quick
Google(interstellar dispersion) brought up
this page which gives a formula that relates the integral of
electron density along the path of the signal to the dispersion of
the signal. On that page they assume they know the density of
interstellar medium inside our galaxy and use the dispersion of
signals from pulsars to estimate their distance.
In the FRB experiment they did it the other way around and used the
known distance (using the red-shift of the source galaxy) and the
dispersion to estimate the integral of the density. Integral
measurements such as this usually give much more accurate results
than point measurements.
If the interstellar medium were entirely made up of ionized hydrogen
then knowing the electron density would give you the total density
since there would be one proton for every electron. You need to add
corrections because only about 70% of the interstellar medium is
hydrogen so you need to estimate the number of neutrons. You also
need to make a small correction since the interstellar medium is not
100% ionized.
The reason why the dispersion is related to the electron density is
given, for example, in J. D. Jackson's Classical
Electrodynamics where the dielectric "constant" (and hence the
speed of light) is shown to be a function of the frequency. The
electromagnetic wave causes the electrons to wiggle (just a little),
the higher the frequency, the less wiggling so the higher
frequencies are slowed down less than the lower frequencies. You
can think of it like having an eccentricity in a front tire of your
car that makes the steering wheel vibrate. When you go fast enough
so the frequency of the oscillations is much greater then the
resonant frequencies of the components then the amount of vibration
you experience goes down because the direction reverses before
things can move very far. In the interstellar medium each electron
slows down the wave just a little and the total amount of slowing
down is proportional to the number of electrons encountered.
Usually, the "closeness" of the electrons is not considered, instead
the integrated electron density is used. But it is possible to
calculate how close the electrons have to be to a line between the
source and the receiver using what is called the 3-point function.
The cross section of the volume that contributes to a fixed
percentage of the signal will be roughly elliptical with the source
and the receiver at each focus of the ellipse.
Given the vast intergalactic distances involved, it will probably be
extremely wide near the middle by any earthly standards but this
width is independent of the calculation of the dispersion. I'd
imagine the width would scale as sqrt(d * c * t) where d is the
distance between the source and the receiver, c is the speed of
light, and t is roughly the duration of the signal (more accurately,
the inverse of the bandwidth).
The reason the closeness (the width of the 3-point function)
doesn't matter is because the more spread out the 3-point function
is, the weaker it is. All that matters is what you get when you
sum it all up.