Abstract
It has been suggested that radiative transfer effects may explain the
unusually high equivalent widths (EWs) of the Lya line, observed occasionally
from starburst galaxies, especially at high redshifts. If the dust is locked up
inside high-density clouds dispersed in an empty intercloud medium, the Lya
photons could scatter off of the surfaces of the clouds, effectively having
their journey confined to the dustless medium. The continuum radiation, on the
other hand, does not scatter, and would thus be subject to absorption inside
the clouds. This scenario is routinely invoked when Lya EWs higher than what is
expected theoretically are observed, although the ideal conditions under which
the results are derived usually are not considered. Here we systematically
examine the relevant physical parameters in this idealized framework, testing
whether any astrophysically realistic scenarios may lead to such an effect. It
is found that although clumpiness indeed facilitates the escape of Lya, it is
highly unlikely that any real interstellar media should result in a
preferential escape of Lya over continuum radiation. Other possible causes are
discussed, and it is concluded that the observed high EWs are more likely to be
caused by cooling radiation from cold accretion and/or anisotropic escape of
the Lya radiation.
Users
Please
log in to take part in the discussion (add own reviews or comments).