Abstract
High surface density, rapidly star-forming galaxies are observed to have
\$50-100\,km\,s^-1\$ line-of-sight velocity dispersions, which
are much higher than expected from supernova driving alone, but may arise from
large-scale gravitational instabilities. Using three-dimensional simulations of
local regions of the interstellar medium, we explore the impact of high
velocity dispersions that arise from these disk instabilities. Parametrizing
disks by their surface densities and epicyclic frequencies, we conduct a series
of simulations that probe a broad range of conditions. Turbulence is driven
purely horizontally and on large scales, neglecting any energy input from
supernovae. We find that such motions lead to strong global outflows in the
highly-compact disks that were common at high redshifts, but weak or negligible
mass loss in the more diffuse disks that are prevalent today. Substantial
outflows are generated if the one-dimensional horizontal velocity dispersion
exceeds \$35\,km\,s^-1,\$ as occurs in the dense disks that have
star formation rate densities above \$0.1\,M\_ødot\,\rm
yr^-1\,kpc^-2.\$ These outflows are triggered by a thermal runaway,
arising from the inefficient cooling of hot material coupled with successive
heating from turbulent driving. Thus, even in the absence of stellar feedback,
a critical value of the star-formation rate density for outflow generation can
arise due to a turbulent heating instability. This suggests that in strongly
self-gravitating disks, outflows may be enhanced by, but need not caused by,
energy input from supernovae.
Users
Please
log in to take part in the discussion (add own reviews or comments).