Abstract
In this work we use gamma-ray burst (GRB) afterglow spectra observed with the
VLT/X-shooter spectrograph to measure rest-frame extinction in GRB
lines-of-sight by modeling the broadband near-infrared (NIR) to X-ray afterglow
spectral energy distributions (SEDs). Our sample consists of nine Swift GRBs,
eight of them belonging to the long-duration and one to the short-duration
class. Dust is modeled using the average extinction curves of the Milky Way and
the two Magellanic Clouds. We derive the rest-frame extinction of the entire
sample, which fall in the range $0 A_V 1.2$.
Moreover, the SMC extinction curve is the preferred extinction curve template
for the majority of our sample, a result which is in agreement with those
commonly observed in GRB lines-of-sights. In one analysed case (GRB 120119A),
the common extinction curve templates fail to reproduce the observed
extinction. To illustrate the advantage of using the high-quality X-shooter
afterglow SEDs over the photometric SEDs, we repeat the modeling using the
broadband SEDs with the NIR-to-UV photometric measurements instead of the
spectra. The main result is that the spectroscopic data, thanks to a
combination of excellent resolution and coverage of the blue part of the SED,
are more successful in constraining the extinction curves and therefore the
dust properties in GRB hosts with respect to photometric measurements. In all
cases but one the extinction curve of one template is preferred over the
others. We show that the modeled values of the extinction and the spectral
slope, obtained through spectroscopic and photometric SED analysis, can differ
significantly for individual events. Finally we stress that, regardless of the
resolution of the optical-to-NIR data, the SED modeling gives reliable results
only when the fit is performed on a SED covering a broader spectral region.
Users
Please
log in to take part in the discussion (add own reviews or comments).