You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, tau is used as the strength of the blackbody (stellar continuum) and modified Blackbody (dust continuum) components, and the equation for evaluating the blackbody looks like this
where the fitted amplitude is reported as tau in the Features table.
In general, the fact the we often have to deal with the tails of the blackbody profile, makes them hard to use. In my experience, the typical order of magnitude of the resulting tau values can be very different, scaling strongly with the temperature of the blackbody. I feel like this causes numerical issues sometimes; very often, the minimizer will use only two components, while all the rest of the tau values are set to 0.
We could think of an alternate parameterization / normalization for these components. A few ideas:
Amplitude at the peak wavelength. This becomes a problem however when the peak is far "off screen", and only the tail of the blackbody is seen
Amplitude at a fixed wavelength, say 20 or 30 microns. Would be ok for most dust components, since at most dusty temperatures, the blackbody profile will have a reasonable contribution there. This will not work for the stellar component however.
Total power: we could introduce something like "PowerBlackbody". Again, this might be problematic for components that peak far outside the wavelength range. But at least it would allow us to set reasonable and perhaps physically motivated upper limits.
The text was updated successfully, but these errors were encountered:
I'm not actually sure the zero amplitudes result from numerical instability or "tail fitting", but rather their unimportance to the final continuum shape. $\tau$is roughly the "amplitude at the peak", which as you say may be far higher and off to the red from the fitted wavelength regime. I don't think total power of a blackbody makes a lot of sense, since that is not anywhere constrained, and might encourage people to interpret it as total IR luminosity. Fixed wavelength amplitude might be OK, but given the fixed temperatures (typically) this would just be a constant numerical scaling of the amplitude by some set of factors of order 1-10 (guessing).
$\tau$ does intrinsically have surface brightness unit scaling. $\tau_\star$ can thus be interpreted as the fraction of the aperture occupied by the surface of stars (which we consider equivalent). We could normalize that to some fixed fractional value and use those for our internal $\tau$ units, but I'd like to be convinced there's a numerical issue in the first place.
Currently,
tau
is used as the strength of the blackbody (stellar continuum) and modified Blackbody (dust continuum) components, and the equation for evaluating the blackbody looks like thiswhere the fitted amplitude is reported as
tau
in theFeatures
table.In general, the fact the we often have to deal with the tails of the blackbody profile, makes them hard to use. In my experience, the typical order of magnitude of the resulting
tau
values can be very different, scaling strongly with the temperature of the blackbody. I feel like this causes numerical issues sometimes; very often, the minimizer will use only two components, while all the rest of the tau values are set to 0.We could think of an alternate parameterization / normalization for these components. A few ideas:
The text was updated successfully, but these errors were encountered: