For all we care, chroma is the distance between a color and the achromatic color (“white”) having the same luminance. The max chroma that a color space can handle depends on the hue, the luminance and the color space primaries. Bringing the opponent representation is a bit off-topic for something I wrote mostly to reassure users rather than to be a class about color management for devs.
Don’t forget that sensor RGB is decoded to XYZ using an input profile 3×3 matrix that is a least-square fit of low-saturation samples. As you see on the graph above for the Nikon D810, the fitted coeffs send values to the UV zone, whether the sensor recorded them there or not. Also, sensor metamerism doesn’t match human metamerism (since the spectrum → tristimulus split is not a bijection in both cases, so 2 spectra can produce the same tristimulus on some sensor and not on others). So, in any case, shit gonna happen in the blue-indigo region and, given the circumstances, the highest priority is to get smooth gradients that do not cross several hues rather than aiming at accurate color matching.