Here’s the thought process I’ve recently been through in case it helps.
How my setup was (misconfigured mess)
I have a Dell U3011 monitor which I bought years ago mainly for its size, resolution and aspect ratio (16:10 instead of the less-useful-to-me 16:9). I ran it in its standard (full gamut, though that kind of thing wasn’t really on my radar back then) mode, but everything in my sRGB/web life was very bright and saturated, and I compensated by turning the contrast right down.
This made for quite a nice experience for me indivudally in isolation as colours were still saturated and I don’t like my displays too bright/contrasty.
What I then tried (attempting Adobe RGB)
When I recently bought myself a new DSLR and started to get back into photography a bit more I looked into display colour issues and read a few articles. At some point I looked up my monitor specs and info and was surprised to learn that it was wide(r) gamut (I’d been assuming it was sRGB), and that it can “do” Adobe RGB. I’d read that working in Adobe RGB could be problematic but I wanted to give it a go, so I started to experiment.
I did more reading and used some of the tools @houz mentions above. I think I did have some success, and my monitor’s Adobe RGB mode, Ubuntu and Darktable seemed to make this viable, though I wasn’t very sure of the results.
Where I am now (aiming low)
However, I ended up deciding that I didn’t need colours outside of sRGB for now (especially as if you’re going to go to the bother of Adobe RGB it seems you should really do a proper monitor calibration rather than relying on the manufacturer’s supplied profile, which I’m not keen to do at this stage) in the knowledge that I could always re-process RAWs into AdobeRGB rather than sRGB if I wanted to in order to get those richer colours. (RAWs are not made in either colour space, and require rendering into such a colour space at the point of processing.) I don’t print often and my images will mostly be viewed on the web, where it’s best to assume that viewers have no colour management, unfortunately. I’m a web developer and hobbyist photographer, so I think sRGB is currently right for me.
Wishing to stick with the simple sRGB route, I’d turned my monitor into its preconfigured sRGB mode. However, this made all colours very dull and unsaturated. I finally realised that my low contrast setting from back when I initially adjusted the monitor was to blame; turning it up again resulted in somewhat saturated colours, though they are still less saturated than I was used to running in the standard/full gamut mode I’d got used to. However, as I understand it, these slightly less saturated colours are nearer what most viewers will see on typical monitors, most of which won’t support saturation to the level wider gamut displays do.
I’m not sure if this helps you in your situation, but maybe it gives you some things to think about and try.
My OS and monitor are both set to sRGB, and your images look the same to me in both browsers; I would say that the saturation level is good as the red hair is quite bright while still showing detail. I opened up one of the thumbnails in GIMP and I see that it had an sRGB colour profile embedded in it, and this being the case, Firefox shouldn’t erroneously desaturate as I understand it. Nor should it if there is no no embedded colour profile as it should assume sRGB.