Standards. We love 'em, so let's have lots of them.
System B/G has a rather narrow vestigial sideband (0.75MHz) and cuts a fair bit of the upper sideband of the colour subcarrier. Also uses phase response pre-distortion at the transmitter to allow for typical receiver IF response. This may have been a good idea at the time but SAW filters made it a slight nuisance.
System I (UK) is a lot cleaner. 1.25MHz VSB, near enough full upper chroma sideband.
Coming back to the original subject, PAL is a lot more resistant to all these problems than NTSC. In a properly designed delay line PAL decoder any phase (hue) errors are converted to saturation errors which are not very visible.
Looking back, we often wonder why certain decision swere made when setting standards. You guys in the US did a wonderful job when you changed the field rate to 59.94Hz rather than tweak the sound slightly. Who likes drop-frame timecode

In Europe we're hardly any better, with multiple standards including SECAM which was a nightmare in vision mixers. At least we don't waste transmitter power with 7.5IRE setup.
And finally... PAL was a good idea at the time since it was much less critical than NTSC in several ways. In retrospect this advantage went away pretty quickly as circuit technique etc developed and it's far harder to cleanly decode a PAL signal with comb filters.