Having owned an audio mfr company for a few years and having built many tube amps, I wonder how they are getting 20Hz-30KHz +/- 1dB!!
I couldn't realistically get any better than about +/- 3dB and that was 30-18KHz.
UTC's LS and HA series boast high fidelity response +/- 1 dB or so between anywhere from 30-15,000~ and 7-40,000~, depending on the transformer. It's been my experience that the frequency response of the vast majority of the LS transformers tends to get ragged above about 11K, with some serious peaks and dips. I have some of the older pre-WW2 transformers with the more crude cast-iron cases, and several of them drop off in response above about 5,000~.
I ran a test with my LS-49 class B driver transformer, connecting up a scope and feeding one end of the secondary winding into the horizontal plates and the other end into t he vertical plates, with the midtap grounded. Running a sweep throughout the audio range, the pattern should have been a straight diagonal line at any setting of the signal generator. But in reality, above 11 or 12K the line began to split into a narrow elipse, indicating a slight phase shift between the two ends of the windings at the higher audio frequencies. That was a brand new-old-stock transformer from the sealed box, and the frequency response was rated flat to at least 20k.
I didn't see it as a problem, and I have been using the transformer for over a decade now with good performance. But it demonstrates that UTC apparently over-rated the specs on their transformers.
And I doubt that the transformers deteriorated over time. What could happen to a transformer to cause its frequency response to deteriorate with age? I suspect quality control banked on most users of the transformers taking their word for the specs without giving them a rigorous test, and perhaps cheerfully replacing the transformer with a carefully tested one if a customer did happen to complain.