Question about amplifier voltage and dB: test tones vs music

I have a question about amplifiers and how the output voltage relates to the dB level produced in headphones. If I understand correctly, the sensitivity of a headphone is expressed as dB per Volt (or dB per mW) and that headphone manufacturers usually state the sensitivity in reference to a 1khz sine wave signal. So for example, if the sensitivity rating of my headphones is 83dB/V and I play a 1khz sine wav test tone from REW (at 0dB rel) and my voltmeter says the amplifier is outputting 1V, I know that I’m getting 83dB - more or less considering unit variation, optimistic manufacturer specs, seal, etc. Now if I change the signal to something with more bandwidth than a single frequency like music, will the db/V remain constant? Or does the amplifier need to output more voltage to achieve the same dB?

If the db/V remains constant, then I must be misinformed as to manufacturer specs using a 1khz sine wave since it would seem to not matter. If it requires more voltage then aren’t headphone amplifier power requirement calculators a bit optimistic?