Some here have looked at Fourier analysis and have the basic concept of it well entrenched in their heads and a few of those have a lot of the details well-understood. I’m guessing, though, that there are >1 people in the thread here who don’t and don’t see the relationship between things like an impulse response or square wave and frequency response.
What I do for a living often centers around MRI imaging. We make pictures of parts of your bodies (like ear canals) by making different parts of our image resonate at slightly different frequencies and phases so that we can take a little squiggle that comes out of an antenna and turn it into a picture. This whole thing relies on Fourier. There are many nuances that I’m surely unaware of, but I have presented the gist of it more than once in my time and getting someone able to play with it themselves can often be eye opening.
Fortunately, with REW and Audacity, you can do this yourself. Here’s a brief how-to:
You’ll generate some sound files and use the spectral analysis tools in REQ or Audacity to see how things like random noise or impulses are made up of weightings of all the possible frequencies (based on your sampling) and how applying EQ “slows” the impulse response (and that a “slowed” impulse response will affect the frequency response).
I’m not trying to put this out as some perfect method (we’re working with my limited mastery of both bits of software for starters), but it gives you a starting point and a way to explore yourself how these two domains can really be interchanged.