In this day and age of dynamic web content, how relevant can a caching proxy server be? I believe that the answer could be: quite!
I have installed a caching proxy server based on Squid, which is now used within my company. It also does content scanning using squidclamav and Clamav. I wrote an article about how to setup such a content scanning proxy.
The thing is that I didn't much care for the actual caching functionality of Squid, I deemed the content-scanning part more interesting. But I'm quite pleased with the actual caching hit ratio.
It seems that we have a hit ratio between 20% to 25% and that is more than I expected. Most content is dynamic in nature, so I would expect that most content is not cached but it seems that there is still quite some data that can be cached. This must also improve the end-user surfing experience as latency for downloading content should be reduced.
Of course, this is just a sample for the last hour. However, multiple measurements at different moments yield similar results.
I think this result proves that a caching proxy server is still relevant, especially if you don't have a fast internet connection. If you do, you can still improve the overall browsing experience due to the fact that data is cached.
There is a caveat: the proxy server itself also introduces latency. I haven't performed a side-by-side comparison and measured actual responsiveness of browsing with or without a proxy.