I imagine most people who get asked that question would answer “Of course!” but I disagree in a way.
I’ve seen countless videos and articles talking about adding caching to your ASP.NET website or API in order to “speed it up”.
Speeding up your responses is a good thing, but if you’re using caching as a bandaid for poor performance, then you’ll soon run into another issue, which is scalability.
At some point, you need to expire the cache or invalidate it due to new data. If your underlying code is slow, then everyone who makes a request that DOESN’T hit the cache is going to have a poor experience.
A typical response is “Well I’m not fussed if 1 in 100 of my customers has a poor experience”. Well, I can guarantee you that the user doesn’t feel the same way!
Another issue that occurs if your underlying code is slow is that you often require more resources to run that site, which leads to additional costs. And caching will only take you so far.
I’m not against caching, in fact, I use it all the time across several different systems, but I “try” to use it for scalability, NOT for performance.
You should always try to optimise your code to be as fast as possible on every request. So if you had no caching, then if your site got 10 users, they would all have a nice speedy experience. Once you have this, THEN you add in the caching.
The benefit of building your system this way is that if you suddenly go from 10 to 1000 users on your site, the site will scale well, and all of your users will get a great experience. You get the added benefit of your servers not blowing up (or scaling out to excess instances) trying to handle those slow non-cached requests.