I can't say I was too excited when I was asked to dive back into some web service code I had written for a recent project to implement caching in a few methods.
Caching. That just sounds like one of those things that's going to be a pain in the ass to implement, doesn't it? There was a major risk that I was actually going to have to put some
thought into this lumped-together collection of dumb/stateless service methods and make it do something.
What's more, my colleague suggested that I take a look at the Caching application block in Enterprise Library.
Wonderful, I thought,
now it's expected to be "enterprise-grade" and make use of this big honking library too.
I must have been feeling lazy or interested in/distracted by some other task at the time, because usually this kind of opportunity excites me - new tools I've never played with before! Fortunatly I got over the "starter's hump" and dove into it, and I'm glad I did. Enterprise Library is something I haven't played with before, and I still haven't touched the majority of it, but I wanted to share my experience working with the Caching block to show you how easy it is to use. I'm going to implement in-memory caching in a simple little WCF service method:
public int SlowAdd(int value1, int value2)
{
Thread.Sleep(5000);
return value1 + value2;
}
Yeah, it's contrived, but it simulates the service's reliance on some slow resource, like another service. Is it a good example of a method that should be cached? Sort of. It's definitely slow, and there are only two parameters, but because we don't have any contextual information about the kind of usage this method sees, we don't know if those parameters are constrained at all. Are callers of this service method really only ever interested in values from 1 to 50? Or does this service get nailed with int values from all over the board? Caching will work fine in either case, but remember that if the number of potential combinations of arguments you care about is very high, the cache may grow to be very large, and callers may not get any benefit out of it if two calls using the same parameters are rare.
OK, back to EntLib. First things first... what
is it?
Wikipedia defines it pretty well: "a set of tools and programming libraries for the Microsoft .NET Framework. It provides an API to facilitate best practices in core areas of programming including data access, security, logging, exception handling and others." Really, that's it. I like to think of it as simple, configurable helper code that wasn't low-level enough to include in the framework itself. It's broken up into "application blocks," which is really just an enterprisey way of saying it provides multiple pieces of functionality. A few examples of blocks are cryptography, data access, exception handling, logging, and of course caching - all stuff that most developers could probably write themselves, but with EntLib it's standardized and already done for you. If one person on your team knows how to use a few of the blocks, he can easily share his knowledge with others, and since those blocks are simple and configurable enough to use everywhere, everyone he teaches becomes a more effective developer with a bigger toolkit of standardized, recognized tools.
To actually use it, you need to grab a copy from
here. I had no problems downloading it and installing it using all the default options (make sure you check "build application blocks" at the end of the first installer - it's a multi stage extract/build from source/install VS tool.
Now let's dig in! You need to add a reference from your service project to Microsoft.Practices.EnterpriseLibrary.Caching.dll. If you completed the installation properly and built all the application blocks, this should be in the Bin folder of the Enterprise Library installation directory (Program Files\Microsoft Enterprise Library 4.1 - October 2008 or similar, depending on your version).
Before adding any code, you need to configure your service application to use the Caching application block (an "application block" is just a part of Enterprise Library - other "blocks" include exception handling, logging, data access, cryptography, and more. They're all things you could write yourself, but with EntLib they're standardized and you don't have to). You can do this by manually editing your web.config, but it's much easier to start by using the EntLib configuration tool: right-click on your Web.config in Solution Explorer and choose Edit Enterprise Library Configuration:
What you'll get is a nice graphical view of your web.config, with an interface designed to make it easy to drop in EntLib-specific configuration:
Right-click the Web.config entry (the one right under the root) and click New > Caching Application Block. At this point, you've configured everything you need to configure to use simple, in-memory caching. You may want to look at some of the options that pop up in the Properties pane when you click your new Cache Manager entry. If you want to use a different cache provider, like a database, this would be the place to specify that, but I'm not going to go into that here. Now is a good chance to look at your web.config in an XML editor if you want to see the changes that were made: EntLib dropped in a couple new configSection definitions and a new section to describe the caching provider.
Now, onto the code. Here's the new method that takes advantage of caching:
public int SlowAdd(int value1, int value2)
{
string cacheKey = "SlowAdd-" + value1 + "|" + value2;
ICacheManager cacheManager = CacheFactory.GetCacheManager();
if (cacheManager.Contains(cacheKey))
{
return (int)cacheManager.GetData(cacheKey);
}
Thread.Sleep(5000);
int result = value1 + value2;
cacheManager.Add(cacheKey, result);
return result;
}
That's it! Before I step through this line by line to examine exactly how it works, here are a few things to know:
- This is by no means an example of architectural excellence, just an example of how easy it is to use the EntLib caching api. You may want to abstract away the caching in some other method or perhaps use inheritance to implement versions of the method that work without caching enabled.
- Our "cache key" is a dumb string based on our parameters. Basing your cache on a simple combination of all the parameters to your method isn't always the right thing to do.
- Our code does not take into account that a call to SlowAdd(x, y) returns the same value as SlowAdd(y, x). This is easy to fix, but I'm lazy.
- The cache object that holds the values lives independently of the service instance that calls it. This means that the same cache object is used by all instances of the service. This is exactly what you want - otherwise, the cache wouldn't be all that useful.
Alright, let's take a slightly closer look at what's going on. If you couldn't tell, the cache is essentially a Dictionary
hidden behind an API that... really isn't a lot simpler or more complicated? Why not just whip up your own dictionary and use that? Like I said before, you could, but with EntLib, you don't have to worry about concurrent modifications and accesses, configurability points, limiting the size, implementing a sane entry deletion policy, and reuse (you did think about all those things when you considered rolling your own, right?).
Let's start from the top: cacheKey is our key into the cache for this particular set of parameters. CacheFactory.GetCacheManager() returns an ICacheManager, which is the object we use to manipulate our cache. Our application only has one cache object - the default one, which is what a call to GetCacheManager() without parameters returns. You can give it a string if your code uses multiple cache objects. The rest is pretty self explanatory: if we find our precalculated result already in the cache, we return it. If not, we take the long road, calculating the result ourselves, and then we save it into the cache before returning it. The next call to this method with the same parameter set will return in a fraction of the time.
One common point of configuration I didn't show here is how to control how long something lives in the cache. This is done with an overload to the add method:
cacheManager.Add(cacheKey, result,
CacheItemPriority.Normal, null, new SlidingTime(TimeSpan.FromHours(5)));
The first couple parameters haven't changed. The third is a priority that determines how "important" the cached result is, which in this context means how likely the cache "scavenger" process is to consider it for deletion when the cache is full and needs to make room for something new. The fourth is an instance of an ICacheItemRefreshAction, which I won't discuss in detail here, but is essentially a way to control what happens when your item gets deleted - you can implement refresh logic if you like, so your cache keeps itself up to date outside of the control of the service code. The last parameter is actually a params array of ICacheItemExpirations, five of which are available out of the box with EntLib: AbsoluteTime, ExtendedFormatTime, FileDependency, NeverExpired, and SlidingTime. Here, I've used SlidingTime, which determines whether or not an item has expired by determining if a preset amount of time has passed since the item's last access or modification.
This is by no means an exhaustive manual for the Caching application block, but hopefully it will help you get started if you are considering using it, or any part of EntLib, in your project. It's actually pretty easy to set up and get going on, and I encourage you to give it a try if you haven't already.