Cache Service

Overview

Dandago.CacheService is a socket wrapper around MemoryCache, allowing it to be shared by different processes. The service may use custom cache implementations. A client library is provided. This project is still in its infancy, and is not to be considered stable, much less suitable for production environments.

It is somewhat inspired by Redis, but is very different in principle. While Redis is single-threaded, Dandago.CacheService uses asynchronous sockets to handle requests concurrently. The underlying protocol permits spontaneous duplex communication from both client and server, as opposed to a simple request/response implementation. Thus, requests may be interleaved and pipelining is a natural consequence of the design.

Dandago.CacheService is also not designed to handle the complex data structures (such as sets, lists, etc) that Redis supports.

On the other hand, Windows is not yet a supported platform for Redis. In time, Dandago.CacheService may be an alternative for simple shared cache implementations once it reaches maturity.

Supported Features

  • String-only storage
  • Key expiry
  • Thread-safety
  • Basic operations:
    • Set
    • Get
    • Delete (removes a key from the cache)
    • DeleteGet (removes a key from the cache, returning its value in the process)

Not Yet There

Fault tolerance in connections has yet to be implemented, but is planned. Right now, a client disconnect will cause an exception in the server.

Where to Use

Dandago.CacheService is a simple cache implementation that may be used by different processes. Here are a few scenarios to consider:

Requirements Recommendation
You need a simple in-memory cache to be used by a single process. Use MemoryCache.
You need a simple in-memory cache to be shared by multiple processes. Use Dandago.CacheService.
You need a complex in-memory cache to be shared by multiple processes, and don’t need to run it on Windows. Use Redis.
You need a cache capable of notifying subscribers when changes are made to keys. Use SignalR.

Usage: Server

First, install the Dandago.CacheService NuGet package:

Install-Package Dandago.CacheService -IncludePrerelease

Then, just create your cache and server instances, and start the server:

            var memCache = new DefaultMemoryCache();
            var cacheServer = new CacheServer(memCache, 2674);
            cacheServer.Start();

            Console.ReadLine();

Usage: Client

First, install the Dandago.CacheService.Client NuGet package:

Install-Package Dandago.CacheService.Client -IncludePrerelease

Then, create your client and connect to the server:

            Task.Run(async () =>
            {
                var client = new CacheClient("localhost", 2674);
                await client.ConnectAsync();

                // TODO code goes here
            });

            Console.ReadLine();

Once that is done, you can use the methods provided to interact with the cache asynchronously.

For example, simple set and get:

                var setResponse = await client.SetAsync("Chuck", "28");
                var getResponse = await client.GetAsync("Chuck"); // 28

Deletion:

                var delResponse = await client.DeleteAsync("Chuck");
                getResponse = await client.GetAsync("Chuck"); // Not Found

Or instead, deletion with retrieval:

                var setLarryResponse = await client.SetAsync("Larry", "34");
                var delGetResponse = await client.DeleteGetAsync("Larry"); // 34
                getResponse = await client.GetAsync("Larry"); // Not Found

You can also set a key to expire:

                var expiry = new DateTimeOffset(DateTime.Now).AddSeconds(2.0);
                var setTomResponse = await client.SetAsync("Tom", "26", expiry);
                getResponse = await client.GetAsync("Tom"); // 26
                Thread.Sleep(3000);
                getResponse = await client.GetAsync("Tom"); // Not Found

Source Code

The source code is available on BitBucket.

"You don't learn to walk by following rules. You learn by doing, and by falling over." — Richard Branson