Home Using Task.run to have non-blocking locks in c#
Reply: 0

Using Task.run to have non-blocking locks in c#

user3875 Published in April 26, 2018, 1:45 pm

Imagine I have some object which I know will be used by at least two threads. Say, a shared cache of some kind.

One thread will be a worker thread, and will continuously run a loop which makes use of the information in the cache.

Other "client" threads can submit new information to the cache. The worker thread should complete its current loop and pick up new information on the next iteration.

It doesn't really matter if the program ends before all client information is in the cache, since it's just an application-side cache. We will have persisted the information somewhere else, like a database.

Under these circumstances, it seems to me that a "simple" solution is to share state and use locks. But I don't want the "client" threads to have to wait around for the worker thread to release its locks at the end of each iteration before letting the client thread continue, it seems unnecessary. I might describe the client addition a "lock bound" (as opposed to CPU or IO bound) operation which could be completed asynchronously.

Is there anything fundamentally wrong with creating a SharedCache class which provides an Add() method, where inside that method we issue a Task.Run, where the code being run takes a lock in order to add to the underlying cache object? From what I understand, Task.Run is really a way to run operations in parallel as opposed to asynchronously, but I'm not sure what a "pure async" solution would look like here.

On the one hand this approach seems simple and easy, but on the other hand I have read Stephen Cleary's articles on async, including the article titled "Don't use Task.Run in the implementation", which is what this solution proposes to do.

Here's an example console application using this pattern. I have provided two different Add methods - one which blocks, and another which does not.

When running this code I noticed that the locks aren't always granted in the order that they were requested, but that's not an issue here.

using System;
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;

namespace asynclockwait {
    public class SharedCache {
        private List<char> cache;
        private readonly object locker;
        public SharedCache() { 
            locker = new object(); cache = new List<char>(); 
        public void AddWithoutBlocking(char c) {
            Task.Run(() => { lock(locker) { cache.Add(c); } });
        public void AddWithBlocking(char c) {
            lock(locker) { cache.Add(c); }
        public void Use() {
            lock (locker) { int i = cache.Count; Thread.Sleep(200); } // do something which takes a little while
        public void Dump() {
            foreach (char c in cache) { Console.Write(c); }

    class Program {
        private static bool stopping;
        private static SharedCache cache = new SharedCache();

        public static void loop() {
            while (!stopping) {
                Thread.Sleep(50); // create a window for people waiting on the lock to make use of it before we ask for it back
        static void Main(string[] args) {
            Thread t1 = new Thread(loop);

            while (!stopping) {
                ConsoleKeyInfo k = Console.ReadKey();
                if (k.Key == ConsoleKey.X) {
                    stopping = true;
                } else {
You need to login account before you can post.

About| Privacy statement| Terms of Service| Advertising| Contact us| Help| Sitemap|
Processed in 0.422467 second(s) , Gzip On .

© 2016 Powered by mzan.com design MATCHINFO