4

I need a linked list to be able to add items at both sides. The list will hold data to be shown in a trend viewer. As there is a huge amount of data I need to show data before its completely read so what I want to do is to read a block of data that I know has been already written and while I read that block have two threads filling the sides of the collection:

enter image description here

I thought on using LinkedList but it says that is does not support this scenario. Any ideas on something at the Framework that can help me or will I have to develop my custom list from scratch?

Thanks in advance.

EDIT: The main idea of the solution is to do it without locking anything because I'm reading a piece of the list that is not going to be changed while writing at other places. I mean, the read Thread will only read one chunk (from A to B) (a section that has been already written). When it finishes and other chunks have been completely written the reader will read those chunks while the writters write new data.

See the updated diagram:

enter image description here

Ignacio Soler Garcia
  • 21,122
  • 31
  • 128
  • 207
  • How can you know that you are only writing to the list where you are not reading. Suppose your reading thread reads faster than your writer threads write entries to the list. Its just a matter of time when your threads will access the same list item... Somebody has to lock the concurrent access - its either you or teh developer who developed the thread safe datastructure you use – Jan Sep 21 '11 at 09:45
  • 1
    LinkedList.Count is O(1) so it must be a stored value and concurrent writes may corrupt this value. – Guillaume Sep 21 '11 at 09:51
  • 1
    "without locking anything" - there begins the disaster... – H H Sep 21 '11 at 09:53
  • @Henk: can you develop that please? I cannot see with this cannot work. Well, the only locks I can see are between the structures that will be shared between threads to allow the writers to notify to the reader that a chunk has been completely written. – Ignacio Soler Garcia Sep 21 '11 at 10:00
  • "lock-free threading" always spells disaster. Maybe you meant something else. – H H Sep 21 '11 at 10:05
  • Well, what I mean is that one thread can read from the collection while other thread writes on it because both of them work on different places of the collection. – Ignacio Soler Garcia Sep 21 '11 at 10:06
  • Your choice of datastructure and "time sorted collection" don't rhyme very well. I'm not convinced a LinkedList is needed here, it certainly complicates things. – H H Sep 21 '11 at 10:07
  • How about a ConcurrentDictionary with a Timestamp as the key, and hold back on optimization until you really have a problem in that department? – H H Sep 21 '11 at 10:08
  • Well, as the data is time linear that's why I thought on a list that grows at the two ends. For example, if I work with days the reader will read one day while the writers append new days at both ends. That way the trend control will be able to show data as it becomes available. As the data inside a time chunk will no be regular I'm not sure on how to query a Dictionary to get the data to be painted ... – Ignacio Soler Garcia Sep 21 '11 at 10:16
  • I'm not sure how you're going to query (a segment of) a LinkedList... – H H Sep 21 '11 at 10:19
  • Using code like the one posted by @Stephen Martin – Ignacio Soler Garcia Sep 21 '11 at 10:24

5 Answers5

2

If you are on .NET4 you can use two ConcurrentQueue. One for the left side and one for the right side.

Albin Sunnanbo
  • 46,430
  • 8
  • 69
  • 108
  • Mmmm, there is no real left side and right side, it's just a time sorted collection of data. I plan to start somewhere in the middle and fill the other sides while displaying what has been already read ... – Ignacio Soler Garcia Sep 21 '11 at 09:28
1

You could use the linked list and just use normal .NET threading constructs like the lock keyword in order to protect access to the list.

Any custom list you developed would probably do something like that anyway.

Joon
  • 2,127
  • 1
  • 22
  • 40
  • Well, the thing is that I don't want to lock anything because I want to read and write at the same time because I'm reading and writting different items of the collection. – Ignacio Soler Garcia Sep 21 '11 at 09:31
1

I would recommend to consider an other approach with single data structure to persist incomming data, in this way you can keep order of incomming data messages. For instance you can use blocking queue, in this SO post you can find nice example Creating a blocking Queue in .NET?.

Community
  • 1
  • 1
sll
  • 61,540
  • 22
  • 104
  • 156
1

Why not using LinkedList class. The documentation says its not threadsafe so you have to synchronize access to the list for yourself, but you have to do this with any datastructure accessed by multiple threads.

Performance should be quiet good here is what msdn says about inserting nodes at any position:

LinkedList provides separate nodes of type LinkedListNode, so insertion and removal are O(1) operations.

You just have to lock read and insert operations with the lock construct.

EDIT

Ok, i think i understand what you want. You want a list like datastructure which is split into chunks of items. You want to independently write and read chunks of items without locking the whole list.

I suggest to use the LinkedList holding your chunks of data items. The chunks themself can be represented as simple List or can be LinkedList instances as well.

You have to lock the access to the global LinkedList.

Now your writer threads fill one private List with n items a time. When finished the writer locks the LinkedList and adds his private list with dataitems to the LinkedList.

The reader thread locks the LinkedList reads one chunk and releases the lock. Now it can process n dataitems without locking them.

Jan
  • 15,802
  • 5
  • 35
  • 59
  • Please, read my edit. I really don't want to lock anything as I'm accessing different zones of the collection with the read and write threads. I think that with this scenario is not needed to lock. – Ignacio Soler Garcia Sep 21 '11 at 09:42
  • See my comment on your question. Someone has to synchronize the concurrent access to your datastructure. – Jan Sep 21 '11 at 09:47
  • Again, see my new edit. As the data on the collection can will be used in chunks I think I don't really need locking. I will read from F to G while writing from G+1 to H and from F-1 to E. – Ignacio Soler Garcia Sep 21 '11 at 09:54
  • I have edited my answer. I'm still sure its not possible without *any* locking but i agree that its possible to reduce locking – Jan Sep 21 '11 at 10:15
  • Looks a good idea ... I will consider your approach for sure. – Ignacio Soler Garcia Sep 21 '11 at 10:18
1

If I understand correctly you have a linked list to which you are adding data at the beginning and the end. You are never adding or removing from anywhere else. If this is the case you do not have to worry about threading since the other threads will never interfere.

Simply do something like this:

        //Everything between first and last is thread safe since 
        //the other threads only add before and after.
        LinkedListNode<object> first = myList.First;
        LinkedListNode<object> current = first;
        LinkedListNode<object> last = myList.Last;
        bool done = false;

        if (current == null) return; //empty list
        do
        {
            //do stuff
            if (current == last) done = true;
            current = current.Next;
        } while (!done && current != null);

After you are done with this section you can do the same with two more sections from the new myList.First to first and from last to the new myList.Last.

Stephen Martin
  • 9,495
  • 3
  • 26
  • 36
  • That's just what I wanted to do ... but the docs say that this won't work with a LinkedList ... are you sure that this code will run? – Ignacio Soler Garcia Sep 21 '11 at 10:20
  • In general use it would not be safe since you could be inserting or removing at some point within your sequence. But since you are using the list as a double-ended queue, and are inserting only, it is safe. – Stephen Martin Sep 21 '11 at 10:33
  • Stephen, did you see @Guillaume's comment about Count being O(1) ? It means this is _not_ threadsafe. – H H Sep 21 '11 at 18:34
  • @Henk. If you're going from one item to another one taking all the ones that are in the middle you do not need to use count for anything like in the code above. Am I missing something? – Ignacio Soler Garcia Sep 23 '11 at 09:36
  • @somos, walking the list is Ok but the Count property means you can't change the list from a thread. – H H Sep 23 '11 at 10:31