Queue vs List

I have a queue(implemented using an actual Queue) of requests for a pathfinder thread, the problem I have is that you can’t grab something out of the middle of a queue, and if one of the requests are no longer needed even before it started calculating I can’t tell it to forget it.(and i don’t wanna hold another collection of stuff not to calculate and cross check that before starting a new calculation, i just wanna yank it out the queue)

I thought about just using a list as a queue, i know it’s not as good as a queue in terms of gc and stuff but I only have one list for each pathing thread so it’s gonna be hardly noticeable i guess.

Anyways my question is there a way to pull something out of a queue without disturbing the rest of the items in the collection (at least not what’s in front of the item in the queue) with out using LINQ or a buffer Queue and selector?

sticking to the KISS principal i’m just gonna go ahead and use a list unless i see a problem with it later on, I’m asking mostly out of curiosity, most the answers I saw was using another queue and transferring the items or with LINQ(or both).

Well there are several things to consider. Namely:

  • How many elements does the queue hold on average and worst case?
  • How frequent are elements added / removed from the queue.
  • How frequent do you need to remove elements in between.
  • What’s your optimisation goal: high performance or zero GC.

Each point has subquestions depending on the exact usecase of your queue. Though to clear up some general questions I will show some different approaches and what are the benefits of one or another.

First of all a List and a Queue are essentially the same thing under the hood. Both use a normal internal array to store the elements. Once the internal array size becomes to small to hold new elements it will create a new array with a larger size (usually twice the previous size) and copy the elements over from the old array. At this point the old array is up for garbage collection. Though the array is never reduced (replaced with a smaller array) automatically. So once the collection has reached its peak count you don’t get any memory allocations from the Queue or List. Of course you can set the capacity of the internal array from the very start so no reallocation is necessary. So if the capacity is large enough so it can hold the peak element count, you are GC free in terms of the List or Queue.

The main difference between a List and a Queue is that while the List has a single integer to remember how many elements are actually stored in the array (the internal count), a Queue has a count as well as a start index. The queue uses the internal array as a ring buffer. So the start index is the position in the array of the “oldest” element. When you dequeue an element you get the element from this position. The Queue simply sets the element to null / default, increases the start index by one and reducing the internal count by 1. When you queue an element it is simply added at the “end”. While the end of a list is simply the internal count, a queue does wrap its index around once it reaches the capacity limit of the array. That’s essentially how a Queue works.

The most expensive operation on a List is removing the first element. That’s because once the element has been removed, all following elements have to be copied one element down. So index 1 is copied to index 0, index 2 is copied to index 1 and so on. Adding or removing an element at the end is the cheapest operation on a List.

If the average element count in your queue is relatively low, say around 20-40 using a List would be fine unless you do several dequeue operations per frame.

Depending on how your queued elements look like it would be possible to give your class an “obsolete” bool flag that you can set to essentially “remove” an element from a normal queue. So the dequeuing code just has to check if the job it just pulled from the queue is still required. If it’s obsolete, just throw it away and pull the next. So there’s no need to remove them in the middle of the queue.

If you really need to remove the elements from the queue for some reasons and you have a really high element count (say 5k+) it might be worth to use a manually implemented double linked list built into your job class itself. Though this only makes sense if you have direct access to the job instance in the list that you want to remove. If you have to search for it it would be completely pointless.

Quite some time ago I’ve written this RingBuffer class which works almost identical to a Queue. However it has a fix size and does not automatically expand its internal size. Instead when you add an element when the buffer is full, the oldest element will be overwritten and is lost. However this buffer does implement random access like a List and also allows removal in between. Though the performance overhead when removing elements in between is similar compared to a List. So it’s probably not worth it.

Since you talked about a thread doing the actual dequeuing there are other things to consider since you have to be careful with synchronisation. If you have a single producer and a single consumer, there are solutions that don’t require a lock. However if you use a threadpool or simply multiple consumer threads you can’t avoid proper locking. Of course there are other patterns which involve the main thread managing the job distribution and you use wait handles to signal between the threads. So threads may wait for a signal before starting working on the provided jobs and the thread signals when it’s done.

6 Likes

Thank you for this RingBuffer class! I’m using it to add delay/trailing objects in my VR experience.