5

I have seen some implementations of FIFO Queues using slices in Go. As items exit the queue can this memory be freed up without reallocating the underlying array? If this doesn't occur it would seem to me that the queue would leak a ton of memory. This is what I mean:

type queue
{
  []int
  int head
}

func (q *queue) enqueue(val int) {
   q = append(q, val)
}

func (q *queue) dequeue() int {
   return (*q)[q.head++]
}

After calling enqueue/dequeue a bunch of times, the low indexes of the array underying the slice are no longer usable but I am not sure how they can be freed either. Can someone point me to a proper queue implementation that does not use pointers, and doesn't leak memory like this or have performance issues? Alternatively a description of how this might work would also be appreciated.

Thank you, Plamen

Big Endian
  • 944
  • 1
  • 6
  • 23
  • First off, if u want concurrency you'll need mutex locks in those methods, which already begins to smell. I handle queues with Channels instead as channels are basically just stack copies. GC will clean them up after they are used. – eduncan911 Dec 03 '16 at 17:39
  • not interested in thread safety, just memory management for this question. using channels to implement a queue is a bad idea from what I've read as they are closely tied to go routine scheduling and can hang up the whole program. in the best case they're inefficient. this is the book I read this in: https://www.amazon.com/Programming-Language-Addison-Wesley-Professional-Computing/dp/0134190440 – Big Endian Dec 03 '16 at 18:32
  • the code above is not an implementation, simply meant to illustrate my question – Big Endian Dec 03 '16 at 18:33
  • 2
    The implementation's trade-offs are something you have to decide on. This example seems to purposely leak memory, so I not sure what it's trying to illustrate when you could just as easily slice off dequeued value. The std library heap is also often uses as a priority queue if you want to use that as an example. – JimB Dec 03 '16 at 20:41
  • If you're interested in memory management, you might take a look at [this answer](http://stackoverflow.com/a/14582628/539810) and its comments. Not sure how much it will actually help you, but that's usually a benefit since you usually don't want to incur an execution time penalty just to add a single value... –  Dec 03 '16 at 23:32
  • You might also see [this answer](http://stackoverflow.com/a/33393554/539810) and its comments by @JimB. To answer your question, you must reallocate the underlying array because it's simply a chunk of memory that Go asked the OS to reserve for your program. You can have Go ask the OS to release that memory of course, but beware of issues such as multiple slices using the same underlying array because it will be released at one time since it's a single block of memory, even if the OS stores it as multiple blocks or it's a discontinuous section of RAM on the hardware level. –  Dec 03 '16 at 23:56

1 Answers1

2

You can use a circular buffer. From wikipedia:

A circular buffer, circular queue, cyclic buffer or ring buffer is a data structure that uses a single, fixed-size buffer as if it were connected end-to-end. This structure lends itself easily to buffering data streams.

...

Circular buffering makes a good implementation strategy for a queue that has fixed maximum size. Should a maximum size be adopted for a queue, then a circular buffer is a completely ideal implementation; all queue operations are constant time. However, expanding a circular buffer requires shifting memory, which is comparatively costly. For arbitrarily expanding queues, a linked list approach may be preferred instead.

Here's a package that implements this: https://github.com/eapache/queue.

Depending on the use case, a channel is also a good way to implement a queue. It blocks, but using a select with a default you can avoid that behavior:

select {
case msg := <-queue:
default:
}
Community
  • 1
  • 1
Caleb
  • 9,272
  • 38
  • 30