I know the question is 4 years old, but the accepted answer makes no sense (as Justin Raymond pointed out).
Nick Babcock's approach is imprecise, as the number of elements is too low; there is always some overhead on the heap, that you will measure as well.
To show this, I used a bigger data type and more elements (4096):
On g++ 6.2.1
and linux x64
sizeof(void*) = 8
and sizeof (bigDataType_t) = 800
(bigData_t
is long[100]
).
So what do we expect? Each type of list has to store the actual data on the heap; std::list
stores 2 pointers per link (backward and forward), std::forward_list
just one (forward).
Expected memory for std::list
:
4096 x 800 + 2 x 8 x 4096 = 3,342,336 bytes
Actual memory for std::list
: 3,415,040 bytes
Expected memory for std::forward_list
:
4096 x 800 + 1 x 8 x 4096 = 3,309,568 bytes
Actual memory for std::forward_list
: 3,382,272 bytes
I used Massif to get the heap usage of the programs.
As we can see, the numbers fit quite well. When using big data types, the memory for the extra pointer doesn't make much difference!
When using char
as datatype (as OP), the expected and actual memory footprint don't fit too well, most likely because of some overhead. However, there is no factor 3 for memory consumption.
std::list: Expected 69,632 bytes, actual: 171,008 bytes
std::forward_list: Expected 36,864 bytes, actual: 138,240 bytes
My code:
#include <list>
#include <forward_list>
struct bigData_t {
long foo[100];
};
typedef bigData_t myType_t;
// typedef char myType_t;
int main()
{
#ifdef USE_FORWARD_LIST
std::forward_list<myType_t> linkedList;
#else
std::list<myType_t> linkedList;
#endif
for (int i = 0; i < 4096; i++) {
myType_t bigData;
linkedList.push_front(bigData);
}
return 0;
}