I am working on an erase method for a data structure with a hard-coded maximum number of elements, N, that relies on std::array
to avoid heap memory. Although the std::array
contains N elements only some number, M, of them are "relevant" elements where M is less than or equal to N. As an example, if N is 10 and the array looks like this:
std::array<int, N> elements = { 0, 1, 2, -1, 4, -1, 6, -1, -1, 9 };
...and if M is 7, only the first 7 elements are "relevant" while the others are considered junk (the ending { -1, -1, -9 }
are junk). I am using int
here for a SO example but the real program stores objects that implement operator==
. Below is a working example that removes all -1
and updates M:
#include <algorithm>
#include <array>
#include <iostream>
constexpr unsigned N = 10;
unsigned M = 7;
std::array<int, N> elements = { 0, 1, 2, -1, 4, -1, 6, -1, -1, 9 };
int main() {
for (unsigned i = 0; i < M; ++i)
std::cout << elements[i] << ' ';
std::cout << '\n';
auto newEnd = std::remove_if(
std::begin(elements), std::begin(elements) + M,
[](const auto& element) {
return -1 == element;
}
);
unsigned numDeleted = M - std::distance(std::begin(elements), newEnd);
M -= numDeleted;
std::cout << "Num deleted: " << numDeleted << '\n';
for (unsigned i = 0; i < M; ++i)
std::cout << elements[i] << ' ';
std::cout << '\n';
return 0;
}
The question I have is what is the asymptotic complexity of the std::remove_if
? I would imagine that between the std::remove_if
and std::distance
it is overall O(2M) or O(M) where the std::remove_if
is a more expensive operation. However I am not sure if the std::remove_if
is O(N * M) due to shifting elements per deletion
Edit: For clarity, I understand that this should be applying the predicate M times but am wondering if N shifts are being applied each time the predicate is true