I'm using a BitSet to represent a time series of data. For example, the first bit represents day 1, the second bit represents day 2, etc.
I am confused when I run the following code because it always returns the length as 0:
BitSet a = new BitSet();
for( int i = 0 ; i < 100 ; i++ ) {
a.set(i, false);
}
System.out.println(a.length());
After some playing around I see its because the values are all false. I set the bit to zero so I would assume it would count in the length and I need it to count. Is there a way to get the count including false and true?