I have a pandas dataframe of N columns of integer object values. The values in the columns are associated with outcome of a particular random experiment. For example, if I were to call df.head():
0 1 2 3
0 13 4 0 5
1 8 2 16 6
2 6 20 14 0
3 17 4 8 4
4 17 2 12 0
What I am interesting in doing is identifying the number of times each of unique values occur for a particular column. Concerning ourselves with column 0 only, I may wish to know of the number of times I have observe the value '17' this experiment, and in our box above we can see this occurred twice over the first 5 entries in column 0.
What would be the optimal method of doing this, via Pandas itself or otherwise?
The first approach I considered was to collapse that column down into a Dictionary where the Key is the observed data value, and the Dictionary Value being associated with the count of that Particular Key. I used the Counter datastructure from Python Collections.
# converting the Dataset into a Pandas Dataframe
df = pd.read_csv("newdataset.txt",
header=None,
#skiprows=0,
delim_whitespace=True)
print(df.head())
user0Counter = Counter()
for dataEntry in df[0]:
user0Counter.update(dataEntry)
This leads to a type error.
TypeError Traceback (most recent call last)
<ipython-input-15-d2a83c38d0d0> in <module>
----> 1 import codecs, os;__pyfile = codecs.open('''~/dir/foo/bar.py''', encoding='''utf-8''');__code = __pyfile.read().encode('''utf-8''');__pyfile.close();exec(compile(__code, '''~/dir/foo/bar.py''', 'exec'));
~/dir/foo/bar.py in <module>
28
29 for dataEntry in df[0]:
---> 30 user0Counter.update(dataEntry)
31
32 print(len(user0Counter))
~/anaconda3/lib/python3.7/collections/__init__.py in update(*args, **kwds)
651 super(Counter, self).update(iterable) # fast path when counter is empty
652 else:
--> 653 _count_elements(self, iterable)
654 if kwds:
655 self.update(kwds)
TypeError: 'int' object is not iterable
If I replace the user0Counter.update() method with a print(dataEntry) block, there is no issue iterating over df[0].
0 1 2 3
0 13 4 0 5
1 8 2 16 6
2 6 20 14 0
3 17 4 8 4
4 17 2 12 0
13
8
6
17
17
1
1
4
6
19
3
11
3
4
12
7
1
9
4
2
1
2
5
1
2
13
And so forth.