It seems that dtype
only work for pandas.DataFrame.Series
, right? Is there a function to display data types of all columns at once?

- 8,111
- 25
- 27
- 44

- 4,077
- 7
- 32
- 58
6 Answers
The singular form dtype
is used to check the data type for a single column. And the plural form dtypes
is for data frame which returns data types for all columns. Essentially:
For a single column:
dataframe.column.dtype
For all columns:
dataframe.dtypes
Example:
import pandas as pd
df = pd.DataFrame({'A': [1,2,3], 'B': [True, False, False], 'C': ['a', 'b', 'c']})
df.A.dtype
# dtype('int64')
df.B.dtype
# dtype('bool')
df.C.dtype
# dtype('O')
df.dtypes
#A int64
#B bool
#C object
#dtype: object

- 209,562
- 33
- 339
- 356
-
8If you want all the non-numeric/categorical columns you can get it by `df.dtypes[df.dtypes != 'int64'][df.dtypes != 'float64']` – nishant Aug 31 '18 at 10:11
-
3Can you please explain why column C is having an object type instead of str? – Star Rider Jun 05 '19 at 03:18
-
3@StarRider see [this answer](https://stackoverflow.com/questions/21018654/strings-in-a-dataframe-but-dtype-is-object) as well as the [`pandas` documentation](https://pandas.pydata.org/pandas-docs/stable/getting_started/basics.html#dtypes) where it mentions: "_Pandas uses the object dtype for storing strings_" – call-in-co Sep 04 '19 at 21:21
-
1Any idea why I have object as dtype? – WJA Apr 23 '20 at 12:42
-
2I hadn't realized that equality is a bit fuzzy with dtypes. `df.C.dtype` returns `dtype('O')`, but `df.C.dtype=='object'` is true. – Teepeemm Nov 10 '20 at 15:16
Suppose df is a pandas DataFrame then to get number of non-null values and data types of all column at once use:
df.info()

- 421
- 4
- 7
To go one step further, I assume you actually want to do something with these dtypes.
df.dtypes.to_dict()
comes in handy.
my_type = 'float64'
dtypes = dataframe.dtypes.to_dict()
for col_name, typ in dtypes.items():
if (typ != my_type): #<---
raise ValueError(f"Yikes - `dataframe['{col_name}'].dtype == {typ}` not {my_type}")
You'll find that Pandas did a really good job comparing NumPy classes and user-provided strings. For example: even things like 'double' == dataframe['col_name'].dtype
will succeed when .dtype==np.float64
.

- 4,922
- 4
- 42
- 74
(This answer does not directly answer OP's question but might be useful.)
Responses so far rely on printed reports or string values, so might not be future-proof.
pandas offers programmatic ways for type checking:
import pandas as pd
from pandas.api.types import is_object_dtype, is_numeric_dtype, is_bool_dtype
df = pd.DataFrame({'A': [1,2,3], 'B': [True, False, False], 'C': ['a', 'b', 'c']})
is_numeric_dtype(df['A'])
>>> True

- 2,065
- 1
- 16
- 22
If you have a lot many columns and you do df.info()
or df.dtypes
it may give you overall statistics of columns or just some columns from the top and bottom like
<class 'pandas.core.frame.DataFrame'>
Int64Index: 4387 entries, 1 to 4387
Columns: 119 entries,
CoulmnA to ColumnZ
dtypes: datetime64[ns(24),
float64(54), object(41)
memory usage: 4.0+ MB
It just gives that 24 columns are datetime, 54 are float64 and 41 are object.
So, if you want the datatype of each column in one command, do:
dict(df.dtypes)

- 8,111
- 25
- 27
- 44
You can also see it indirectly by using dataframe_name.column_name which shows column values and also dtype with it.
Example:
import pandas as pd
data = {"strings" : ["a","b","c"], "ints" : [1,2,3]}
df = pd.DataFrame(data)
print(df.strings)
print("------------")
print(df.ints)
which will output:
0 a
1 b
2 c
Name: strings, dtype: object
------------
0 1
1 2
2 3
Name: ints, dtype: int64

- 5
- 3