0

I want to change the datatype of a column from bigint to double in spark for a delta table.

AnalysisException: Cannot update spark_catalog.default.tablename field column_name: bigint cannot be cast to decimal(10,2); line 1 pos 0; AlterColumn resolvedfieldname(StructField(column_name,LongType,true)), DecimalType(10,2)

ALTER TABLE tablename ALTER COLUMN column_name TYPE decimal(10, 2)

younus
  • 412
  • 2
  • 10
  • 20

1 Answers1

0

I reproduce the same in my environment. I got this output:

Following the below approach it will change the datatype for a column in spark sql.

Approach 1

Pyspark

First directly read the Delta table as a data frame and use the cast function to change data types. Note : my_table1 it is my delta table

%python

from pyspark.sql.functions import col

df= spark.sql("SELECT * FROM my_table1")
df1 = df.withColumn("age", col("age").cast("double"))

enter image description here

Approach 2:

Spark SQL:

INSERT OVERWRITE TABLE my_table1
SELECT id, name, CAST(age AS DOUBLE) AS age FROM my_table1;

enter image description here

B. B. Naga Sai Vamsi
  • 2,386
  • 2
  • 3
  • 11