2

I am learning apache spark and I ran below code on google colab.

#installed based upon https://colab.research.google.com/github/JohnSnowLabs/spark-nlp-workshop/blob/master/jupyter/quick_start_google_colab.ipynb#scrollTo=lNu3meQKEXdu

import os

# Install java
!apt-get install -y openjdk-8-jdk-headless -qq > /dev/null
!wget -q "https://downloads.apache.org/spark/spark-3.1.1/spark-3.1.1-bin-hadoop2.7.tgz" > /dev/null
!tar -xvf spark-3.1.1-bin-hadoop2.7.tgz > /dev/null
!pip install -q findspark

os.environ["SPARK_HOME"] = "/content/spark-3.1.1-bin-hadoop2.7"
os.environ["JAVA_HOME"] = "/usr/lib/jvm/java-8-openjdk-amd64"
os.environ["PATH"] = os.environ["JAVA_HOME"] + "/bin:" + os.environ["PATH"]
! java -version

# Install spark-nlp and pyspark
! pip install spark-nlp==3.0.0 pyspark==3.1.1


import sparknlp
spark = sparknlp.start()

from sparknlp.base import DocumentAssembler
documentAssembler = DocumentAssembler().setInputCol(text_col).setOutputCol('document')

I get the below error. How could i resolve it

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-48-535b177b526b> in <module>()
      4 
      5 from sparknlp.base import DocumentAssembler
----> 6 documentAssembler = DocumentAssembler().setInputCol(text_col).setOutputCol('document')

4 frames
/usr/local/lib/python3.7/dist-packages/pyspark/ml/wrapper.py in _new_java_obj(java_class, *args)
     64             java_obj = getattr(java_obj, name)
     65         java_args = [_py2java(sc, arg) for arg in args]
---> 66         return java_obj(*java_args)
     67 
     68     @staticmethod

TypeError: 'JavaPackage' object is not callable
user2543622
  • 5,760
  • 25
  • 91
  • 159
  • try this : documentAssembler = DocumentAssembler.setInputCol(text_col).setOutputCol('document') – itIsNaz Mar 25 '21 at 22:12
  • tried it but different error `----> 6 documentAssembler = DocumentAssembler.setInputCol(text_col).setOutputCol('document') TypeError: setInputCol() missing 1 required positional argument: 'value'` – user2543622 Mar 25 '21 at 22:36
  • refer to my answer , it is more explained – itIsNaz Mar 26 '21 at 00:00

1 Answers1

0

As mentioneed in my last comment :

Change the text_col by the name of the column in your spark dataframe that you have , document by the its name You can add .setCleanupMode("clean_mode") For more details you can refer to this link: https://spark.apache.org/docs/latest/ml-features

documentAssembler = DocumentAssembler \
                   .setInputCol("text_col") \               
                   .setOutputCol("document")         
                   

itIsNaz
  • 621
  • 5
  • 11