createDF() is not SparkSession method. It is spark-daria method. You need to install dependancy and import the spark-daria library them you should be able to use it.
Below article for your reference.
https://medium.com/@mrpowers/manually-creating-spark-dataframes-b14dae906393
how to import a Github lib in a Spark-shell session?
You can use this alias with your appropriate "etc" in the properties-file
value.
alias sshell_daria='export SPARK_MAJOR_VERSION=2; spark-shell --packages mrpowers:spark-daria:0.35.0-s_2.11 --properties-file /opt/_etc_/_etc2_/conf/sparkShell.conf'
but, it not work fine always, Spark-shell stop to work after this this messages
SPARK_MAJOR_VERSION is set to 2, using Spark2
Ivy Default Cache set to: /home/_etc_/.ivy2/cache
The jars for the packages stored in: /home/_etc_/.ivy2/jars
:: loading settings :: url = jar:file:/usr/hdp/2.6.4.0-91/spark2/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.mrpowers#spark-daria added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
confs: [default]
You can download the current version as jar file at dl.bintray.com, and --jars
option instead packages
. So, the correct alias in this case is
alias sshell_daria='export SPARK_MAJOR_VERSION=2; spark-shell --jars _your_path_/spark-daria-0.35.0-s_2.12.jar --properties-file /opt/_etc_/_etc2_/conf/sparkShell.conf'