0

We are setting pybuilder on a new big data project. We have to tests that some classes build the correct distributed tables. As a consequence we built a few unitests that pass when running them on eclipse/pydev. I run independant unit tests succesfully but when i ad the one using pyspark i have a long list of java exception starting with:

ERROR Utils:91 - Aborting task
ExitCodeException exitCode=-1073741515:
 at org.apache.hadoop.util.Shell.runCommand(Shell.java:582)

This my build.py file:

from pybuilder.core import use_plugin
from pybuilder.core import init
import sys
import os


sys.path.append(os.path.join(os.environ['SPARK_HOME'], 'python\lib\py4j-0.10.7-src.zip'))
sys.path.append(os.path.join(os.environ['SPARK_HOME'], 'python'))

use_plugin("python.core")
use_plugin("python.unittest")
use_plugin("python.install_dependencies")

default_task = "publish"

We are using pyspark 2.3.1 and python 3.7. What am i doing wrong?

user6106573
  • 193
  • 2
  • 11

1 Answers1

0

The solution for me was to execute winutils CHMOD 777 -R in my workspace after installing Microsoft Visual C++ 2010 Redistributable Package

user6106573
  • 193
  • 2
  • 11