How to pass variables to spark shells

Steven Lacerda
1 min readNov 19, 2019

--

You can pass configuration variables to dse spark, dse pyspark, and dse spark-sql using “ — conf”:

dse spark-sql --conf "spark.executor.cores=1"

Alternatively, you can edit the spark-defaults.conf file and add the configuration parameter to the file.

--

--

Steven Lacerda
Steven Lacerda

Written by Steven Lacerda

Steve Lacerda is a software engineer specializing in web development. His favorite 80’s song is Let’s Put the X in Sex by Kiss.

No responses yet