How to pass variables to spark shells
1 min readNov 19, 2019
You can pass configuration variables to dse spark, dse pyspark, and dse spark-sql using “ — conf”:
dse spark-sql --conf "spark.executor.cores=1"
Alternatively, you can edit the spark-defaults.conf file and add the configuration parameter to the file.