Can spark-defaults.conf resolve environment variables? -


if have line below in spark-env.sh file

export my_jars==$(jars=(/my/lib/dir/*.jar); ifs=,; echo "${jars[*]}") 

which gives me comma delimited list of jars in /my/lib/dir, there way can specify

spark.jars $my_jars 

in spark-defaults.conf?

tl;dr no, cannot, there is solution.

spark reads conf file properties file without additional env var substitution.

what write computed value my_jars spark-env.sh straight spark-defaults.conf using >> (append). last wins no worry there many similar entries.


Comments

Popular posts from this blog

how to insert data php javascript mysql with multiple array session 2 -

multithreading - Exception in Application constructor -

windows - CertCreateCertificateContext returns CRYPT_E_ASN1_BADTAG / 8009310b -