Mapping between Spark configurations and parameters -
when launch spark in command line, found parameter num-executors acts similar spark.executor.instances in configuration file. same? if so, can find full mapping between such pairs of same functionality?
from the documentation
the spark shell , spark-submit tool support 2 ways load configurations dynamically. first command line options, such --master, shown above. spark-submit can accept spark property using --conf flag, uses special flags properties play part in launching spark application. running ./bin/spark-submit --help show entire list of these options.
so, there way fewer command line options such --executor-cores there spark options such spark.executor.cores, , documentation says, -- options running ./bin/spark-submit --help. run - tell not cl options usable in every situation (something confused me lot). can set property not have special command line option so: --conf spark.executor.cores=16.
here incomplete table - must read through comments find appropriate parameters.
Comments
Post a Comment