https://github.com/apache/spark
Revision 6f1a8d8bfdd8dccc9af2d144ea5ad644ddc63a81 authored by Jungtaek Lim (HeartSaVioR) on 22 March 2019, 22:07:49 UTC, committed by Marcelo Vanzin on 22 March 2019, 22:12:35 UTC
This patch fixes the issue that ClientEndpoint in standalone cluster doesn't recognize about driver options which are passed to SparkConf instead of system properties. When `Client` is executed via cli they should be provided as system properties, but with `spark-submit` they can be provided as SparkConf. (SpartSubmit will call `ClientApp.start` with SparkConf which would contain these options.)

Manually tested via following steps:

1) setup standalone cluster (launch master and worker via `./sbin/start-all.sh`)

2) submit one of example app with standalone cluster mode

```
./bin/spark-submit --class org.apache.spark.examples.SparkPi --master "spark://localhost:7077" --conf "spark.driver.extraJavaOptions=-Dfoo=BAR" --deploy-mode "cluster" --num-executors 1 --driver-memory 512m --executor-memory 512m --executor-cores 1 examples/jars/spark-examples*.jar 10
```

3) check whether `foo=BAR` is provided in system properties in Spark UI

<img width="877" alt="Screen Shot 2019-03-21 at 8 18 04 AM" src="https://user-images.githubusercontent.com/1317309/54728501-97db1700-4bc1-11e9-89da-078445c71e9b.png">

Closes #24163 from HeartSaVioR/SPARK-26606.

Authored-by: Jungtaek Lim (HeartSaVioR) <kabhwan@gmail.com>
Signed-off-by: Marcelo Vanzin <vanzin@cloudera.com>
(cherry picked from commit 8a9eb05137cd4c665f39a54c30d46c0c4eb7d20b)
Signed-off-by: Marcelo Vanzin <vanzin@cloudera.com>
1 parent 95e73b3
History
Tip revision: 6f1a8d8bfdd8dccc9af2d144ea5ad644ddc63a81 authored by Jungtaek Lim (HeartSaVioR) on 22 March 2019, 22:07:49 UTC
[SPARK-26606][CORE] Handle driver options properly when submitting to standalone cluster mode via legacy Client
Tip revision: 6f1a8d8

README.md

back to top