Revision cf10fa88c41119c1bdd61bbb281de5e16055227f authored by sharkdtu on 19 June 2017, 21:54:54 UTC, committed by Marcelo Vanzin on 19 June 2017, 21:55:08 UTC
## What changes were proposed in this pull request?

When I set different clusters for "spark.hadoop.fs.defaultFS" and "spark.yarn.stagingDir" as follows:
```
spark.hadoop.fs.defaultFS  hdfs://tl-nn-tdw.tencent-distribute.com:54310
spark.yarn.stagingDir hdfs://ss-teg-2-v2/tmp/spark
```
The staging dir can not be deleted, it will prompt following message:
```
java.lang.IllegalArgumentException: Wrong FS: hdfs://ss-teg-2-v2/tmp/spark/.sparkStaging/application_1496819138021_77618, expected: hdfs://tl-nn-tdw.tencent-distribute.com:54310
```

## How was this patch tested?

Existing tests

Author: sharkdtu <sharkdtu@tencent.com>

Closes #18352 from sharkdtu/master.

(cherry picked from commit 3d4d11a80fe8953d48d8bfac2ce112e37d38dc90)
Signed-off-by: Marcelo Vanzin <vanzin@cloudera.com>
1 parent e329bea
History

back to top