https://github.com/apache/spark
Revision fefd54d7fa11c22517603d498a6273b237b867ef authored by Kent Yao on 20 April 2022, 06:38:26 UTC, committed by Kent Yao on 20 April 2022, 06:39:37 UTC
### What changes were proposed in this pull request?

TaskLocation.apply w/o NULL check may throw NPE and fail job scheduling

```

Caused by: java.lang.NullPointerException
    at scala.collection.immutable.StringLike$class.stripPrefix(StringLike.scala:155)
    at scala.collection.immutable.StringOps.stripPrefix(StringOps.scala:29)
    at org.apache.spark.scheduler.TaskLocation$.apply(TaskLocation.scala:71)
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$getPreferredLocsInternal
```

For instance, `org.apache.spark.rdd.HadoopRDD#convertSplitLocationInfo` might generate unexpected `Some(null)` elements where should be replace by `Option.apply`

### Why are the changes needed?

fix NPE

### Does this PR introduce _any_ user-facing change?

no

### How was this patch tested?

new tests

Closes #36222 from yaooqinn/SPARK-38922.

Authored-by: Kent Yao <yao@apache.org>
Signed-off-by: Kent Yao <yao@apache.org>
(cherry picked from commit 33e07f3cd926105c6d28986eb6218f237505549e)
Signed-off-by: Kent Yao <yao@apache.org>
1 parent c7733d3
History
Tip revision: fefd54d7fa11c22517603d498a6273b237b867ef authored by Kent Yao on 20 April 2022, 06:38:26 UTC
[SPARK-38922][CORE] TaskLocation.apply throw NullPointerException
Tip revision: fefd54d

README.md

back to top