Revision 399c397e7035665c928b1d439a860f9e7b1ce3b3 authored by Dongjoon Hyun on 01 September 2022, 16:34:55 UTC, committed by Dongjoon Hyun on 01 September 2022, 16:35:06 UTC
### What changes were proposed in this pull request?

This PR aims to add a new test tag, `decomTestTag`, to K8s Integration Test.

### Why are the changes needed?

Decommission-related tests took over 6 minutes (`363s`). It would be helpful we can run them selectively.
```
[info] - Test basic decommissioning (44 seconds, 51 milliseconds)
[info] - Test basic decommissioning with shuffle cleanup (44 seconds, 450 milliseconds)
[info] - Test decommissioning with dynamic allocation & shuffle cleanups (2 minutes, 43 seconds)
[info] - Test decommissioning timeouts (44 seconds, 389 milliseconds)
[info] - SPARK-37576: Rolling decommissioning (1 minute, 8 seconds)
```

### Does this PR introduce _any_ user-facing change?

No, this is a test-only change.

### How was this patch tested?

Pass the CIs and test manually.
```
$ build/sbt -Psparkr -Pkubernetes -Pkubernetes-integration-tests \
-Dspark.kubernetes.test.deployMode=docker-desktop "kubernetes-integration-tests/test" \
-Dtest.exclude.tags=minikube,local,decom
...
[info] KubernetesSuite:
[info] - Run SparkPi with no resources (12 seconds, 441 milliseconds)
[info] - Run SparkPi with no resources & statefulset allocation (11 seconds, 949 milliseconds)
[info] - Run SparkPi with a very long application name. (11 seconds, 999 milliseconds)
[info] - Use SparkLauncher.NO_RESOURCE (11 seconds, 846 milliseconds)
[info] - Run SparkPi with a master URL without a scheme. (11 seconds, 176 milliseconds)
[info] - Run SparkPi with an argument. (11 seconds, 868 milliseconds)
[info] - Run SparkPi with custom labels, annotations, and environment variables. (11 seconds, 858 milliseconds)
[info] - All pods have the same service account by default (11 seconds, 5 milliseconds)
[info] - Run extraJVMOptions check on driver (5 seconds, 757 milliseconds)
[info] - Verify logging configuration is picked from the provided SPARK_CONF_DIR/log4j2.properties (12 seconds, 467 milliseconds)
[info] - Run SparkPi with env and mount secrets. (21 seconds, 119 milliseconds)
[info] - Run PySpark on simple pi.py example (13 seconds, 129 milliseconds)
[info] - Run PySpark to test a pyfiles example (14 seconds, 937 milliseconds)
[info] - Run PySpark with memory customization (12 seconds, 195 milliseconds)
[info] - Run in client mode. (11 seconds, 343 milliseconds)
[info] - Start pod creation from template (11 seconds, 975 milliseconds)
[info] - SPARK-38398: Schedule pod creation from template (11 seconds, 901 milliseconds)
[info] - Run SparkR on simple dataframe.R example (14 seconds, 305 milliseconds)
...
```

Closes #37755 from dongjoon-hyun/SPARK-40304.

Authored-by: Dongjoon Hyun <dongjoon@apache.org>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
(cherry picked from commit fd0498f81df72c196f19a5b26053660f6f3f4d70)
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
1 parent 7c19df6
Raw File
CONTRIBUTING.md
## Contributing to Spark

*Before opening a pull request*, review the 
[Contributing to Spark guide](https://spark.apache.org/contributing.html). 
It lists steps that are required before creating a PR. In particular, consider:

- Is the change important and ready enough to ask the community to spend time reviewing?
- Have you searched for existing, related JIRAs and pull requests?
- Is this a new feature that can stand alone as a [third party project](https://spark.apache.org/third-party-projects.html) ?
- Is the change being proposed clearly explained and motivated?

When you contribute code, you affirm that the contribution is your original work and that you 
license the work to the project under the project's open source license. Whether or not you 
state this explicitly, by submitting any copyrighted material via pull request, email, or 
other means you agree to license the material under the project's open source license and 
warrant that you have the legal authority to do so.
back to top