Revision 02cf178bb2a7dc8b4c06eb040c44b6453e41ed15 authored by Mark Grover on 08 June 2017, 16:55:43 UTC, committed by Marcelo Vanzin on 08 June 2017, 16:55:54 UTC
## What changes were proposed in this pull request?

Add a new property `spark.streaming.kafka.consumer.cache.enabled` that allows users to enable or disable the cache for Kafka consumers. This property can be especially handy in cases where issues like SPARK-19185 get hit, for which there isn't a solution committed yet. By default, the cache is still on, so this change doesn't change any out-of-box behavior.

## How was this patch tested?
Running unit tests

Author: Mark Grover <mark@apache.org>
Author: Mark Grover <grover.markgrover@gmail.com>

Closes #18234 from markgrover/spark-19185.

(cherry picked from commit 55b8cfe6e6a6759d65bf219ff570fd6154197ec4)
Signed-off-by: Marcelo Vanzin <vanzin@cloudera.com>
1 parent 2f5eaa9
History
File Mode Size
beeline -rwxr-xr-x 1.1 KB
beeline.cmd -rw-r--r-- 899 bytes
find-spark-home -rwxr-xr-x 1.9 KB
load-spark-env.cmd -rw-r--r-- 1.9 KB
load-spark-env.sh -rw-r--r-- 2.1 KB
pyspark -rwxr-xr-x 2.9 KB
pyspark.cmd -rw-r--r-- 1002 bytes
pyspark2.cmd -rw-r--r-- 1.5 KB
run-example -rwxr-xr-x 1.0 KB
run-example.cmd -rw-r--r-- 988 bytes
spark-class -rwxr-xr-x 3.1 KB
spark-class.cmd -rw-r--r-- 1012 bytes
spark-class2.cmd -rw-r--r-- 2.4 KB
spark-shell -rwxr-xr-x 2.9 KB
spark-shell.cmd -rw-r--r-- 1010 bytes
spark-shell2.cmd -rw-r--r-- 1.5 KB
spark-sql -rwxr-xr-x 1.0 KB
spark-submit -rwxr-xr-x 1.0 KB
spark-submit.cmd -rw-r--r-- 1012 bytes
spark-submit2.cmd -rw-r--r-- 1.1 KB
sparkR -rwxr-xr-x 1.0 KB
sparkR.cmd -rw-r--r-- 1000 bytes
sparkR2.cmd -rw-r--r-- 1014 bytes

back to top