Skip to content

Commit 3c44ff4

Browse files
aarondavpwendell
authored andcommitted
Super minor: Add require for mergeCombiners in combineByKey
We changed the behavior in 0.9.0 from requiring that mergeCombiners be null when mapSideCombine was false to requiring that mergeCombiners *never* be null, for external sorting. This patch adds a require() to make this behavior change explicitly messaged rather than resulting in a NPE. Author: Aaron Davidson <[email protected]> Closes apache#623 from aarondav/master and squashes the following commits: 520b80c [Aaron Davidson] Super minor: Add require for mergeCombiners in combineByKey (cherry picked from commit 3fede48) Signed-off-by: Patrick Wendell <[email protected]>
1 parent 289d761 commit 3c44ff4

File tree

1 file changed

+1
-0
lines changed

1 file changed

+1
-0
lines changed

core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -77,6 +77,7 @@ class PairRDDFunctions[K: ClassTag, V: ClassTag](self: RDD[(K, V)])
7777
partitioner: Partitioner,
7878
mapSideCombine: Boolean = true,
7979
serializerClass: String = null): RDD[(K, C)] = {
80+
require(mergeCombiners != null, "mergeCombiners must be defined") // required as of Spark 0.9.0
8081
if (getKeyClass().isArray) {
8182
if (mapSideCombine) {
8283
throw new SparkException("Cannot use map-side combining with array keys.")

0 commit comments

Comments
 (0)