be sure to set class loader of kryo instances #393
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Maybe I'm doing something wrong, but w/out this change I get exceptions if I try to deserialize custom classes that I haven't registered. Eg., if I try to shuffle a custom case class:
case class MyCaseClass(val v: Int, val x: String)
I would get an exception:
13/01/21 16:06:54 INFO cluster.TaskSetManager: Loss was due to com.esotericsoftware.kryo.SerializationException: Unable to deserialize object of type: scala.Tuple2
at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:571)
at com.esotericsoftware.kryo.ObjectBuffer.readClassAndObject(ObjectBuffer.java:92)
at spark.KryoDeserializationStream.readObject(KryoSerializer.scala:95)
at spark.serializer.DeserializationStream$$anon$1.getNext(Serializer.scala:82)
at spark.serializer.DeserializationStream$$anon$1.hasNext(Serializer.scala:92)
at scala.collection.Iterator$class.foreach(Iterator.scala:660)
at spark.serializer.DeserializationStream$$anon$1.foreach(Serializer.scala:75)
at spark.BlockStoreShuffleFetcher$$anonfun$fetch$6.apply(BlockStoreShuffleFetcher.scala:38)
at spark.BlockStoreShuffleFetcher$$anonfun$fetch$6.apply(BlockStoreShuffleFetcher.scala:34)
at scala.collection.Iterator$class.foreach(Iterator.scala:660)
at scala.collection.Iterator$$anon$22.foreach(Iterator.scala:382)
at spark.BlockStoreShuffleFetcher.fetch(BlockStoreShuffleFetcher.scala:34)
...
this change makes those errors go away