Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 6 additions & 1 deletion core/src/main/scala/org/apache/spark/SparkConf.scala
Original file line number Diff line number Diff line change
Expand Up @@ -478,7 +478,12 @@ private[spark] object SparkConf extends Logging {
DeprecatedConfig("spark.kryoserializer.buffer.mb", "1.4",
"Please use spark.kryoserializer.buffer instead. The default value for " +
"spark.kryoserializer.buffer.mb was previously specified as '0.064'. Fractional values " +
"are no longer accepted. To specify the equivalent now, one may use '64k'.")
"are no longer accepted. To specify the equivalent now, one may use '64k'."),
DeprecatedConfig("spark.cleaner.ttl", "1.4",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oops looks like this needs to be 1.5 now

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah, this patch is out of date; I was waiting until I had time to do the period GC timer feature; feel free to work-steal if you want to pick this up :)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nope, it's all yours...

"TTL-based metadata cleaning is no longer necessary in recent Spark versions " +
"and can lead to confusing errors if metadata is deleted for entities that are still in " +
"use. Except in extremely special circumstances, you should remove this setting and rely " +
"on Spark's reference-tracking-based cleanup instead. See SPARK-7689 for more details.")
)

Map(configs.map { cfg => (cfg.key -> cfg) }:_*)
Expand Down
11 changes: 0 additions & 11 deletions docs/configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -722,17 +722,6 @@ Apart from these, the following properties are also available, and may be useful
Which broadcast implementation to use.
</td>
</tr>
<tr>
<td><code>spark.cleaner.ttl</code></td>
<td>(infinite)</td>
<td>
Duration (seconds) of how long Spark will remember any metadata (stages generated, tasks
generated, etc.). Periodic cleanups will ensure that metadata older than this duration will be
forgotten. This is useful for running Spark for many hours / days (for example, running 24/7 in
case of Spark Streaming applications). Note that any RDD that persists in memory for more than
this duration will be cleared as well.
</td>
</tr>
<tr>
<td><code>spark.executor.cores</code></td>
<td>1 in YARN mode, all the available cores on the worker in standalone mode.</td>
Expand Down