Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -162,7 +162,7 @@ private[spark] object History {
val APPLY_CUSTOM_EXECUTOR_LOG_URL_TO_INCOMPLETE_APP =
ConfigBuilder("spark.history.custom.executor.log.url.applyIncompleteApplication")
.doc("Whether to apply custom executor log url, as specified by " +
"`spark.history.custom.executor.log.url`, to incomplete application as well. " +
s"${CUSTOM_EXECUTOR_LOG_URL.key}, to incomplete application as well. " +
"Even if this is true, this still only affects the behavior of the history server, " +
"not running spark applications.")
.booleanConf
Expand Down
18 changes: 8 additions & 10 deletions docs/monitoring.md
Original file line number Diff line number Diff line change
Expand Up @@ -159,23 +159,21 @@ Security options for the Spark History Server are covered more detail in the
<td>false</td>
<td>
Indicates whether the history server should use kerberos to login. This is required
if the history server is accessing HDFS files on a secure Hadoop cluster. If this is
true, it uses the configs <code>spark.history.kerberos.principal</code> and
<code>spark.history.kerberos.keytab</code>.
if the history server is accessing HDFS files on a secure Hadoop cluster.
</td>
</tr>
<tr>
<td>spark.history.kerberos.principal</td>
<td>(none)</td>
<td>
Kerberos principal name for the History Server.
When <code>spark.history.kerberos.enabled=true</code>, specifies kerberos principal name for the History Server.
</td>
</tr>
<tr>
<td>spark.history.kerberos.keytab</td>
<td>(none)</td>
<td>
Location of the kerberos keytab file for the History Server.
When <code>spark.history.kerberos.enabled=true</code>, specifies location of the kerberos keytab file for the History Server.
</td>
</tr>
<tr>
Expand All @@ -189,7 +187,7 @@ Security options for the Spark History Server are covered more detail in the
<td>spark.history.fs.cleaner.interval</td>
<td>1d</td>
<td>
How often the filesystem job history cleaner checks for files to delete.
When <code>spark.history.fs.cleaner.enabled=true</code>, specifies how often the filesystem job history cleaner checks for files to delete.
Files are deleted if at least one of two conditions holds.
First, they're deleted if they're older than <code>spark.history.fs.cleaner.maxAge</code>.
They are also deleted if the number of files is more than
Expand All @@ -201,14 +199,14 @@ Security options for the Spark History Server are covered more detail in the
<td>spark.history.fs.cleaner.maxAge</td>
<td>7d</td>
<td>
Job history files older than this will be deleted when the filesystem history cleaner runs.
When <code>spark.history.fs.cleaner.enabled=true</code>, job history files older than this will be deleted when the filesystem history cleaner runs.
</td>
</tr>
<tr>
<td>spark.history.fs.cleaner.maxNum</td>
<td>Int.MaxValue</td>
<td>
The maximum number of files in the event log directory.
When <code>spark.history.fs.cleaner.enabled=true</code>, specifies the maximum number of files in the event log directory.
Spark tries to clean up the completed attempt logs to maintain the log directory under this limit.
This should be smaller than the underlying file system limit like
`dfs.namenode.fs-limits.max-directory-items` in HDFS.
Expand Down Expand Up @@ -242,15 +240,15 @@ Security options for the Spark History Server are covered more detail in the
<td>spark.history.fs.driverlog.cleaner.interval</td>
<td><code>spark.history.fs.cleaner.interval</code></td>
<td>
How often the filesystem driver log cleaner checks for files to delete.
When <code>spark.history.fs.driverlog.cleaner.enabled=true</code>, specifies how often the filesystem driver log cleaner checks for files to delete.
Files are only deleted if they are older than <code>spark.history.fs.driverlog.cleaner.maxAge</code>
</td>
</tr>
<tr>
<td>spark.history.fs.driverlog.cleaner.maxAge</td>
<td><code>spark.history.fs.cleaner.maxAge</code></td>
<td>
Driver log files older than this will be deleted when the driver log cleaner runs.
When <code>spark.history.fs.driverlog.cleaner.enabled=true</code>, driver log files older than this will be deleted when the driver log cleaner runs.
</td>
</tr>
<tr>
Expand Down