Skip to content
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
48 changes: 48 additions & 0 deletions docs/configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -456,6 +456,40 @@ Apart from these, the following properties are also available, and may be useful
from JVM to Python worker for every task.
</td>
</tr>
<tr>
<td><code>spark.python.task.killTimeout</code></td>
<td>2s</td>
<td>
Timeout to wait before killing the python worker when a task cannot be interrupted
</td>
</tr>
<tr>
<td><code>spark.sql.repl.eagerEval.enabled</code></td>
<td>false</td>
<td>
Enable eager evaluation or not. If true and the REPL you are using supports eager evaluation,
Dataset will be ran automatically. The HTML table which generated by <code>_repl_html_</code>
called by notebooks like Jupyter will feedback the queries user have defined. For plain Python
REPL, the output will be shown like <code>dataframe.show()</code>
(see <a href="https://issues.apache.org/jira/browse/SPARK-24215">SPARK-24215</a> for more details).
</td>
</tr>
<tr>
<td><code>spark.sql.repl.eagerEval.maxNumRows</code></td>
<td>20</td>
<td>
Default number of rows in eager evaluation output HTML table generated by <code>_repr_html_</code> or plain text,
this only take effect when <code>spark.sql.repl.eagerEval.enabled</code> is set to true.
</td>
</tr>
<tr>
<td><code>spark.sql.repl.eagerEval.truncate</code></td>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

spark.sql.* configs should be in sql-programming-guide.md

<td>20</td>
<td>
Default number of truncate in eager evaluation output HTML table generated by <code>_repr_html_</code> or
plain text, this only take effect when <code>spark.sql.repl.eagerEval.enabled</code> set to true.
</td>
</tr>
<tr>
<td><code>spark.files</code></td>
<td></td>
Expand Down Expand Up @@ -541,6 +575,13 @@ Apart from these, the following properties are also available, and may be useful
Python binary executable to use for PySpark in both driver and executors.
</td>
</tr>
<tr>
<td><code>spark.worker.driverTerminateTimeout</code></td>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Likewise spark.worker.* configs are in spark-standalone.md, it appears

<td>10s</td>
<td>
Timeout to wait for when trying to terminate a driver.
</td>
</tr>
</table>

### Shuffle Behavior
Expand Down Expand Up @@ -771,6 +812,13 @@ Apart from these, the following properties are also available, and may be useful
Buffer size to use when writing to output streams, in KiB unless otherwise specified.
</td>
</tr>
<tr>
<td><code>spark.ui.consoleProgress.update.interval</code></td>
<td>200</td>
<td>
Update period of progress bar, in milliseconds
</td>
</tr>
<tr>
<td><code>spark.ui.enabled</code></td>
<td>true</td>
Expand Down