From 1d716397c31fa09c44f9abcdf1e46488d47ebf9b Mon Sep 17 00:00:00 2001 From: scwf Date: Sun, 27 Apr 2014 02:41:03 +0800 Subject: [PATCH] to indicate the fraction memory is max memory for spark cache --- docs/configuration.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/configuration.md b/docs/configuration.md index 8d3442625b475..0c7cc9e90d571 100644 --- a/docs/configuration.md +++ b/docs/configuration.md @@ -115,7 +115,7 @@ Apart from these, the following properties are also available, and may be useful spark.storage.memoryFraction 0.6 - Fraction of Java heap to use for Spark's memory cache. This should not be larger than the "old" + Max fraction of Java heap to use for Spark's memory cache. This should not be larger than the "old" generation of objects in the JVM, which by default is given 0.6 of the heap, but you can increase it if you configure your own old generation size.