Skip to content

Commit 7db5624

Browse files
yongtangsarutak
authored andcommitted
[SPARK-14368][PYSPARK] Support python.spark.worker.memory with upper-case unit.
## What changes were proposed in this pull request? This fix tries to address the issue in PySpark where `spark.python.worker.memory` could only be configured with a lower case unit (`k`, `m`, `g`, `t`). This fix allows the upper case unit (`K`, `M`, `G`, `T`) to be used as well. This is to conform to the JVM memory string as is specified in the documentation . ## How was this patch tested? This fix adds additional test to cover the changes. Author: Yong Tang <[email protected]> Closes #12163 from yongtang/SPARK-14368.
1 parent 8f50574 commit 7db5624

File tree

2 files changed

+13
-1
lines changed

2 files changed

+13
-1
lines changed

python/pyspark/rdd.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -115,7 +115,7 @@ def _parse_memory(s):
115115
2048
116116
"""
117117
units = {'g': 1024, 'm': 1, 't': 1 << 20, 'k': 1.0 / 1024}
118-
if s[-1] not in units:
118+
if s[-1].lower() not in units:
119119
raise ValueError("invalid format: " + s)
120120
return int(float(s[:-1]) * units[s[-1].lower()])
121121

python/pyspark/tests.py

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1966,6 +1966,18 @@ def test_startTime(self):
19661966
self.assertGreater(sc.startTime, 0)
19671967

19681968

1969+
class ConfTests(unittest.TestCase):
1970+
def test_memory_conf(self):
1971+
memoryList = ["1T", "1G", "1M", "1024K"]
1972+
for memory in memoryList:
1973+
sc = SparkContext(conf=SparkConf().set("spark.python.worker.memory", memory))
1974+
l = list(range(1024))
1975+
random.shuffle(l)
1976+
rdd = sc.parallelize(l, 4)
1977+
self.assertEqual(sorted(l), rdd.sortBy(lambda x: x).collect())
1978+
sc.stop()
1979+
1980+
19691981
@unittest.skipIf(not _have_scipy, "SciPy not installed")
19701982
class SciPyTests(PySparkTestCase):
19711983

0 commit comments

Comments
 (0)