Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,8 @@ private[spark] object SizeEstimator extends Logging {
// Sets object size, pointer size based on architecture and CompressedOops settings
// from the JVM.
private def initialize() {
is64bit = System.getProperty("os.arch").contains("64")
val arch = System.getProperty("os.arch")
is64bit = arch.contains("64") || arch.contains("s390x")
isCompressedOops = getIsCompressedOops

objectSize = if (!is64bit) 8 else {
Expand All @@ -97,6 +98,11 @@ private[spark] object SizeEstimator extends Logging {
return System.getProperty("spark.test.useCompressedOops").toBoolean
}

// java.vm.info provides compressed ref info for IBM JDKs
if (System.getProperty("java.vendor").contains("IBM")) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If I merge your other change that introduces an enum for JVM vendor, can it be used here?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It would be useful to have the utility as there are a few more places coming where I'll need to know the JVM vendor.

Architecturally, are you ok with creating a public utility in launcher and having a dependency on it from core?

Feel free to commit these and I'll restructure on the fly, or I can take both back and submit multiple changes in a single pull request.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Darn, right, we don't want to add that dependency. Well, I think for the moment these two changes are OK in that regard since adding a new dependency across these two isn't good.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How do you feel about me creating a getJavaVendor() utility in org.apache.spark.util.Utils rather than in-lining the checks?

It would, however, be a duplicate of the version proposed for the launcher which is not great.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the duplication of the snippet in launcher may be unavoidable for now. I think this PR can be left as is, and a refactoring can be added when it becomes necessary in another PR?

return System.getProperty("java.vm.info").contains("Compressed Ref")
}

try {
val hotSpotMBeanName = "com.sun.management:type=HotSpotDiagnostic"
val server = ManagementFactory.getPlatformMBeanServer()
Expand Down
12 changes: 12 additions & 0 deletions core/src/test/scala/org/apache/spark/util/SizeEstimatorSuite.scala
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,10 @@ class DummyClass6 extends DummyClass5 {
val y: Boolean = true
}

class DummyClass7 {
val x: DummyClass1 = new DummyClass1
}

object DummyString {
def apply(str: String) : DummyString = new DummyString(str.toArray)
}
Expand Down Expand Up @@ -197,4 +201,12 @@ class SizeEstimatorSuite
assertResult(24)(SizeEstimator.estimate(new DummyClass5))
assertResult(32)(SizeEstimator.estimate(new DummyClass6))
}

test("check 64-bit detection for s390x arch") {
System.setProperty("os.arch", "s390x")
val initialize = PrivateMethod[Unit]('initialize)
SizeEstimator invokePrivate initialize()
// Class should be 32 bytes on s390x if recognised as 64 bit platform
assertResult(32)(SizeEstimator.estimate(new DummyClass7))
}
}