Skip to content

Conversation

@WangTaoTheTonic
Copy link
Contributor

We should let Thrift Server take these two parameters as it is a daemon. And it is better to read driver-related configs as an app submited by spark-submit.

https://issues.apache.org/jira/browse/SPARK-7031

@SparkQA
Copy link

SparkQA commented Apr 21, 2015

Test build #30664 has finished for PR 5609 at commit 4fb25ed.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.
  • This patch does not change any dependencies.

@SparkQA
Copy link

SparkQA commented Apr 21, 2015

Test build #30674 has finished for PR 5609 at commit 0565831.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.
  • This patch does not change any dependencies.

@SparkQA
Copy link

SparkQA commented Apr 23, 2015

Test build #30823 has finished for PR 5609 at commit 624e652.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.
  • This patch does not change any dependencies.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm probably missing basic stuff here but it jumps out at me that you have to special-case thrift server here. Is this really related to spark submit?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah. I think so. If we take a look at start-thriftserver.sh then will find in spark-daemon.sh we use spark-submit to submit the org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.

Then it comes into spark-class, and spark-class use org.apache.spark.launcher.Main to resolve args in which it has:

if (className.equals("org.apache.spark.deploy.SparkSubmit")) {
builder = new SparkSubmitCommandBuilder(args);
} else {
builder = new SparkClassCommandBuilder(className, args);
}

So I think we start Thrift Server by spark-submit but at the meantime Thrift Server is sort of daemon process and it should take daemon related options too.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The alternative would be to make this HiveThriftServer2 go through spark-class instead of spark-submit, but I think that would be a bigger change. While I'm not in love of the idea of peppering special cases inside SparkSubmitCommandBuilder, one doesn't sound like a terrible idea. If there are more in the future we can look at a better solution.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Total agree with you.

@WangTaoTheTonic
Copy link
Contributor Author

BTW I have tested on my cluster with setting

export SPARK_DAEMON_MEMORY=1111m
export SPARK_DAEMON_JAVA_OPTS=" -Dx=y "

in spark-env.sh.
Before this patch the jinfo shows:

VM Flags:
-Xms512m -Xmx512m -XX:MaxPermSize=128m

After:

VM Flags:
-Dx=y -Xms1111m -Xmx1111m -XX:MaxPermSize=128m

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are these test changes needed? The tests have been passing without them.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No they are not necessary and just left while I fixing the broken tests. I will revert these.

@WangTaoTheTonic
Copy link
Contributor Author

Jenkins, retest this please.

@WangTaoTheTonic
Copy link
Contributor Author

Jenkins, test this please.

1 similar comment
@WangTaoTheTonic
Copy link
Contributor Author

Jenkins, test this please.

@SparkQA
Copy link

SparkQA commented Apr 28, 2015

Test build #728 has finished for PR 5609 at commit d3ddfb6.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.
  • This patch does not change any dependencies.

@WangTaoTheTonic
Copy link
Contributor Author

@vanzin So though it is not a best idea but ok to go?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To avoid the copy & paste, I'd do:

String tsMemory = isThriftServer(mainClass) ? getenv("SPARK_DAEMON_MEMORY") : null;
memory = firstNonEmpty(tsMemory, ...);

@vanzin
Copy link
Contributor

vanzin commented Apr 29, 2015

Just a small nit, but otherwise looks good.

@WangTaoTheTonic
Copy link
Contributor Author

Jenkins, retest this please.

@SparkQA
Copy link

SparkQA commented Apr 30, 2015

Test build #31373 has finished for PR 5609 at commit 8d3fc16.

  • This patch fails SparkR unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.
  • This patch does not change any dependencies.

@WangTaoTheTonic
Copy link
Contributor Author

Jenkins, retest this please.

@SparkQA
Copy link

SparkQA commented Apr 30, 2015

Test build #31378 has finished for PR 5609 at commit 8d3fc16.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.
  • This patch does not change any dependencies.

@WangTaoTheTonic
Copy link
Contributor Author

Looks like the failed test case is flacky. Jenkins, retest this please.

@SparkQA
Copy link

SparkQA commented Apr 30, 2015

Test build #31384 has finished for PR 5609 at commit 8d3fc16.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.
  • This patch does not change any dependencies.

@WangTaoTheTonic
Copy link
Contributor Author

Another different case failed. Jenkins, retest this please.

@SparkQA
Copy link

SparkQA commented Apr 30, 2015

Test build #31389 has finished for PR 5609 at commit 8d3fc16.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.
  • This patch does not change any dependencies.

@SparkQA
Copy link

SparkQA commented Apr 30, 2015

Test build #31398 has finished for PR 5609 at commit 8d3fc16.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.
  • This patch adds the following new dependencies:
    • jaxb-api-2.2.7.jar
    • jaxb-core-2.2.7.jar
    • jaxb-impl-2.2.7.jar
    • pmml-agent-1.1.15.jar
    • pmml-model-1.1.15.jar
    • pmml-schema-1.1.15.jar
  • This patch removes the following dependencies:
    • activation-1.1.jar
    • jaxb-api-2.2.2.jar
    • jaxb-impl-2.2.3-1.jar

@asfgit asfgit closed this in 49549d5 May 2, 2015
jeanlyn pushed a commit to jeanlyn/spark that referenced this pull request May 28, 2015
…Y and SPARK_DAEMON_JAVA_OPTS

We should let Thrift Server take these two parameters as it is a daemon. And it is better to read driver-related configs as an app submited by spark-submit.

https://issues.apache.org/jira/browse/SPARK-7031

Author: WangTaoTheTonic <[email protected]>

Closes apache#5609 from WangTaoTheTonic/SPARK-7031 and squashes the following commits:

8d3fc16 [WangTaoTheTonic] indent
035069b [WangTaoTheTonic] better code style
d3ddfb6 [WangTaoTheTonic] revert the unnecessary changes in suite
624e652 [WangTaoTheTonic] fix break tests
0565831 [WangTaoTheTonic] fix failed tests
4fb25ed [WangTaoTheTonic] let thrift server take SPARK_DAEMON_MEMORY and SPARK_DAEMON_JAVA_OPTS
jeanlyn pushed a commit to jeanlyn/spark that referenced this pull request Jun 12, 2015
…Y and SPARK_DAEMON_JAVA_OPTS

We should let Thrift Server take these two parameters as it is a daemon. And it is better to read driver-related configs as an app submited by spark-submit.

https://issues.apache.org/jira/browse/SPARK-7031

Author: WangTaoTheTonic <[email protected]>

Closes apache#5609 from WangTaoTheTonic/SPARK-7031 and squashes the following commits:

8d3fc16 [WangTaoTheTonic] indent
035069b [WangTaoTheTonic] better code style
d3ddfb6 [WangTaoTheTonic] revert the unnecessary changes in suite
624e652 [WangTaoTheTonic] fix break tests
0565831 [WangTaoTheTonic] fix failed tests
4fb25ed [WangTaoTheTonic] let thrift server take SPARK_DAEMON_MEMORY and SPARK_DAEMON_JAVA_OPTS
nemccarthy pushed a commit to nemccarthy/spark that referenced this pull request Jun 19, 2015
…Y and SPARK_DAEMON_JAVA_OPTS

We should let Thrift Server take these two parameters as it is a daemon. And it is better to read driver-related configs as an app submited by spark-submit.

https://issues.apache.org/jira/browse/SPARK-7031

Author: WangTaoTheTonic <[email protected]>

Closes apache#5609 from WangTaoTheTonic/SPARK-7031 and squashes the following commits:

8d3fc16 [WangTaoTheTonic] indent
035069b [WangTaoTheTonic] better code style
d3ddfb6 [WangTaoTheTonic] revert the unnecessary changes in suite
624e652 [WangTaoTheTonic] fix break tests
0565831 [WangTaoTheTonic] fix failed tests
4fb25ed [WangTaoTheTonic] let thrift server take SPARK_DAEMON_MEMORY and SPARK_DAEMON_JAVA_OPTS
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants