Skip to content

Conversation

@xueyumusic
Copy link
Contributor

What changes were proposed in this pull request?

This PR replaces some "getTimeAsMs" with "getTimeAsSeconds". This will return a wrong value when the user specifies a value without a time unit.

How was this patch tested?

manual test

Please review http://spark.apache.org/contributing.html before opening a pull request.

@xueyumusic xueyumusic changed the title [SPARK-24560][SS][MESOS] Fix some getTimeAsMs as getTimeAsSeconds [SPARK-24560][CORE][MESOS] Fix some getTimeAsMs as getTimeAsSeconds Jun 14, 2018

private[this] val killPollingIntervalMs: Long =
conf.getTimeAsMs("spark.task.reaper.pollingInterval", "10s")
conf.getTimeAsSeconds("spark.task.reaper.pollingInterval", "10s") * 1000L
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

plz use DurationConversions.

@maropu
Copy link
Member

maropu commented Jun 15, 2018

Can you add tests in SparkConfSuite?


/** How long to wait before killing the python worker if a task cannot be interrupted. */
private val taskKillTimeout = env.conf.getTimeAsMs("spark.python.task.killTimeout", "2s")
private val taskKillTimeout = env.conf.getTimeAsSeconds("spark.python.task.killTimeout", "2s") * 1000L
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: Probably, we'd be better to add Ms in the suffix, => taskKillTimeoutMs

@xueyumusic
Copy link
Contributor Author

I have made some modification, @maropu please review the code, thanks

@ueshin
Copy link
Member

ueshin commented Jun 16, 2018

ok to test.

@SparkQA
Copy link

SparkQA commented Jun 16, 2018

Test build #91959 has finished for PR 21567 at commit a8dc241.

  • This patch fails Scala style tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@maropu
Copy link
Member

maropu commented Jun 16, 2018

cc @jiangxb1987

@maropu
Copy link
Member

maropu commented Jun 16, 2018

@xueyumusic Can you fix the error first?

@SparkQA
Copy link

SparkQA commented Jun 16, 2018

Test build #91964 has finished for PR 21567 at commit a69df11.

  • This patch fails due to an unknown error code, -9.
  • This patch merges cleanly.
  • This patch adds no public classes.

@maropu
Copy link
Member

maropu commented Jun 16, 2018

retest this please

@SparkQA
Copy link

SparkQA commented Jun 16, 2018

Test build #91971 has finished for PR 21567 at commit a69df11.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jun 17, 2018

Test build #91984 has started for PR 21567 at commit 0dd57ea.

@maropu
Copy link
Member

maropu commented Jun 18, 2018

retest this please

@SparkQA
Copy link

SparkQA commented Jun 18, 2018

Test build #92008 has finished for PR 21567 at commit 0dd57ea.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.


/** How long to wait before killing the python worker if a task cannot be interrupted. */
private val taskKillTimeout = env.conf.getTimeAsMs("spark.python.task.killTimeout", "2s")
private val taskKillTimeoutMs = env.conf.getTimeAsSeconds("spark.python.task.killTimeout",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The conf is not documented, but I think it's designed to accept values like 1.5s
cc @zsxwing

// Timeout to wait for when trying to terminate a driver.
private val DRIVER_TERMINATE_TIMEOUT_MS =
conf.getTimeAsMs("spark.worker.driverTerminateTimeout", "10s")
conf.getTimeAsSeconds("spark.worker.driverTerminateTimeout", "10s").seconds.toMillis
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ditto


private[this] val killPollingIntervalMs: Long =
conf.getTimeAsMs("spark.task.reaper.pollingInterval", "10s")
conf.getTimeAsSeconds("spark.task.reaper.pollingInterval", "10s").seconds.toMillis
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually I believe this is change things from right to wrong. If you want to use a second value then you shall use sth like 1s.

// Update period of progress bar, in milliseconds
private val updatePeriodMSec =
sc.getConf.getTimeAsMs("spark.ui.consoleProgress.update.interval", "200")
sc.getConf.getTimeAsMs("spark.ui.consoleProgress.update.interval", "200ms")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We shall document this config.

@jiangxb1987
Copy link
Contributor

Overall I don't think the current logic shall be modified. However, it shall be useful to document some the configs mentioned in this PR.

@xueyumusic
Copy link
Contributor Author

I see, thanks for your review and guidance, @jiangxb1987 @maropu , I will try to add related config to doc and close this PR, thank you

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants