Skip to content

Conversation

@zjffdu
Copy link
Contributor

@zjffdu zjffdu commented Jan 26, 2016

…anager in local mode

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove this empty line.

@zjffdu
Copy link
Contributor Author

zjffdu commented Jan 26, 2016

Thanks @jerryshao

@SparkQA
Copy link

SparkQA commented Jan 26, 2016

Test build #50069 has finished for PR 10914 at commit 90118ca.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jan 26, 2016

Test build #50073 has finished for PR 10914 at commit 0467617.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@jerryshao
Copy link
Contributor

All the DEA related unit tests are running on local mode, they will be failed with this change, we should fix it.

@SparkQA
Copy link

SparkQA commented Jan 26, 2016

Test build #50085 has finished for PR 10914 at commit 69d6086.

  • This patch fails PySpark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@JoshRosen
Copy link
Contributor

Just curious, what's the compelling reason for not creating this in local mode? Performance?

@zjffdu
Copy link
Contributor Author

zjffdu commented Jan 27, 2016

@JoshRosen No compelling reason for that. Just saw the following message in pyspark which is a little confusing as Dynamic executor allocation is not supported in local mode.

16/01/27 07:57:56 WARN spark.ExecutorAllocationManager: Unable to reach the cluster manager to kill executor driver,or no executor eligible to kill!

@JoshRosen
Copy link
Contributor

Ping @andrewor14; this one-line change should be quick to review I think.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is hard to read. Why not just make the change in Utils.isDynamicAllocationEnabled so all usages of it will see this change as well?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Make sense. Update the PR to move it into Utils.isDynamicAllocationEnabled and have some refactoring accordingly.

@andrewor14
Copy link
Contributor

I agree with the change, but I think it could be made in a better place.

@zjffdu
Copy link
Contributor Author

zjffdu commented Feb 2, 2016

@andrewor14 Make sense to move it into Utils.isDynamicAllocationEnabled (add isLocal as another parameter, set isLocal as false explicitly in yarn related classes), update the patch and have some refactoring accordingly.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we add a default value isLocal: Boolean = false to avoid changing several other parts?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I thought about it, but feel it would be better to explicitly set it. Because I feel it doesn't make sense to assume that the default mode is non local.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

actually it doesn't look very useful to pass in a flag that always decides the value of the return type. It appears that SparkContext is the only place where we need to read the flag so let's leave it out of this method.

@SparkQA
Copy link

SparkQA commented Feb 2, 2016

Test build #50543 has finished for PR 10914 at commit 555a149.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Feb 2, 2016

Test build #50562 has finished for PR 10914 at commit fc0a8dd.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@zjffdu
Copy link
Contributor Author

zjffdu commented Feb 2, 2016

Seems the failed tests are unrelated.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this warning doesn't belong here anymore. You need to move it to Utils too

@andrewor14
Copy link
Contributor

@zjffdu the latest changes look more complicated than before. I think it would be best to achieve the following:

  • all checks involving the conf should be in Utils.dynamicAllocationEnabled
  • on second thought the isLocal check should just be in SparkContext; it doesn't really make sense for it to be everywhere like the way it is currently in this patch

Ideally we'd have something like:

// In SparkContext
val dynamicAllocationEnabled = Utils.dynamicAllocationEnabled(_conf)

val executorAllocationManager =
  if (dynamicAllocationEnabled && !isLocal) {
    Some(...)
  } else {
    None
  }

Everything else can go into Utils.dynamicAllocationEnabled. Does that make sense?

@zjffdu
Copy link
Contributor Author

zjffdu commented Feb 3, 2016

@andrewor14 I add one method Utils.isLocal, and keep the signature of Utils.dynamicAllocationEnabled unchanged and put all the logic in Utils.dynamicAllocationEnabled

@SparkQA
Copy link

SparkQA commented Feb 3, 2016

Test build #50633 has finished for PR 10914 at commit 8261c14.

  • This patch fails PySpark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

asserting === true is redundant

@JoshRosen
Copy link
Contributor

Hey, any chance that we can update this patch and get it merged? This change seems conceptually simple and I can't imagine that it'll be too much more work to get it across the finish line.

@srowen
Copy link
Member

srowen commented Feb 18, 2016

Ping @zjffdu ?

@srowen
Copy link
Member

srowen commented Feb 23, 2016

@zjffdu do you mind closing this PR if you're not going to update it?

@zjffdu
Copy link
Contributor Author

zjffdu commented Feb 23, 2016

Sorry for late response, patch updated.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This def is redundant now

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Keep using def, because val will cause test fails (val will be evaluated when declared while def will be evaluated when invoked ) Although I can use config instead of _conf, but considering all the variables around isLocal is using _conf, I don't want to make it inconsistent and involve any potential issue.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This needs a better name and description. It says whether the master is local. Use @return

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Where did you feel unclear ? I think it is straightforward. And I think @return is only mandatory when it is public api.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is just called "isLocal" and is a function of SparkConf. What is local? why be opaque -- "isLocalMaster"? hm, does it even belong as a helper in SparkConf? (I don't feel strongly about that.)

You're already writing "Returns .."; why not just use the actual scaladoc tag?

@SparkQA
Copy link

SparkQA commented Feb 23, 2016

Test build #51770 has finished for PR 10914 at commit e3c0f9a.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Feb 24, 2016

Test build #51831 has finished for PR 10914 at commit c15212b.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Feb 24, 2016

Test build #51838 has finished for PR 10914 at commit bffa7a4.

  • This patch fails SparkR unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@srowen
Copy link
Member

srowen commented Feb 26, 2016

Jenkins, retest this please

@SparkQA
Copy link

SparkQA commented Feb 26, 2016

Test build #52071 has finished for PR 10914 at commit bffa7a4.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.


/**
*
* @return whether it is local mode
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No need to change this, but you have an extra blank line here

@srowen
Copy link
Member

srowen commented Feb 27, 2016

isLocalMaster could be private, but that's fairly minor

@srowen
Copy link
Member

srowen commented Feb 29, 2016

Merged to master

@asfgit asfgit closed this in 99fe899 Feb 29, 2016
roygao94 pushed a commit to roygao94/spark that referenced this pull request Mar 22, 2016
…anager in local mode

Author: Jeff Zhang <[email protected]>

Closes apache#10914 from zjffdu/SPARK-12994.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants