Skip to content

Conversation

@dragos
Copy link
Contributor

@dragos dragos commented Sep 9, 2015

Currently only coarse-grained mode observes spark.cores.max. The fine-grained mode should follow the same limit, and not go above the defined maximum cores.

There's some duplication of logic between the coarse-grained and fine-grained mode. I created SPARK-10444 to fix that.

@SparkQA
Copy link

SparkQA commented Sep 9, 2015

Test build #42206 has finished for PR 8671 at commit deca1d3.

  • This patch fails Scala style tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@dragos dragos force-pushed the issue/mesos/fine-grained-maxCores branch from deca1d3 to 6f748c5 Compare September 9, 2015 15:12
@SparkQA
Copy link

SparkQA commented Sep 9, 2015

Test build #42207 has finished for PR 8671 at commit 6f748c5.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@dragos
Copy link
Contributor Author

dragos commented Sep 10, 2015

/cc @tnachen @andrewor14

@dragos
Copy link
Contributor Author

dragos commented Sep 15, 2015

@tnachen @andrewor14 would you have some time to have a look at this one?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will availableCores becomes negative and just starts adding actual Cores?
I think it's probably safer to check if availableCores < Cores then we just return None.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, I'll refactor this condition.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants