Skip to content

Conversation

@yanboliang
Copy link
Contributor

Change numPartitions() to getNumPartitions() to be consistent with Scala/Python.
Note: If we can not catch up with 1.6 release, it will be breaking change for 1.7 that we also need to explain in release note.

cc @sun-rui @felixcheung @shivaram

@felixcheung
Copy link
Member

This is actually not exported from SparkR - since it is first integrated in Spark 1.4, SparkR is exporting a smaller/different set of API.
You can see in https://github.com/apache/spark/blob/master/R/pkg/NAMESPACE

While it is possible to access this with Spark:::numPartitions(), it has been available since Spark 1.4 so this rename is actually going to be a breaking change (of an internal API).
So I'd vote for no change in SparkR.

@SparkQA
Copy link

SparkQA commented Dec 3, 2015

Test build #47121 has finished for PR 10123 at commit 6870073.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@srowen
Copy link
Member

srowen commented Dec 3, 2015

Yeah, it's unfortunately because until a change 3 days ago, we had 3 different methods in 4 languages for this simple function. Now everything but R uses getNumPartitions. It's not worth breaking something but may be worth adding/deprecating a method.

@yanboliang
Copy link
Contributor Author

@felixcheung Thanks for your comments. I got it's not an exposed API, but I think to provide consistent function name is necessary especially when we want to expose RDD API someday. I think another solution is to add getNumPartitions as an alias of numPartitions which will not cause breaking change, and we can expose getNumPartitions when we want to expose RDD API. Looking forward to other members comments.

@yanboliang
Copy link
Contributor Author

@srowen I think we have the consistent idea that to provide both getNumPartitions and numPartitions at SparkR side, and mark numPartitions as deprecated.

@sun-rui
Copy link
Contributor

sun-rui commented Dec 3, 2015

+1 @yanboliang

@SparkQA
Copy link

SparkQA commented Dec 3, 2015

Test build #47135 has finished for PR 10123 at commit ed691eb.

  • This patch fails some tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Dec 3, 2015

Test build #47137 has finished for PR 10123 at commit 94c596d.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@shivaram
Copy link
Contributor

shivaram commented Dec 4, 2015

Yeah this looks fine. Since it wasn't a publicly exposed API I don't think the backwards compatibility matters that much -- but for right now I'm +1 on just deprecating numPartitions.

R/pkg/R/RDD.R Outdated
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you add a note that this is deprecated here ?

Also this is the title of the document, so its better to have this be something like Gets the number of partitions of an RDD cc @felixcheung

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is, although we are not generating doc for internal API via @noRd below...
Still, better to be descriptive.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In fact, if we are deprecating, consider adding

.Deprecated(newFuncSig, old = oldFuncSig)

call to the deprecated version

@SparkQA
Copy link

SparkQA commented Dec 4, 2015

Test build #47179 has finished for PR 10123 at commit c17ce80.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@shivaram
Copy link
Contributor

shivaram commented Dec 6, 2015

LGTM. Merging this. Thanks @yanboliang

asfgit pushed a commit that referenced this pull request Dec 6, 2015
… be consistent with Scala/Python

Change ```numPartitions()``` to ```getNumPartitions()``` to be consistent with Scala/Python.
<del>Note: If we can not catch up with 1.6 release, it will be breaking change for 1.7 that we also need to explain in release note.<del>

cc sun-rui felixcheung shivaram

Author: Yanbo Liang <[email protected]>

Closes #10123 from yanboliang/spark-12115.

(cherry picked from commit 6979edf)
Signed-off-by: Shivaram Venkataraman <[email protected]>
@asfgit asfgit closed this in 6979edf Dec 6, 2015
@yanboliang yanboliang deleted the spark-12115 branch December 7, 2015 01:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants