Skip to content

Conversation

@koeninger
Copy link
Contributor

What changes were proposed in this pull request?

during sbt unidoc task, skip the streamingKafka010 subproject and filter kafka 0.10 classes from the classpath, so that at least existing kafka 0.8 doc can be included in unidoc without error

How was this patch tested?

sbt spark/scalaunidoc:doc | grep -i error

@srowen
Copy link
Member

srowen commented Jul 4, 2016

It doesn't work to only keep the 0.10 classes? I was hoping they'd be a superset of 0.8 for these purposes. Can we skip generating javadoc for one of them rather than modify the classpath, if necessary?

@SparkQA
Copy link

SparkQA commented Jul 4, 2016

Test build #61708 has finished for PR 14041 at commit 5312215.

  • This patch fails from timeout after a configured wait of 250m.
  • This patch merges cleanly.
  • This patch adds no public classes.

@koeninger
Copy link
Contributor Author

Keeping the 0.10 classes might work if we want to skip publishing 0.8, but
trying to skip publishing 0.10 did not work until I modified the classpath
for the unidoc task. 0.8 would error, apparently due to changed methods in
0.10. Skipping both might work.

Either way there's still the question of where / how to publish the api doc
for the Kafka connector. Only options I can see are publish it
separately; or run unidoc with the exclusions, run scaladoc for the
excluded project(s), then script merging them.
On Jul 4, 2016 3:20 AM, "Sean Owen" [email protected] wrote:

It doesn't work to only keep the 0.10 classes? I was hoping they'd be a
superset of 0.8 for these purposes. Can we skip generating javadoc for one
of them rather than modify the classpath, if necessary?


You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
#14041 (comment), or mute
the thread
https://github.com/notifications/unsubscribe/AAGAB4wgwoTkmYw188NCtoxFxc9svNUfks5qSMJWgaJpZM4JEDCW
.

@srowen
Copy link
Member

srowen commented Jul 4, 2016

I suppose publishing one version's docs is better than none. Is it possible to publish 0.10 only?

@shivaram
Copy link
Contributor

shivaram commented Jul 5, 2016

I just noticed that our nightly docs build has been failing with an error related to kafka (Example [1]). Will this PR fix this or should we open a new JIRA for this ?

[1] https://amplab.cs.berkeley.edu/jenkins/view/Spark%20Packaging/job/spark-branch-2.0-docs/209/consoleFull

@srowen
Copy link
Member

srowen commented Jul 5, 2016

Same error, yes.

@tdas
Copy link
Contributor

tdas commented Jul 5, 2016

Does anyone know why does unidoc fail? Is it because unidoc is combining both kafka 0.8 and 0.10 in the compile path and therefore causing problems?

@tdas
Copy link
Contributor

tdas commented Jul 5, 2016

In any case, we have to release rc2 soon and that cannot be done with a broken unidoc. And between 0.8 and 0.10, 0.8 is higher priority for having docs because it is stable API. So LGTM for this PR, will merge to master and 2.0 as soon as tests pass.

@SparkQA
Copy link

SparkQA commented Jul 5, 2016

Test build #3161 has finished for PR 14041 at commit 5312215.

  • This patch fails to build.
  • This patch merges cleanly.
  • This patch adds no public classes.

@koeninger
Copy link
Contributor Author

That build error looks hive related... I can try merging latest master

@tdas yes, the reason unidoc is failing is because it's throwing all dependencies from all subprojects into one classpath when running scaladoc/javadoc

@SparkQA
Copy link

SparkQA commented Jul 5, 2016

Test build #61775 has finished for PR 14041 at commit 2ea2b17.

  • This patch fails to build.
  • This patch merges cleanly.
  • This patch adds no public classes.

@koeninger
Copy link
Contributor Author

koeninger commented Jul 5, 2016

From this + looking at jenkins, it seems like master is broken

looks like maybe #13818

@tdas
Copy link
Contributor

tdas commented Jul 5, 2016

yeah. @zsxwing is looking into it.

@SparkQA
Copy link

SparkQA commented Jul 5, 2016

Test build #3162 has finished for PR 14041 at commit 2ea2b17.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@tdas
Copy link
Contributor

tdas commented Jul 5, 2016

Since build passed, merging this to master and 2.0

asfgit pushed a commit that referenced this pull request Jul 5, 2016
## What changes were proposed in this pull request?
during sbt unidoc task, skip the streamingKafka010 subproject and filter kafka 0.10 classes from the classpath, so that at least existing kafka 0.8 doc can be included in unidoc without error

## How was this patch tested?
sbt spark/scalaunidoc:doc | grep -i error

Author: cody koeninger <[email protected]>

Closes #14041 from koeninger/SPARK-16359.

(cherry picked from commit 1f0d021)
Signed-off-by: Tathagata Das <[email protected]>
@asfgit asfgit closed this in 1f0d021 Jul 5, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants