Skip to content

Conversation

@liancheng
Copy link
Contributor

JIRA issue: SPARK-1429

Spark shell fails to start after sbt clean assemble-deps package. A * should be added in compute-classpath.sh line 55 to make sure both spark-assembly-xxx-deps.jar and spark-hive-assembly-xxx-deps.jar are taken into account.

https://issues.apache.org/jira/browse/SPARK-1429

Spark shell fails to start after "sbt clean assemble-deps package"
@AmplabJenkins
Copy link

Merged build triggered.

@AmplabJenkins
Copy link

Merged build started.

@AmplabJenkins
Copy link

Merged build finished. All automated tests passed.

@AmplabJenkins
Copy link

All automated tests passed.
Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/13819/

@pwendell
Copy link
Contributor

pwendell commented Apr 6, 2014

I believe this is fixed in #237 (but we should check). Also this issue has been reported already as:
https://issues.apache.org/jira/browse/SPARK-1309

@aarondav
Copy link
Contributor

aarondav commented Apr 6, 2014

This check assumes the existence of the -hive- jars, which #237 is removing, so I think we should defer to that one.

@pwendell
Copy link
Contributor

pwendell commented Apr 7, 2014

@liancheng mind closing this?

@liancheng liancheng closed this Apr 7, 2014
andrewor14 pushed a commit to andrewor14/spark that referenced this pull request Apr 7, 2014
Mllib 16 bugfix

Bug fix: https://spark-project.atlassian.net/browse/MLLIB-16

Hi, I fixed the bug and added a test suite for `GradientDescent`. There are 2 checks in the test case. First, the final loss must be lower than the initial one. Second, the trend of loss sequence should be decreasing, i.e., at least 80% iterations have lower losses than their prior iterations.

Thanks!
@liancheng liancheng deleted the assemble-deps branch July 3, 2014 21:28
mccheah pushed a commit to mccheah/spark that referenced this pull request Oct 3, 2018
bzhaoopenstack pushed a commit to bzhaoopenstack/spark that referenced this pull request Sep 11, 2019
Remove duplicate job define "cloud-provider-openstack-test"
turboFei added a commit to turboFei/spark that referenced this pull request Nov 6, 2025
… for 2.3.1 compatibility (apache#337)

* [HADP-45041] Allow specifying location in temporary table for 2.3.1 compatibility (apache#103)

Allow specifying location in temporary table and restore the related part being removed in https://github.corp.ebay.com/hadoop/spark-longwing3/pull/21

To keep compatibility with 2.3.1 where user loads data with temporary table.

No.

Existing UT updated.

Co-authored-by: tianlzhang <[email protected]>
Co-authored-by: Wang, Fei <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants