Skip to content

Conversation

@kayousterhout
Copy link
Contributor

This commit moves developer-specific information from the release-
specific documentation in this repo to the developer tools page on
the main Spark website. This commit relies on this PR on the
Spark website: apache/spark-website#33.

@srowen

This commit moves developer-specific information from the release-
specific documentation in this repo to the developer tools page on
the main Spark website. This commit relies on this PR on the
Spark website: apache/spark-website#33.
@SparkQA
Copy link

SparkQA commented Feb 21, 2017

Test build #73237 has finished for PR 17018 at commit ee666c6.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Feb 23, 2017

Test build #73356 has finished for PR 17018 at commit c8a094a.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
  • public class TransportChannelHandler extends ChannelInboundHandlerAdapter
  • trait Logging
  • public class JavaLinearSVCExample
  • class LinearSVCWrapperWriter(instance: LinearSVCWrapper) extends MLWriter
  • class LinearSVCWrapperReader extends MLReader[LinearSVCWrapper]
  • class LSHParams(Params):
  • class LSHModel(JavaModel):
  • class BucketedRandomProjectionLSH(JavaEstimator, LSHParams, HasInputCol, HasOutputCol, HasSeed,
  • class BucketedRandomProjectionLSHModel(LSHModel, JavaMLReadable, JavaMLWritable):
  • class MinHashLSH(JavaEstimator, LSHParams, HasInputCol, HasOutputCol, HasSeed,
  • class MinHashLSHModel(LSHModel, JavaMLReadable, JavaMLWritable):
  • class YarnProxyRedirectFilter extends Filter with Logging
  • class NoSuchDatabaseException(val db: String) extends AnalysisException(s\"Database '$db' not found\")
  • class ResolveBroadcastHints(conf: CatalystConf) extends Rule[LogicalPlan]
  • case class UnresolvedRelation(tableIdentifier: TableIdentifier) extends LeafNode
  • case class JsonToStruct(
  • case class StructToJson(
  • case class Hint(name: String, parameters: Seq[String], child: LogicalPlan) extends UnaryNode
  • case class InnerOuterEstimation(conf: CatalystConf, join: Join) extends Logging
  • case class LeftSemiAntiEstimation(conf: CatalystConf, join: Join)
  • case class NumericRange(min: JDecimal, max: JDecimal) extends Range
  • sealed abstract class HiveStringType extends AtomicType
  • case class CharType(length: Int) extends HiveStringType
  • case class VarcharType(length: Int) extends HiveStringType
  • case class StreamingExplainCommand(
  • case class SaveIntoDataSourceCommand(
  • abstract class JsonDataSource[T] extends Serializable
  • class FileStreamOptions(parameters: CaseInsensitiveMap[String]) extends Logging

@kayousterhout
Copy link
Contributor Author

Ignored the appveyor failure based on advice from @shivaram that this was broken by the SQL folks (and it does seem clearly unrelated to this PR), and merged this into master. Thanks for the review @srowen.

@asfgit asfgit closed this in f87a6a5 Feb 23, 2017
@shivaram
Copy link
Contributor

Just to clarify what I meant is that there was a similar error fixed due to Hadoop / SQL issues by @HyukjinKwon in #16927 - I'll see if this happens again and file a issue if it does

@shivaram
Copy link
Contributor

Hmm I think this specific one timed out, so it might not be worth investigating https://ci.appveyor.com/project/ApacheSoftwareFoundation/spark/build/824-master/messages

@HyukjinKwon
Copy link
Member

Let me run a build in my account to double check and create a JIRA with cc'ing you if it is broken.

@HyukjinKwon
Copy link
Member

Thanks for cc'ing me.

@HyukjinKwon
Copy link
Member

HyukjinKwon commented Feb 23, 2017

Ah, this one seems timed out because it exceeds the 1 hour limitation of execution time in AppVeyor. It seems now R tests could take longer than an hour including the build rarely.

This limit could be increased a bit by asking. My account has 1 and half hours (I asked AppVeyor team when I tried to run some Scala/Java tests). Let me maybe try to ask increasing it first if you all think it is fine.

If it can be a problem with the increased time in the future, let me try to split the tests as I did for Scala/Java tests in my account e.g., https://ci.appveyor.com/project/spark-test/spark/build/632-20170219-windows-test

@shivaram
Copy link
Contributor

Yeah asking them for 1.5hr seems like a good first step. We can also see what fraction of time it takes for build vs. tests. I guess we can't make builds faster without some effort, but we could look at making the tests faster or running a subset of them (for example the RDD tests in SparkR are probably not high priority as the API is private etc.)

@HyukjinKwon
Copy link
Member

HyukjinKwon commented Feb 23, 2017

I see. Sure. Just as a side note, if you download the log, it prints the times.

In this case, it was so close assuming from the last log :).

[00:59:47] DONE ===========================================================================

@HyukjinKwon
Copy link
Member

I have a confirmation that it is increased to 90 mins now.

@shivaram
Copy link
Contributor

Thanks - will keep an eye out to see if the time out errors are gone.

Yunni pushed a commit to Yunni/spark that referenced this pull request Feb 27, 2017
This commit moves developer-specific information from the release-
specific documentation in this repo to the developer tools page on
the main Spark website. This commit relies on this PR on the
Spark website: apache/spark-website#33.

srowen

Author: Kay Ousterhout <[email protected]>

Closes apache#17018 from kayousterhout/SPARK-19684.
@kayousterhout kayousterhout deleted the SPARK-19684 branch April 11, 2017 22:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants