diff --git a/dev/create-release/release-build.sh b/dev/create-release/release-build.sh index 96f9b5714ebb8..4487cd39f9cb2 100755 --- a/dev/create-release/release-build.sh +++ b/dev/create-release/release-build.sh @@ -81,7 +81,7 @@ BASE_DIR=$(pwd) MVN="build/mvn --force" PUBLISH_PROFILES="-Pmesos -Pyarn -Phive -Phive-thriftserver -Phadoop-2.2" -PUBLISH_PROFILES="$PUBLISH_PROFILES -Pspark-ganglia-lgpl -Pkinesis-asl" +PUBLISH_PROFILES="$PUBLISH_PROFILES -Pspark-ganglia-lgpl" rm -rf spark git clone https://git-wip-us.apache.org/repos/asf/spark.git diff --git a/docs/streaming-kinesis-integration.md b/docs/streaming-kinesis-integration.md index 96198ddf537b6..44abe5121a960 100644 --- a/docs/streaming-kinesis-integration.md +++ b/docs/streaming-kinesis-integration.md @@ -119,7 +119,9 @@ A Kinesis stream can be set up at one of the valid Kinesis endpoints with 1 or m 3. **Deploying:** As with any Spark applications, `spark-submit` is used to launch your application. However, the details are slightly different for Scala/Java applications and Python applications. - For Scala and Java applications, if you are using SBT or Maven for project management, then package `spark-streaming-kinesis-asl_{{site.SCALA_BINARY_VERSION}}` and its dependencies into the application JAR. Make sure `spark-core_{{site.SCALA_BINARY_VERSION}}` and `spark-streaming_{{site.SCALA_BINARY_VERSION}}` are marked as `provided` dependencies as those are already present in a Spark installation. Then use `spark-submit` to launch your application (see [Deploying section](streaming-programming-guide.html#deploying-applications) in the main programming guide). + You'll first need to perform a custom build of Spark. **Note that by linking to this library, you will include [ASL](https://aws.amazon.com/asl/)-licensed code in your application.** Download Spark source and follow the [instructions](building-spark.html) to build Spark with profile *-Pkinesis-asl*. + + For Scala and Java applications, if you are using SBT or Maven for project management, you'll need to package `spark-streaming-kinesis-asl_{{site.SCALA_BINARY_VERSION}}` and its dependencies into the application JAR. Make sure `spark-core_{{site.SCALA_BINARY_VERSION}}` and `spark-streaming_{{site.SCALA_BINARY_VERSION}}` are marked as `provided` dependencies as those are already present in a Spark installation. Then use `spark-submit` to launch your application (see [Deploying section](streaming-programming-guide.html#deploying-applications) in the main programming guide). For Python applications which lack SBT/Maven project management, `spark-streaming-kinesis-asl_{{site.SCALA_BINARY_VERSION}}` and its dependencies can be directly added to `spark-submit` using `--packages` (see [Application Submission Guide](submitting-applications.html)). That is,