Skip to content

Conversation

@foxish
Copy link
Member

@foxish foxish commented Jan 4, 2018

runner.sh is useful in general for anyone looking to test against an arbitrary k8s cluster.
e2e-prow.sh has some specific setup for the prow testing environment.
e2e-minikube.sh has setup for local minikube environments.

Tested with:

./e2e/runner.sh -m https://x.y.z.w -i docker.io/foxish

cc @liyinan926 @kimoonkim

@foxish
Copy link
Member Author

foxish commented Jan 5, 2018

CI is now working against GKE - https://k8s-testgrid.appspot.com/sig-big-data#spark-k8s-periodic, and using the changes in this branch.
I'll point to master after it merges.

Looks like there are a couple of failures in the runs of the integration tests to be addressed as well.

@foxish foxish force-pushed the cloud-testing-scripts branch from 4f7022f to 1f9ebc3 Compare January 5, 2018 07:50
@foxish
Copy link
Member Author

foxish commented Jan 5, 2018

@kimoonkim, integration test failure looks like a minikube issue on the jenkins node.

@kimoonkim
Copy link
Member

Yes, there was a minikube issue. I don't know the root cause, but it seems to have disappeared.

@kimoonkim
Copy link
Member

rerun integration tests please

Copy link
Member

@kimoonkim kimoonkim left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM overall. I like the change. A few minor suggestions below. PTAL.

}

### Basic Validation ###
if [ ! -d "integration-test" ]; then
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a good check. I like it.

e2e/e2e-cloud.sh Outdated
tag=$(git rev-parse HEAD | cut -c -6)
echo "Spark distribution built at SHA $tag"

cd dist && ./sbin/build-push-docker-images.sh -r $IMAGE_REPO -t $tag build
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like this separation of docker image building and running integration tests.

# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add a one-liner comment for what this script does?

# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add a comment for what this script does and what are the requirements for running the script?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done


# Install basic dependencies
echo "deb http://http.debian.net/debian jessie-backports main" >> /etc/apt/sources.list
apt-get update && apt-get install -y curl wget git tar
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm. So this works only for debian? Maybe mention this as a requirement at the file level comment?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Mentioned above - this is specific to the k8s testing environment used by the Kubernetes project. It's not intended to be used elsewhere.

e2e/e2e-prow.sh Outdated
root=$(pwd)
master=$(kubectl cluster-info | head -n 1 | grep -oE "https?://[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}(:[0-9]+)?")
repo="https://github.com/apache/spark"
image_repo="gcr.io/spark-testing-191023"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is this magic number 191023? Is it ok to hard code?


cd "$(dirname "$0")"/../
./e2e/e2e-cloud.sh -m $master -r $repo -i $image_repo
ls -1 ./integration-test/target/surefire-reports/*.xml | cat -n | while read n f; do cp "$f" "/workspace/_artifacts/junit_0$n.xml"; done
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Comment why do we create the junit_ files?

@foxish
Copy link
Member Author

foxish commented Jan 5, 2018

@kimoonkim, added comments and a new script to run for minikube. Should just work out of the box. PTAL.

e2e/runner.sh Outdated
### Set sensible defaults ###
REPO="https://github.com/apache/spark"
IMAGE_REPO="docker.io/kubespark"
DEPLOY_MODE="cloud"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would assume minikube is better as default as it is less assuming than GCE.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sgtm, will flip that.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

fi
else
# -m option for minikube.
cd dist && ./sbin/build-push-docker-images.sh -m -r $IMAGE_REPO -t $tag build
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm. Where are this build dir and the build script? What if they don't exist?

Copy link
Member Author

@foxish foxish Jan 5, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

build and push are arguments that the script takes.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

They need not be valid directories. The build script is part of the upstream spark distribution, and exists on our fork as well.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see. This is spark/dist dir inside a spark repo.

@kimoonkim
Copy link
Member

LGTM and I approved just now. Thanks for writing this change!

@foxish
Copy link
Member Author

foxish commented Jan 5, 2018

Thanks for reviewing. Merging now

@foxish foxish merged commit 1e5b290 into master Jan 5, 2018
@foxish foxish deleted the cloud-testing-scripts branch January 5, 2018 21:19
@foxish foxish restored the cloud-testing-scripts branch January 5, 2018 21:19
@foxish
Copy link
Member Author

foxish commented Jan 5, 2018

(not deleting the branch for now because the cloud jobs are still using it)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants