From 5643ab9e652b3d20c335a6d4e7545da0f115d774 Mon Sep 17 00:00:00 2001 From: Yuming Wang Date: Mon, 12 Jun 2017 10:54:11 +0800 Subject: [PATCH 1/8] Improve docs --- docs/building-spark.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/building-spark.md b/docs/building-spark.md index 0f551bc66b8c9..8a0589909ac2e 100644 --- a/docs/building-spark.md +++ b/docs/building-spark.md @@ -219,7 +219,7 @@ The run-tests script also can be limited to a specific Python version or a speci ## Running R Tests To run the SparkR tests you will need to install the R package `testthat` -(run `install.packages(testthat)` from R shell). You can run just the SparkR tests using +(run `install.packages("testthat")` from R shell). You can run just the SparkR tests using the command: ./R/run-tests.sh From d715ae89fba24bb56a2d2ca7fd0e0c1d438851af Mon Sep 17 00:00:00 2001 From: Yuming Wang Date: Mon, 12 Jun 2017 12:53:31 +0800 Subject: [PATCH 2/8] Be consistent with examples-unit-tests and more cleaner --- docs/building-spark.md | 8 +++++--- 1 file changed, 5 insertions(+), 3 deletions(-) diff --git a/docs/building-spark.md b/docs/building-spark.md index 8a0589909ac2e..30d737498107d 100644 --- a/docs/building-spark.md +++ b/docs/building-spark.md @@ -218,9 +218,11 @@ The run-tests script also can be limited to a specific Python version or a speci ## Running R Tests -To run the SparkR tests you will need to install the R package `testthat` -(run `install.packages("testthat")` from R shell). You can run just the SparkR tests using -the command: +To run the SparkR tests you will need to install the [testthat](http://cran.r-project.org/web/packages/testthat/index.html) package first: + + R -e 'install.packages("testthat", repos="http://cran.us.r-project.org")' + +You can run just the SparkR tests using the command: ./R/run-tests.sh From 926137bc1e16b0d25268bd4eb31f862f783298cb Mon Sep 17 00:00:00 2001 From: Yuming Wang Date: Tue, 13 Jun 2017 15:12:32 +0800 Subject: [PATCH 3/8] Add more packages --- R/README.md | 4 ++-- docs/building-spark.md | 4 ++-- 2 files changed, 4 insertions(+), 4 deletions(-) diff --git a/R/README.md b/R/README.md index 4c40c5963db70..3084cb38a8985 100644 --- a/R/README.md +++ b/R/README.md @@ -66,9 +66,9 @@ To run one of them, use `./bin/spark-submit `. For example: ```bash ./bin/spark-submit examples/src/main/r/dataframe.R ``` -You can also run the unit tests for SparkR by running. You need to install the [testthat](http://cran.r-project.org/web/packages/testthat/index.html) package first: +You can also run the unit tests for SparkR by running. You need to install the [knitr](https://cran.r-project.org/package=knitr), [rmarkdown](https://cran.r-project.org/package=rmarkdown), [testthat](https://cran.r-project.org/package=testthat), [e1071](https://cran.r-project.org/package=e1071) and [survival](https://cran.r-project.org/package=survival) packages first: ```bash -R -e 'install.packages("testthat", repos="http://cran.us.r-project.org")' +R -e 'install.packages(c("knitr", "rmarkdown", "testthat", "e1071", "survival"), repos="http://cran.us.r-project.org")' ./R/run-tests.sh ``` diff --git a/docs/building-spark.md b/docs/building-spark.md index 30d737498107d..c44fab4408c8b 100644 --- a/docs/building-spark.md +++ b/docs/building-spark.md @@ -218,9 +218,9 @@ The run-tests script also can be limited to a specific Python version or a speci ## Running R Tests -To run the SparkR tests you will need to install the [testthat](http://cran.r-project.org/web/packages/testthat/index.html) package first: +To run the SparkR tests you will need to install the [knitr](https://cran.r-project.org/package=knitr), [rmarkdown](https://cran.r-project.org/package=rmarkdown), [testthat](https://cran.r-project.org/package=testthat), [e1071](https://cran.r-project.org/package=e1071) and [survival](https://cran.r-project.org/package=survival) packages first: - R -e 'install.packages("testthat", repos="http://cran.us.r-project.org")' + R -e 'install.packages(c("knitr", "rmarkdown", "testthat", "e1071", "survival"), repos="http://cran.us.r-project.org")' You can run just the SparkR tests using the command: From a2fd28a545e9ea0158345704e18582a25034bbac Mon Sep 17 00:00:00 2001 From: Yuming Wang Date: Tue, 13 Jun 2017 23:54:11 +0800 Subject: [PATCH 4/8] Remove duplicate info --- R/README.md | 6 +----- 1 file changed, 1 insertion(+), 5 deletions(-) diff --git a/R/README.md b/R/README.md index 3084cb38a8985..f5cf32160ca53 100644 --- a/R/README.md +++ b/R/README.md @@ -66,11 +66,7 @@ To run one of them, use `./bin/spark-submit `. For example: ```bash ./bin/spark-submit examples/src/main/r/dataframe.R ``` -You can also run the unit tests for SparkR by running. You need to install the [knitr](https://cran.r-project.org/package=knitr), [rmarkdown](https://cran.r-project.org/package=rmarkdown), [testthat](https://cran.r-project.org/package=testthat), [e1071](https://cran.r-project.org/package=e1071) and [survival](https://cran.r-project.org/package=survival) packages first: -```bash -R -e 'install.packages(c("knitr", "rmarkdown", "testthat", "e1071", "survival"), repos="http://cran.us.r-project.org")' -./R/run-tests.sh -``` +You can run the unit tests following [running-r-tests](http://spark.apache.org/docs/latest/building-spark.html#running-r-tests). ### Running on YARN From 0da2f5b5b3ac122a2ba5b295b4ae8c4f4af02758 Mon Sep 17 00:00:00 2001 From: Yuming Wang Date: Wed, 14 Jun 2017 08:06:06 +0800 Subject: [PATCH 5/8] Update to "You can run R unit tests by following the instructions under Running R Tests" --- R/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/R/README.md b/R/README.md index f5cf32160ca53..1152b1e8e5f9f 100644 --- a/R/README.md +++ b/R/README.md @@ -66,7 +66,7 @@ To run one of them, use `./bin/spark-submit `. For example: ```bash ./bin/spark-submit examples/src/main/r/dataframe.R ``` -You can run the unit tests following [running-r-tests](http://spark.apache.org/docs/latest/building-spark.html#running-r-tests). +You can run R unit tests by following the instructions under [Running R Tests](http://spark.apache.org/docs/latest/building-spark.html#running-r-tests). ### Running on YARN From 4bc320e87707b2fcc19586f3436ce6000fad7e88 Mon Sep 17 00:00:00 2001 From: Yuming Wang Date: Wed, 14 Jun 2017 11:05:54 +0800 Subject: [PATCH 6/8] Update WINDOWS doc --- R/WINDOWS.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/R/WINDOWS.md b/R/WINDOWS.md index 9ca7e58e20cd2..5d9d6b7f9580b 100644 --- a/R/WINDOWS.md +++ b/R/WINDOWS.md @@ -34,10 +34,10 @@ To run the SparkR unit tests on Windows, the following steps are required —ass 4. Set the environment variable `HADOOP_HOME` to the full path to the newly created `hadoop` directory. -5. Run unit tests for SparkR by running the command below. You need to install the [testthat](http://cran.r-project.org/web/packages/testthat/index.html) package first: +5. Run unit tests for SparkR by running the command below. You need to install the [knitr](https://cran.r-project.org/package=knitr), [rmarkdown](https://cran.r-project.org/package=rmarkdown), [testthat](https://cran.r-project.org/package=testthat), [e1071](https://cran.r-project.org/package=e1071) and [survival](https://cran.r-project.org/package=survival) packages first: ``` - R -e "install.packages('testthat', repos='http://cran.us.r-project.org')" + R -e "install.packages(c('knitr', 'rmarkdown', 'testthat', 'e1071', 'survival'), repos='http://cran.us.r-project.org')" .\bin\spark-submit2.cmd --conf spark.hadoop.fs.defaultFS="file:///" R\pkg\tests\run-all.R ``` From 75cf0d8d3093917f66ccbf5ee876874891109187 Mon Sep 17 00:00:00 2001 From: Yuming Wang Date: Wed, 14 Jun 2017 17:13:26 +0800 Subject: [PATCH 7/8] Link WINDOWS dependency packages. --- R/WINDOWS.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/R/WINDOWS.md b/R/WINDOWS.md index 5d9d6b7f9580b..69f68cc7d05eb 100644 --- a/R/WINDOWS.md +++ b/R/WINDOWS.md @@ -34,7 +34,7 @@ To run the SparkR unit tests on Windows, the following steps are required —ass 4. Set the environment variable `HADOOP_HOME` to the full path to the newly created `hadoop` directory. -5. Run unit tests for SparkR by running the command below. You need to install the [knitr](https://cran.r-project.org/package=knitr), [rmarkdown](https://cran.r-project.org/package=rmarkdown), [testthat](https://cran.r-project.org/package=testthat), [e1071](https://cran.r-project.org/package=e1071) and [survival](https://cran.r-project.org/package=survival) packages first: +5. Run unit tests for SparkR by running the command below. You need to install the [dependency packages](http://spark.apache.org/docs/latest/building-spark.html#running-r-tests) first: ``` R -e "install.packages(c('knitr', 'rmarkdown', 'testthat', 'e1071', 'survival'), repos='http://cran.us.r-project.org')" From ff6415fecfd1542eaf4eb35e1b2ba418da856761 Mon Sep 17 00:00:00 2001 From: Yuming Wang Date: Thu, 15 Jun 2017 17:36:53 +0800 Subject: [PATCH 8/8] Unified --- R/WINDOWS.md | 3 +-- docs/building-spark.md | 2 +- 2 files changed, 2 insertions(+), 3 deletions(-) diff --git a/R/WINDOWS.md b/R/WINDOWS.md index 69f68cc7d05eb..124bc631be9cd 100644 --- a/R/WINDOWS.md +++ b/R/WINDOWS.md @@ -34,10 +34,9 @@ To run the SparkR unit tests on Windows, the following steps are required —ass 4. Set the environment variable `HADOOP_HOME` to the full path to the newly created `hadoop` directory. -5. Run unit tests for SparkR by running the command below. You need to install the [dependency packages](http://spark.apache.org/docs/latest/building-spark.html#running-r-tests) first: +5. Run unit tests for SparkR by running the command below. You need to install the needed packages following the instructions under [Running R Tests](http://spark.apache.org/docs/latest/building-spark.html#running-r-tests) first: ``` - R -e "install.packages(c('knitr', 'rmarkdown', 'testthat', 'e1071', 'survival'), repos='http://cran.us.r-project.org')" .\bin\spark-submit2.cmd --conf spark.hadoop.fs.defaultFS="file:///" R\pkg\tests\run-all.R ``` diff --git a/docs/building-spark.md b/docs/building-spark.md index c44fab4408c8b..777635a64f83c 100644 --- a/docs/building-spark.md +++ b/docs/building-spark.md @@ -220,7 +220,7 @@ The run-tests script also can be limited to a specific Python version or a speci To run the SparkR tests you will need to install the [knitr](https://cran.r-project.org/package=knitr), [rmarkdown](https://cran.r-project.org/package=rmarkdown), [testthat](https://cran.r-project.org/package=testthat), [e1071](https://cran.r-project.org/package=e1071) and [survival](https://cran.r-project.org/package=survival) packages first: - R -e 'install.packages(c("knitr", "rmarkdown", "testthat", "e1071", "survival"), repos="http://cran.us.r-project.org")' + R -e "install.packages(c('knitr', 'rmarkdown', 'testthat', 'e1071', 'survival'), repos='http://cran.us.r-project.org')" You can run just the SparkR tests using the command: