Skip to content

Commit c76457c

Browse files
felixcheungshivaram
authored andcommitted
[SPARK-10903][SPARKR] R - Simplify SQLContext method signatures and use a singleton
Eliminate the need to pass sqlContext to method since it is a singleton - and we don't want to support multiple contexts in a R session. Changes are done in a back compat way with deprecation warning added. Method signature for S3 methods are added in a concise, clean approach such that in the next release the deprecated signature can be taken out easily/cleanly (just delete a few lines per method). Custom method dispatch is implemented to allow for multiple JVM reference types that are all 'jobj' in R and to avoid having to add 30 new exports. Author: felixcheung <[email protected]> Closes #9192 from felixcheung/rsqlcontext.
1 parent 6d506c9 commit c76457c

File tree

6 files changed

+450
-297
lines changed

6 files changed

+450
-297
lines changed

R/pkg/R/DataFrame.R

Lines changed: 4 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -2213,13 +2213,7 @@ setMethod("write.df",
22132213
signature(df = "SparkDataFrame", path = "character"),
22142214
function(df, path, source = NULL, mode = "error", ...){
22152215
if (is.null(source)) {
2216-
if (exists(".sparkRSQLsc", envir = .sparkREnv)) {
2217-
sqlContext <- get(".sparkRSQLsc", envir = .sparkREnv)
2218-
} else if (exists(".sparkRHivesc", envir = .sparkREnv)) {
2219-
sqlContext <- get(".sparkRHivesc", envir = .sparkREnv)
2220-
} else {
2221-
stop("sparkRHive or sparkRSQL context has to be specified")
2222-
}
2216+
sqlContext <- getSqlContext()
22232217
source <- callJMethod(sqlContext, "getConf", "spark.sql.sources.default",
22242218
"org.apache.spark.sql.parquet")
22252219
}
@@ -2281,15 +2275,9 @@ setMethod("saveAsTable",
22812275
signature(df = "SparkDataFrame", tableName = "character"),
22822276
function(df, tableName, source = NULL, mode="error", ...){
22832277
if (is.null(source)) {
2284-
if (exists(".sparkRSQLsc", envir = .sparkREnv)) {
2285-
sqlContext <- get(".sparkRSQLsc", envir = .sparkREnv)
2286-
} else if (exists(".sparkRHivesc", envir = .sparkREnv)) {
2287-
sqlContext <- get(".sparkRHivesc", envir = .sparkREnv)
2288-
} else {
2289-
stop("sparkRHive or sparkRSQL context has to be specified")
2290-
}
2291-
source <- callJMethod(sqlContext, "getConf", "spark.sql.sources.default",
2292-
"org.apache.spark.sql.parquet")
2278+
sqlContext <- getSqlContext()
2279+
source <- callJMethod(sqlContext, "getConf", "spark.sql.sources.default",
2280+
"org.apache.spark.sql.parquet")
22932281
}
22942282
jmode <- convertToJSaveMode(mode)
22952283
options <- varargsToEnv(...)

0 commit comments

Comments
 (0)