Skip to content

Conversation

@rxin
Copy link
Contributor

@rxin rxin commented Dec 5, 2016

What changes were proposed in this pull request?

Many Spark developers often want to test the runtime of some function in interactive debugging and testing. This patch adds a simple time function to SparkSession:

scala> spark.time { spark.range(1000).count() }
Time taken: 77 ms                                                               
res1: Long = 1000

How was this patch tested?

I tested this interactively in spark-shell.

@rxin
Copy link
Contributor Author

rxin commented Dec 5, 2016

cc @marmbrus

@SparkQA
Copy link

SparkQA commented Dec 5, 2016

Test build #69662 has started for PR 16140 at commit 76d2f53.

@dongjoon-hyun
Copy link
Member

Retest this please

@rxin
Copy link
Contributor Author

rxin commented Dec 5, 2016

It's ok no need to retest. The change is fine.

@dongjoon-hyun
Copy link
Member

I see. BTW, @rxin . Do we need spark to measure time? Is it possible to add that into SparkSession companion object?

@rxin
Copy link
Contributor Author

rxin commented Dec 5, 2016

That's much more difficult to type though.

@dongjoon-hyun
Copy link
Member

True. Never mind. I just thought we were able to use like the following.

    SparkSession.time { SparkSession.builder.appName("Spark Pi").getOrCreate() }

@SparkQA
Copy link

SparkQA commented Dec 5, 2016

Test build #69664 has finished for PR 16140 at commit 76d2f53.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@srowen
Copy link
Member

srowen commented Dec 5, 2016

shrug It seems pretty unrelated to SparkSession. I get that it's a convenience method but something Spark really needs to expose as an API method?

@rxin
Copy link
Contributor Author

rxin commented Dec 6, 2016

The cost to maintain this seems very small though, and I'd definitely use it all the time in the repl. In Databricks this is not an issue since the environment always appends the time, but I really miss this in the repl.

@hvanhovell
Copy link
Contributor

LGTM - merging to master/2.1. Thanks!

asfgit pushed a commit that referenced this pull request Dec 6, 2016
## What changes were proposed in this pull request?
Many Spark developers often want to test the runtime of some function in interactive debugging and testing. This patch adds a simple time function to SparkSession:

```
scala> spark.time { spark.range(1000).count() }
Time taken: 77 ms
res1: Long = 1000
```

## How was this patch tested?
I tested this interactively in spark-shell.

Author: Reynold Xin <[email protected]>

Closes #16140 from rxin/SPARK-18714.

(cherry picked from commit cb1f10b)
Signed-off-by: Herman van Hovell <[email protected]>
@asfgit asfgit closed this in cb1f10b Dec 6, 2016
robert3005 pushed a commit to palantir/spark that referenced this pull request Dec 15, 2016
## What changes were proposed in this pull request?
Many Spark developers often want to test the runtime of some function in interactive debugging and testing. This patch adds a simple time function to SparkSession:

```
scala> spark.time { spark.range(1000).count() }
Time taken: 77 ms
res1: Long = 1000
```

## How was this patch tested?
I tested this interactively in spark-shell.

Author: Reynold Xin <[email protected]>

Closes apache#16140 from rxin/SPARK-18714.
uzadude pushed a commit to uzadude/spark that referenced this pull request Jan 27, 2017
## What changes were proposed in this pull request?
Many Spark developers often want to test the runtime of some function in interactive debugging and testing. This patch adds a simple time function to SparkSession:

```
scala> spark.time { spark.range(1000).count() }
Time taken: 77 ms
res1: Long = 1000
```

## How was this patch tested?
I tested this interactively in spark-shell.

Author: Reynold Xin <[email protected]>

Closes apache#16140 from rxin/SPARK-18714.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants