Skip to content
Snippets Groups Projects
Commit ace4079c authored by Reynold Xin's avatar Reynold Xin Committed by Herman van Hovell
Browse files

[SPARK-18714][SQL] Add a simple time function to SparkSession


## What changes were proposed in this pull request?
Many Spark developers often want to test the runtime of some function in interactive debugging and testing. This patch adds a simple time function to SparkSession:

```
scala> spark.time { spark.range(1000).count() }
Time taken: 77 ms
res1: Long = 1000
```

## How was this patch tested?
I tested this interactively in spark-shell.

Author: Reynold Xin <rxin@databricks.com>

Closes #16140 from rxin/SPARK-18714.

(cherry picked from commit cb1f10b4)
Signed-off-by: default avatarHerman van Hovell <hvanhovell@databricks.com>
parent e362d998
No related branches found
No related tags found
No related merge requests found
......@@ -618,6 +618,22 @@ class SparkSession private(
@InterfaceStability.Evolving
def readStream: DataStreamReader = new DataStreamReader(self)
/**
* Executes some code block and prints to stdout the time taken to execute the block. This is
* available in Scala only and is used primarily for interactive testing and debugging.
*
* @since 2.1.0
*/
@InterfaceStability.Stable
def time[T](f: => T): T = {
val start = System.nanoTime()
val ret = f
val end = System.nanoTime()
// scalastyle:off println
println(s"Time taken: ${(end - start) / 1000 / 1000} ms")
// scalastyle:on println
ret
}
// scalastyle:off
// Disable style checker so "implicits" object can start with lowercase i
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment