Skip to content
Snippets Groups Projects
Commit d17e5f2f authored by Dongjoon Hyun's avatar Dongjoon Hyun Committed by Shivaram Venkataraman
Browse files

[SPARK-16233][R][TEST] ORC test should be enabled only when HiveContext is available.

## What changes were proposed in this pull request?

ORC test should be enabled only when HiveContext is available.

## How was this patch tested?

Manual.
```
$ R/run-tests.sh
...
1. create DataFrame from RDD (test_sparkSQL.R#200) - Hive is not build with SparkSQL, skipped

2. test HiveContext (test_sparkSQL.R#1021) - Hive is not build with SparkSQL, skipped

3. read/write ORC files (test_sparkSQL.R#1728) - Hive is not build with SparkSQL, skipped

4. enableHiveSupport on SparkSession (test_sparkSQL.R#2448) - Hive is not build with SparkSQL, skipped

5. sparkJars tag in SparkContext (test_Windows.R#21) - This test is only for Windows, skipped

DONE ===========================================================================
Tests passed.
```

Author: Dongjoon Hyun <dongjoon@apache.org>

Closes #14019 from dongjoon-hyun/SPARK-16233.
parent d601894c
No related branches found
No related tags found
No related merge requests found
......@@ -1725,6 +1725,7 @@ test_that("mutate(), transform(), rename() and names()", {
})
test_that("read/write ORC files", {
setHiveContext(sc)
df <- read.df(jsonPath, "json")
# Test write.df and read.df
......@@ -1741,6 +1742,7 @@ test_that("read/write ORC files", {
expect_equal(count(orcDF), count(df))
unlink(orcPath2)
unsetHiveContext()
})
test_that("read/write Parquet files", {
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment