Skip to content
Snippets Groups Projects
Commit 99386fe3 authored by Kevin Yu's avatar Kevin Yu Committed by Wenchen Fan
Browse files

[SPARK-15804][SQL] Include metadata in the toStructType

## What changes were proposed in this pull request?
The help function 'toStructType' in the AttributeSeq class doesn't include the metadata when it builds the StructField, so it causes this reported problem https://issues.apache.org/jira/browse/SPARK-15804?jql=project%20%3D%20SPARK when spark writes the the dataframe with the metadata to the parquet datasource.

The code path is when spark writes the dataframe to the parquet datasource through the InsertIntoHadoopFsRelationCommand, spark will build the WriteRelation container, and it will call the help function 'toStructType' to create StructType which contains StructField, it should include the metadata there, otherwise, we will lost the user provide metadata.

## How was this patch tested?

added test case in ParquetQuerySuite.scala

(If this patch involves UI changes, please attach a screenshot; otherwise, remove this)

Author: Kevin Yu <qyu@us.ibm.com>

Closes #13555 from kevinyu98/spark-15804.
parent 147c0208
No related branches found
No related tags found
No related merge requests found
......@@ -91,7 +91,7 @@ package object expressions {
implicit class AttributeSeq(val attrs: Seq[Attribute]) extends Serializable {
/** Creates a StructType with a schema matching this `Seq[Attribute]`. */
def toStructType: StructType = {
StructType(attrs.map(a => StructField(a.name, a.dataType, a.nullable)))
StructType(attrs.map(a => StructField(a.name, a.dataType, a.nullable, a.metadata)))
}
// It's possible that `attrs` is a linked list, which can lead to bad O(n^2) loops when
......
......@@ -625,6 +625,21 @@ class ParquetQuerySuite extends QueryTest with ParquetTest with SharedSQLContext
}
}
}
test("SPARK-15804: write out the metadata to parquet file") {
val df = Seq((1, "abc"), (2, "hello")).toDF("a", "b")
val md = new MetadataBuilder().putString("key", "value").build()
val dfWithmeta = df.select('a, 'b.as("b", md))
withTempPath { dir =>
val path = dir.getCanonicalPath
dfWithmeta.write.parquet(path)
readParquetFile(path) { df =>
assert(df.schema.last.metadata.getString("key") == "value")
}
}
}
}
object TestingUDT {
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment