How to Convert Map Data into Key - Value Columns DataFrame in Scala.
Hi Friends,
In this post, I'd like to explain that how to convert a Map data to a Spark DataFrame having columns named key and value which contains key and value data from the given Map Data.
Input Data : Map("Id" -> "111" , "Name" -> "Anamika Singh", "City" -> "Bangalore")
Output DataFrame :
In this post, I'd like to explain that how to convert a Map data to a Spark DataFrame having columns named key and value which contains key and value data from the given Map Data.
Input Data : Map("Id" -> "111" , "Name" -> "Anamika Singh", "City" -> "Bangalore")
Output DataFrame :
Below is the code with explanation to achieve the above output from the given Map data.
import org.apache.spark.sql.SparkSession
import org.apache.spark.SparkConf
import org.apache.spark.sql.functions._
object ConvertMapToColumn extends App {
//Creating SparkSession
lazy val conf = new SparkConf().setAppName("map-to-key-value-column").setIfMissing("spark.master", "local[*]")
lazy val sparkSession = SparkSession.builder().config(conf).getOrCreate()
import sparkSession.implicits._
//Creating Input Map to Convert into DataFrame
val inputMap = Map( "Id" -> "111" , "Name" -> "Anamika Singh", "City" -> "Bangalore" )
//Creating UDF to fetch key and value from Map into separate column
val getValueFromMap = udf ((x: String) => inputMap.getOrElse(x, null.asInstanceOf[String]) )
//Creating DataFrame from MapData
val mapDF = Seq("Id", "Name", "City").toDF("mapData")
mapDF.show(false)
//Creating Key - Value column from above DataFrame : mapDF
val keyValueDF = mapDF.select(col("mapData") as "key" , getValueFromMap($"mapData").as("value"))
keyValueDF.show(false)
}
Output :
I hope, this post was helpful. Please do like, comment and share.
Thank You!
Comments
Post a Comment