Monday, September 4, 2017

spark job - Caused by: java.lang.OutOfMemoryError: Java heap space

User reading a large size json file (280GB) using sqlContext.json.  

val loadJsonFile = sqlContext.jsonFile("/data/filename.txt") 

Error

org.apache.spark.SparkException: Job aborted
Caused by: java.lang.OutOfMemoryError: Java heap space 

Executed  application with an increased driver memory for fix the issue

spark-shell --driver-memory 3g

No comments: