In this video I have explained how, you can track null values anywhere in a spark data frame, usually, .isNull function can track you null values when you have a limited number of columns, but for a huge number of spark data frame columns, there might be a need of a generic solution that can track the null values anywhere in the data frame.
End to End pipeline Introduction Videos:
Pyspark End to End Pipeline
https://youtu.be/upDxnBhYwHI
Spark + Scala End to End Pipeline
https://youtu.be/wZgQthFG6Vw
Complete END to END big data pipeline courses is available now!!
Course Links:
Spark + SCALA course available at :
https://courses.gkcodelabs.com/product/big-data-batch-processing-pipeline-for-beginners-end-to-end-spark-scala/
PySpark course available at :
https://courses.gkcodelabs.com/product/big-data-batch-processing-pipeline-for-beginners-end-to-end-pyspark/
Starter Pack available at just: ₹549 (For Indian Payments) or $9 (For non-Indian payments)
Extended Pack available at just: ₹1299 (For Indian Payments) or $19 (For non-Indian payments)
Queries? Write to us at : [email protected]
Website: https://www.gkcodelabs.com