Back to Browse

Track NULL values anywhere in a Spark DataFrame | Important Spark Use case | Interview Question

4.6K views
Oct 30, 2020
23:33

In this video I have explained how, you can track null values anywhere in a spark data frame, usually, .isNull function can track you null values when you have a limited number of columns, but for a huge number of spark data frame columns, there might be a need of a generic solution that can track the null values anywhere in the data frame. End to End pipeline Introduction Videos: Pyspark End to End Pipeline https://youtu.be/upDxnBhYwHI Spark + Scala End to End Pipeline https://youtu.be/wZgQthFG6Vw Complete END to END big data pipeline courses is available now!! Course Links: Spark + SCALA course available at : https://courses.gkcodelabs.com/product/big-data-batch-processing-pipeline-for-beginners-end-to-end-spark-scala/ PySpark course available at : https://courses.gkcodelabs.com/product/big-data-batch-processing-pipeline-for-beginners-end-to-end-pyspark/ Starter Pack available at just: ₹549 (For Indian Payments) or $9 (For non-Indian payments) Extended Pack available at just: ₹1299 (For Indian Payments) or $19 (For non-Indian payments) Queries? Write to us at : [email protected] Website: https://www.gkcodelabs.com

Download

1 formats

Video Formats

360pmp433.6 MB

Right-click 'Download' and select 'Save Link As' if the file opens in a new tab.

Track NULL values anywhere in a Spark DataFrame | Important Spark Use case | Interview Question | NatokHD