Before u proceed below to check the jobs/CVs, please select your favorite job categories, whose top job alerts you want in your email & Subscribe to our Email Job Alert Service For FREE
Job Location: Gurgaon
Selected intern’s day-to-day responsibilities include:
1. Install, configure and maintain big data components or technologies (Hadoop, Hive, Spark, and Airflow) as standalone and as clusters
2. Work on cloud-specific similar components/products/services
3. Clean, transform, and analyze vast amounts of raw data from various source systems
4. Design workflows and data processing pipelines
5. Produce unit tests for big data programs and helper methods
6. Write Scaladoc-style documentation with all the code
7. Design data processing pipelines
Submit CV To All Data Science Job Consultants Across Bharat For Free