Terragig Llp | Hadoop+GCP Developer – GCP/Big Query Developer | Bhagya Nagar, Chennai, Bengaluru | Bharat | BigDataKB.com | 2023-03-09

Before u proceed below to check the jobs/CVs, please select your favorite job categories, whose top job alerts you want in your email & Subscribe to our Email Job Alert Service For FREE

 

Job Location: Bhagya Nagar, Chennai, Bengaluru

Job Detail:

We have a urgent opening for Hadoop+GCP Developer and GCP/Big Query Developer to Chennai location.

Client : DTech.

Location : Chennai.

BigDataKB.com Jyotish
BigDataKB.com Jyotish - Career & Life Prediction

Experience : 5 To 8 Years.

Roles & Responsibilities :

Role: Hadoop+GCP

Necessary to have:

  • Professional experience with a cloud platform
  • Developer must have sound knowledge in Apache Spark and Python programming.
  • Deep experience in developing data processing tasks using pySpark such as reading data from external sources, merge data, perform data enrichment and load in to target data destinations.
  • Ability to design, build and unit test the application in Spark/Pyspark.
  • In-depth knowledge of Hadoop, Spark, and similar frameworks.
  • Ability to understand existing ETL & logic to convert into Spark/PySpark/ Spark SQL.
  • Knowledge of Unix shell scripting, RDBMS, Hive, HDFS File System, HDFS File Types, HDFS compression codec.
  • Experience in processing large amounts of structured and unstructured data, including integrating data from multiple sources.

Role: GCP/BigQuery Developer

Necessary to have:

.Relevant Industry Work Experience (5+ years for Dev, & Lead)

.Experience extracting data from a variety of sources, and a desire to expand those skills (Excellent knowledge in SQL and Spark is mandatory)

.Strong knowledge of Google BigQuery and architecting data pipelines from on-prem to GCP.

.Experience building applications using Google Cloud Platform related frameworks such as DataProc and GCS at the minimum.

.Excellent Communication Skills to Understand and Pass on Requirements.

.Excellent Data Analysis skills. Must be comfortable with querying and analyzing large amount of data on Hadoop HDFS using Hive and Spark.

.Professional experience with a cloud hosting platform (GCP preferred)

.GCP certification is preferred.

.Knowledge of Power BI, Tableau or other BI Tools is preferred.

.Experience working with finance / treasury datasets.

Interested Candidates can share your CV to below Email ID

[email protected]

Apply Here

Submit CV To All Data Science Job Consultants Across Bharat For Free

LEAVE A REPLY

Please enter your comment!
Please enter your name here