Thursday, July 18, 2024
bigdatakb.com submit cv

GlobalLogic | Data Engineer with Bigdata expertise IRC178760 | Bengaluru | Bharat | BigDataKB.com | 2023-03-07

Before u proceed below to check the jobs/CVs, please select your favorite job categories, whose top job alerts you want in your email & Subscribe to our Email Job Alert Service For FREE

 

Job Location: Bengaluru

Job Detail:

Job:
IRC178760

Location:
India – Bangalore

Designation:
Associate Consultant

Experience:
5-10 years

Function:
Engineering

Skills:
Airflow, Apache Kafka, Apache Spark, Hive, Nosql DB, Python, SQL

Remote
Yes

Description:

Join GlobalLogic, to be a valid part of the team working on a huge software project for the world-class company providing M2M / IoT 4G/5G modules e.g. to the automotive, healthcare and logistics industries. Through our engagement, we contribute to our customer in developing the end-user modules’ firmware, implementing new features, maintaining compatibility with the newest telecommunication and industry standards, as well as performing analysis and estimations of the customer requirements.

Requirements:

1. Data engineer with 6 to 8 years of hands on experience working on Big Data Platforms

2. Experience building and optimizing Big data data pipelines and data sets ranging from Data ingestion to Processing to Data Visualization.
3. Good Experience in writing and optimizing Spark Jobs, Spark SQL etc. Should have worked on both batch and steaming data processing
4. Good experience in any one programming language -Scala/Python , Python preferred.
5. Experience in writing and optimizing complex Hive and SQL queries to process huge data. good with UDFs, tables, joins,Views etc
6. Experience in using Kafka or any other message brokers
7. Configuring, monitoring and scheduling of jobs using Oozie and/or Airflow
8. Processing streaming data directly from Kafka using Spark jobs, expereince in Spark- streaming is must
9. Should be able to handling different file formats (ORC, AVRO and Parquet) and unstructured data
10. Should have experience with any one No SQL databases like Amazon S3 etc
11. Should have worked on any of the Data warehouse tools like AWS Redshift or Snowflake or BigQuery etc
12. Work expereince on any one cloud AWS or GCP or Azure


Good to have skills:

1. Experience in AWS cloud services like EMR, S3, Redshift, EKS/ECS etc
2. Experience in GCP cloud services like Dataproc, Google storage etc
3. Experience in working with huge Big data clusters with millions of records
4. Experience in working with ELK stack, specially Elasticsearch
5. Experience in Hadoop MapReduce, Apache Flink, Kubernetes etc

Job Responsibilities:

1. Data engineer with 6 to 8 years of hands on experience working on Big Data Platforms
2. Experience building and optimizing Big data data pipelines and data sets ranging from Data ingestion to Processing to Data Visualization.
3. Good Experience in writing and optimizing Spark Jobs, Spark SQL etc. Should have worked on both batch and steaming data processing
4. Good experience in any one programming language -Scala/Python , Python preferred.
5. Experience in writing and optimizing complex Hive and SQL queries to process huge data. good with UDFs, tables, joins,Views etc
6. Experience in using Kafka or any other message brokers
7. Configuring, monitoring and scheduling of jobs using Oozie and/or Airflow
8. Processing streaming data directly from Kafka using Spark jobs, expereince in Spark- streaming is must
9. Should be able to handling different file formats (ORC, AVRO and Parquet) and unstructured data
10. Should have experience with any one No SQL databases like Amazon S3 etc
11. Should have worked on any of the Data warehouse tools like AWS Redshift or Snowflake or BigQuery etc
12. Work expereince on any one cloud AWS or GCP or Azure


Good to have skills:

1. Experience in AWS cloud services like EMR, S3, Redshift, EKS/ECS etc
2. Experience in GCP cloud services like Dataproc, Google storage etc
3. Experience in working with huge Big data clusters with millions of records
4. Experience in working with ELK stack, specially Elasticsearch
5. Experience in Hadoop MapReduce, Apache Flink, Kubernetes etc

What We Offer

Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them.


Collaborative Environment:
You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities!

Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays.


Professional Development:
Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings.


Excellent Benefits:
We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses.


Fun Perks:
We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!

About GlobalLogic

GlobalLogic is a leader in digital engineering. We help brands across the globe design and build innovative products, platforms, and digital experiences for the modern world. By integrating experience design, complex engineering, and data expertise—we help our clients imagine what’s possible, and accelerate their transition into tomorrow’s digital businesses. Headquartered in Silicon Valley, GlobalLogic operates design studios and engineering centers around the world, extending our deep expertise to customers in the automotive, communications, financial services, healthcare and life sciences, manufacturing, media and entertainment, semiconductor, and technology industries. GlobalLogic is a Hitachi Group Company operating under Hitachi, Ltd. (TSE: 6501) which contributes to a sustainable society with a higher quality of life by driving innovation through data and technology as the Social Innovation Business.

Apply Here

Submit CV To All Data Science Job Consultants Across Bharat For Free

Explore Our Youtube Channel

Career Guidance

Codebasics Data Analyst Bootcamp 3.0 Review By BigDataKB.com

In this article we will review Codebasics Data Analyst...

IIT Madras Diploma in Data Science Join or Not?

IIT Madras is one of the very active IITs...

No Placements at My College – Bad Admission ?

Recently, one of our dear subscribers on our youtube...

Best Power BI Course For Power BI Internship ?

This Power BI course has been created from BigDataKB.com/Bharat...
VeerBharat Yadav | King of Career Guidance
VeerBharat Yadav | King of Career Guidancehttps://bigdatakb.com
No More Pain For Job Search- Let's Take Out Some Time For Society... I am From Bharat.., The Most Blessed Country...Jai Hind- Bharat Mata Ki Jai....

Prometrics Solutions | Python Developer / Trainee Apply now | The Great Bharat | Bharat | 2024-07-17

Job Location: The Great Bharat Job Detail: We are looking for an Fresher Developer with a strong background in Python development to design, develop, and maintain...

Yali Aerospace | Project Intern Cloud computing | Thanjāvūr | Bharat | 2024-07-17

Job Location: Thanjāvūr Job Detail: Location: ThanjavurInternship Period: 2 monthsAbout Yali Aerospace:Yali Aerospace is a pioneering company in the drone manufacturing industry, focusing on healthcare,...

Bonjour | Artificial Intelligence AI Internship | Noida | Bharat | 2024-07-17

Job Location: Noida Job Detail: About the internship Selected intern's day-to-day responsibilities include:1. Conduct research to explore new AI algorithms and techniques.2. Gather,...

LEAVE A REPLY

Please enter your comment!
Please enter your name here