Job Location: Hyderabad/Secunderabad, Pune
We at Datametica Solutions Private Limited are looking for Bigdata Lead who have a passion for cloud with knowledge of different on-premise and cloud Data implementation in the field of Big Data and Analytic s including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike. Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica’s tools and accelerators
Job Description
Experience : 5 to 13 Years
Location : Pune / Hyderabad
- 5+ years of overall experience in developing, testing & implementing Big data projects using Hadoop, Spark, Hive.
- Hands-on experience playing lead role in Big data projects, responsible for implementing one or more tracks within projects, identifying and assigning tasks within the team and providing technical guidance to team
members. - Experience in setting up Hadoop services, implementing ETL/ELT pipelines, working with Terabytes of data ingestion & processing from varied systems
- Experience working in onshore/offshore model, leading technical discussions with customers, mentoring and guiding teams on technology, preparing HDD & LDD documents
Required Skills and Abilities:
Mandatory Skills Spark, Scala/Pyspark, Hadoop ecosystem including Hive, Sqoop, Impala, Oozie, Hue, Java, Python, SQL, Flume, bash(shell scripting)
Secondary Skills Apache Kafka, Storm, Distributed systems, good understanding of networking, security(platform & data) concepts, Kerberos , Kubernetes
Understanding of Data Governance concepts and experience implementing metadata capture, lineage capture, business glossary
Experience implementing CICD pipelines and working experience with tools like SCM tools such as GIT, Bit bucket, etc
Ability to assign and manage tasks for team members, provide technical guidance, work with architects on HDD, LDD, POCs
Hands on experience in writing data ingestion pipelines, data processing pipelines using spark and sql, experience in implementing SCD type 1 & 2, auditing, exception handling mechanism
Data Warehousing projects implementation with either Java, or Scala based Hadoop programming background.
Proficient with various development methodologies like waterfall, agile/scrum.
Exceptional communication, organization, and time management skills
Collaborative approach to decision-making & Strong analytical skills
Good To Have – Certifications in any of GCP, AWS or Azure, Cloudera
Work on multiple Projects simultaneously, prioritizing appropriatelyAbout Us!
A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.
We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, DataStage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.
Datametica is a key partner of the major cloud service providers – Google, Microsoft, Amazon, Snowflake.Benefits we Provide!
Working with Highly Technical and Passionate, mission-driven people
Subsidized Meals & Snacks
Flexible Schedule
Approachable leadership
Access to various learning tools and programs
Pet Friendly
Certification Reimbursement PolicyCheck out more about us on our website below!
www.datametica.comRoles and Responsibilities
Submit CV To All Data Science Job Consultants Across Bharat For Free

Leave a Reply