Job Location: Pune
Brief about the Company:
AdZapier Corporation is a global technology and enablement services company with a vision to
transform data into value for everyone. Through a simple open approach, in connecting systems
and data, we provide the data foundation for the worlds best marketers. By making it safe and
easy to activate, validate, enhance, and unify data. We provide marketers with the ability to
deliver relevant messages at scale and tie those messages back to actual results. Our products
and services enable individual-based marketing, allowing our clients to generate a higher ROI
and drive better omni-channel customer experiences.
Position Description:
Join our Information Technology team where you will work on new technologies and find ways to meet our customers needs and make it easy for them to do business with us.
You will use functional expertise to act as an advisor to management and make recommendations on more complex projects. You will use professional concepts and company policies & procedures to solve a wide range of difficult problems creatively and practically.
Responsibilities
- You will be responsible for operations and administration of Cloudera Hadoop platform.
- You will work independently on day to day monitoring and operations of Data Analytics platform.
- You will be required to develop automation using scripting languages. After initial training, you will be able to handle critical operation tasks as well as on demand requests.
What were looking for…
You are curious about new technology and the possibilities it creates. You like solving problems and quickly resolving challenging issues.
Minimum Requirements:
- 3+ years of experience in Software Development including Big Data Analytics area
- Experience in Hadoop Big Data Platform Operations and Administration
- High Proficiency working with Hadoop platform including Hadoop, Hive, Spark/Scala, Java,
Kafka, Flume etc. - Experience with any scripting language such as BASH, Scala or Python
- Good understanding of file formats including JSON, Parquet, Avro, and others
Even better if you have:
- Experience in monitoring and supporting large-scale & highly available systems in a 24/7
environment - Knowledge of container technologies (Docker/Kubernetes)
- Proficiency in Hive and Kafka
- Strong in UNIX, Linux and Shell scripting, Perl, PHP, JavaScript, JSON, Python
- Teamwork & collaboration skills to work across organizations and lead cross-functional teams.
- Problem solving skills to develop quick and solutions to resolve complex issues.
- Understanding of architecture and design across all systems
Qualification:
B.E/B.TECH/M.TECH/MS
Job Location – Mumbai/Pune
Salary – Best as per Industry standard.
Shift timing – 2.30 pm to 11.30 pm (5 days)(US Shift) Work from home during lockdown.
Timings after Covid will be subject to change. (US Shift)
Submit CV To All Data Science Job Consultants Across India For Free

