Job Location: Bhopal
Important Things You Will Do:
- Participate in requirements gathering, analysis, and systems design.
- Collaborate with team and clients to produce efficient software design and architecture.
- Write clean, scalable code with an emphasis on efficiency and reusability.
- Test and deploy applications, systems, and features.
- Revise, update, refactor, and debug code.
- Improve existing systems and applications.
- Assist in documentation development throughout the software development life cycle.
- Maintain code quality, organization, and automatization.
- Participate in the Agile process, including Sprint Planning, Stand-ups, and Retros.
Our Ideal Candidate (with a combination of education and work experience):
- 3+ years of experience with the Apache Spark, Python.
- Hands on experience in bigdata tech stack such as spark, Hadoop using python programming skill
- Experience in implementing the data from different data source, pre-processing using pySpark/Datafactory.
- Good knowledge on Spark Streaming and Spark batch.
- Good knowledge on Kafka/Eventhub/Kenesis.
- Hand-on experience with any of the cloud technology such as AWS.
- Working experience with Agile methodology.
- Experience with non-relational and relational databases.
- Ability to problem solve and break down the complex problem into solvable pieces.
- Ability to deliver quick and accurate results for each analysis/task.
- Ability to do work within tight timelines.