Job Location: Hyderābād
Discover the Unexpected
Experian is the world’s leading global information services company. We are listed on the London Stock Exchange (EXPN) and are a constituent of the FTSE 100 Index. We’re passionate about unlocking the power of data in order to transform lives and create opportunities for consumers, businesses and society. For more than 125 years, we’ve helped businesses grow, consumers and small businesses gain access to financial services, and economies and communities flourish – and we’re not done.
Our 18k amazing employees in 40+ countries believe the possibilities for you, and the world, are growing. We’re investing in the future, through new technologies, talented people and innovation so we can help create a better tomorrow.
To do this we employ the greatest and brightest minds that share our purpose and want to make a difference. Experian Asia Pacific’s culture, people and environments are key differentiators. We focus on what truly matters; diversity and inclusion, work/life balance, flexible working, development, equity, engagement, collaboration, wellness, reward & recognition, volunteering… the list goes on. We’re committed to fostering a strong sense of belonging and a place where you can bring your true self to work.
Our uniqueness is that we truly value yours. We’re an award winning organisation due to our strong people first focus. This includes Top Employer™ and Great Place To Work™ accreditations.
Learn more at www.experianplc.com
- BS degree in computer science, computer engineering or equivalent
- 5 – 6 years of experience delivering enterprise software solutions ( > 8 years for lead)
- Proficient in Spark, Scala, Python, AWS Cloud technologies
- 3+ years of experience across multiple Hadoop / Spark technologies such as Hadoop, MapReduce, HDFS, HBase, Hive, Flume, Sqoop, Kafka, Scala
- Flair for data, schema, data model, how to bring efficiency in big data related life cycle
- Must be able to quickly understand technical and business requirements and can translate them into technical implementations
- Experience with Agile Development methodologies
- Experience with data ingestion and transformation
- Solid understanding of secure application development methodologies
- Experienced in developing microservices using spring framework is a plus
- Experience with Airflow and Python will be preferred
- Understanding of automated QA needs related to Big data
- Strong object-oriented design and analysis skills
- Excellent written and verbal communication skills
- Utilize your software engineering skills including Java, Spark, Python, Scala to Analyze disparate, complex systems and collaboratively design new products and services
- Integrate new data sources and tools
- Implement scalable and reliable distributed data replication strategies
- Ability to mentor and provide direction in architecture and design to onsite/offshore developers
- Collaborate with other teams to design and develop and deploy data tools that support both operations and product use cases
- Perform analysis of large data sets using components from the Hadoop ecosystem
- Own product features from the development, testing through to production deployment
- Evaluate big data technologies and prototype solutions to improve our data processing architecture
Automate everything
Additional Information
Experian Careers – Creating a better tomorrow together
Find out what its like to work for Experian by clicking here
Submit CV To All Data Science Job Consultants Across Bharat For Free


