Job Location: Delhi / NCR
Roles and Responsibilities
Role :AVP Architect Big Data
Reporting to: SVP โ Software Development
Function: Technology / Services
Experience: 10-14 years
Location: Delhi
Role Description
Principal Big Data Architect shall be responsible for driving the data management strategy and architecture for Business Intelligence and Fraud Analytics vertical. The incumbent shall create strategies for acquiring, staging, processing, storing and finally reporting in visual as well as text mode. The data management responsibility entails handling peta byte scale data and creating frameworks and workflows to process and report in real-time/near-real-time modes. Incumbent must have a proven hands on record on Hadoop eco-system and related technologies. The candidate must have an avid experience of Open-Source technologies and toolsets.
Key Skillsets
Incumbent must have strong software design pattern knowledge to design massive scale data platform for performance. S/he shall build solutions using latest technologies and modern software algorithms for high-performance real-time analytics requirements. S/he shall be responsible for data driven architect requiring approaches for reducing the complexity and response of queries. Some responsibilities include (but not limited to)
- o Manage end to end life cycle of complete data analytics solution.
- o Shall be able to understand all the sources of data and shall come up with a solution for integrating, staging, warehousing and marting all the data.
- o Shall have an end-to-end vision of the project, and to see how an analytical model will translate into system design, development and operation, and how the data will flow through the successive stages involved in the project.
- o Designing relational and non-relational databases, developing strategies for data acquisitions, archive recovery, and implementation of database, cleaning and maintaining the database by removing and deleting old data.
- o Shall be responsible for Integration with identified external and internal applications/ agencies/ departments in the future.
- o Shall be able to design feedback mechanism architecture to refine the models created by Data Experts.
- o Shall be able to write complex queries, scale it to multiple machines and ensures backups.
- o Shall work closely with Data Scientists to make sure the engineering they put in place handles machine learning models correctly
Key Interfaces:
External:
Internal:
- MSPs
- System Integrators
- Product and Services Vendors
Internal Departments i.e. Services; Procurements and Contracts; Customer Service, Technology
Key Attributes & Skills:
- Should have 10+ experience of working on popular Hadoop distribution platforms like Apache, Cloudera, HortonWorks etc.
- B.E/ B. Tech/ M. Tech /MCA with at least 10+ years of experience, 4 years as a Data Architect of which 3 years should be as a Big Data Architect.
- Should have sound knowledge of various MDM and ELT/ETL tools and data design.
- Should have prior experience in handling specialized and complex architectural issues involving applications working on Data of petabytes scale.
- Should have very strong fundamentals of data structures and algorithms
Submit CV To All Data Science Job Consultants Across India For Free

Leave a Reply