Job Location: Bangalore/Bengaluru
Roles and Responsibilities
Job Responsibilities:
- Review and understand target state data strategy and end-to-end data management systems architecture (including meta-data management, data quality / profiling, data integration, master data management, data provisioning and Data Lake solution components)
- Work with Azure Public Cloud subject matter experts to build DEV, QA and PROD, including Disaster Recovery, Data Lake environments to ensure proper configuration aligned with Architecture and business requirements.
- Provide 2nd level support to Enterprise Data Platforms, Management team and end-users to identify root causes and remediate environment availability and performance incidents.
- Implement dynamic environment provisioning services within Azure Public Cloud platforms for Digital and Business Sandboxes.
- Test and validate automation scripts. Coordinate with Production Support to ensure that dynamic provisioning services are ready to be transitioned under production control
- Review and understand target state data strategy and end-to-end data management systems architecture (including meta-data management, data quality / profiling, data integration, master data management, data provisioning and Data Lake solution components)
- Work with Data ingestion and integration tool vendor and subject matter experts to install, configure, and test tool functions checks and transformation logic within pipeline
- Implement integration hooks between Data Ingestion tool and other components, such as Azure Data Lake and EDP(Enterprise Data Platform) Tools
- Build and execute functional, integration and load / performance test
- Coordinate with Data Integration tool vendor to address defects, install patches and validate hot-fixes and patches
- Coordinate with Release Management to promote Data ingestion software releases from DEV to UAT and PROD
- Provide in-depth analysis of the hybrid cloud solutions to ensure high quality systems and performance.
- Design, build, maintain and support server landscape, including Operating Systems and Hypervisors on all platforms
- Provide 2nd and 3rd level support to end-users as related to all deployed platforms and Cloud services
- May be asked to be a technical lead on some teams/Projects
- To manage personal performance incorporating a commitment to ongoing professional development and continuous learning
- Act as an escalation point of contact for GIS Service delivery, issues, requests and concerns following the ITSM processes
- Provide accurate and complete documentation of processes and procedures to ensure that systems can be supported
- May be responsible to assign personnel to various tasks, direct their activities & evaluate their work
- Will be required to be on-call, where travel may be required to any of the Apotex and/or affiliate sites.
- Provides input in the development/amendment of standard operating procedures
- Write programmatic scripts for process and task automation with knowledge of at least one scripting language (such as VB Script or PowerShell)
- Monitor and administer the Apotex Enterprise Data Platform tools stack (Azure, Informatica and Zettalabs)
- Works in a safe manner collaborating as a team member to achieve all outcomes.
- Demonstrate Behaviours that exhibit our organizational Values: Collaboration, Courage, Perseverance, and Passion.
- Ensure personal adherence with all compliance programs including the Global Business Ethics and Compliance Program, Global Quality policies and procedures, Safety and Environment policies, and HR policies.
- All other relevant duties as assigned.
Job Requirements:
- Education
- Undergraduate degree or Post-Secondary Diploma in Computer Science, Engineering, or related work experience. Relevant technology certifications are an asset;
- Knowledge, Skills and Abilities
- Knowledge of EDP Architecture, Hadoop, HDFS, & Linux directory & file
- Good understanding of Microsoft Azure Services and Data storage options (ADLS, HDInsight, structure & Namespace)
- Understanding of software development lifecycle (SDLC) and release management processes
- Knowledge of infrastructure security standards and how to install and deploy solutions in the most secure configuration
- Experience with data ingestion and analytic tools that run on Hadoop and Spark
- Experience with Azure products such has ADLS Gen2, HDInsight, Databricks, Synapse Analytics Installation and configuration
- Experience in Azure Network, Security, storage, compute, Analytics, Databases, DevOps process and frameworks
- Ability to prioritize workload based on risk and business impact with a strong focus on the customer.
- Proven ability to function in a fast paced and continuously changing environment with, eexcellent organizational and time management skills
- Demonstrate ability to make decisions, solve and troubleshoot problems based on analysis, experience and judgement
- Concise and clear written and verbal communication skills
- Strong understanding of Hybrid and Multi-Cloud Platforms as well as Virtual and Hypervisor technologies such as (VMWare, Hyper-V etc), Microsoft Active Directory, Windows Server, Unix/Solaris and Linux Operating Systems
- Scripting and Automation (SQL, Hive, SCALA, Python, VB, PowerShell etc) skills
- Experience with interacting with vendors to install, configure, test and deploy third-party solutions
- Some experience in Pharmaceutical/Process Industry, Consumer Packaged Goods, Information Systems would be an asset
- Experience
- Design, building, deploying, maintaining and troubleshooting Big Data Platforms.
- Minimum 3 years experience in the building, support and troubleshooting of Hybrid, Private and Public cloud systems.
Submit CV To All Data Science Job Consultants Across India For Free
🔍 Explore All Related ITSM Jobs Below! 🚀
✅ Select your preferred "Job Category" in the Job Category Filter 🎯
🔎 Hit "Search" to find matching jobs 🔥
➕ Click the "+" icon that appears just before the company name to see the Job Detail & Apply Link 📝💼

