Job Location: Gurgaon/Gurugram
The Associate would be responsible to liaise with various business SMEs to understand business requirements and translate them into functional specifications.
The candidate should have the ability to participate in Requirements Definition and Functional Specification exercises on system enhancements new projects, also understanding the threat landscape by analyzing the . It is a combination of surveillance and using modern technology concept.
The ultimate goal of an organization is to avoid the emerging threats but the more realistic way to beat them is to alert the relevant stakeholders beforehand. This part focuses on a cognitive process that enables effectual analysis, which helps to take firm decision-making. The team performs technical and non-technical methods to know about generalize patterns, sequences of raw data for analysis
Responsibilities
- Design, develop, test, implement and support Oracle applications.
- Maintain SQL/PLSQL processes
- Ensure Database availability
- Actively seek to optimize and simplify architecture
- Execution of data migration jobs and scripts as required.
- Support and collaborate with product developers
- Design, develop, test and support new capabilities and ongoing changes within the various application data marts
- Collaborate with other IT specialists to rapidly develop and deliver solutions that meet changing business needs
- Work with data owners to document data mappings and transformations to support effective downstream analytics and alerts
- Make recommendations and advise on data refresh, optimization of data, data storage, and data integration
- Have extensively worked in developing Oracle and ETL program for supporting Data Extraction, transformations and loading using Informatica PowerCenter.
- Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.
- Experience in Extraction, Transformation and Loading (ETL) of data from various data sources into Data Marts and Data Warehouse using Informatica PowerCenter components (Repository Manager, Designer, Workflow Manager and Workflow Monitor).
- Good Experience in creating Transformations and mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets
- Should have good knowledge in ETL process, standards, and best practices.
- Hands on experience in Batch processing, scheduling jobs in Programming/Querying (Stored procedures, Functions, Complex joins, Triggers, Indexing, DML and DDL)
- Performing database diagnostics, query optimizations, performance tuning and monitoring
- Experience into Data Modelling
- Writing Sqoop scripts to import, export and update the data to RDBMS.
- Strong object-oriented design and analysis skills.
- Writing various Cron jobs using to feed the downstream system.
- Performance tuning for jobs/reports.
- Troubleshooting errors and Complete data exploration and address any data anomalies, gaps or issues.
- Documentation within the Code itself and also preparing doc for modules/procedures.
- Responsible for development projects and occasionally to triage and fix the application production support issues with a quick turnaround.
- Individual will be involved in contributing into large and complex projects as part of development team.
- Involved in end to end application lifecycle development. This includes participating in business requirements analysis, design, development, testing, production support, and deliver small / medium size module with good quality.
Requirements :
Education –
- B.E/ B.Tech
Certifications If Any
- Graph DB/ Informatica/ Hadoop
Foundational skills:
- The ability to analyze, model and interpret data.
- Hands on experience in Oracle is a MUST and Informatica HiveQL as add on
- Oracle Experience should include DB design, capacity planning and performance tuning
- Oracle management tools like (Data Guard, RMAN, Data pump)
- Familiarity with data loading tools like Flume, Sqoop.
- Knowledge of workflow/schedulers like Oozie.
- Analytical and problem-solving skills, applied to Big Data domain
- Proven understanding with Hadoop, HBase, Hive, Pig, and HBase.
- Good aptitude in multi-threading and concurrency concepts.
- Proficiency with Python, LINUX, UNIX, Shell Scripts ,MYSQL, Hadoop Ecosystem components like HDFS, Hive , SQOOP, Scala, HBase, Hive, Map Reduce, ETL, Visualization etc.
- Writing LINUX shell scripts to load the data from different interfaces to Hadoop.
- Writing Scripts/Business logic using HIVE/SQL/Sqoop Jobs queries and in scripting language like PHP.
- Flair for data, schema, data model, how to bring efficiency in big data related life cycle
- Experience in Analytics using Informatica/Python.
- Knowledge of Oracle/ SQL/Other or strong ability to conceptualize and develop tools with data analysts.
- Interpret data, analyze results using statistical techniques and provide ongoing reports.
- Strong Analytic and critical thinking with problem solving ability.
- Experience in descriptive and diagnostic Analytics using R/SAS/other analytic tools.
- Oracle/SQL/NoSQL data stores
- Python Programing and Tools
- Informatica
- Hadoop
- Neo4j
- Graph DB
- Graph Visualization
Submit CV To All Data Science Job Consultants Across India For Free

Leave a Reply