Job Location: Bangalore
Job Description:
Life at Grab
At Grab, every Grabber is guided by The Grab Way, which spells out our mission, how we believe we can achieve it, and our operating principles – the 4Hs: Heart, Hunger, Honour and Humility. These principles guide and help us make decisions as we work to create economic empowerment for the people of Southeast Asia.
Get to know the Team
We are looking for an insider threat savvy Data Engineer to join our growing team of investigation and analytics experts. The new hire will be responsible for expanding and optimising our data and data pipeline architecture, as well as optimising data flow and collection for the Insider Threat technical investigations team.
Get to know the Role
The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimising data systems and/or building them from the ground up. It will be also preferable if the candidate has a good understanding of investigative processes and insider threats.
This Senior Engineer will support and be an integral part of the Insider Threat Technical Investigation team. This team provides the business with investigative services and monitors for fraud and insider threats. To protect Grab’s assets, we have a diversified insider threat team and detection strategy instead of relying on a single solution. Our detection process combines several tools and processes to not only monitor insider behavior but also filter through a large number of alerts to eliminate false positives. For example, we are using tools like Machine Learning (ML) applications that can help analyse the data stream and prioritise the most relevant alerts. We also use digital forensics and analytics tools. We are currently exploring User and Event Behavior Analytics (UEBA) to help detect, analyse, and alert to any potential insider threats. The role will be integral to the Insider Threat Team, its systems and products. The right candidate will be excited by the prospect of optimising, and being part of designing our team’s data architecture to support our current and next generation of products and data initiatives and investigations.
The Day-to-Day Activities
Big data manipulation, extraction, and cleaning:
In order to run experiments and build new risk analytic systems, you will need to be able to access, clean, and structure data. Work with imbalanced datasets. The class imbalance will be a common obstacle in fraud analysis. The number of legitimate users or transactions is far far greater than the number of fraudulent ones. So knowing how class imbalance might bias your model and how to counter some of its negative effects (resampling, SMOTE*, etc.) is critical. (* Synthetic Minority Oversampling Technique, or SMOTE for short – this is a python approach).
-
Create and maintain optimal data pipeline architecture.
-
Assemble large, complex data sets that meet functional/non-functional business requirements.
-
Identify, design, and implement internal process improvements: automating manual processes, optimising data delivery, re-designing infrastructure for greater scalability, etc.
-
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
-
Build analytics tools that utilise the data pipeline to provide actionable insights into stakeholder requirements, operational efficiency and other key areas of investigation.
-
Work the insider threat investigators and data analysts on data-related technical issues and support their data infrastructure needs.
-
Keep our data separated and secure across national boundaries through multiple data centers and regions.
-
Create data tools for analytics and insider threat investigation team members that assist them in delivering high-quality results.
-
Work with data and insider threat analytics experts to strive for greater functionality in our insider threat data systems.
The Must-Haves
-
Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
-
Be fluent in programming languages such as SQL, Python and other relevant ones.
-
Adept at finding warehousing solutions, and using ETL (Extract, Transfer, Load) tools, and understanding basic machine learning and algorithms.
-
Experience building and optimising ‘big data’ data pipelines, architectures and data sets.
-
Use statistical analysis to segment user behavior and develop targeted solutions to prevent sophisticated fraud and other insider threat activity.
-
Experience performing root cause analysis on internal and external data and processes to answer specific investigative questions and identify opportunities for improvement.
-
Strong analytic skills related to working with unstructured datasets.
-
Build processes supporting data transformation, data structures, metadata, dependency and workload management.
-
A successful history of manipulating, processing and extracting value from large disconnected datasets.
-
Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
-
Strong project management and organisational skills.
-
Experience with building risk-based monitoring systems.
-
Excellent oral and written communication skills.
Submit CV To All Data Science Job Consultants Across India For Free
🔍 Explore All Related ITSM Jobs Below! 🚀
✅ Select your preferred "Job Category" in the Job Category Filter 🎯
🔎 Hit "Search" to find matching jobs 🔥
➕ Click the "+" icon that appears just before the company name to see the Job Detail & Apply Link 📝💼

