Publicis Re:Sources India | Data Architect for Marcel Product (Default) | Gurugram | Bharat | BigDataKB.com | 14 Oct 2022

0

Job Location: Gurugram

Company Description

Re:Sources is the backbone of Publicis Groupe, the world’s third-largest communications group. Formed in 1998 as a small team to service a few Publicis Groupe firms, Re:Sources has grown to 4,000+ people servicing a global network of prestigious advertising, public relations, media, healthcare and digital marketing agencies. We provide technology solutions and business services including finance, accounting, legal, benefits, procurement, tax, real estate, treasury and risk management to help Publicis Groupe agencies do what they do best: create and innovate for their clients.

In addition to providing essential, everyday services to our agencies, Re:Sources develops and implements platforms, applications and tools to enhance productivity, encourage collaboration and enable professional and personal development. We continually transform to keep pace with our ever-changing communications industry and thrive on a spirit of innovation felt around the globe. With our support, Publicis Groupe agencies continue to create and deliver award-winning campaigns for their clients.


About Marcel Product:

Marcel is the AI platform that connects more than 80,000 employees at Publicis Groupe— across geographies, agencies and capabilities. Marcel helps our employees learn, share, create, connect
and grow more than ever before. Marcel connects employees to our culture, helps them master new skills, inspires them and tackles diversity and inclusion head on to help build a better world together. It’s a place where we come together every day to amplify each other as one global team.
All of this employee engagement creates over 100 million data points that power our AI-enabled knowledge graph, making the experience even more relevant for employees. And for our clients, our knowledge graph makes Marcel one of the most powerful tools ever invented for finding exactly the right expertise, teams and knowledge that we need to win in the Platform World.
Marcel is a strategic investment in our people and is aimed at being their personal growth engine in this hybrid world. This role is joining the dynamic Marcel team in helping build and evolve this product.

Job Description


Working Location: Pune, Bangalore, Gurgaon.


The key accountabilities for this role are, but not limited to;

  • Ensure the data models of the Marcel program are managed efficiently and that model enhancements are in alignment with the data modelling principles, standards and meta data practices
  • Ensure that all data lifecycle events are efficiently managed by the Marcel platform, aligning technology and feature teams accordingly.
  • Ensure that data quality is maintained in production through measurement and operationally supported
  • Work closely with feature teams to ensure that all analytics, data and architectures are in alignment with the Data Strategy.
  • Act as a point of contact and advisor on all data related features of Marcel and where relevant drive enhancements from concept through to production delivery.
  • Coaching and mentor others on best practices, data principles, performance.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs..
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.


Specific responsibilities:

  • Responsible for overall Data Architecture of the platform
  • Responsible for leading the team of data engineers to build data pipelines using a combination of Azure Data Factory and Databricks
  • Accountable for delivery of team commitments
  • Responsible for training and development of team members
  • Responsible for the design and architecture of feeds and data integrations
  • Responsible for sign off of deliverables
  • Responsible for establishing best practices and standards
  • Write maintainable and effective data feeds, and pipelines
  • Follow best practices for test driven environment, continuous integration.
  • Design, develop, test and implement end-to-end requirement
  • Contribute on all phases of development life cycle
  • Perform unit testing and troubleshooting applications

Qualifications


Minimum Experience (relevant): 5 (Overall experience of atleast 10+ years)


Maximum Experience (relevant): 15


Must have skills:

  • Strong written and verbal communication skills
  • Strong experience in implementing Graph database technologies (property graph) like Neo4J.
  • Strong experience in leading data modelling activities for a production graph database solution
  • Strong experience in Cypher (or Tinkerpop Gremlin) with understand of tuning
  • Strong experience working with data integration technologies, specifically Azure Services, ADF, ETLs, JSON, Hop or ETL orchestration tools.
  • Strong experience using PySpark, Scala, DataBricks
  • 3-5+ years’ experience in design and implementation of complex distributed systems architectures
  • Strong experience with Master Data Management solutions
  • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
  • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
  • Strong knowledge Azure based services
  • Strong understanding of RDBMS data structure, Azure Tables, Blob, and other data sources
  • Experience with GraphQL
  • Experience in high availability and disaster recovery solutions
  • Experience with test driven development
  • Understanding of Jenkins, CI/CD processes using ADF, and DataBricks.
  • Strong analytical skills related to working with unstructured datasets.
  • Strong analytical skills necessary to triage and troubleshoot
  • Results-oriented and able to work across the organization as an individual contributor


Good to have skills:

  • Knowledge in graph data science, such as graph embedding
  • Knowledge in Neo4J HA Architecture for Critical Applications (Clustering, Multiple Data Centers, etc.)
  • Experience in working with EventHub, streaming data.
  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with Redis
  • Understanding of ML models and experience in building ML pipeline, MLflow, AirFlow.
  • Bachelor’s degree in engineering, computer science, information systems, or a related field from an accredited college or university; Master’s degree from an accredited college or university is preferred. Or equivalent work experience.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • A successful history of manipulating, processing and extracting value from large disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable Azure based data stores.
  • Strong project management and organizational skills.
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  • Understanding of Node.js is a plus, but not required.

Additional Information


Attributes/behaviours

  • Ability to design, develop, implement complex requirement.
  • Building reusable components and front-end libraries for future use
  • Translating designs and wireframes into high quality code

Pro-active support to the business is a key attribute for this role with a customer service focus to link both systems requirements with business outcomes




Apply Here

Submit CV To All Data Science Job Consultants Across Bharat For Free

🔍 Explore All Related ITSM Jobs Below! 🚀 ✅ Select your preferred "Job Category" in the Job Category Filter 🎯 🔎 Hit "Search" to find matching jobs 🔥 ➕ Click the "+" icon that appears just before the company name to see the Job Detail & Apply Link 📝💼

LEAVE A REPLY

Please enter your comment!
Please enter your name here