Cint | Data Engineer II – Core Team | Remote | United States | BigDataKB.com | 17 Oct 2022

Before u proceed below to check the jobs/CVs, please select your favorite job categories, whose top job alerts you want in your email & Subscribe to our Email Job Alert Service For FREE

 

Job Location: Remote

Cint is a global software leader in technology-enabled insights. The Cint platform automates the insights gathering process so that companies can gain access to insights faster with unparalleled scale. Cint has the one of the world’s largest consumer networks for digital survey-based research, made up of over 160 million engaged respondents across more than 130 countries. Over 3,200 insights-driven companies use Cint to accelerate how they gather consumer insights and supercharge business growth.


The Opportunity

Cint is seeking a Data Engineer to join our core data integrations team. This team is working to develop real time data structures and solve big data problems utilizing modern technologies like Python, Java/Scala, SQL, and AWS cloud environment. You will own applications and services running on the backend for the ingestion, storage, and processing of data hosted on AWS.


The Team

BigDataKB.com Jyotish
BigDataKB.com Jyotish - Career & Life Prediction

The Core Team owns applications and services running on the backend for the ingestion, storage, and processing of data. You’re working with streams of data that the team owns to make sure we have structured and unstructured data maintained in data lakes and data warehouses You will be writing code and building pipelines to create data sets for our partners.


This role is open to candidates living in the US to work remotely or from one of our US-based office locations (New Orleans, New York).


What You Will Do

  • Create and maintain optimal data pipeline architecture, ETL design, complex data modeling and systems, and component integration using large databases in a business environment
  • Implement data ingestion, transformation, processing, and data quality frameworks in real-time and batch
  • Deliver various type of data services written in Python, Java/Scala, Go/Golang and other languages
  • Build tools and frameworks for data security, data governance, data privacy protection and data anonymization
  • Improve data quality and integrity by using internal tools and frameworks to automatically detect data quality issues and discrepancies
  • Build high quality alerting and monitoring system to keep various components of the data pipeline in healthy state
  • Collaborate with DBA’s for scheduled downtime, investigate and debug issues for CDC pipelines, Data Migration Solutions and similar services services
  • Work with Data Governance Committee and legal team to get approvals for data collection, storage, retention, and deletion
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics
  • Write reliable, maintainable, well-documented code that will scale to support millions of respondents
  • Collaborate with Product Managers to refine and modify requirements. Support the business teams and product managers in data extracts and data analysis
  • Strong unit testing and integration testing practices
  • Participate in the on-call rotation to monitor health of all data applications and services

What We Are Looking For

  • Bachelor’s or Master’s degree in Computer Science
  • A couple years of hands-on software development experience in writing data algorithms and pipelines using Python, Java/Scala, or other modern big data and cloud computing technologies
  • Competency in data structures, algorithms, and software design
  • Proficient SQL and Database skills
  • Ability to rapidly debug and solve problems in unfamiliar areas
  • Experience in working in an agile environment with rapidly changing requirements and tight deadlines

Bonus Points If You Have

  • Familiarity with containerization technologies like Docker and Kubernetes
  • Familiarity with Data Ops pipelines and CI/CD implementations
  • Familiarity with AWS or other cloud platforms
  • Experience with AWS Kinesis, EMR, EC2, EKS, Apache Spark, Redshift and Airflow
  • Familiar with Databricks, Snowflake, Terraform, Airflow, Jenkins, and AWS Lamba
  • Advanced working SQL knowledge and experience working with relational databases, NoSQL, modeling and query authoring (SQL)
  • Demonstrate efficiency in handling data, tracking data lineage, ensuring data quality, and improving discoverability of data.
  • Sound knowledge of distributed systems and data architecture (lambda) – design and implement batch and stream data processing pipelines, knows how to optimize the distribution, partitioning, and MPP of high-level data structures.

Our Values


  • We are accountable
    We do what we say, and say what we do. We believe in transparency. We drive results.

  • We work together
    We listen to understand. We collaborate to find the best solutions. We help each other to succeed.

  • We drive new ideas forward
    We are passionate about innovation. We are curious learners. We take smart risks.

  • We think beyond ourselves
    – We are respectful and compassionate. We champion diversity and equality. We promote a sense of belonging.

More About Cint


In June 2021, Cint acquired Berlin-based GapFish – the world’s largest ISO certified online panel community in the DACH region – and in December, completed the acquisition of US-based Lucid – a programmatic research technology platform that provides access to first-party survey data in over 110 countries.


Cint Group AB (publ), listed on Nasdaq Stockholm, has a rapidly growing team across its many global offices, including Stockholm, London, New York, New Orleans, Singapore, Tokyo and Sydney. (
www.cint.com)

  • Flexible working/Remote Working
  • Health Insurance (full benefits)
  • Gym Membership (depending on country)
  • 401K or pension (depending on country)
  • Phone screen with the recruiter
  • Interview with the hiring manager
  • Panel interview with the team

Python, AWS, Go, ETL, SQLPython, AWS, Airflow, SQL, Spark, Go, AWS Lambda, Docker, Kubernetes, Terraform, ETL




Apply Here

Submit CV To All Data Science Job Consultants Across United States For Free

LEAVE A REPLY

Please enter your comment!
Please enter your name here