Brillio | Hiring | Data Engineer | Santa Clara, CA | BigDataKB.com | 5 Oct 2022

Before u proceed below to check the jobs/CVs, please select your favorite job categories, whose top job alerts you want in your email & Subscribe to our Email Job Alert Service For FREE

 

Job Location: Santa Clara, CA

Data Engineer

About Brillio

Brillio delivers disruptive digital solutions across capabilities such as Design Thinking, Product Engineering, Data Analytics, Digital Front Office, Digital Infrastructure through a team of 4,000+ experts technologically elevating 200 companies globally. Our core values of Customer success, We Care, Entrepreneurial (mindset) and Excellence – drive everything that we do, from the very first day.

BigDataKB.com Jyotish
BigDataKB.com Jyotish - Career & Life Prediction

Find your passion and build your career with Brillio.

Role: Data Engineer

Employment Type: Permanent

Location: Remote (USA)

The Opportunity:

As a Data Engineer, you will get an opportunity to closely work with business teams and other application owners, understand the core functionality of banking, credit, risk and finance applications and associated data.

Financial Domain Knowledge (Nice to have this skill):

  • Understand basic banking processes and products, and few years of related experience in the financial services industry, is preferred
  • Good knowledge of one area preferred (Risk, Regulatory, Credit, Deposits, Card, Investments, Loans, Capital, AML (Anti Money Laundering), KYC, etc..)
  • Understand policies and processes about how data can be collected, analyzed, and utilized
  • Understanding of banking regulations, Regulatory Reporting compliance for Medium to Large Financial Institutions is good to have.

As a Data Engineer,

  • You will build, append, and enhance existing enterprise data warehouse.
  • You will build data pipelines, tools, and reports that enable analysts, product managers, and business executives

JOB DESCRIPTION:

  • Design and Build ETL jobs to support Customer’s Enterprise data warehouse.
  • Write Extract-Transform-Load (ETL) jobs using any standard tools and Spark/Hadoop/AWS Glue jobs to calculate business metrics
  • Partnering with business team(s)to understand the business requirements, understand the impact to existing systems and design and Implement new data provisioning pipeline process for Finance / External reporting domains.
  • You will also have the opportunity to display your skills in the following areas: AWS Cloud, Big Data technologies, Design, implement, and build our enterprise data platform (EDP).
  • Design and develop data models for SQL/NoSQL database systems
  • Monitor and troubleshoot operational or data issues in the data pipelines
  • Drive architectural plans and implementation for future data storage, reporting, and analytic solutions

SKILL REQUIRED

  • 5+ years of relevant work experience in analytics, data engineering, business intelligence or related field
  • 2+ years of experience in implementing big data processing technology : AWS / Azure / GCP, Hadoop, Apache Spark, Python and good to have understanding of Redshift, Snowflake.
  • Experience using SQL queries, experience in writing and optimizing SQL queries in a business environment with large-scale, complex datasets
  • Detailed knowledge of databases like Oracle / DB2 / SQL Server, data warehouse concepts and technical architecture, infrastructure components, ETL and reporting/analytic tools and environments
  • Hands on experience on major ETL tools like Informatica IICS, SAPBODS and/or any cloud based ETL tools.
  • Hands on experience with scheduling tools like Redwood, Control-Mor Tidal. Expects good understanding and experience on reporting tools like Tableau, BOXI etc.
  • Hands on experience in cloud technologies (AWS /google cloud/Azure) related to Data Ingestion tool (both real time and batch based), CI/CD processes, Cloud architecture understanding ,AWS, Big data implementation.
  • AWS certification is a plus and working knowledge of Glue, Lambda,S3, Athena, Redshift is a plus.
  • Coding proficiency in at least one modern programming language (Python, Ruby, Java, etc.)

Preferred Qualifications

  • Bachelor’s degree in Computer Science, Mathematics, Statistics, Finance, related technical field, or equivalent work experience
  • Strong ability to effectively communicate with both business and technical teams
  • Basic Experience with Cloud technologies
  • Experience in banking domain is a plus

Know more about

DAE: https://www.brillio.com/services-data-analytics/

Know what it’s like to work and grow at Brillio: https://www.brillio.com/join-us/

Equal Employment Opportunity Declaration

EOE/Minority/Female/Veteran/Disabled/Sexual Orientation/Gender Identity/National Origin

Stay Connected

Brillio is an equal opportunities employer and welcomes applications from all sections of the society and does not discriminate on the grounds of race, religion or belief, ethnic or national origin, disability, age, citizenship, marital status, sexual orientation, gender identity, or any other basis as protected by applicable law.




Apply Here

Submit CV To All Data Science Job Consultants Across United States For Free

LEAVE A REPLY

Please enter your comment!
Please enter your name here