Before u proceed below to check the jobs/CVs, please select your favorite job categories, whose top job alerts you want in your email & Subscribe to our Email Job Alert Service For FREE
Job Location: Remote
Job Detail:
- Your primary responsibility would be to build Reliable ETL Data Pipelines using AWS Services like Glue, Lambda, S3, Eventbridge, Cloud Formation, Appflow, SNS, etc.
- A strong knowledge in Python and Pyspark are required.
- Good understanding of Structured and Semi structured data such as Nested JSON schemas will be required
- Good understanding of AWS Redshfit and Oracle is required since the data pipelines should be designed to stage the data into Redshift
Should be able to design and build ETL Data pipelines in AWS Cloud framework with knowledge in S3, Glue, Lambda, Python and PySpark
Skills: Database skills: AWS Redshift, Oracle, SQL Server, SQL, Stored Procedures (PL/SQL)
AWS Skills: S3, Glue, Lambda, Eventbridge, Cloudwatch and other AWS Services
Programming : Python, PySpark
Job Type: Full-time
Pay: $80,000.00 – $100,000.00 per year
Benefits:
- 401(k)
- Dental insurance
- Health insurance
- Paid time off
- Vision insurance
Schedule:
- 8 hour shift
Experience:
- Oracle: 5 years (Required)
- PySpark: 1 year (Required)
- AWS Redshift: 3 years (Required)
Work Location: Remote
Submit CV To All Data Science Job Consultants Across United States For Free