Job Location: Dallas, TX
Job Description:
Data Science Platform Engineer-
You will be part of Data Science Engineering Team which supports Enterprise Data Science Center and distributed Data Science Business Teams in the Organization. Our Team architects, implements and manages the cloud-based (AWS) and on-prem Data Science Platforms. As a Platform Engineer, you will be responsible for design and development of platform capabilities to deliver Data Science/ML at scale. The platform will be a combination of AWS services and custom components that together provide a complete solution that includes data pipelines, model development, model training, and production deployment of solutions while being compliant with enterprise policies.
About The Job
Design and develop the platform capabilities, centered around AWS SageMaker, for self-service usage (by Data Scientists)
Lead the design and implementation of best governance practices for design, security, development, usability, cost control and forensics across cloud and on-prem infrastructure to support hybrid solutions
Design and automate ML deployment pipelines (MLOps) to define a clear path to production for Data Science Projects
Provide L3 engineering support and lead the delivery of infrastructure solutions on the platform.
Minimum Qualifications:
Ideal candidate should have 5+ years of prior experience in IT infrastructure; network and server engineering, data center in a high-tech environment.
3+ years successfully designing and implementing successful public cloud solutions (AWS).
Proven work experience as a Platform Engineer or similar role in a cloud native environment.
Experience in deploying and implementing scalable cloud-based solution architectures for PaaS, IaaS or SaaS.
Deep hands-on experience leading the design and deployment of technology infrastructure, network, compute, storage, and virtualization.
Experience with cloud analytics services and data engineering is a plus.
Skills:
Good understanding of Data science model lifecycle stages
Proficiency in Python development
AWS cloud Experience – especially in Data and Analytics Space – Athena, Redshift, Glue, SageMaker
AWS cloud formation/deployment – Infrastructure as Code
GitLab and AWS CI/CD enabling tool stack. GitOps is mandatory.
AWS SageMaker Experience – Build, Train, Test, Deploy.
MLOps best practices
Solid experience in Docker
Experience with SQL and Spark
Experience working with Machine learning projects
Enterprise security policies (IAM, Secure AUTH, Ping)
‘ R’ (optional)
Java (optional)
Airline Experience (Optional)
Required Skills: N = Nice to Have M = MUST HAVE FIRST COLUMN IS FOR SR LEVEL RESOURCES AND SECOND COLUMN is for TECH Lead DevOps CI/CD M M GitLab M M Jenkins N N Monitoring, & Metrics, Observability N M Languages Python/ Scala /Javascript; Java M M Python Java (on-prem services) Javascript (rare) Frameworks Serverless / AWS CDK or SAM N M Cloud Tools – Stacker/Terraform N M CloudFormation M M Cloud Experience – AWS, AZURE Infrastructure as Code (IAAC) – Cloud Formation M M Identity Access Management – IAM M M cloud storage -S3 M M Glue – Crawler, Catalog, Glue Registry N N Step Function N N Serverless Function – Lambda N M Docker M M
Basic Qualification:
Additional Skills: Remote role
Background Check: Yes
Drug Screen: Yes
Notes:
Selling points for candidate: Remote role
Project Verification Info:
Candidate must be your W2 Employee: No
Exclusive to Apex: No
Face to face interview required: No
Candidate must be local: No
Candidate must be authorized to work without sponsorship: No
Interview times set:: No
Type of project: Development/Engineering
Master Job Title: Data Scientist
Branch Code: Dallas
Job Type: Full-time
Submit CV To All Data Science Job Consultants Across United States For Free

