Before u proceed below to check the jobs/CVs, please select your favorite job categories, whose top job alerts you want in your email & Subscribe to our Email Job Alert Service For FREE
Job Location: Hartford, CT
Job Detail:
Must Have: AWS Databricks Python, Spark, PySpark
Location Hartford, CT or St. Paul, MN
Candidate has to relocate any one of the onsite location
Roles & Responsibilities
- Acts as a single point of contact for data migration to AWS projects for customer
- Provides innovative and cost-effective solution using AWS, Spark, python & customer suggested toolset
- Optimizes the use of all available resources
- Develops solutions to meet business needs that reflect a clear understanding of the objectives, practices and procedures of the corporation, department and business unit
- As a leader in the Cloud Engineering you will be responsible for the overseeing development
- Learn/adapt quickly to new Technologies as per the business need
- Develop a team of Operations Excellence, building tools and capabilities that the Development teams leverage to maintain high levels of performance, scalability, security and availability
Skills
- The Candidate must have 3-5 yrs of experience in PySpark & Python
- Hands on experience on AWS Cloud platform especially S3, lamda, EC2, EMR
- Experience on spark scripting
- Has working knowledge on migrating relational and dimensional databases on AWS Cloud platform
- Relevant experience with ETL methods and with retrieving data from dimensional data models and data warehouses.
- Strong experience with relational databases and data access methods, especially SQL.
- Knowledge of Amazon AWS architecture and design
Submit CV To All Data Science Job Consultants Across United States For Free