EPSoft | AWS Data Analytics | Franklin, TN | United States | BigDataKB.com | 2023/01/18

0

Job Location: Franklin, TN

Job Detail:

Position :- AWS Data Analytics

Location :- Franklin TN (Day 1 onsite)

Duration: Long Term

Technical Skillset (Mandatory)

Big Data Tools: Hadoop, Cloudera (CDP), PySpark, Kafka, etc.

Data Pipeline & Integration: Airflow, Kafka, Informatica Cloud (IICS), PowerCenter

Data Governance: EDC, AXON

Stream-Processing Systems: Storm, Spark-Streaming, AWS Kinesis

Databases: Relational, Amazon DynamoDB, Mongo DB, Snowflake, Postgres.

AWS Cloud Services: EC2, S3, EMR, RDS, Lambda, DMS, Kinesis, Glue

Object-Oriented Languages: Python, R

Technical Skillset (Optional)

Visualization Tool: SAP Business Objects, Tableau

Role And Responsibilities

Enhance our cloud capability by creating and implementing cloud application patterns

Develop and implement ways to move apps and workloads to the cloud

Work closely with business leads and product owners to understand solution requirements and identify architectural patterns

Write and develop cloud automation playbooks for managing and scaling containers, hosts, cloud services, and applications

Monitor compliance of cloud resources to see if they fit industry guardrails and best practices

Help other development and engineering teams resolve application to platform integration issues for Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) services

Research and propose solutions for AWS data transformation, data connections, operational frameworks, and application integration

Work closely with lead architects and engineers to create and maintain architectural templates and build/operational documents

Work with DevOps and engineering teams to develop service catalogs

Based on customer technology landscape arrive at solutions which is more Agile, scalable, and Cost effective.

Provide Proactive proposals to customer on Cloud Strategy.

Understand Customer Problem Statements and leverage Ai/ML Capabilities for Solution.

Build reusable artefacts and accelerators in Cloud

Collaborate with technical and business users to develop and maintain enterprise wide solutions and standards to provide data required for metrics and analysis

Project Specific Requirement (if Any)

Data-oriented personality, good communication skills, and an excellent eye for details.

Certifications in AWS Cloud, Machine Learning Programs

Relevant Experience Required

12 + Years’ experience in Enterprise Solutions and 5+ Years in Cloud preferably AWS.

Proficient understanding of distributed computing principles.

Strong knowledge and practical experience with AWS

Strong programming skills with experience in Webhook and API development using Node.js, Ruby, Python, Shell, and PowerShell

Familiarity with modern cloud application architecture

Exposure to cloud managed services and microservices like Function as a Service, Containers, and managed databases

Thorough understanding of ML, data analysis, data visualization, and event-driven architecture

Familiarity working with large systems

Experience with setting up load balancers, cloud networks, and virtual servers

Capable of working under tight deadline

Proficient in Solutioning for Real time Analytics




Apply Here

Submit CV To All Data Science Job Consultants Across United States For Free

🔍 Explore All Related ITSM Jobs Below! 🚀 ✅ Select your preferred "Job Category" in the Job Category Filter 🎯 🔎 Hit "Search" to find matching jobs 🔥 ➕ Click the "+" icon that appears just before the company name to see the Job Detail & Apply Link 📝💼

LEAVE A REPLY

Please enter your comment!
Please enter your name here