No menu items!
Home Blog Page 246

Art of Living HR | Python Developer | Greater Bengaluru Area | Bharat | BigDataKB.com | 2023-02-28

Job Location: Greater Bengaluru Area

Job Detail:

Python Developer Job Description :

We are looking for an experienced Python developer to join our engineering team and help us create dynamic software applications for our clients. In this role, you will be responsible for writing and testing scalable code, developing back-end components, and integrating user-facing elements in collaboration with front-end developers.

To be successful as a Python developer, you should possess in-depth knowledge of object-relational mapping, experience with server-side logic, and above-average knowledge of Python programming. Ultimately, a top-class Python developer is able to design highly responsive web-applications that perfectly meet the needs of the client.

Python Developer Responsibilities:

Coordinating with development teams to determine application requirements.

Writing scalable code using Python programming language.

Testing and debugging applications.

Developing back-end components.

Integrating user-facing elements using server-side logic.

Assessing and prioritizing client feature requests.

Integrating data storage solutions.

Coordinating with front-end developers.

Reprogramming existing databases to improve functionality.

Developing digital tools to monitor online traffic.

Python Developer Requirements:

Bachelor’s degree in computer science, computer engineering, or related field.

2-3 years of experience as a Python developer.

Expert knowledge of Python and related frameworks including  Docker / Kubernetes.

A deep understanding and multi-process architecture and the threading limitations of Python.

Java, C++ and C # is optional. 

Ability to integrate multiple data sources into a single system.

Familiarity with testing tools.

Ability to collaborate on projects and work independently when required.

Experience: 2-3 Yrs

Salary : As per the industry standards

Location : Bangalore

Apply Here

Submit CV To All Data Science Job Consultants Across Bharat For Free

Appollo hospital | Data Analyst | Tiruchirappalli, Tamil Nadu, The Great Bharat | Bharat | BigDataKB.com | 2023-02-28

Job Location: Tiruchirappalli, Tamil Nadu, The Great Bharat

Job Detail:

A data analyst is responsible for organizing data related to sales numbers, market research, logistics, linguistics, or other behaviors. They utilize technical expertise to ensure data is accurate and high-quality. Data is then analyzed, designed, and presented in a way that assists individuals, businesses, and organizations make better decisions.

Roles and Responsibilities:

Using automated tools to extract data from primary and secondary sources

Removing corrupted data and fixing coding errors and related problems

Developing and maintaining data systems – reorganizing data in a readable format 

Performing analysis to assess the quality and meaning of data

Filter Data by reviewing reports and performance indicators to identify and correct code problems

Using statistical tools to identify, analyze, and interpret patterns and trends in complex data sets could be helpful for the diagnosis and prediction

Assigning numerical value to essential business functions so that business performance can be assessed and compared over periods of time.

Analyzing local, national, and global trends that impact both the organization and the industry

Preparing reports for the management stating trends, patterns, and predictions using relevant data

Working with programmers, engineers, and management heads to identify process improvement opportunities, propose system modifications, and devise data governance strategies. 

Preparing final analysis reports for the stakeholders to understand the data-analysis steps, enabling them to take important decisions based on various facts and trends. 

skills:
Cleansing and preparing data
Analyzing and exploring data
Expertise in statistics
Analyzing and visualizing data
Reports and dashboards
Communication and writing
Expertise in the domain
Solution-oriented

Experience 1-4yrs
immediate joiners

Apply Here

Submit CV To All Data Science Job Consultants Across Bharat For Free

Aegan Technologies | Gcp Data Engineer | Bhagya Nagar, Chennai | Bharat | BigDataKB.com | 2023-02-28

Job Location: Bhagya Nagar, Chennai

Job Detail:

Job Description

We Aegan technologies hiring GCP Data Analyst resources for CMMI level 5 client.

JOB DESCRIPTION

Role : GCP Data Analyst
Experience : 4 – 6 yrs
Location : Hyderabad/Chennai

Mode : Work from Office

Primary Skill : Kafka

Secondary Skill : GCP

EXPERIENCE Required:
• Strong knowledge on GCP platform, R or Python
Preferred
a. Build Apache Kafka Data pipelines
b. Maintain Apache Kafka Data pipelines
c. Troubleshoot Kafka Data pipelines
• Strong experience on any programming language R or Python.
• Good experience on any cloud provider like aws, azure or GCP (preferably GCP)
• knowledge on cloud development (CI/CD)
• strong knowledge on network, storage, and cloud services
• Infrastructure planning, testing, and development. (e.g., creating new instance on cloud)
• Should have knowledge on automations through Python.
• Experience on reporting, automation and monitoring the environments.
• Willing to work around the clock.

Interested candidates please share your resume to raj@aegan-global.com, nancy@aegan-global.com

Contact Number- 9087527698

Apply Here

Submit CV To All Data Science Job Consultants Across Bharat For Free

Techforeus | Data Science Trainer | Mysore | Bharat | BigDataKB.com | 2023-02-28

Job Location: Mysore

Job Detail:

Data Science Trainer Job Description

We are looking for experienced Data Science Trainers who can work with us.

  • Who can train for fresher’s/professionals at our location.
  • Experience: Min 2 years or above as Data Science Trainer
  • Location: Mysore

Responsibilities:

  • Training Freshers/working professionals on in-demand skills like Data analysis, Machine Learning etc.
  • Delivering highly interactive lectures online that are in line with INSAID’s teaching methodology
  • Develop cutting edge and innovative content for classes to help facilitate delivery of classes in an interesting way
  • Continuously improve the delivery experience to ensure that experience of students is world class

Minimum requirement:

  • Should have minimum of 2+ years of technical training experience
  • Should have strong command over any one programming language- Python, R, .NET,Java
  • Should ideally be conversant with Machine Learning, Data analysis concepts etc.
  • Passion for teaching and training is a must
  • Strong communication skills and ability to delivery highly interactive lectures is a must

Job Type: Freelance
Contract length: 3 months

Salary: ₹448,566.66 – ₹1,878,900.27 per year

Schedule:

  • Day shift

Speak with the employer
+91 9035420598
Expected Start Date: 11/03/2023

Apply Here

Submit CV To All Data Science Job Consultants Across Bharat For Free

Softility | Big Data Engineer | Bhagya Nagar | Bharat | BigDataKB.com | 2023-02-28

Job Location: Bhagya Nagar

Job Detail:

Big Data Developer

● Participate in the design and implementation phases for data projects to help translate high level business requirements into technical solutions

● Provide technically sound solutions for the ingestion, storage, transformation, and presentation of enterprise data, including ETL design, data storage strategies, data access and security

● Driving the design of scalable solutions while considering recoverable and resiliency requirements

● Participate in DevOps related activities, including involvement with continuous integration, automated deployment, automated testing, and continuous monitoring to enhance performance and product quality

● Provide timely support for troubleshooting and resolving production issues

Required Technical and Professional Expertise

● Experience with any of the following: Big Data technologies, Analytics, Hadoop ecosystems, Data Lakes, or Data Migration

● Experience with cloud platforms such as AWS, Azure, or Google Cloud

● Experience with Java, Scala, Spark

● Strong database integration and management skills

● Strong ETL skills, including database performance tuning and optimization

● Familiar with Shell scripting skills and Unix/Linux knowledge

● Exceptional analytical and problem-solving skills

● Running diagnostic tests and performing debugging procedures

● Documenting application development processes, procedures, and standards

● Excellent leadership and interpersonal skills

● Strong communication and team working skills

● Able to analyze complex situations and derive workable actions

● Flexibility to adjust to multiple demands, shifting priorities, ambiguity and rapid change

Preferred Technical And Professional Expertise

● Knowledge of DevOps and related tooling such as Jenkins, Ansible etc.

● Experience with Big Data technologies such as Apache Hadoop, HBase, Hive, Spark, Flink or Flume

● Strong coding skills in one of Python/Java/Scala

● Familiar with event management technologies such as Apache Kafka or Apache Airflow

● Familiar with container technologies such as Docker or Kubernetes

● Demonstrable experience coaching junior members of your teams

Job Type: Full-time

Salary: ₹500,000.00 – ₹2,200,000.00 per year

Benefits:

  • Health insurance
  • Provident Fund

Schedule:

  • Monday to Friday

Supplemental pay types:

  • Shift allowance

Ability to commute/relocate:

  • Hyderabad, Telangana: Reliably commute or planning to relocate before starting work (Preferred)

Speak with the employer
+91 7207461112

Apply Here

Submit CV To All Data Science Job Consultants Across Bharat For Free

Randstad | Urgent Opportunity For Manager – MIS Sales Operations(Distiller) | Indraprasth / NCR | Bharat | BigDataKB.com | 2023-02-28

Job Location: Indraprasth / NCR

Job Detail:

Job Responsibilities

Manage – both tracking & reporting MIS, Data Analysis & Sales Automation software based out of the Head office in Delhi.

Responsible for

1. Work closely with State Heads for data requirements & Analysis.

2. Manage Bizom – Sales Automation Software data updation & check on sales team

targets and attendance.

3. Share & track pending targets for all executives on a daily basis.

4. Tracking of payment reconciliation & submission on a daily basis.

5. Daily/Weekly Sales Target planning & tracking

6. Daily tracking & reporting of Tertiary Sales of all executives.

7. Historical Data Analysis for all outlets at SKU level.

8. Competition Data Tracking

Skills Required:

1. BCOM or equivalent degree

2. 3+ years experience in MIS in AlcoBev industry

3. Strong skills in MS Excel, Tally, Tableau

4. Excellent written and verbal skills to communicate at all levels of the business

Whats on offer?

1. The opportunity to work on an innovative new age whiskey brand

2. Working & growing with the company as it scales up

Apply Here

Submit CV To All Data Science Job Consultants Across Bharat For Free

Manpowergroup Services India | Data Engineer | Chennai | Bharat | BigDataKB.com | 2023-02-28

Job Location: Chennai

Job Detail:

Greetings from ManpowerGroup!

We are hiring for one of our client for Chennai location.

About Client

An AWS Cloud Data Engineering Company focused on Life Science Industry providing the Top 1% of Data Engineers, provides its customers with Top Quality Cloud and Data Talent and working on building a 5000 Strong & Certified, Global Workforce

Position Data Engineer – Job description

  •  3+ years of experience in IT programming, application/product development. 
  • Experience with one or more of the big data tools like Hadoop, Kafka, Spark, Beam etc.
  • Good in any one of the programming languages like Python
  • Good in SQL/RDBMS
  • Good understanding of ETL.
  • Working experience in semi structure data.
  • Very good communication and team player skills

Primary Skill     1. Understanding of ETL concepts

                            2. Knowledge on any one of the ETL tools

                            3. Python

                            4. Pyspark

                            4. AWS

                            5. SQL

Apply Here

Submit CV To All Data Science Job Consultants Across Bharat For Free

Manpowergroup Services India | SAP SD Business Systems Analyst II Product Based II Remote | Permanent Remote | Bharat | BigDataKB.com | 2023-02-28

Job Location: Permanent Remote

Job Detail:

Greetings from ManpowerGroup!

We are hiring for one of our client for remote opportunity.

Location Remote

About Client:

Born in 2011 through a merger of five of Japan’s most successful building materials and housing companies, we draw on our Japanese heritage to create world-leading technology and innovate to make high quality products that transform homes. Today, a global enterprise with approximately 55,000 employees in more than 150 countries worldwide.

Responsibilities:

  • SAP Sales and Distribution support for end-users; provides support for all SAP OTC area
  • Engage in configuring and customizing of SAP Sales & Distribution (SD) & Logistics execution module.
  • Work with ABAP team to provide customized and enhanced SAP solutions.
  • Provide a Gap analysis and convert the user requirements into formal requirements and design documents leading into systems solutions.
  • Develop process documentation for processes and procedures.
  • Prepare and execute test plans for new functionality.
  • Participate in daily support activities, own and resolve day to day support ticket tickets.
  • Conduct and gather business requirements meetings. Prepare functional specifications.
  • Well versed with integrations area like interfaces, running LSMW uploads.
  • Support pricing, EDI IDOC error monitoring in all the locations.
  • Engage in month end invoice clearing issues and trouble shoot of credit card processing.
  • Engage in EDI and Idoc related support.
  • Proficient in SAP queries and LSMWs.
  • Engage in learning and supporting other third party tools supporting OTC areas.

Requirements:

  • At least 6 years of experience in all or any two modules from SAP SD, CRM, Logistics execution & FI integration experience.
  • S4 HANA experience is a plus.
  • Working knowledge of integration Experience with integration of CRM, web service, & tax ware.
  • Strong verbal and written communication skills
  • Strong analytical, problem-solving and conceptual skills
  • Strong interpersonal skills; ability to work well on cross-functional, cross-cultural teams and foster team commitment to tasks
  • Good organization, documentation, and prioritization skills
  • Experience with IDOCS, ALE and EDI , Knowledge on API integrations a plus. 

Apply Here

Submit CV To All Data Science Job Consultants Across Bharat For Free

Randstad | senior devops engineer | Bengaluru | Bharat | BigDataKB.com | 2023-02-28

Job Location: Bengaluru

Job Detail:

summary


  • bengaluru, karnataka

  • a client of randstad india

  • contract
  • reference number

    JPC – 81389

job details
Title/Role: Senior DevOps Engineer
Location: Bengaluru / Pune / Gurgaon / Hybrid-Remote
wing engineering team that is building the next generation
of solutions for our clients. As a part of this team, you will:
 Develop cutting-edge, cloud native digital products and solutions, powered by advanced analytics …
 Learn from seasoned thought leaders and domain experts
 Partner with our global consultants across industries that drive strategy definition to create value for our clients in
ever changing market conditions
 Collaborate with leading system integrators and cloud providers in a variety of delivery models to define,
standardize and mature product delivery life cycle
About the role
For Senior DevOps Engineer position, we are seeking an enthusiastic and passionate Cloud DevOps professional with
extensive hands-on experience in Azure cloud infrastructure, security and network set-up, along with build CI/CD
pipelines. Expertise is needed in Containerized and Serverless deployments with IaaC using Terraform. Additional
experience in AWS, GCP, On-prem/Hybrid implementation is highly preferred.
This role will drive the overall definition and standardization of DevOps practice, lead and support a suite of cloud native
SaaS products. This role will be acting as the to-go SME and leading the technology efforts for all products of Solutions
Factory, collaborating with Infrastructure and Network Engineering teams, Tech Leads and Solution Architects. The role is
expected to be up to date on security posture and new developments and releases to maintain existing environments and
underlying cloud infrastructure, software component analysis, vulnerability scans, penetration testing, data security
aspects.
Key Requirements
 6 – 12 years of established track record in implementing scalable, distributed, and highly available systems on cloud
environments
 System administration with Linux/Unix, including installation/management of services, resource management, root
cause analysis, and capacity planning
 Knowledge of various cloud technologies such as VPC, Security Groups, S3/Blob, IAM, Azure Functions, App
Services, API Gateways, WAF, API Gateway, Azure AD/Cognito, ARM templates, Azure DevOps as an integrated
Agile workflow management, code repository, and CI/CD solution, Azure Kubernetes Services, Azure Container
Registry, Embedded Power BI etc.
 Experience implementing configuration management tools (e.g. Terraform, Ansible, etc.), authoring tasks for CI/CD
pipelines, and source control with Git
 Composure, deployment, and management of Docker containers on cloud platforms
 Hands-on experience in Terraform, Docker, and Git.
 Hands-on experience in programming languages such as NodeJS, PHP, Python, shell scripting and databases.
 Hands-on experience with relational (MySQL desirable) and NoSQL (e.g. Mongo, Neo4j, etc.) databases
 Hands-on experience in supporting RESTful APIs, event-driven workflows, and microservices architectures
 Hands-on experience with queueing and service orchestration technologies (e.g. Kafka, Storm, RabbitMQ, etc.)
 Configuration and use of enterprise monitoring tools (e.g. Nagios, SolarWinds, Zabbix, etc.)
 Background in network and security architectures
 Strong communication skills; can engage with both technical and non-technical team members
 Experience working with and the leading of remote teams
 Experience working in an Agile Scrum environment
 Drive to master emerging technologies and share experiences with team members
 Proven problem solving and critical thinking skills
 Front-end and back-end frameworks and programming languages, in-memory data store/caching
 Bachelor’s degree in computer science, engineering, or equivalent technical experienceDesirable Requirements
 Industry certifications in relevant technologies
 Experience in setting up DR, Cross-region replication
 Experience working with snowflake and Data bricks
Infrastructure
Azure Cloud; Windows, Linux Servers and OS; Containerized and Serverless

show more

Title/Role: Senior DevOps Engineer
Location: Bengaluru / Pune / Gurgaon / Hybrid-Remote
wing engineering team that is building the next generation
of solutions for our clients. As a part of this team, you will:
 Develop cutting-edge, cloud native digital products and solutions, powered by advanced analytics
 Learn from seasoned thought leaders and domain experts
 Partner with our global consultants across industries that drive strategy definition to create value for our clients in
ever changing market conditions
 Collaborate with leading system integrators and cloud providers in a variety of delivery models to define,
standardize and mature product delivery life cycle
About the role
For Senior DevOps Engineer position, we are seeking an enthusiastic and passionate Cloud DevOps professional with
extensive hands-on experience in Azure cloud infrastructure, security and network set-up, along with build CI/CD
pipelines. Expertise is needed in Containerized and Serverless deployments with IaaC using Terraform. Additional
experience in AWS, GCP, On-prem/Hybrid implementation is highly preferred. …
This role will drive the overall definition and standardization of DevOps practice, lead and support a suite of cloud native
SaaS products. This role will be acting as the to-go SME and leading the technology efforts for all products of Solutions
Factory, collaborating with Infrastructure and Network Engineering teams, Tech Leads and Solution Architects. The role is
expected to be up to date on security posture and new developments and releases to maintain existing environments and
underlying cloud infrastructure, software component analysis, vulnerability scans, penetration testing, data security
aspects.
Key Requirements
 6 – 12 years of established track record in implementing scalable, distributed, and highly available systems on cloud
environments
 System administration with Linux/Unix, including installation/management of services, resource management, root
cause analysis, and capacity planning
 Knowledge of various cloud technologies such as VPC, Security Groups, S3/Blob, IAM, Azure Functions, App
Services, API Gateways, WAF, API Gateway, Azure AD/Cognito, ARM templates, Azure DevOps as an integrated
Agile workflow management, code repository, and CI/CD solution, Azure Kubernetes Services, Azure Container
Registry, Embedded Power BI etc.
 Experience implementing configuration management tools (e.g. Terraform, Ansible, etc.), authoring tasks for CI/CD
pipelines, and source control with Git
 Composure, deployment, and management of Docker containers on cloud platforms
 Hands-on experience in Terraform, Docker, and Git.
 Hands-on experience in programming languages such as NodeJS, PHP, Python, shell scripting and databases.
 Hands-on experience with relational (MySQL desirable) and NoSQL (e.g. Mongo, Neo4j, etc.) databases
 Hands-on experience in supporting RESTful APIs, event-driven workflows, and microservices architectures
 Hands-on experience with queueing and service orchestration technologies (e.g. Kafka, Storm, RabbitMQ, etc.)
 Configuration and use of enterprise monitoring tools (e.g. Nagios, SolarWinds, Zabbix, etc.)
 Background in network and security architectures
 Strong communication skills; can engage with both technical and non-technical team members
 Experience working with and the leading of remote teams
 Experience working in an Agile Scrum environment
 Drive to master emerging technologies and share experiences with team members
 Proven problem solving and critical thinking skills
 Front-end and back-end frameworks and programming languages, in-memory data store/caching
 Bachelor’s degree in computer science, engineering, or equivalent technical experienceDesirable Requirements
 Industry certifications in relevant technologies
 Experience in setting up DR, Cross-region replication
 Experience working with snowflake and Data bricks
Infrastructure
Azure Cloud; Windows, Linux Servers and OS; Containerized and Serverless

show more


  • experience

    7

Apply Here

Submit CV To All Data Science Job Consultants Across Bharat For Free

Randstad | senior data engineer-snowflake | Bengaluru | Bharat | BigDataKB.com | 2023-02-28

Job Location: Bengaluru

Job Detail:

summary


  • bangalore, karnataka

  • a client of randstad india

  • permanent
  • reference number

    JPC – 81372

job details

About Client:

Our client is into a privately help American food corporation offering a range of farmer services and risk management solution.

Position:

This role will be working very closely with Product owners and Data/Solution Architect to understand the product backlog and overall architecture and vision and provide technical Solutions to the scrum development team to deliver features/outcomes of high quality as per sprint plans.

This position will also work closely with Data Lead and Solution Architect to continue to improve technical Design, leverage available tools within the client and outside, optimize performance etc.

Accountability

Product Development and Testing:

  • Design, build and test of the product development for one or more scum teams
  • Provide guidance and mentor the team on the development standards and methods
  • Identifies dependencies across scrum teams and solves conflicts
  • Provides solution for the given requirements, and impacts of changing requirements
  • Own technical design for more complex areas
  • Ensure the solution designed and built is supportable as part of a DevOps model
  • Work with Scrum Master / Business Analyst to define the product backlog based on given requirements
  • Work with the Scrum Master and Product Owner on bi-weekly sprint planning
  • Perform integration development to move data from production systems to CDP using patterns and standards by CDP team
  • Perform data transformation, harmonization as needed
  • Deliver high quality solutions with no defects

Solution Analysis and Design:

  • Use technical expertise and knowledge to benchmark with third party organizations to identify and incorporate best practices into the overall architecture strategy
  • Proactively appraise and modify current technical Solutions to identify deficiencies, research alternatives, and determine improvements
  • Assist in the decision-making process related to architecting solutions
  • Ensure the solution designed and built is supportable as part of a DevOps model
  • Creating documents that ensure consistency in development across the client organization.

Miscellaneous:

  • Coach Data Engineers
  • Miscellaneous duties as assigned

Required Qualifications

  • Bachelor’s Degree in MIS, Statistics, Business or related field
  • 7+ years of IT and business/industry work experience developing data or software applications including: analysis, design, coding, testing, deploying and supporting of applications
  • 3+ years of experience working with Snowflake building solutions that deliver value through leveraging data as an asset
  • 2+ years experience in SQL Scripting
  • 2+ years applied experience with Python
  • 1+ year experience with development using GitHub, TSVS or TFS
  • 1+ year experience with Agile methodology
  • Strong interpersonal skills; demonstrated ability to build trust and strong relationships
  • Result orientation and ability to work in ambiguous situations where requirements are not clear, specifications not in detail etc.
  • Strong conceptual strength, strategic thinking, problem solving, technical, and analytical skills.
  • Demonstrated application of strong market awareness and customer focus.
  • Ability to ask next level questions anticipating business inquiries and performing root cause analysis

Preferred Qualifications

  • Applied experience in Hadoop
  • Applied experience in Reporting Tools like Power BI , Tableau etc
  • Applied experience with Cloud platforms including AWS and/or Azure
  • Experience working with various types of data sources such as SAP, JDE, legacy ERP systems
  • Understanding trading and risk management concepts

Work Timings: 12PM to 9PM IST

Work mode: Hybrid (3 days from office – Tue, Wed & Thursday)

Salary: Will commensurate with experience

About Client:

Our client is into a privately help American food corporation offering a range of farmer services and risk management solution.

Position:

This role will be working very closely with Product owners and Data/Solution Architect to understand the product backlog and overall architecture and vision and provide technical Solutions to the scrum development team to deliver features/outcomes of high quality as per sprint plans.

This position will also work closely with Data Lead and Solution Architect to continue to improve technical Design, leverage available tools within the client and outside, optimize performance etc.

Accountability

Product Development and Testing:

  • Design, build and test of the product development for one or more scum teams
  • Provide guidance and mentor the team on the development standards and methods
  • Identifies dependencies across scrum teams and solves conflicts
  • Provides solution for the given requirements, and impacts of changing requirements
  • Own technical design for more complex areas
  • Ensure the solution designed and built is supportable as part of a DevOps model
  • Work with Scrum Master / Business Analyst to define the product backlog based on given requirements
  • Work with the Scrum Master and Product Owner on bi-weekly sprint planning
  • Perform integration development to move data from production systems to CDP using patterns and standards by CDP team
  • Perform data transformation, harmonization as needed
  • Deliver high quality solutions with no defects

Solution Analysis and Design:

  • Use technical expertise and knowledge to benchmark with third party organizations to identify and incorporate best practices into the overall architecture strategy
  • Proactively appraise and modify current technical Solutions to identify deficiencies, research alternatives, and determine improvements
  • Assist in the decision-making process related to architecting solutions
  • Ensure the solution designed and built is supportable as part of a DevOps model
  • Creating documents that ensure consistency in development across the client organization.

Miscellaneous:

  • Coach Data Engineers
  • Miscellaneous duties as assigned

Required Qualifications

  • Bachelor’s Degree in MIS, Statistics, Business or related field
  • 7+ years of IT and business/industry work experience developing data or software applications including: analysis, design, coding, testing, deploying and supporting of applications
  • 3+ years of experience working with Snowflake building solutions that deliver value through leveraging data as an asset
  • 2+ years experience in SQL Scripting
  • 2+ years applied experience with Python
  • 1+ year experience with development using GitHub, TSVS or TFS
  • 1+ year experience with Agile methodology
  • Strong interpersonal skills; demonstrated ability to build trust and strong relationships
  • Result orientation and ability to work in ambiguous situations where requirements are not clear, specifications not in detail etc.
  • Strong conceptual strength, strategic thinking, problem solving, technical, and analytical skills.
  • Demonstrated application of strong market awareness and customer focus.
  • Ability to ask next level questions anticipating business inquiries and performing root cause analysis

Preferred Qualifications

  • Applied experience in Hadoop
  • Applied experience in Reporting Tools like Power BI , Tableau etc
  • Applied experience with Cloud platforms including AWS and/or Azure
  • Experience working with various types of data sources such as SAP, JDE, legacy ERP systems
  • Understanding trading and risk management concepts

Work Timings: 12PM to 9PM IST

Work mode: Hybrid (3 days from office – Tue, Wed & Thursday)

Salary: Will commensurate with experience

  • experience

    10

Apply Here

Submit CV To All Data Science Job Consultants Across Bharat For Free