Job Location: Kochi/Cochin
JOB Description:
Greeting Sorice Solutions
Job Purpose
As a Data Engineer at Information Asset you will be responsible for expanding and optimizing our data and data pipeline architecture, as well as streamlining data flow and collection for cross functional teams.
You will leverage your experience as a data pipeline builder and data wrangler to optimize data systems. In this role you will support software developers, database architects and data analysts on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects
At Information Asset you are required to have superior communication and writing skills, technical problem-solving skills, and a customer orientation such that you can handle any issue that arises. You will advise and guide customers, helping them through their journey alone or with a team on several different engagements at once. You can understand complex technical and business concepts and explain them lucidly to others. Your ultimate mission is to deliver a first-rate client experience and positive brand awareness for Information Asset.
Working independently, or as part of a team, you will have the opportunity to work at the front end in a rapidly growing business.
Essential Functions
- Create and maintain optimal data pipeline architecture.
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS big dataโ technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights for Information Asset as well as Information Asset Customersโ operational efficiency and other key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Create data tools for analytics to build and optimize offering into an industry leading solution
- Work with data and analytics experts to strive for greater functionality data systems.
Experience
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience building and optimizing โbig dataโ data pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large, disconnected datasets.
- Working knowledge of message queuing, stream processing, and highly scalable โbig dataโ data stores.
- Strong project management and organizational skills.
- Excellent writing and verbal communication skills and a high level of customer orientation
Education
- Bachelorโs degree in computer science or information technology, or equivalent work experience.
- 3+ years as a Data Engineer.
- Must have technical expertise with data models, data mining, and segmentation techniques.
- Experience using any of all of a combination of the following software/tools:
- Relational SQL and NoSQL databases, including Postgres and Cassandra.
- Data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Data Storage Technologies like Amazon S3, Snowflake/Redshift, Hive, HDFS
- Stream-processing systems: Storm, Spark-Streaming, etc.
- Object-oriented/object function scripting languages: Python/ pySpark, Java, C++, Scala, etc.
- Data warehousing using Oracle/Snowflake.
- ETL Technologies like -Apache Spark/AWS Glue/ Databricks, Apache Airflow, Apache Hadoop.
- Experience supporting and working with cross-functional teams in a dynamic environment.
Certifications, Accreditations, Licenses
Data engineering certification (e.g. IBM Certified Data Engineer, AWS Certified Data Analytics – Specialty) is a plus
Work Environment
Working hours are from 9am – 5pm IST with potential to overlap with Onsite Team โ Incumbent should be prepared to work seamlessly with the US Based team.
Supervisory Responsibilities
No Supervisory Responsibility
Submit CV To All Data Science Job Consultants Across India For Free

Leave a Reply