Job Location: Arlington, VA
100% Remote
Job Description
Required Skills & Experience
- Degree in Computer Science, Engineering, Mathematics, or a related field or 5+ years industry experience
- 7+ years of experience with demonstrated strength in ETL/ELT, data modeling, data warehouse technical architecture, infrastructure components and reporting/analytic tools.
- 5+ years of hands-on experience in writing complex, highly-optimized SQL queries across large data sets.
- 3+ years of experience in scripting languages like Python etc.
- A good candidate has strong analytical skills and enjoys working with large complex data sets.
- Good knowledge of Advanced SQL, AWS Redshift or MPP databases
- A good candidate can partner with business owners directly to understand their requirements and provide data which can help them observe patterns and spot anomalies.
Preferred
Apache Airflow
AWS skills
Projects
As a Data Engineer, you will provide technical leadership, lead data engineering initiatives and build end-to-end analytical solutions that are highly available, scalable, stable, secure, and cost-effective. You strive for simplicity, and demonstrate creativity and sound judgement. You deliver data solutions that are customer focused, easy to consume and create business impact. You are passionate about working with huge datasets and have experience with organizing and curating data for analytics. You have a strategic and long-term view on architecting advanced data ecosystems. You are experienced in building efficient and scalable data services and have the ability to integrate data systems with AWS tools and services to support a variety of customer use cases/applications.
Interaction With Team
As a Data Engineer with Workforce Intelligence, you will partner with Software Engineers, Data Scientists and Business Intelligence Engineers. You will gain a deep understanding of our services and the data they produce, and become our resident expert in transforming that data into a useful format for analytics and business intelligence. You will proactively help to identify new data for integration with our platform, and propose and implement new technologies to help us better understand our data.
In this role, you will serve as the expert in designing, implementing, and operating a stable, scalable, low cost environment to flow information from the source systems to data warehouse into end-user facing reporting applications such as Tableau or AWS QuickSight. Above all, you will bring large datasets together to answer business questions and drive data-driven decision making.
Typical Tasks
- Design, implement and operate large-scale, high-volume, high-performance data structures for analytics and data science.
Implement data ingestion routines both real time and batch using best practices in data modeling, ETL/ELT processes by leveraging AWS technologies and big data tools.
Gather business and functional requirements and translate these requirements into robust, scalable, operable solutions with a flexible and adaptable data architecture.
Collaborate with engineers to help adopt best practices in data system creation, data integrity, test design, analysis, validation, and documentation
Help continually improve ongoing reporting and analysis processes, automating or simplifying self-service modeling and production support for customers.
Submit CV To All Data Science Job Consultants Across United States For Free

