YSI | Data Analyst/Cloud ETL Developer | Richmond, VA | United States | BigDataKB.com | 2024-07-28

Job Location: Richmond, VA

Job Detail:

Yakshna Solutions, Inc., (YSI) is a CMMI Level 3 assessed, ISO 9001, 20000:1, 27001 certified, woman-owned small business enterprises, headquartered in Herndon, Virginia, USA. YSI provides professional IT solutions and services to business corporations and government organizations. YSI is committed to serve its business communities as a leading IT vendor providing innovative, quality, and cost-effective IT business solutions and services.

We offer a competitive benefits package that includes the following: 401(k), health, dental, and vision insurance, Life insurance, short-term and long-term disability insurance, paid time off, training, and professional development assistance.

YSI is seeking a highly qualified Data Analyst/Cloud ETL Developer. The selected candidate will be able to communicate effectively (written/verbal), possess strong interpersonal skills, be self-motivated, and be innovative in a fast-paced environment.

ยท Local Richmond, VA candidates ONLY required due to onsite requirement

ยท This position requires onsite 3 days a week with 2 remote.

ยท Contractor will be responsible for purchasing parking through VDOTโ€™s Parking Management Office or procuring their own parking.

Job Description: Data Analyst/SR ETL Developer

ยท YSI is seeking a Master Data Analyst with demonstrated experience in data analytics to work as a key member of Enterprise Data Asset team. This analyst will support teams working in Agile (Sprint) to analyze datasets to be made available in a cloud-based data management platform that will support the agency to produce master data with data governance.
Responsibilities include analyzing source systems which contain a spatial component for candidate datasets; documenting business processes and data lifecycle; developing data requirements, user stories and acceptance criteria; and testing strategies. Develop ETL to extract business data and spatial data and load it into a data warehousing environment. Design and test the performance of the system. Consult with various teams to understand the companyโ€™s data storage needs and develop data warehousing options. Deep knowledge of coding languages, such as python, Java, XML, and SQL. Well-versed in warehousing architecture techniques such as MOLAP, ROLAP, ODS, DM, and EDW.

ยท VDOT is a fast-paced organization with very high standards for work quality and efficiency. This position is expected to handle multiple projects, and remain flexible and productive, despite changing priorities and processes. Ongoing improvement and efficiency are a part of our culture, and each team member is expected to proactively contribute to process improvements.

Responsibilities:

ยท Work with the Project team members and business stakeholders to understand business processes and pain points

ยท Develop expertise in source system datasets and data lifecycle

ยท Profile source data which may contain a spatial component; review source data and compare content and structure to dataset requirements; identify conflicts and determine recommendations for resolution

ยท Conduct entity resolution to identify matching and merging and semantic conflicts

ยท Elicit, record, and manage metadata

ยท Diagram current processes and proposed modifications using process flows, context diagrams and data flow diagrams

ยท Decompose requirements into Epics and Features and create clear and concise user stories that are easy to understand and implement by technical staff

ยท Utilize progressive elaboration; map stories to data models and architectures to be used by internal staff to facilitate master data management

ยท Identify and group related user stories into themes, document dependencies and associated business processes

ยท Discover and document requirements and user stories with a focus on improving both business and technical processing

ยท Assist Product Owner in maintaining the product backlog

ยท Create conceptual prototypes and mock-ups

ยท Collaborate with staff, vendors, consultants, and contractors as they are engaged on tasks to formulate, detail and test potential and implemented solutions

ยท Perform Quality Analyst functions such as defining test objectives, test plans and test cases, and executing test cases

ยท Coordinate and Facilitate User Acceptance Testing with Business and ensure Project Managers/Scrum Masters are informed of the progress

ยท Designs and develops systems for the maintenance of the Data Asset Program (Data Hub), ETL processes, ETL processes for spatial data, and business intelligence.

ยท Develop a new data engineering process that leverage a new cloud architecture and will extend or migrate our existing data pipelines to this architecture as needed.

ยท Design and supports the DW database and table schemas for new and existent data sources for the data hub and warehouse. Design and development of Data Marts.

ยท Work closely with data analysts, data scientists, and other data consumers within the business in an attempt together and populate data hub and data warehouse table structure, which is optimized for reporting.

ยท The Data developers partners with Data modeler and Data architect in an attempt to refine the businessโ€™s data requirements, which must be met for building and maintaining Data Assets.

Qualifications Required:

ยท The candidate must have a minimum of 10 years of experience delivering business data analysis artifacts

ยท 5+ years of experience as an Agile Business Analyst; strong understanding of Scrum concepts and methodology

ยท Experience organizing and maintaining Product and Sprint backlogs

ยท Experience translating client and product strategy requirements into dataset requirements and user stories

ยท Proficient with defining acceptance criteria and managing acceptance process

ยท Exceptional experience writing complex sql queries for Sql Server and Oracle

ยท Experience with Azure Databricks

ยท Experience with ESRI ArcGIS

ยท Experience with enterprise data management

ยท Expertise with Microsoft Office products (Word, Excel, Access, Outlook, Visio, PowerPoint, Project Server)

ยท Experience with reporting systems โ€“ operational data stores, data warehouses, data lakes, data marts

ยท The candidate must have exceptional written and oral communications skills and have the proven ability to work well with a diverse set of peers and customers

ยท Preferred Skills:

ยท Advanced understanding of data integrations.

ยท Strong knowledge of database architectures

ยท Strong analytical and problem-solving skills

ยท Ability to build strong relationships both internally and externally

ยท Ability to negotiate and resolve conflicts

ยท Ability to effectively prioritize and handle multiple tasks and projects

ยท Strong written and verbal communication skills

ยท Desire to learn, innovate and evolve technology

ยท Computer Skills/MS Office/Software:

ยท Excellent computer skills and be highly proficient in the use of MS Word, PowerPoint, MS Excel, MS Project, MS Visio, and MS Team Foundation Server, which will all be necessary in the creation of visually and verbally engaging ETL, data designs and tables as well as the communication of documentation and reporting.

ยท Deep passion for data analytics technologies as well as analytical and dimensional modeling. The candidate must be extensively familiar with ETL (Extraction, Transformation & Load), data warehousing, and business intelligence tools such as business objects, PowerBI, and Tableau.

ยท The candidate must also have vast knowledge of database design and modeling in the context of data warehousing.

ยท Experience with key data warehousing architectures including Kimball and Inmon, and has a broad experience designing solutions using a broad set of data stores (e.g., HDFS, Azure Data Lake Store, Azure Blob Storage, Azure SQL Data Warehouse, Azure Cosmos DB

ยท Technologies Required:

ยท Data Factory v2, Data Lake Store, Data Lake Analytics, Azure Analysis Services, AZURE Synapse

ยท IBM DataStage, Erwin, SQL Server (SSIS, SSRS, SSAS), ORACLE, T-SQL, Azure SQL Database, Azure SQL Datawarehouse.

ยท Operating System Environments (Windows, Unix, etc.).

ยท Scripting experience with Windows and/or Python, Linux Shell scripting

Job Types: Full-time, Contract

Pay: $125,000.00 – $130,000.00 per year

Benefits:

  • 401(k)
  • Dental insurance
  • Health insurance

Schedule:

  • 8 hour shift
  • Monday to Friday

Experience:

  • Informatica: 10 years (Required)
  • SQL: 10 years (Required)
  • Data warehouse: 10 years (Required)
  • delivering business data analysis artifacts: 10 years (Required)
  • Agile Business Analyst: 5 years (Required)
  • organizing, maintaining Product, Sprint backlogs: 10 years (Required)
  • Azure Databricks: 5 years (Required)
  • ESRI ArcGIS: 5 years (Required)
  • enterprise data management: 5 years (Required)
  • AZURE Cloud engineering: 10 years (Required)
  • Python, Linux Shell scripting: 10 years (Required)
  • IBM DataStage, Erwin, SQL Server: 10 years (Required)

Ability to Commute:

  • Richmond, VA 23219 (Required)

Ability to Relocate:

  • Richmond, VA 23219: Relocate before starting work (Required)

Work Location: In person

Apply Here

Submit CV To All Data Science Job Consultants Across United States For Free

๐Ÿ” Explore All Related ITSM Jobs Below! ๐Ÿš€ โœ… Select your preferred “Job Category” in the Job Category Filter ๐ŸŽฏ ๐Ÿ”Ž Hit “Search” to find matching jobs ๐Ÿ”ฅ โž• Click the “+” icon that appears just before the company name to see the Job Detail & Apply Link ๐Ÿ“๐Ÿ’ผ

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *