Job Location: Madison, WI
At American Family Insurance, we believe people are an organizationโs most valuable asset, and their ideas and experiences matter. From our CEO to our agency force, weโre committed to growing a diverse and inclusive culture that empowers innovation that will inspire, protect, and restore our customersโ dreams in ways never imagined.
American Family Insurance is driven by our customers and employees. Thatโs why we provide more than just a job โ we provide opportunity. Whether youโre already part of our team in search of a new challenge or new to our company and ready for whatโs next, youโre in the right place. Every dream is a journey that starts with a single step. Start your journey right here. Join our team. Bring your dreams.
Job ID:
R27870 Data Engineer III/Senior Data Engineer – AmFam Labs (open to remote) (Open)
Compensation may vary based on the job level and your geographic work location.
Compensation Minimum:$103,500
Compensation Maximum:$165,700
Summary:
The Data Science and Analytics Lab (DSAL) at American Family is a highly performant team of data scientists and software engineers dedicated to innovating in the Insurtech space, and delivering cloud-native solutions driven by machine learning. We set a high bar for our software engineers in terms of depth and breadth of experience, with a focus on cloud architecture, microservices, data pipelining, and dev/ops.
American Familyโs Data Science and Analytics Lab (DSAL) is the result of American Family Insuranceโs investment in innovation-visit their website for more information: https://amfamlabs.com/
Job Description:
Duties and Responsibilities:
- Build and deploy cloud native APIs and architectures to deliver machine-learning solutions at scale
- Continuously work to improve and model industry best practices as it relates to system architecture, tooling, CI/CD, testing, and software design
- Work with business stakeholders and collaborate with other embedded engineering teams to deliver solutions to complex customer problems on time
- Be flexible, forward-thinking, and able to pick up new skills, tools, and languages as needs evolve and change
- Deploy applications to the cloud professionally and in an automated fashion
Skills and Qualifications:
- Strong acumen in at least one major, general-purpose language (e.g. Python, Java, C#, etc.); Python preferred
- Minimum 2 years’ experience developing cloud services (e.g. AWS, GCP, Azure)
- Understanding of CI/CD systems and Infrastructure as Code (IaC)
- Extensive experience with containerization; Kubernetes experience is a plus
- Data pipelining experience preferred (e.g. Spark, Dask, Luigi, Airflow, etc.)
- API design and execution (thorough understanding of REST is a must; gRPC and GraphQL are a plus)
- Database design and understanding of both relational and NoSQL data stores
- Exposure to Data Science frameworks (Numpy, Pandas, Tensorflow, Pytorch, etc.) is a plus
Perks:
- High level of trust given to engineers is the default
- Fully remote, partially remote, or on-site โ up to you
- Subsidized conference attendance and training (Employee)
- Flexible vacation policy (Employee)
- Excellent benefits package included (Employee)
About Our Team:
DSAL engineers fulfill a number of responsibilities, so we seek out candidates with a high degree of engineering aptitude, an appetite for learning, and a willingness to collaborate. Our team balances a desire to stay on the cutting edge with a necessity to deliver solutions that impact the future of the insurance industry. To that end when choosing tooling or designing systems, we favor the mantra of “strong opinions, loosely held.” DSAL operates much like a start-up, bringing new products to life from scratch to MVP, and either “failing fast” or launching systems that are used either by our companies or spun out as companies. We expect that all engineers on our team can focus on any of the varied components across our entire tech stack, with the understanding that each engineer will have their own specialties.
About Our Tech Stack:
- Python (various web frameworks including FastAPI, Flask, Falcon, and Django)
- Services containerized and deployed to Kubernetes (and sometimes Fargate)
- Data pipelines orchestrated by Luigi
- Data parallelization done in containers using AWS Batch/Kubernetes
- Some applications deployed on AWS, others on GCP
- Utilization of AWS Lambda, AWS Batch, GCP Cloud Functions, and GCP Cloud Run
- Code repositories and issue backlogs stored in Gitlab
- ArgoCD and Terraform Enterprise incorporated into our CI system and utilized for deployment
- Extensive use of the HashiStack (Consul, Vault, Terraform)
- Datadog for tracing, monitoring, and logging
- Use of Pomerium Identity-Aware Proxy (IAP) for authentication
- Foxpass VPNs to connect to our VPCs, though we are moving towards a zero-trust model
- Slack and Zoom for team communication
Education and Licenses
- Bachelorโs degree in computer science or related field, or equivalent combination of education and experience.
Specialized Knowledge & Skills Requirements
- Demonstrated experience providing customer-driven solutions, support or service.
- In-depth knowledge of SQL or NoSQL and experience using a variety of data stores (e.g. RDBMS, analytic database, scalable document stores)
- Extensive hands-on Python programming experience, with an emphasis towards building ETL workflows and data-driven solutions. Able to employ design patterns and generalize code to address common use cases. Capable of authoring robust, high quality, reusable code and contributing to the divisionโs inventory of libraries.
- Expertise in big data batch computing tools (e.g. Hadoop or Spark), with demonstrated experience developing distributed data processing solutions.
- Applied knowledge of cloud computing (AWS, GCP, Azure).
- Knowledge of open source machine learning toolkits, such as sklearn, SparkML, or H2O.
- Solid data understanding and business acumen in the data rich industries like insurance or financial
- Applied knowledge of data modeling principles (e.g. dimensional modeling and star schemas).
- Strong understanding of database internals, such as indexes, binary logging, and transactions.
- Experience using tools for infrastructure-as-code (e.g. Docker, CloudFormation, Terraform, etc.)
- Experience with software engineering tools and workflows (i.e. Jenkins, CI/CD, git).
- Practical experience authoring and consuming web services.
Travel Requirements
- This position requires travel up to 10% of the time.
Additional Job Information:
-
There are multiple openings.
-
Depending on qualifications, candidates may be considered at different levels.
-
The teams are based in Chicago, IL and Madison, WI so those locations are preferable; however, we are open to remote employees in the United States.
-
Offer to selected candidate will be made contingent on the results of applicable background checks.
-
Offer to selected candidate is contingent on signing a non-disclosure agreement for proprietary information, trade secrets, and inventions
#LI-Remote
When you work at American Family you can expect benefits that support your physical, emotional, and financial wellbeing. You will have access to comprehensive medical, dental, vision and wellbeing benefits that enable you to take care of your health. We also offer a competitive 401(k) contribution, a pension plan, an annual incentive, and a paid-time off program. In addition, our student loan repayment program and paid-family leave are available to support our employees and their families. Interns and contingent workers are not eligible for American Family Enterprise benefits.
We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law.
Stay connected: Join Our Enterprise Talent Community!
#LI-DB1
Submit CV To All Data Science Job Consultants Across India For Free

Leave a Reply