Senior Data Engineer
- Type:
- Location(s):
- 2795 E Cottonwood Pkwy, Salt Lake City, UT 84121, United States of America
- Date Posted:
- 01/21/2026
- Job ID:
- R-76532
- Compensation:
*** PLEASE NOTE: This is a hybrid role, requiring this person to work at our corporate headquarters in Salt Lake City, UT. We are unable to sponsor or take over sponsorship of an employment visa at this time. ***
Job Summary
The Sr. Data Engineer serves as a technical expert within the team, owns critical data systems, mentors others, and drives reliability and excellence in modern data engineering. Focuses on optimizing data architectures for scalability, integrating advanced tools beyond traditional warehousing, building proficiency in Python while supporting Fabric migration efforts, and applying Kimball dimensional modeling expertise to ensure robust, performant data solutions. Designs, codes, tests, debugs, and documents complex databases.
Primary Responsibilities
Architect scalable data pipelines and dimensional models across hybrid environments, applying Kimball methodology (e.g., bus architecture, star/snowflake schemas, fact table granularity, slowly changing dimensions, surrogate keys).
Lead technical execution of data projects, including migrations to Microsoft Fabric (e.g., refactoring Synapse Pipelines to Fabric equivalents while preserving dimensional integrity).
Mentor Associates and Mid-level engineers; review pipeline designs, Python code, dimensional models, and implementations.
Proactively identify and resolve data performance, quality, security, or scalability issues.
Ensure adherence to data governance standards, security practices (e.g., encryption, access controls), and compliance requirements.
Break down complex data initiatives into actionable plans, incorporating Kimball principles and Fabric components (e.g., Dataflows, Notebooks, Lakehouse).
Implement and maintain Git-based workflows for data pipelines, notebooks, and transformations, including branching strategies for safe development.
Configure and execute promotions in Fabric Deployment Pipelines, handling environment-specific rules and content.
Conduct data quality and pipeline tests during the development cycle, ensuring changes are reliable before cross-environment deployment.
Support migration-related CI/CD activities, such as refactoring Synapse Pipelines to Fabric equivalents with version-controlled artifacts.
Independently interpret business requests into technical requirements through direct engagement with requestors.
Deliver end-to-end solutions for complex or high-impact requests, including proactive suggestions for improvements.
Provide technical guidance during requirement refinement and feedback sessions with requesting teams.
Designs and implements archive, recovery, and load strategies.
Determines database structural requirements by analyzing client or internal operations, applications, and programming.
Reviews objectives with clients or internal users and evaluates current systems.
Coordinates new data development, ensuring consistency and providing for integration with existing warehouse structure.
Key Tools & Technologies
Advanced SQL for querying, transformations, performance tuning, and Kimball dimensional modeling.
Python for scripting, automation, custom logic, and Fabric notebooks.
Azure Synapse Pipelines, SQL Server, and Azure Functions.
Microsoft Fabric (Lakehouse, OneLake, Pipelines, Notebooks, Dataflows)
Git integration and Fabric Deployment Pipelines for version control and CI/CD workflows.
Job Specifications
Demonstrates proficiency in all areas of data engineering with advanced in-depth specialization in dimensional modeling, pipeline architecture, and modern cloud platforms.
Participates in developing technical/business approaches and new or enhanced technical tools, including CI/CD best practices.
Has advanced knowledge of scalable data pipelines, lakehouse architectures, and high-volume processing in Azure Synapse and Microsoft Fabric environments.
Education and Experience
Typically requires 5+ years of related experience and a bachelor’s degree (or equivalent experience).
Strong hands-on experience with Microsoft Azure data services, SQL Server, Python, and data modeling (Kimball methodology preferred).
Experience supporting or leading migrations to modern platforms like Microsoft Fabric is highly desirable.
Work Environment & Physical Requirements
Performs sedentary work in an office environment with limited lifting (less than 10 pounds) or walking required. Close visual acuity required to perform work at computer terminal. No exposure to adverse environmental conditions. Requires repetitive typing motion, talking, hearing, grasping and feeling.
Disclaimer
The job description outlines the general nature and scope of work employees perform in this role. It's not intended to be an exhaustive list of all duties, responsibilities, or qualifications required for the position. The company reserves the right to modify, revise, or update the job description to meet business needs.