UKG

UKG

Lead Data Engineer

Lowell, Massachusetts, USFull-time6 days agovia The Muse

Salary

-

Job type

Full-time

Location

Lowell, Massachusetts, US

Remote

No

Posted

6 days ago

Database Architect Resume Example

See a professional resume example for this role with key skills, action verbs, and ATS-friendly formatting.

View resume example

Job description

Why UKG:At UKG, the work you do matters. The code you ship, the decisions you make, and the care you show a customer all add up to real impact. Today, tens of millions of workers start and end their days with our workforce operating platform. Helping people get paid, grow in their careers, and shape the future of their industries. That's what we do.We never stop learning. We never stop challenging the norm. We push for better, and we celebrate the wins along the way. Here, you'll get flexibility that's real, benefits you can count on, and a team that succeeds together. Because at UKG, your work matters-and so do you.OverviewWe are seeking a Lead Data Engineer to design, build, and lead the delivery of scalable data solutions on our Cloud Data Platform (GCP/Azure). This role plays a critical hands-on leadership function, owning complex data pipelines and models while guiding a small team of data engineers through design, development, and best practices.The ideal candidate is a strong technical contributor who can translate architectural guidance into execution, partner closely with Product Owners and Architects, and ensure high-quality, reliable data solutions aligned to business needs.Essential Duties & ResponsibilitiesDesign & Build Data SolutionsDesign, develop, and maintain end-to-end data pipelines and transformations using Azure (ADF, Databricks, ADLS Gen2, Synapse) and GCP (Composer, Data Fusion, Dataform, DataProc, BigQuery, GCS).Implement Fact and Dimension models, CDC patterns, and cloud-native ELT pipelines following established architectural standards.Optimize data pipelines for performance, scalability, and cost efficiency.Technical Leadership & DeliveryServe as the technical lead for one or more data domains or initiatives.Lead design discussions, perform code reviews, and ensure adherence to engineering standards and best practices.Mentor and guide Senior and Mid-level Data Engineers through hands-on coaching and technical feedback.Act as the first point of escalation for complex technical issues within owned initiatives.Collaboration & AlignmentPartner with Data Architects, Product Owners, Analysts, and Platform teams to translate business requirements into effective data solutions.Contribute to data modeling decisions to ensure alignment with analytics and reporting use cases.Support cross-team initiatives by implementing shared frameworks and reusable components defined at the platform level.Operational ExcellenceEnsure reliability and maintainability of data pipelines through monitoring, alerting, and automated testing.Implement and maintain CI/CD pipelines using Azure DevOps and/or GitHub Actions.Create and maintain technical documentation for pipelines, models, and processes.Support data quality, governance, and metadata standards within owned solutions.QualificationsExperience7-9+ years of experience in data engineering and data warehousing, including cloud-based solutions.Proven experience delivering complex data pipelines and warehouse solutions in Azure and/or GCP.Strong hands-on experience with Python, SQL, Spark, and distributed data processing.Technical ProficiencyAzure: Data Factory, Databricks, Synapse, ADLS Gen2, Azure DevOpsGCP: Composer, Data Fusion, Dataform, DataProc, BigQuery, GCS, GitHubSolid understanding of Dimensional Modeling, ELT, CDC, and modern data lakehouse concepts.Leadership & CollaborationDemonstrated ability to lead technical delivery and influence design decisions within a team.Experience mentoring engineers through code reviews, pairing, and design guidance.Strong communication skills with the ability to explain technical concepts to non-technical stakeholders.PreferredExperience integrating data from SaaS platforms (Salesforce, D365, Qualtrics, Pendo, etc.).Familiarity with DataOps practices, orchestration, and monitoring tools.Exposure to data quality and governance concepts.Cloud certification (Azure or GCP) is a plus.Company Overview:UKG is the Workforce Operating Platform that puts workforce understanding to work. With the world's largest collection of workforce insights, and people-first AI, our ability to reveal unseen ways to build trust, amplify productivity, and empower talent, is unmatched. It's this expertise that equips our customers with the intelligence to solve any challenge in any industry - because great organizations know their workforce is their competitive edge. Learn more at ukg.com.Equal Opportunity EmployerUKG is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, disability, religion, sex, age, national origin, veteran status, genetic information, and other legally protected categories. View The EEO Know Your Rights poster UKG participates in E-Verify. View the E-Verify posters here. It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability.Disability Accommodation in the Application and Interview ProcessFor individuals with disabilities that need additional assistance at any point in the application and interview process, please email UKGCareers@ukg.com.The pay range for this position is $129,500.00 to $186,100.00. The actual base pay offered may vary depending on skills, experience, job-related knowledge and work location. In addition to base pay, employees may be eligible to participate in a performance-based bonus plan and to receive restricted stock unit awards as part of total compensation. Learn more about UKG's benefits and rewards at https://www.ukg.com/about-us/careers/benefits

Responsibilities

  • This role plays a critical hands-on leadership function, owning complex data pipelines and models while guiding a small team of data engineers through design, development, and best practices
  • The ideal candidate is a strong technical contributor who can translate architectural guidance into execution, partner closely with Product Owners and Architects, and ensure high-quality, reliable data solutions aligned to business needs
  • Essential Duties & ResponsibilitiesDesign & Build Data SolutionsDesign, develop, and maintain end-to-end data pipelines and transformations using Azure (ADF, Databricks, ADLS Gen2, Synapse) and GCP (Composer, Data Fusion, Dataform, DataProc, BigQuery, GCS)
  • Implement Fact and Dimension models, CDC patterns, and cloud-native ELT pipelines following established architectural standards
  • Optimize data pipelines for performance, scalability, and cost efficiency
  • Technical Leadership & DeliveryServe as the technical lead for one or more data domains or initiatives
  • Lead design discussions, perform code reviews, and ensure adherence to engineering standards and best practices
  • Mentor and guide Senior and Mid-level Data Engineers through hands-on coaching and technical feedback
  • Act as the first point of escalation for complex technical issues within owned initiatives
  • Collaboration & AlignmentPartner with Data Architects, Product Owners, Analysts, and Platform teams to translate business requirements into effective data solutions
  • Contribute to data modeling decisions to ensure alignment with analytics and reporting use cases
  • Support cross-team initiatives by implementing shared frameworks and reusable components defined at the platform level
  • Operational ExcellenceEnsure reliability and maintainability of data pipelines through monitoring, alerting, and automated testing
  • Implement and maintain CI/CD pipelines using Azure DevOps and/or GitHub Actions
  • Create and maintain technical documentation for pipelines, models, and processes
  • Support data quality, governance, and metadata standards within owned solutions.QualificationsExperience7-9+ years of experience in data engineering and data warehousing, including cloud-based solutions

Qualifications

  • Proven experience delivering complex data pipelines and warehouse solutions in Azure and/or GCP
  • Strong hands-on experience with Python, SQL, Spark, and distributed data processing
  • Technical ProficiencyAzure: Data Factory, Databricks, Synapse, ADLS Gen2, Azure DevOpsGCP: Composer, Data Fusion, Dataform, DataProc, BigQuery, GCS, GitHubSolid understanding of Dimensional Modeling, ELT, CDC, and modern data lakehouse concepts
  • Leadership & CollaborationDemonstrated ability to lead technical delivery and influence design decisions within a team
  • Experience mentoring engineers through code reviews, pairing, and design guidance
  • Strong communication skills with the ability to explain technical concepts to non-technical stakeholders
  • Exposure to data quality and governance concepts

Benefits

  • Disability Accommodation in the Application and Interview ProcessFor individuals with disabilities that need additional assistance at any point in the application and interview process, please email UKGCareers@ukg.com.The pay range for this position is $129,500.00 to $186,100.00
  • The actual base pay offered may vary depending on skills, experience, job-related knowledge and work location
  • In addition to base pay, employees may be eligible to participate in a performance-based bonus plan and to receive restricted stock unit awards as part of total compensation
  • Learn more about UKG's benefits and rewards at https://www.ukg.com/about-us/careers/benefits

Stand out from other applicants

AI reads this job description and tailors your resume to match, optimized for ATS filters.

Similar jobs

Ready to land your next role?

Join thousands of professionals who use Mokaru to manage their job search. AI-powered resume tailoring, application tracking, and more.

Create Free Resume