Jobs via Dice

Jobs via Dice

Senior Data Integration & Data Quality Analyst

McLean, Virginia, USFull-time2 days agovia LinkedIn

Salary

-

Job type

Full-time

Location

McLean, Virginia, US

Remote

No

Posted

2 days ago

Data Warehousing Specialist Resume Example

See a professional resume example for this role with key skills, action verbs, and ATS-friendly formatting.

View resume example

Job description

Dice is the leading career destination for tech experts at every stage of their careers. Our client, Cyber Sphere LLC, is seeking the following. Apply via Dice today!

Senior Data Integration & Data Quality Analyst

McLean, VA - 100% Onsite In person Interview

Long Term

Contract

Senior Data Integration & Data Quality Analyst with deep expertise in advanced SQL, data profiling, and end-to-end data integration/ETL analysis. The ideal candidate will reverse engineer legacy pipelines and ensure strong data quality across enterprise data platforms including Snowflake, Db2, and MongoDB. Experience with Snowflake Cortex is desired to support AI-assisted analytics, automation, and data quality use cases. Python proficiency is strongly preferred to accelerate automation and validation. Experience in the mortgage or financial domain is highly desirable.

Qualifications

Required

  • 5+ years of experience in data integration, data quality, data warehousing, or related roles.
  • Expert-level SQL with strong experience in Snowflake and Db2 within EDW/ODS environments.
  • Experience integrating and analyzing data from MongoDB (document structures, nested fields, schema drift considerations).
  • Proven experience with data profiling and data quality analysis.
  • Demonstrated ability to reverse engineer ETL/data pipelines and document transformation logic from existing jobs and code.
  • Strong experience producing source-to-target mappings and data lineage documentation.
  • Experience with data modeling concepts in enterprise data warehouses (e.g., dimensional modeling, defining grain, keys, relationships, and conformed dimensions).
  • Excellent analytical skills and strong technical documentation/writing ability; comfortable working with incomplete or legacy documentation.

Preferred

  • Snowflake Cortex experience (or comparable AI/LLM capabilities within a data platform) applied to analytics, automation, or documentation workflows.
  • Python proficiency (e.g., Pandas, SQLAlchemy; plus cloud/utility libraries as needed).
  • ETL / Data Integration Tools: IBM DataStage, Informatica IICS, Talend, Nexus EBM, SSIS, dbt, or similar.
  • Cloud familiarity: AWS (Lambda, S3, Glue) or Azure/Google Cloud Platform equivalents.
  • Governance/metadata tools: Collibra; diagramming tools such as draw.io, Lucidchart, or Erwin.
  • Mortgage domain experience.

Responsibilities

  • Develop and optimize complex SQL for profiling, validation, reconciliation, anomaly investigation, and root-cause analysis across Snowflake, Db2, and MongoDB; build reusable query assets and repeatable validation patterns.
  • Data Profiling & Data Quality: Perform profiling for completeness, uniqueness, format conformance, outliers, and referential integrity; document data quality issues and recommend remediation strategies.
  • ETL / Data Integration Reverse Engineering: Analyze and reverse engineer existing ETL/data integration pipelines (ETL tools, stored procedures, scripts) to reconstruct transformation logic, dependencies, and embedded business rules especially where documentation is missing.
  • Mapping & Lineage Documentation: Produce detailed source-to-target mappings including column-level lineage, transformation logic, business rules, and handling for incremental loads and SCD Type 1/2 where applicable.
  • Data Modeling: Partner with data engineering and analytics teams to design and refine data models for ODS/EDW and downstream consumption, including dimensional and normalized models.
  • Python Automation: Build scripts and utilities to automate profiling, reconciliation, ETL validation/testing, lineage extraction, file parsing, audit trail generation, and incremental load checks.
  • Snowflake Cortex: Apply Cortex capabilities to accelerate data understanding and quality workflows in alignment with governance and security standards.
  • Governance & Standards: Support enterprise standards (naming, typing, null handling, audit columns), contribute profiling/lineage artifacts to governance processes, and assist with traceability patterns and CDC/audit logging approaches.
  • Collaboration: Partner with data engineering, BI/reporting, governance, and business stakeholders to validate logic, confirm requirements, and support modernization/migration initiatives.

Munesh

,

CYBER SPHERE LLC

Responsibilities

  • The ideal candidate will reverse engineer legacy pipelines and ensure strong data quality across enterprise data platforms including Snowflake, Db2, and MongoDB
  • Develop and optimize complex SQL for profiling, validation, reconciliation, anomaly investigation, and root-cause analysis across Snowflake, Db2, and MongoDB; build reusable query assets and repeatable validation patterns
  • Data Profiling & Data Quality: Perform profiling for completeness, uniqueness, format conformance, outliers, and referential integrity; document data quality issues and recommend remediation strategies
  • ETL / Data Integration Reverse Engineering: Analyze and reverse engineer existing ETL/data integration pipelines (ETL tools, stored procedures, scripts) to reconstruct transformation logic, dependencies, and embedded business rules especially where documentation is missing
  • Mapping & Lineage Documentation: Produce detailed source-to-target mappings including column-level lineage, transformation logic, business rules, and handling for incremental loads and SCD Type 1/2 where applicable
  • Data Modeling: Partner with data engineering and analytics teams to design and refine data models for ODS/EDW and downstream consumption, including dimensional and normalized models
  • Python Automation: Build scripts and utilities to automate profiling, reconciliation, ETL validation/testing, lineage extraction, file parsing, audit trail generation, and incremental load checks
  • Snowflake Cortex: Apply Cortex capabilities to accelerate data understanding and quality workflows in alignment with governance and security standards
  • Governance & Standards: Support enterprise standards (naming, typing, null handling, audit columns), contribute profiling/lineage artifacts to governance processes, and assist with traceability patterns and CDC/audit logging approaches
  • Collaboration: Partner with data engineering, BI/reporting, governance, and business stakeholders to validate logic, confirm requirements, and support modernization/migration initiatives

Qualifications

  • Senior Data Integration & Data Quality Analyst with deep expertise in advanced SQL, data profiling, and end-to-end data integration/ETL analysis
  • 5+ years of experience in data integration, data quality, data warehousing, or related roles
  • Expert-level SQL with strong experience in Snowflake and Db2 within EDW/ODS environments
  • Experience integrating and analyzing data from MongoDB (document structures, nested fields, schema drift considerations)
  • Proven experience with data profiling and data quality analysis
  • Demonstrated ability to reverse engineer ETL/data pipelines and document transformation logic from existing jobs and code
  • Strong experience producing source-to-target mappings and data lineage documentation
  • Experience with data modeling concepts in enterprise data warehouses (e.g., dimensional modeling, defining grain, keys, relationships, and conformed dimensions)
  • Excellent analytical skills and strong technical documentation/writing ability; comfortable working with incomplete or legacy documentation

Stand out from other applicants

AI reads this job description and tailors your resume to match, optimized for ATS filters.

Similar jobs

Ready to land your next role?

Join thousands of professionals who use Mokaru to manage their job search. AI-powered resume tailoring, application tracking, and more.

Create Free Resume