Data Engineer (Core SQL, VaultSpeed) at Unison Group
Hyderabad, Telangana, India -
Full Time


Start Date

Immediate

Expiry Date

15 Dec, 25

Salary

0.0

Posted On

16 Sep, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Core SQL, VaultSpeed, Data Vault 2.0, Data Modeling, ETL, ELT, Data Governance, Metadata Management, Data Quality, Performance Tuning, Data Warehousing, Batch Processing, Real-Time Processing, Version Control, CI/CD, Orchestration Frameworks

Industry

Business Consulting and Services

Description
Job Summary: We are looking for a highly skilled Data Engineer with strong expertise in Core SQL and hands-on experience with VaultSpeed to join our growing data engineering team. You will be responsible for designing, developing, and maintaining scalable data pipelines and data warehouse solutions based on the Data Vault 2.0 methodology. Your work will directly support enterprise data integration, governance, and analytics initiatives. Key Responsibilities: Design and implement data models and pipelines using VaultSpeed to automate Data Vault architecture. Write and optimize complex SQL queries for data extraction, transformation, and loading (ETL/ELT). Work closely with data architects, analysts, and business stakeholders to ensure accurate and reliable data solutions. Automate data ingestion from multiple source systems (batch and real-time). Maintain and optimize data warehouse solutions using modern best practices. Monitor, troubleshoot, and enhance data processes for reliability and performance. Contribute to data governance and metadata management efforts. Ensure data quality, lineage, and security throughout the data lifecycle. Bachelor’s degree in Computer Science, Information Systems, Engineering, or related field. 4+ years of professional experience in Data Engineering or similar role. Strong expertise in Core SQL (joins, CTEs, window functions, performance tuning, etc.). Hands-on experience with VaultSpeed and understanding of Data Vault 2.0 concepts. Proficiency in working with modern data warehousing platforms (e.g., Snowflake, Azure Synapse, BigQuery, Redshift, etc.). Familiarity with ETL/ELT tools and orchestration frameworks (e.g., dbt, Airflow, Talend, etc.). Understanding of data modeling principles (3NF, Star Schema, Data Vault). Experience with version control systems (e.g., Git) and CI/CD pipelines.
Responsibilities
Design and implement data models and pipelines using VaultSpeed to automate Data Vault architecture. Write and optimize complex SQL queries for data extraction, transformation, and loading (ETL/ELT).
Loading...