Data Engineering & Analytics Architect at NTT DATA
Bengaluru, karnataka, India -
Full Time


Start Date

Immediate

Expiry Date

27 Feb, 26

Salary

0.0

Posted On

29 Nov, 25

Experience

10 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Architecture, ETL, ELT, Informatica, Spark, AWS Glue, Azure Data Factory, Databricks, Power BI, Tableau, SSRS, SSAS, Data Vault Modeling, Streaming, Batch Orchestration, Security, Governance

Industry

IT Services and IT Consulting

Description
Define data-first architecture, standards, and blueprints across ingestion, cleansing, enrichment, modeling, and serving layers. Lead design of batch and streaming pipelines (e.g., Informatica, Spark, AWS Glue, Azure Data Factory/Databricks); choose fit-for-purpose storage/compute. Establish semantic/serving models and integration patterns for Power BI, Tableau, SSRS, SSAS (tabular models, datasets, data marts). Set data quality and validation frameworks aligned to CLIENT benchmarks; champion lineage, metadata, and observability. Enforce security, privacy, and compliance (PII handling per CLIENT data classification; encryption, RBAC, secrets). Drive audit readiness (controls, documentation, evidence) and lead internal security reviews. Partner with data governance and platform teams; mentor developers and review designs/PRs; guide cost/performance optimization. Handle PII/customer data per CLIENT classification policies; projects subject to internal audits/security reviews. Though ~10-15% of footprint, data projects are high-complexity and multi-sprint; close collaboration with data governance and platform teams is critical. 10+ years in data/analytics platforms, 5+ years in data architecture. Expertise in ETL/ELT with Informatica and/or Spark/PySpark, plus native cloud data services (AWS Glue, Azure ADF/Databricks). Proven integration with Power BI, Tableau, SSRS, SSAS; strong dimensional/Data Vault modeling. Hands-on with streaming (Kafka/Kinesis/Event Hubs) and batch orchestration (Airflow/ADF/Glue Workflows). Deep knowledge of security, governance, and audit practices; excellent stakeholder leadership.
Responsibilities
Define data-first architecture and lead the design of batch and streaming pipelines. Establish data quality frameworks and enforce security, privacy, and compliance standards.
Loading...