Data Engineer Solution Architect | 2026HP02004/#8ut45PH2 at Mindverse Consulting Services Limited
, gujarat, India -
Full Time


Start Date

Immediate

Expiry Date

27 May, 26

Salary

0.0

Posted On

26 Feb, 26

Experience

10 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Snowflake, Databricks, dbt, Matillion, Python, Airflow, Control-M, AWS, Azure, GCP, Data Warehousing, Data Lake, ETL, Data Modeling, PySpark, Power BI

Industry

IT Services and IT Consulting

Description
Job Summary We are looking for an experienced Data Engineering Solutions Architect to join our growing Data Practice. The ideal candidate will have 8–12 years of hands-on experience designing, architecting, and delivering large-scale data warehousing, data lake, ETL, and reporting solutions across modern and traditional data platforms. You will play a key role in defining scalable, secure, and cost-effective architectures that enable advanced analytics and AI-driven insights for our clients. This role demands a balance of technical depth, solution leadership, and consulting mindset - helping customers solve complex data engineering challenges while also building internal capability and best practices within the organization. Job Responsibilities · Design and architect end-to-end data solutions using technologies like Snowflake, Databricks, dbt, Matillion, Python, Airflow, Control-M, and cloud-native services on AWS/Azure/GCP. · Define and implement data ingestion, transformation, integration and orchestration frameworks for structured and semi-structured data. · Architect data lakes and data warehouses with an emphasis on scalability, cost optimization, performance, and governance. · Support real-time and API-based data integration scenarios; design solutions for streaming, micro-batch, and event-driven ingestion. · Lead design and delivery of data visualization and reporting solutions using tools such as Power BI, Tableau, and Streamlit. · Collaborate with business and technical stakeholders to define requirements, design architecture blueprints, and ensure alignment with business objectives. · Establish and enforce engineering standards, frameworks, and reusable assets to improve delivery efficiency and solution quality. · Mentor data engineers and help build internal capability on emerging technologies. · Provide thought leadership around modern data platforms, AI/ML integration, and data modernization strategies. Essential Skills · 8–12 years of experience in data engineering and architecture, including hands-on solution delivery. · Deep expertise with Snowflake or Databricks, with strong working knowledge of tools like dbt, Matillion, SQL, and Python or PySpark. · Experience designing and implementing data pipelines and orchestration using tools like Airflow, Control-M, or equivalent. · Familiarity with cloud-native data engineering services (Such as AWS Glue, Redshift, Athena, GCP BigQuery, Dataflow, Pub/Sub, etc.) or similar. · Strong understanding of data modelling, ELT/ETL design, and modern architecture frameworks (medallion, layered, or modular architectures). · Experience integrating and troubleshooting APIs and real-time data ingestion technologies (Kafka, Kinesis, Pub/Sub, REST APIs). · Familiarity with traditional ETL and data integration tools (Informatica, SSIS, Oracle Data Integrator, etc.). · Excellent understanding of data governance, performance tuning, and DevOps for data (CI/CD, version control, monitoring). · Strong communication, problem-solving, and stakeholder management skills. Preferred Qualifications · Certifications such as: · Snowflake SnowPro, Databricks Certified Architect, AWS Data Analytics Specialty, or Google Professional Data Engineer. · Prior consulting or client-facing experience. · Exposure to AI/ML, data quality, or metadata management frameworks. · Experience leading solution design across multi-cloud or hybrid environments. Background Check required No criminal record Others This is 5 days work from office role Office Location- Ambawadi, Ahmedabad Interview rounds-3-4 rounds of interviews.
Responsibilities
The role involves designing and architecting end-to-end data solutions using modern technologies like Snowflake, Databricks, and cloud services, while defining frameworks for data ingestion, transformation, and orchestration. Responsibilities also include leading the design of reporting solutions, collaborating with stakeholders, establishing engineering standards, and mentoring data engineers.
Loading...