Intermediate AI Engineer at Enable Data Incorporated
Bengaluru, karnataka, India -
Full Time


Start Date

Immediate

Expiry Date

12 Mar, 26

Salary

0.0

Posted On

12 Dec, 25

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Azure Databricks, Python, SQL, Azure Cloud, ETL, CI/CD, Data Lake, Blob Storage, Machine Learning, Generative AI, MLOps, Monitoring, Error Handling, Model Governance, Vector Search, Azure Services

Industry

IT Services and IT Consulting

Description
Primary Responsibilities: This role focuses on building production-ready AI applications and deploying them on Azure Databricks and Azure cloud infrastructure. You will work end-to-end: from data ingestion and model integration to scalable deployment, monitoring, and ongoing optimization. The expectation is to convert AI ideas into reliable, governed, and cost-efficient applications that run in production. You will design data and AI pipelines, integrate models (including ML and Generative AI), and deploy them using Databricks workflows and Azure-native services. Success in this role requires strong hands-on experience with Azure Databricks, Python, SQL, and Azure services, along with a clear understanding of how AI systems fail in production—and how to prevent it. You will collaborate closely with data scientists, platform engineers, and business stakeholders to ensure AI applications are usable, scalable, and maintainable beyond the first release. Key Responsibilities Design and build end-to-end data and AI pipelines using Azure Databricks. Develop robust ETL/ELT workflows using Python (PySpark) and SQL. Implement CI/CD pipelines for Databricks deployments (jobs, notebooks, workflows). Integrate Databricks with Azure services (Data Lake, Blob Storage, Key Vault, Azure OpenAI, Azure Functions, etc.). Optimize jobs for performance, cost, and reliability. Build reusable, modular code. Collaborate with data scientists and platform teams to move models from experimentation to production. Implement logging, monitoring, and error handling for production pipelines. Develop and deploy ML and Generative AI models (LLMs, embeddings, RAG pipelines) for NLP, computer vision, and predictive analytics. Fine-tune LLMs using LoRA/QLoRA and integrate with Azure OpenAI or Hugging Face models. Implement vector search and retrieval pipelines using FAISS or Azure Cognitive Search. Ensure responsible AI practices, including bias detection and model governance. Good to Have (Strong Advantage) Experience with ML and Generative AI workloads on Databricks. RAG, embeddings, or inference pipelines. Terraform / ARM / Bicep for infrastructure. Databricks Asset Bundles. Airflow or ADF orchestration. Production monitoring and cost optimization experience. Knowledge of LangChain or similar frameworks for AI application development. Experience with Azure AI services (Azure Machine Learning, Azure Cognitive Services). Required Skills Azure Databricks (jobs, workflows, clusters, Unity Catalog preferred). Python (PySpark-heavy, not just pandas). SQL (complex joins, window functions, analytical queries). Azure Cloud (ADLS Gen2, ADF, Key Vault, IAM concepts). Pipeline orchestration & deployment (CI/CD, environment promotion). Azure DevOps. Strong understanding of ML lifecycle and MLOps best practices. Experience with model deployment using MLflow or similar frameworks.
Responsibilities
The role involves designing and building end-to-end data and AI pipelines using Azure Databricks, as well as developing robust ETL/ELT workflows. You will also integrate models and deploy them using Databricks workflows and Azure-native services.
Loading...