DataBricks Architect at Scicom Infrastructure Services
Atlanta, Georgia, USA -
Full Time


Start Date

Immediate

Expiry Date

05 Dec, 25

Salary

0.0

Posted On

06 Sep, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Security, Sql, Python, Scala, Spark

Industry

Information Technology/IT

Description

Seeking a Databricks Architect to lead the design and delivery of large-scale data platforms using Databricks.

Key Responsibilities

  • Architect and implement Databricks lakehouse solutions (Delta Lake, Spark, Unity Catalog, MLflow).
  • Design and optimize ETL/ELT pipelines, data ingestion, and real-time streaming frameworks.
  • Integrate Databricks with cloud services (AWS, Azure, GCP) and enterprise data systems.
  • Establish data governance, security, and compliance frameworks.
  • Lead migration efforts from legacy platforms to Databricks.
  • Collaborate with business stakeholders, data engineers, and data scientists to deliver end-to-end solutions.

Required Skills

  • Proven hands-on experience as a Databricks Architect / Senior Data Engineer.
  • Strong experience with data pipelines including Cribl, Confluent and other high capacity data interchange
  • Strong expertise in Databricks ecosystem (Spark, Delta Lake, SQL Analytics, MLflow).
  • Cloud experience (Azure Databricks preferred, AWS/GCP acceptable).
  • Strong coding background in Python, PySpark, SQL, Scala.
  • Knowledge of data security, compliance, and governance standards.
  • U.S. work authorization; must be based onshore.

Preferred

  • Databricks certifications.
  • Experience with BI tools (Power BI, Tableau) and automation/DevOps (Terraform, GitHub Actions, Jenkins).
  • Prior consulting/enterprise implementation experience.

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities
  • Architect and implement Databricks lakehouse solutions (Delta Lake, Spark, Unity Catalog, MLflow).
  • Design and optimize ETL/ELT pipelines, data ingestion, and real-time streaming frameworks.
  • Integrate Databricks with cloud services (AWS, Azure, GCP) and enterprise data systems.
  • Establish data governance, security, and compliance frameworks.
  • Lead migration efforts from legacy platforms to Databricks.
  • Collaborate with business stakeholders, data engineers, and data scientists to deliver end-to-end solutions
Loading...