Senior Databricks Solution Architect

Job Details

Working Arrangement: On-site

Location: Santa Clara, California

Salary: $90/hr - $130/hr

Share this job

Senior Databricks Solution Architect (Lakehouse Platform | AWS/Azure)
Location: Santa Clara, CA (Onsite 5 days a week)
Contract: 9–12 months

The Opportunity

We’re supporting a team building and evolving a modern data platform on Databricks, with a focus on scalability, governance, and production-grade architecture.

This role is not pipeline development—it’s end-to-end platform design.

They need someone who can own the architecture of the Databricks environment, define best practices, and ensure the platform is built correctly from the ground up.

What You’ll Be Doing

  • Lead the end-to-end architecture of a Databricks Lakehouse platform, including data ingestion, transformation, storage, and serving layers
  • Define and implement best practices across Medallion architecture (Bronze/Silver/Gold) and scalable data design patterns
  • Design and enforce data governance frameworks using Unity Catalog, RBAC, and data lineage strategies
  • Architect data pipelines and workflows using Delta Live Tables, Databricks Jobs, and orchestration tools
  • Optimize platform performance through cluster configuration, workload management, and cost optimization strategies
  • Establish data quality, observability, and monitoring frameworks across the platform
  • Collaborate with engineering and business teams to translate requirements into scalable, production-ready data solutions
  • Provide technical leadership and guidance to engineers, ensuring adherence to architectural standards

What They’re Looking For

Must-Have:

  • Proven experience as a Databricks Solution Architect (not just engineer-level usage)
  • Deep expertise in Databricks Lakehouse architecture, including Delta Lake, Unity Catalog, Delta Live Tables, and Medallion architecture
  • Strong experience designing end-to-end data platforms in Databricks (not just pipelines)
  • Hands-on expertise with Spark (PySpark / Spark SQL)
  • Experience with cloud platforms (AWS and/or Azure) in a data platform context
  • Strong understanding of data modeling, governance, and scalable architecture patterns

Nice-to-Have:

  • Experience with multi-tenant or platform-style data environments
  • Exposure to real-time or streaming architectures (Kafka, Structured Streaming)
  • Experience integrating Databricks with Snowflake or other downstream systems
  • Background in R&D, manufacturing, or high-scale enterprise environments

What Makes This Role Different

  • True architecture ownership, not just pipeline development
  • Opportunity to shape a modern Databricks platform from a design perspective
  • High visibility with leadership and direct impact on platform strategy
  • Blend of technical depth and architectural leadership
This field is for validation purposes and should be left unchanged.
Accepted file types: doc, docx, pdf, Max. file size: 2 MB.
Scroll to Top