
Senior Databricks Engineer
Posted 22 hours ago

Posted 22 hours ago
• Design and develop extensive data platforms utilizing Databricks (Delta Lake, Spark, Unity Catalog) within the Azure environment.
• Create and sustain batch and streaming data pipelines for complex, high-volume data sources.
• Construct medallion/lakehouse architectures from inception in greenfield settings.
• Develop and enhance data models that facilitate analytics, reporting, and downstream applications.
• Integrate Databricks with enterprise systems such as APIs, event streams, warehouses, and ML workflows.
• Optimize Spark jobs and pipelines for performance, reliability, and cost efficiency at scale.
• Assist in production deployments, encompassing CI/CD pipelines, testing, and release management.
• Collaborate directly with enterprise clients to convert requirements into functional technical solutions.
• Work alongside architects, engineers, and data scientists across various workstreams.
• Balance the need for speed and quality, discerning when to accelerate and when to solidify solutions.
• Make practical decisions in uncertain, evolving situations, particularly in greenfield projects.
• Contribute actively while also directing design and strategies across the team.
• Clearly communicate tradeoffs to both technical and non-technical stakeholders.
• Operate within contemporary engineering practices such as version control, code reviews, and automated testing.
• Proven ability to mentor and guide data engineers and analysts.
• Extensive expertise in Databricks, including experience in architecting and implementing comprehensive lakehouse solutions primarily or entirely on Databricks.
• Advanced knowledge of modern Databricks architecture patterns, including declarative pipelines / Delta Live Tables, Unity Catalog, Delta Lake, workflow orchestration, governance, performance tuning, and operational monitoring.
• Familiarity with infrastructure-as-code tools (Terraform, Bicep), environment provisioning, and CI/CD automation (Github, Azure DevOps) for Databricks-based platforms.
• Strong learning agility, technical curiosity, and comfort with AI-enabled development workflows or automation tools to enhance delivery and improve quality.
• Knowledge of other contemporary cloud data architectures and tools, including cloud-native data warehouses (Snowflake, BigQuery, Redshift), data lakes, orchestration frameworks (Airflow/Astronomer), transformation tools (dbt), governance platforms, and scalable batch or streaming data processing services (Kafka, Kinesis).
• Medical
• Dental
• Vision
• 401k
• Holiday pay
• Vacation
• Personal and family sick leave
• And more.
EC Source Services, LLC
EBI Consulting
Get handpicked remote jobs straight to your inbox weekly.