Remotery

Senior Data Engineer

atBridgeway Benefit TechnologiesUS flagUnited StatesFull-timeData EngineerSenior

Posted 22 hours ago

📋 Description

• Create, develop, and sustain a scalable lakehouse architecture, featuring a medallion (bronze/silver/gold) data model tailored for analytics and AI/ML utilization.

• Design, execute, and manage ELT pipelines, encompassing workflow orchestration, scheduling, and monitoring to guarantee dependable and scalable operations.

• Implement data quality, testing, and observability protocols, while actively monitoring and addressing data and automation challenges to uphold platform reliability and confidence.

• Ensure data security and compliance, incorporating role-based access controls for security, encryption, masking, and governance best practices for the compliant management of sensitive data.

• Enhance the performance of data workflows and storage for cost-effectiveness and speed.

• Collaborate with engineers, analysts, and stakeholders to address data requirements; balance cost, performance, simplicity, and time-to-value while mentoring teams and documenting standards.

• Offer technical leadership and mentorship to team members—promoting best practices, skill enhancement, and cross-functional collaboration.

• Facilitate AI/ML applications through well-structured data models, feature availability, and platform integrations utilizing tools like Databricks Vector Search and Model Serving.

• Create and maintain data pipelines employing version control and CI/CD best practices in a cooperative engineering environment.

• Work within an Agile-Scrum framework and produce extensive technical design documentation to ensure efficient and successful delivery.

• Act as a reliable expert on organizational data domains, processes, and best practices.


⛳️ Requirements

• Over 5 years of practical data engineering experience is essential.

• More than 3 years of experience in building and managing data pipelines on a contemporary lakehouse platform (e.g., Databricks – Unity Catalog, Delta Live Tables, Asset Bundles), including data modeling, governance, and CI/CD deployment methodologies.

• At least 3 years of experience with analytical SQL (ANSI SQL/T-SQL/Spark SQL) and Python for data engineering, covering pipeline construction, transformation logic, and automation is required.

• Excellent communication skills with the capacity to collaborate and influence across engineering, analytics, and business stakeholders are necessary.

• Familiarity with streaming and ingestion tools such as Kafka, Kinesis, Event Hubs, Debezium, or Fivetran is preferred.

• Knowledge of DAX, LookML, dbt; Airflow/Dagster/Prefect, Terraform; Azure DevOps; Power BI/Looker/Tableau; and GitHub CoPilot is a plus.

• A Bachelor’s degree in Computer Science, Information Technology, or a related field is required, with a Master’s degree preferred.


🏝️ Benefits

• Preference will be given to candidates located on the East Coast.

People also viewed

SmartLight Analytics17 hours ago

Senior Data Engineer

US flagUnited States OnlyFull-timeData Engineer
ApplyView job
CloudSmiths17 hours ago

Senior Data Engineer

ZA flagSouth Africa OnlyFull-timeData Engineer
ApplyView job
Software Mind17 hours ago

Data Engineer

PL flagPoland OnlyFull-timeData Engineer
ApplyView job
BPCS, Comprehensive marketing solutions, ltd.17 hours ago

Data Engineer

US flagWashington OnlyFull-timeData Engineer$38 – $40/hour
ApplyView job
Kyndryl17 hours ago

Data Engineer – GCP Data SME

IN flagIndia OnlyFull-timeData Engineer
ApplyView job
Knowtion Health17 hours ago

Data Engineer

US flagAlabama, +18 more statesFull-timeData Engineer
ApplyView job

Never miss a great job!

Get handpicked remote jobs straight to your inbox weekly.

Trusted by 7,400+ designers