
Lead Engineer – Data Streaming
Posted May 6

Posted May 6
• Create and deploy scalable data platform solutions, encompassing data streaming, Change Data Capture (CDC), data warehouses, data lakes, and ETL/ELT pipelines.
• Contribute to and uphold platform standards, best practices, and architectural patterns for cloud-based data technologies.
• Develop and maintain comprehensive data integration pipelines from source systems via Kafka/Striim (or similar technologies) into analytical and operational platforms.
• Facilitate technical design discussions and participate in architecture reviews for intricate data solutions.
• Ensure the reliability and performance of data platforms through observability, monitoring, and proactive issue resolution.
• Serve as a senior escalation point for troubleshooting production issues and enhancing system performance.
• Collaborate with platform engineering, security, and data governance teams to ensure implementations adhere to enterprise standards.
• Advocate for an automation-first approach by establishing CI/CD pipelines, infrastructure as code, and testing frameworks.
• Enhance operational resilience through effective error handling, replay capabilities, and validation strategies.
• Assess and incorporate new tools and technologies to continually advance the data platform ecosystem.
• Mentor and support engineers, promoting strong engineering practices and technical development within the team.
• Bachelor's degree in Computer Science or equivalent experience.
• Over 8 years of software or data engineering experience demonstrating technical leadership.
• Extensive hands-on experience in building and managing large-scale data platforms in production settings.
• Proficient in designing and maintaining CI/CD pipelines for data and platform engineering workflows.
• Familiar with databases and data platforms such as Oracle, PostgreSQL, Redshift, Snowflake, and Kafka (or similar technologies).
• Knowledge of containerization and infrastructure tools such as Docker, Kubernetes, Terraform, and Ansible.
• Strong programming and querying expertise (e.g., Python, SQL).
• Comprehensive understanding of distributed systems, cloud architectures, and data engineering design patterns.
• Excellent problem-solving, communication, and collaboration skills.
• Flexible work environment.
• Internal mobility opportunities.
• Focus on purpose and well-being.
• Emphasis on work-life balance.
• Commitment to an inclusive environment.
• Opportunities for volunteering.
Smartsheet
Smartsheet
Domus Global
PSI CRO AG
Get handpicked remote jobs straight to your inbox weekly.