
Data Engineer
Posted 1 day ago

Posted 1 day ago
• Design, develop, and sustain robust data pipelines utilizing ETL and ELT methodologies.
• Implement and oversee real-time data streaming solutions using Kafka, Debezium, and Kafka Connect.
• Construct, schedule, and maintain customized workflows with Apache Airflow.
• Collaborate with various database technologies, including MySQL, PostgreSQL, MongoDB, and analytical/big data systems.
• Utilize tools such as Terraform, Kubernetes, and Helm for efficient infrastructure management.
• Create and uphold continuous integration and deployment pipelines.
• Perform unit and integration testing to ensure high code quality, data integrity, and system reliability.
• Collaborate with cross-functional teams to comprehend data requirements and deliver tailored solutions.
• Keep clear and thorough documentation of data processes, workflows, and systems.
• Monitor system performance and resolve issues in production environments.
• Bachelor’s degree in Computer Science, Engineering, or a related discipline.
• Essential proficiency in Python.
• Knowledge of JavaScript and Scala is a plus.
• At least 3 years of experience in data engineering positions.
• A minimum of 2 years of experience in software and/or application development roles.
• Practical experience with Kafka, Debezium, and Kafka Connect.
• Proficient in a data pipeline orchestration tool such as Apache Airflow (preferred).
• Strong understanding and hands-on experience with MySQL, PostgreSQL, MongoDB, and Redshift.
• Experience with Terraform, Kubernetes, and Helm.
• Solid grasp of cloud computing concepts.
• Ability to compose complex SQL queries.
• Familiarity with unit and integration testing practices.
• Experience in establishing and maintaining CI/CD pipelines.
• Exposure to self-service reporting tools like Tableau, Looker, and DOMO.
• Competitive salary
• Flexible working hours
• Professional development budget
Astro Sirens LLC
NIVA Health
Get handpicked remote jobs straight to your inbox weekly.