
Director, Data Engineering – Automation
Posted May 2

Posted May 2
• Oversee a team of data engineers tasked with transforming data from various systems to provide insights and analytics for business stakeholders.
• Develop technical roadmaps and propose strategies for data pipeline development and integration.
• Utilize cloud-based infrastructure to design scalable, resilient, and efficient data engineering solutions.
• Work collaboratively with data analysts, data scientists, database administrators, cross-functional teams, and business stakeholders to address challenges.
• Shape architectural decisions and design patterns throughout the data platform.
• Offer technical leadership throughout the software development lifecycle, from design to deployment, including hands-on involvement.
• Create project plans, manage prioritization timelines, allocate resources, and take ownership of assigned technical projects in a dynamic environment.
• Conduct code reviews and ensure that data engineers adhere to best-practice coding standards.
• Define and validate test cases to guarantee data quality, reliability, and a high level of confidence.
• Continuously enhance the quality, efficiency, and scalability of data pipelines, minimizing gaps and inconsistencies.
• Bachelor of Science in Computer Science or a related field.
• 7+ years of professional experience post-degree.
• 4+ years of experience in building and maintaining ETL pipelines within a data warehouse environment.
• 5+ years of experience in Python development.
• Proven experience in hiring and leading a team of 3+ data engineers, including supervision, goal-setting, and fostering professional development.
• Excellent communication and interpersonal skills to initiate and lead projects.
• Familiarity with AWS services such as Kinesis, Firehose, Aurora Unload, Redshift, Spectrum, Elastic Mapreduce, SageMaker, and Lambda.
• Experience in provisioning data sets for analytics tools like Tableau, Quicksight, or similar, along with knowledge of analytic tools such as R, Tableau, Plotly, and Python Pandas.
• Advanced SQL skills (including performance tuning, indexes, and materialized views) and proficiency in designing and executing NoSQL databases to optimize big data storage and retrieval.
• Experience with API integrations with external vendors for data exchange and familiarity with data orchestration pipelines using Argo or Airflow.
• Medical, dental, vision, and life insurance.
• Retirement savings – 401(k) plan with generous company matching contributions (up to 6%), financial advisory services, potential company discretionary contributions, and a wide investment portfolio.
• Tuition reimbursement up to $5,250 per year.
• Business-casual work environment that allows for jeans.
• Generous paid time off upon hire – which includes a paid time off program, ten paid company holidays, and three floating holidays each calendar year.
• Paid volunteer time — 16 hours per calendar year.
• Leave of absence programs – including paid parental leave, paid short- and long-term disability, and Family and Medical Leave (FMLA).
• Business Resource Groups (BRGs) – BRGs promote inclusion and collaboration within our business and the communities we serve. Participation in BRGs is open to all.
SmartLight Analytics
CloudSmiths
BPCS, Comprehensive marketing solutions, ltd.
Get handpicked remote jobs straight to your inbox weekly.