
Senior Engineer – Data
Posted 1 day ago

Posted 1 day ago
• Define, design, and develop scalable and resilient distributed systems.
• Leverage programming languages including Python, SQL, and NoSQL databases, in addition to Apache Spark for data processing, dbt for data transformation, container orchestration technologies such as Docker and Kubernetes, and a variety of Azure tools and services.
• Employ your technical skills to influence product definitions and pursue optimal solutions.
• Collaborate cross-functionally throughout the entire development lifecycle.
• Facilitate design discussions and code reviews with colleagues to enhance the quality of engineering across the organization.
• Create, define, and maintain reusable data components and patterns that meet both business and technical needs.
• Develop a premier analytics platform to fulfill reporting requirements.
• Provide mentorship to fellow engineers.
• Regularly share best practices and enhance processes within and across teams.
• Extensive programming experience and proficiency in big data technologies including Python, SQL, dbt, Spark, Kafka, Git, and Containerization (Docker and Kubernetes).
• Familiarity with Apache Iceberg for managing large-scale tabular data in data lakes is an advantage.
• Experience with orchestration tools like Apache Airflow or similar technologies for automating and managing complex data pipelines.
• Proficiency in business intelligence tools, preferably Power BI or Superset.
• Demonstrated understanding of microservices architecture, REST APIs, and GraphQL.
• Experience in architecting and designing both new and existing systems.
• Advanced knowledge of DevOps principles, including the Azure DevOps framework and tools.
• Experience with CI/CD processes to support seamless integration and deployment of data solutions.
• Advanced skills in PowerShell scripting.
• In-depth understanding of monitoring concepts and tools.
• Strong comprehension of security protocols and products.
• Extensive knowledge of computer science data structures and algorithms.
• Familiarity with developer tools throughout the data development lifecycle, including task management, source code control, building, deployment, operations, and real-time communication.
• Strong analytical and problem-solving skills.
• Capacity to thrive in a fast-paced environment.
• A minimum of 4 years of professional experience in data engineering, programming languages, and utilizing big data technologies.
• At least 3 years of experience in architecture and design.
• A minimum of 3 years of experience with AWS, GCP, Azure, or another cloud service.
• At least 2 years of experience with big data tools such as Spark and Databricks.
• A comprehensive Total Rewards program offering personalized benefits tailored to you and your family's overall well-being.
• Financial perks including competitive market compensation; a 401K savings plan with immediate vesting and a 6% match; performance-based incentives; and tuition assistance.
• Access to additional benefits such as mental health support as well as fertility and adoption assistance.
• Flexibility support - We offer workplace flexibility along with our GEICO Flex program, allowing the option to work from anywhere in the US for up to four weeks each year.
Smartsheet
Smartsheet
Domus Global
PSI CRO AG
Get handpicked remote jobs straight to your inbox weekly.