
Senior Data Engineer – AWS
Posted 18 hours ago

Posted 18 hours ago
• Create reusable, metadata-driven data pipelines with a strong emphasis on database architecture and SQL optimization.
• Engage extensively with AWS-based datasets and infrastructure, including Redshift, RDS, S3, Glue, Athena, and DMS.
• Assist in comprehensive ETL development and enhancement using advanced SQL techniques and query performance tuning.
• Automate and refine data platform processes by leveraging the AWS data services ecosystem.
• Establish robust integrations with data sources and consumers utilizing AWS native solutions.
• Proactively address performance and data quality challenges through SQL optimization and database tuning.
• Utilize DBT for data transformation modeling and Jenkins/GitHub Actions for CI/CD automation.
• Contribute to platform documentation and runbooks with a focus on database best practices.
• Suggest and implement enhancements to the data platform architecture using AWS data services.
• Strong database expertise: advanced SQL capabilities, extensive data modeling experience, and proven query optimization skills.
• Proficiency in AWS data services: practical experience with Redshift, RDS, S3, Glue, Athena, and DMS.
• ETL development: solid background in ETL development and optimization utilizing database-centric methodologies.
• Programming skills: proficiency in Python/PySpark and advanced SQL for developing data pipelines.
• Spark expertise: expert-level capabilities in constructing data pipelines using Databricks on AWS or EMR/EMR Serverless.
• Experience in large-scale data processing: substantial background in managing complex datasets and optimizing database performance.
• Library development: adept at creating reusable data transformation modules as Python packages.
• CI/CD knowledge: familiarity with GitHub Actions, Jenkins, and AWS CodePipeline for automated deployments.
• Understanding of AWS networking and security: knowledge of VPC, subnets, security groups, and IAM.
• Agile methodology: experience with Scrum development practices.
• DBT familiarity: experience with DBT for data transformation and modeling.
• Jenkins/GitHub Actions: advanced automation capabilities extending beyond basic CI/CD.
• Experience with Infrastructure as Code (Terraform, AWS CloudFormation).
• Knowledge of the AWS-native data services ecosystem (Glue, Lambda, S3, Redshift, RDS, Athena).
• Flexible employment opportunities and remote work arrangements.
• Involvement in international projects with prominent global clients.
• Opportunities for international business travel.
• A non-corporate work environment.
• Language classes provided.
• Internal and external training opportunities.
• Access to private healthcare and insurance.
• Multisport card available.
• Initiatives focused on well-being.
SmartLight Analytics
CloudSmiths
BPCS, Comprehensive marketing solutions, ltd.
Get handpicked remote jobs straight to your inbox weekly.