AWS Data Engineer

Salary: £50K per annum

Location: London

Experience: 5+ Years

Key Responsibilities
Design, develop, and maintain robust ETL/ELT data pipelines using Python and PySpark
Build and optimize data models and schemas for analytics and reporting
Work extensively with Amazon Redshift for data warehousing, performance tuning, and query optimization
Develop complex SQL queries for data transformation and analysis
Integrate data from multiple sources (batch and streaming) into the data warehouse
Ensure data quality, reliability, and scalability of data pipelines
Collaborate with analytics, product, and engineering teams to understand data requirements
Implement monitoring, logging, and error-handling mechanisms for pipelines
Follow best practices for data security, governance, and compliance in AWS

Required Skills & Experience
Strong experience in Python for data engineering and pipeline development
Hands-on experience with PySpark/Apache Spark
Advanced proficiency in SQL (query optimization, window functions, CTEs)
Experience with AWS data warehousing tools, including:
o Amazon Redshift & RDS
o AWS Glue, Lambda & Airflow
o Amazon S3, EventBridge & Kinesis
Solid understanding of data warehousing concepts (star/snowflake schemas, fact/dimension tables)
Experience handling large-scale datasets and distributed processing
Familiarity with version control systems (Git)

Nice to Have
Experience with Airflow or other workflow orchestration tools
Knowledge of CI/CD pipelines for data platforms
Experience with Real Time or streaming data (Kafka, Kinesis)
Exposure to data visualization or BI tools
AWS certifications (eg, AWS Certified Data Engineer/Solutions Architect

Job Details

Company
Saki Soft Limited
Location
London, United Kingdom
Employment Type
Permanent
Salary
GBP 50,000 Annual
Posted