Data Engineer (SC cleared)
Guildford, Surrey, United Kingdom
Hybrid / WFH Options
Hybrid / WFH Options
Stott and May
robust ETL/ELT data pipelines using Apache Airflow - Build ingestion processes from internal systems and APIs, using Kafka, Spark, AWS - Develop and maintain data lakes and warehouses (AWS S3, Redshift) - Ensuring governance using automated testing tools - Collaborate with DevOps to manage CI/CD pipelines for data deployments and ensure version control of DAGs - Apply best practice in … security and compliance Required Tech Skills: - Python and SQL for processing - Apache Airflow, writing Airflow DAGs and configuring airflow jobs - AWS cloud platform and services like S3, Redshift - Familiarity with big data processing using Apache Spark - Knowledge of modelling, schema design and partitioning strategies - Understanding batch Vs streaming data paradigms - Docker or Kubernetes (containerization More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted: