robust way possible! Diverse training opportunities and social benefits (e.g. UK pension schema) What do you offer? Strong hands-on experience working with modern Big Data technologies such as Apache Spark, Trino, Apache Kafka, Apache Hadoop, Apache HBase, Apache Nifi, Apache Airflow, Opensearch Proficiency in cloud-native technologies such as containerization and Kubernetes Strong More ❯
and Snowflake Marketplace. Familiarity with Snowpark for Python/Java-based transformations. Understanding of role-based access control, data masking, and time travel features. Databricks: Hands-on experience with Apache Spark and Databricks Runtime. Proficiency in Delta Lake for ACID-compliant data lakes. Experience with Structured Streaming and Auto Loader. Familiarity with MLflow, Feature Store, and Model Registry. Use More ❯
production issues. Optimize applications for performance and responsiveness. Stay Up to Date with Technology: Keep yourself and the team updated on the latest Python technologies, frameworks, and tools like Apache Spark, Databricks, Apache Pulsar, Apache Airflow, Temporal, and Apache Flink, sharing knowledge and suggesting improvements. Documentation: Contribute to clear and concise documentation for software, processes, and … Experience with cloud platforms like AWS, GCP, or Azure. DevOps Tools: Familiarity with containerization (Docker) and infrastructure automation tools like Terraform or Ansible. Real-time Data Streaming: Experience with Apache Pulsar or similar systems for real-time messaging and stream processing is a plus. Data Engineering: Experience with Apache Spark, Databricks, or similar big data platforms for processing … large datasets, building data pipelines, and machine learning workflows. Workflow Orchestration: Familiarity with tools like Apache Airflow or Temporal for managing workflows and scheduling jobs in distributed systems. Stream Processing: Experience with Apache Flink or other stream processing frameworks is a plus. Desired Skills Asynchronous Programming: Familiarity with asynchronous programming tools like Celery or asyncio. Frontend Knowledge: Exposure More ❯
Snowflake, Databricks) Strong DevOps mindset with experience in CI/CD pipelines, monitoring, and observability tools (Grafana or equivalent). Exposure to analytics, reporting, and BI tools such as Apache Superset, Lightdash or OpenSearch Willingness to work across the stack by contributing to API development and, at times, UI components (Vue.js, Zoho, or similar). Excellent communication and collaboration More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Involved Solutions
customer data Continuously improve existing systems, introducing new technologies and methodologies that enhance efficiency, scalability, and cost optimisation Essential Skills for the Senior Data Engineer: Proficient with Databricks and Apache Spark, including performance tuning and advanced concepts such as Delta Lake and streaming Strong programming skills in Python with experience in software engineering principles, version control, unit testing and More ❯
with AWS Cloud-native data platforms, including: AWS Glue, Lambda, Step Functions, Athena, Redshift, S3, CloudWatch AWS SDKs, Boto3, and serverless architecture patterns Strong programming skills in Python and Apache Spark Proven experience in Snowflake data engineering, including: Snowflake SQL, Snowpipe, Streams & Tasks, and performance optimization Integration with AWS services and orchestration tools Expertise in data integration patterns, ETL More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tata Consultancy Services
with AWS Cloud-native data platforms, including: AWS Glue, Lambda, Step Functions, Athena, Redshift, S3, CloudWatch AWS SDKs, Boto3, and serverless architecture patterns Strong programming skills in Python and Apache Spark Proven experience in Snowflake data engineering, including: Snowflake SQL, Snowpipe, Streams & Tasks, and performance optimization Integration with AWS services and orchestration tools Expertise in data integration patterns, ETL More ❯
with AWS Cloud-native data platforms, including: AWS Glue, Lambda, Step Functions, Athena, Redshift, S3, CloudWatch AWS SDKs, Boto3, and serverless architecture patterns Strong programming skills in Python and Apache Spark Proven experience in Snowflake data engineering, including: Snowflake SQL, Snowpipe, Streams & Tasks, and performance optimization Integration with AWS services and orchestration tools Expertise in data integration patterns, ETL More ❯
london, south east england, united kingdom Hybrid / WFH Options
Tata Consultancy Services
with AWS Cloud-native data platforms, including: AWS Glue, Lambda, Step Functions, Athena, Redshift, S3, CloudWatch AWS SDKs, Boto3, and serverless architecture patterns Strong programming skills in Python and Apache Spark Proven experience in Snowflake data engineering, including: Snowflake SQL, Snowpipe, Streams & Tasks, and performance optimization Integration with AWS services and orchestration tools Expertise in data integration patterns, ETL More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Tata Consultancy Services
with AWS Cloud-native data platforms, including: AWS Glue, Lambda, Step Functions, Athena, Redshift, S3, CloudWatch AWS SDKs, Boto3, and serverless architecture patterns Strong programming skills in Python and Apache Spark Proven experience in Snowflake data engineering, including: Snowflake SQL, Snowpipe, Streams & Tasks, and performance optimization Integration with AWS services and orchestration tools Expertise in data integration patterns, ETL More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Tata Consultancy Services
with AWS Cloud-native data platforms, including: AWS Glue, Lambda, Step Functions, Athena, Redshift, S3, CloudWatch AWS SDKs, Boto3, and serverless architecture patterns Strong programming skills in Python and Apache Spark Proven experience in Snowflake data engineering, including: Snowflake SQL, Snowpipe, Streams & Tasks, and performance optimization Integration with AWS services and orchestration tools Expertise in data integration patterns, ETL More ❯
in AWS. Strong expertise with AWS services, including Glue, Redshift, Data Catalog, and large-scale data storage solutions such as data lakes. Proficiency in ETL/ELT tools (e.g. Apache Spark, Airflow, dbt). Skilled in data processing languages such as Python, Java, and SQL. Strong knowledge of data warehousing, data lakes, and data lakehouse architectures. Excellent analytical and More ❯
bradford, yorkshire and the humber, united kingdom
Fruition Group
in AWS. Strong expertise with AWS services, including Glue, Redshift, Data Catalog, and large-scale data storage solutions such as data lakes. Proficiency in ETL/ELT tools (e.g. Apache Spark, Airflow, dbt). Skilled in data processing languages such as Python, Java, and SQL. Strong knowledge of data warehousing, data lakes, and data lakehouse architectures. Excellent analytical and More ❯
Luton, Bedfordshire, United Kingdom Hybrid / WFH Options
Data warehouse operations and tunning experience in schema evolution, indexing, partitioning. Understanding of ML development workflow and knowledge of when and how to use dedicated hardware. Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. Experience with data quality and/or More ❯
Luton, England, United Kingdom Hybrid / WFH Options
easyJet
indexing, partitioning. Hands-on IaC development experience with Terraform or CloudFormation. Understanding of ML development workflow and knowledge of when and how to use dedicated hardware. Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. Experience with data quality More ❯
bedford, east anglia, united kingdom Hybrid / WFH Options
easyJet
indexing, partitioning. Hands-on IaC development experience with Terraform or CloudFormation. Understanding of ML development workflow and knowledge of when and how to use dedicated hardware. Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. Experience with data quality More ❯
luton, bedfordshire, east anglia, united kingdom Hybrid / WFH Options
easyJet
indexing, partitioning. Hands-on IaC development experience with Terraform or CloudFormation. Understanding of ML development workflow and knowledge of when and how to use dedicated hardware. Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. Experience with data quality More ❯
watford, hertfordshire, east anglia, united kingdom Hybrid / WFH Options
easyJet
indexing, partitioning. Hands-on IaC development experience with Terraform or CloudFormation. Understanding of ML development workflow and knowledge of when and how to use dedicated hardware. Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. Experience with data quality More ❯
systems. Nice to haves: Experience with NoSQL databases (MongoDB, Cassandra, Redis). Familiarity with message brokers (Kafka, SQS/SNS, RabbitMQ). Knowledge of real-time streaming (Kafka Streams, Apache Flink, etc.). Exposure to big-data or machine-learning frameworks (TensorFlow, PyTorch, Hugging Face, LangChain). Experience working with AI-driven development tools such as Cursor, Copilot, or More ❯
Linux-based, concurrent, high-throughput, low-latency software systems Experience with pipeline orchestration frameworks (e.g. Airflow, Dagster) Experience with streaming platforms (e.g. Kafka), data lake platforms (e.g. Delta Lake, Apache Iceberg), and relational databases Have a Bachelor or advanced degree in Computer Science, Mathematics, Statistics, Physics, Engineering, or equivalent work experience For more information about DRW's processing activities More ❯
frameworks, and clear documentation within your pipelines Experience in the following areas is not essential but would be beneficial: Data Orchestration Tools: Familiarity with modern workflow management tools like Apache Airflow, Prefect, or Dagster Modern Data Transformation: Experience with dbt (Data Build Tool) for managing the transformation layer of the data warehouse BI Tool Familiarity : An understanding of how More ❯
frameworks, and clear documentation within your pipelines Experience in the following areas is not essential but would be beneficial: Data Orchestration Tools: Familiarity with modern workflow management tools like Apache Airflow, Prefect, or Dagster Modern Data Transformation: Experience with dbt (Data Build Tool) for managing the transformation layer of the data warehouse BI Tool Familiarity : An understanding of how More ❯
Luton, England, United Kingdom Hybrid / WFH Options
easyJet
field. Technical Skills Required • Hands-on software development experience with Python and experience with modern software development and release engineering practices (e.g. TDD, CI/CD). • Experience with Apache Spark or any other distributed data programming frameworks. • Comfortable writing efficient SQL and debugging on cloud warehouses like Databricks SQL or Snowflake. • Experience with cloud infrastructure like AWS or More ❯
bedford, east anglia, united kingdom Hybrid / WFH Options
easyJet
field. Technical Skills Required • Hands-on software development experience with Python and experience with modern software development and release engineering practices (e.g. TDD, CI/CD). • Experience with Apache Spark or any other distributed data programming frameworks. • Comfortable writing efficient SQL and debugging on cloud warehouses like Databricks SQL or Snowflake. • Experience with cloud infrastructure like AWS or More ❯
luton, bedfordshire, east anglia, united kingdom Hybrid / WFH Options
easyJet
field. Technical Skills Required • Hands-on software development experience with Python and experience with modern software development and release engineering practices (e.g. TDD, CI/CD). • Experience with Apache Spark or any other distributed data programming frameworks. • Comfortable writing efficient SQL and debugging on cloud warehouses like Databricks SQL or Snowflake. • Experience with cloud infrastructure like AWS or More ❯