production issues. Optimize applications for performance and responsiveness. Stay Up to Date with Technology: Keep yourself and the team updated on the latest Python technologies, frameworks, and tools like Apache Spark, Databricks, Apache Pulsar, ApacheAirflow, Temporal, and Apache Flink, sharing knowledge and suggesting improvements. Documentation: Contribute to clear and concise documentation for software, processes … Experience with cloud platforms like AWS, GCP, or Azure. DevOps Tools: Familiarity with containerization (Docker) and infrastructure automation tools like Terraform or Ansible. Real-time Data Streaming: Experience with Apache Pulsar or similar systems for real-time messaging and stream processing is a plus. Data Engineering: Experience with Apache Spark, Databricks, or similar big data platforms for processing … large datasets, building data pipelines, and machine learning workflows. Workflow Orchestration: Familiarity with tools like ApacheAirflow or Temporal for managing workflows and scheduling jobs in distributed systems. Stream Processing: Experience with Apache Flink or other stream processing frameworks is a plus. Desired Skills Asynchronous Programming: Familiarity with asynchronous programming tools like Celery or asyncio. Frontend Knowledge More ❯
robust way possible! Diverse training opportunities and social benefits (e.g. UK pension schema) What do you offer? Strong hands-on experience working with modern Big Data technologies such as Apache Spark, Trino, Apache Kafka, Apache Hadoop, Apache HBase, Apache Nifi, ApacheAirflow, Opensearch Proficiency in cloud-native technologies such as containerization and Kubernetes More ❯
Birmingham, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (AI-Driven SaaS plaform) (Python, Snowflake, Data Modelling, ETL/ELT, ApacheAirflow, Kafka, AWS) Large-scale data environment Up to £70,000 plus benefits FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalable data structures and data pipelines within a truly enterprise-scale data … platform integrates Python and Snowflake and you'll need a deep understanding of SQL and NoSQL databases (MongoDB or similar!) You'll also have exposure with streaming platforms like Apache Kafka and be able to develop and maintain ELT and essentially bring a solid understanding of data warehousing concepts and best practice. Essentially, a strong Data Engineer who is … a Snowflake enthusiast who can write solid SQL queries, within Snowflake! You will understanding Apache Kafka to a high standard and have solid knowledge of ApacheAirflow - from a Cloud perspective, good AWS exposure. Naturally you will have good understanding on AWS. I'd love you to be an advocate of Agile too - these guys are massive More ❯
you'll have the opportunity to grow the data function and step into a leadership role. The Role: Designing and building robust ETL pipelines using tools like dbt or ApacheAirflow Integrating data from APIs, databases, and SaaS platforms into BigQuery Structuring clean, queryable data models to support analytics and reporting Collaborating with analysts to deliver insightful dashboards … via Looker Establishing data governance and quality processes Requirements: GCP (BigQuery), but open to other cloud backgrounds ETL: dbt, ApacheAirflow, or similar BI: Looker (preferred), or other BI tools Languages: SQL, Python, Java Experienced data engineer, with strong ETL and cloud data warehouse experience Proficiency in SQL and data modelling best practices Experience with BI tools and More ❯
you'll have the opportunity to grow the data function and step into a leadership role. The Role: Designing and building robust ETL pipelines using tools like dbt or ApacheAirflow Integrating data from APIs, databases, and SaaS platforms into BigQuery Structuring clean, queryable data models to support analytics and reporting Collaborating with analysts to deliver insightful dashboards … via Looker Establishing data governance and quality processes Requirements: GCP (BigQuery), but open to other cloud backgrounds ETL: dbt, ApacheAirflow, or similar BI: Looker (preferred), or other BI tools Languages: SQL, Python, Java Experienced data engineer, with strong ETL and cloud data warehouse experience Proficiency in SQL and data modelling best practices Experience with BI tools and More ❯
you'll have the opportunity to grow the data function and step into a leadership role. The Role: Designing and building robust ETL pipelines using tools like dbt or ApacheAirflow Integrating data from APIs, databases, and SaaS platforms into BigQuery Structuring clean, queryable data models to support analytics and reporting Collaborating with analysts to deliver insightful dashboards … via Looker Establishing data governance and quality processes Requirements: GCP (BigQuery), but open to other cloud backgrounds ETL: dbt, ApacheAirflow, or similar BI: Looker (preferred), or other BI tools Languages: SQL, Python, Java Experienced data engineer, with strong ETL and cloud data warehouse experience Proficiency in SQL and data modelling best practices Experience with BI tools and More ❯
data architectures, such as lakehouse. Experience with CI/CD pipelines, version control systems like Git, and containerization (e.g., Docker). Experience with ETL tools and technologies such as ApacheAirflow, Informatica, or Talend. Strong understanding of data governance and best practices in data management. Experience with cloud platforms and services such as AWS, Azure, or GCP for … deploying and managing data solutions. Strong problem-solving and analytical skills with the ability to diagnose and resolve complex data-related issues. SQL (for database management and querying) Apache Spark (for distributed data processing) Apache Spark Streaming, Kafka or similar (for real-time data streaming) Experience using data tools in at least one cloud service - AWS, Azure or More ❯
systems, with a focus on data quality and reliability. Design and manage data storage solutions, including databases, warehouses, and lakes. Leverage cloud-native services and distributed processing tools (e.g., Apache Flink, AWS Batch) to support large-scale data workloads. Operations & Tooling Monitor, troubleshoot, and optimize data pipelines to ensure performance and cost efficiency. Implement data governance, access controls, and … ELT pipelines and data architectures. Hands-on expertise with cloud platforms (e.g., AWS) and cloud-native data services. Comfortable with big data tools and distributed processing frameworks such as Apache Flink or AWS Batch. Strong understanding of data governance, security, and best practices for data quality. Effective communicator with the ability to work across technical and non-technical teams. … Additional Strengths Experience with orchestration tools like Apache Airflow. Knowledge of real-time data processing and event-driven architectures. Familiarity with observability tools and anomaly detection for production systems. Exposure to data visualization platforms such as Tableau or Looker. Relevant cloud or data engineering certifications. What we offer: A collaborative and transparent company culture founded on Integrity, Innovation and More ❯
data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with ApacheAirflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and More ❯
data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with ApacheAirflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and More ❯
data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with ApacheAirflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and More ❯
data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with ApacheAirflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and More ❯
london (city of london), south east england, united kingdom
HCLTech
data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with ApacheAirflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and More ❯
data engineering tasks. Experience building and maintaining web scraping pipelines. Strong SQL skills, with expertise in performance tuning. Strong proficiency with dbt for data transformations. Hands-on experience with ApacheAirflow or Prefect. Proficiency with GitHub, GitHub Actions, and CI/CD pipelines. Nice to have: Experience with GCP (BigQuery, Dataflow, Composer, Pub/Sub) or AWS. Familiarity More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
Azure Storage, Medallion Architecture, and working with data formats such as JSON, CSV, and Parquet. • Strong understanding of IT concepts, including security, IAM, Key Vault, and networking. • Exposure to ApacheAirflow and DBT is a bonus. • Familiarity with agile principles and practices. • Experience with Azure DevOps pipelines. The "Nice to Haves": • Certification in Azure or related technologies. • Experience More ❯
problem solving skills We'd love to see: Knowledge of Database Systems and trade offs in the distributed systems Familiarity with API Designs Familiarity with Orchestration Frameworks such as ApacheAirflow, Argo Workflows, Conductor etc. Experience working with and designing systems utilizing AWS Bloomberg is an equal opportunity employer and we value diversity at our company. We do More ❯
problem solving skills We'd love to see: Knowledge of Database Systems and trade offs in the distributed systems Familiarity with API Designs Familiarity with Orchestration Frameworks such as ApacheAirflow, Argo Workflows, Conductor etc. Experience working with and designing systems utilizing AWS Bloomberg is an equal opportunity employer and we value diversity at our company. We do More ❯
following engineering disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith More ❯
computing, with experience using frameworks like Databricks Strong experience with Azure or other cloud platforms (AWS) Strong experience building data pipelines or ETL tools and Orchestration frameworks such as Airflow Strong experience and appreciation of CI/CD implementation Good experience with implementing automated testing Good experience with Infrastructure as Code with Terraform Strong understanding of Agile delivery methodologies More ❯
Troubleshooting: Oversee pipeline performance, address issues promptly, and maintain comprehensive data documentation. What Youll Bring Technical Expertise: Proficiency in Python and SQL; experience with data processing frameworks such as Airflow, Spark, or TensorFlow. Data Engineering Fundamentals: Strong understanding of data architecture, data modelling, and scalable data solutions. Backend Development: Willingness to develop proficiency in backend technologies (e.g., Python with … Django) to support data pipeline integrations. Cloud Platforms: Familiarity with AWS or Azure, including services like ApacheAirflow, Terraform, or SageMaker. Data Quality Management: Experience with data versioning and quality assurance practices. Automation and CI/CD: Knowledge of build and deployment automation processes. Experience within MLOps A 1st class Data degree from one of the UKs top More ❯
and Analytics to ensure alignment and impact. Champion a culture of learning, innovation, and continuous improvement within the team. Tech Stack: Python | SQL | Snowflake | AWS (S3, EC2, Terraform, Docker) | Airflow | dbt | Apache Spark | Apache Iceberg | Postgres Requirements: Proven experience in a hands on data engineering leadership role. Strong background in modern data engineering (pipelines, modelling, transformations, governance … . Experience leading or mentoring small teams, with a desire to develop people as much as technology. Solid AWS cloud experience and exposure to modern tooling such as dbt, Airflow, and Snowflake. Strong communication skills with the ability to work cross-functionally and balance technical and business priorities. Curiosity around AI and how it can be used to boost More ❯
data modelling techniques (star schema, data vault, dimensional modelling). Proficiency in Excel-based data workflows for various Agile Retail projects. Hands-on experience with data pipeline orchestration tools (Airflow, dbt, Prefect, or similar). Benefits: Unlimited holiday Annual Wellbeing Allowance Flexible work culture Monthly socials and events Complimentary snack bar Employer pension contribution If you're a data More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Office Angels
data modelling techniques (star schema, data vault, dimensional modelling). Proficiency in Excel-based data workflows for various Agile Retail projects. Hands-on experience with data pipeline orchestration tools (Airflow, dbt, Prefect, or similar). Benefits: Unlimited holiday Annual Wellbeing Allowance Flexible work culture Monthly socials and events Complimentary snack bar Employer pension contribution If you're a data More ❯
Analysis. SQL & Python: schema design, transformations, query optimisation, automation, testing. Track record of buildingETL/ELT pipelines into modern warehouses (BigQuery, Snowflake, Redshift). Familiar with tools like Dagster, Airflow, Prefect, dbt, Dataform, SQLMesh. Cloud experience (we're on GCP) + containerisation (Docker, Kubernetes). Strong sense of ownership over data standards, security, and roadmap. A collaborator at heart More ❯