production issues. Optimize applications for performance and responsiveness. Stay Up to Date with Technology: Keep yourself and the team updated on the latest Python technologies, frameworks, and tools like Apache Spark, Databricks, Apache Pulsar, ApacheAirflow, Temporal, and Apache Flink, sharing knowledge and suggesting improvements. Documentation: Contribute to clear and concise documentation for software, processes … Experience with cloud platforms like AWS, GCP, or Azure. DevOps Tools: Familiarity with containerization (Docker) and infrastructure automation tools like Terraform or Ansible. Real-time Data Streaming: Experience with Apache Pulsar or similar systems for real-time messaging and stream processing is a plus. Data Engineering: Experience with Apache Spark, Databricks, or similar big data platforms for processing … large datasets, building data pipelines, and machine learning workflows. Workflow Orchestration: Familiarity with tools like ApacheAirflow or Temporal for managing workflows and scheduling jobs in distributed systems. Stream Processing: Experience with Apache Flink or other stream processing frameworks is a plus. Desired Skills Asynchronous Programming: Familiarity with asynchronous programming tools like Celery or asyncio. Frontend Knowledge More ❯
Sheffield, South Yorkshire, England, United Kingdom Hybrid/Remote Options
Vivedia Ltd
. Experience with cloud platforms (AWS, Azure, GCP) and tools like Snowflake, Databricks, or BigQuery . Familiarity with streaming technologies (Kafka, Spark Streaming, Flink) is a plus. Tools & Frameworks: Airflow, dbt, Prefect, CI/CD pipelines, Terraform. Mindset: Curious, data-obsessed, and driven to create meaningful business impact. Soft Skills: Excellent communication and collaboration — translating complex technical ideas into More ❯
Stockport, England, United Kingdom Hybrid/Remote Options
Gravitas Recruitment Group (Global) Ltd
and other squads to ensure smooth releases and integration. Key Skills Data Modelling Python & SQL AWS/Redshift 3–5+ years of experience in data engineering Nice to Have Airflow, Tableau, Power BI, Snowflake, Databricks Data governance/data quality tooling Degree preferred Atlassian/Jira, CI/CD, Terraform Why Join? Career Growth: Clear progression to Tech Lead. More ❯
Edinburgh, York Place, City of Edinburgh, United Kingdom
Bright Purple
e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What we’re looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset – adaptable, collaborative, and delivery-focused More ❯
Manchester, England, United Kingdom Hybrid/Remote Options
Client Server
As a Senior Data Engineer you will take ownership of the data platform, optimising it for scalability to ensure successful client onboarding. You'll use modern tools (such as Airflow, Prefect, Dagster or AWS Step Functions) for ETL design and orchestration, work on transformation logic to clean, validate and enrich data (including handling missing values, standardising formats and duplication More ❯
Manchester, Lancashire, England, United Kingdom Hybrid/Remote Options
Client Server Ltd
As a Senior Data Engineer you will take ownership of the data platform, optimising it for scalability to ensure successful client onboarding. You'll use modern tools (such as Airflow, Prefect, Dagster or AWS Step Functions) for ETL design and orchestration, work on transformation logic to clean, validate and enrich data (including handling missing values, standardising formats and duplication More ❯
with internal stakeholders to drive innovation and efficiency Requirements: 3–6 years of experience in Data Engineering or a related role Strong skills in Python, SQL, AWS, Git, and Airflow Proven ability to improve data quality and system performance Excellent communication and problem-solving skills Minimum 2:1 degree from a Russell Group or Ivy League university in Computer More ❯
/frameworks (pandas, numpy, sklearn, TensorFlow, PyTorch) Strong understanding and experience in implementing end-to-end ML pipelines (data, training, validation, serving) Experience with ML workflow orchestration tools (e.g., Airflow, Prefect, Kubeflow) and ML feature or data platforms (e.g., Tecton, Databricks, etc.) Experience with cloud platforms (AWS, GCP/Vertex, Azure), Docker, and Kubernetes Solid coding practices (Git, automated More ❯
in Computer Science or related field, ideally from a Russell Group university) Hedge fund experience is essential Proficiency in SQL, Python, AWS, and modern data tools Experience with Git, Airflow, and data orchestration Strong troubleshooting skills and a track record of improving data reliability Excellent communication and teamwork skills A self-starter mindset with a passion for learning and More ❯
manchester, north west england, united kingdom Hybrid/Remote Options
Inter-Quest
hands-on with AWS, Azure, or GCP, working in a collaborative environment that values innovation, quality, and teamwork. What you'll bring: Solid experience with Python, SQL, Spark, and Airflow Confident working across AWS, Azure, or GCP Proven experience working in a consultancy environment — able to manage multiple clients and projects Great communication skills — able to work with both More ❯
Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom
Noir
Data Engineer - FinTech - Newcastle (Tech stack: Data Engineer, SQL, Python, AWS, Git, Airflow, Data Pipelines, Data Platforms, Programmer, Developer, Architect, Data Engineer) Our client is a trailblazer in the FinTech space, known for delivering innovative technology solutions to global financial markets. They are expanding their engineering capability in Newcastle and are looking for a talented Data Engineer to join More ❯
Skills-Snowflake,DBT,Airflow, Data Modelling,SQL Mode-Hybrid-4 Days Location-Manchester Mandatory Skills Possess good knowledge in Cloud computing Very good working knowledge in data models viz Dimensional Data Model ER Data Model and Data Vault Very good working knowledge in writing SQL queries Very good working knowledge in Snowflake Architecture Very good working knowledge in Snowflake More ❯
Manchester, England, United Kingdom Hybrid/Remote Options
ECOM
hands-on with AWS, Azure, or GCP, working in a collaborative environment that values innovation, quality, and teamwork. 💡 What you’ll bring: Solid experience with Python, SQL, Spark, and Airflow Confident working across AWS, Azure, or GCP Proven experience working in a consultancy environment — able to manage multiple clients and projects Great communication skills — able to work with both More ❯
Leeds, England, United Kingdom Hybrid/Remote Options
ECOM
hands-on with AWS, Azure, or GCP, working in a collaborative environment that values innovation, quality, and teamwork. 💡 What you’ll bring: Solid experience with Python, SQL, Spark, and Airflow Confident working across AWS, Azure, or GCP Proven experience working in a consultancy environment — able to manage multiple clients and projects Great communication skills — able to work with both More ❯
Manchester, Lancashire, England, United Kingdom Hybrid/Remote Options
Interquest
hands-on with AWS, Azure, or GCP, working in a collaborative environment that values innovation, quality, and teamwork. What you’ll bring: Solid experience with Python, SQL, Spark, and Airflow Confident working across AWS, Azure, or GCP Proven experience working in a consultancy environment — able to manage multiple clients and projects Great communication skills — able to work with both More ❯
Knutsford, Cheshire, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
collaborate across teams. What We're Looking For: Proven experience with BigQuery, dbt/dataform, and Tableau. Strong SQL and modern data architecture knowledge. Familiarity with orchestration tools (Prefect, Airflow), Git, CI/CD, and Python. Excellent communication and stakeholder engagement skills. Ready to make an impact? Apply now More ❯
Job Title: Airflow/AWS Data Engineer Location: Manchester Area (3 days per week in the office) Rate: Up to £400 per day inside IR35 Start Date: 03/11/2025 Contract Length: Until 31st December 2025 Job Type: Contract Company Introduction: An exciting opportunity has become available with one of our sector-leading financial services clients. They … to join their growing data engineering function. This role will play a key part in designing, deploying, and maintaining modern cloud infrastructure and data pipelines, with a focus on Airflow, AWS, and data platform automation. Key Responsibilities: Deploy and manage cloud infrastructure across Astronomer Airflow and AccelData environments. Facilitate integration between vendor products and core systems, including data … Establish and enforce best practices for cloud security, scalability, and performance. Configure and maintain vendor product deployments, ensuring reliability and optimized performance. Ensure high availability and fault tolerance for Airflow clusters. Implement and manage monitoring, alerting, and logging solutions for Airflow and related components. Perform upgrades, patches, and version management for platform components. Oversee capacity planning and resource More ❯
skills We're excited if you have 7+ years of experience delivering multi tier, highly scalable, distributed web applications Experience working with Distributed computing frameworks knowledge: Hive/Hadoop, Apache Spark, Kafka, Airflow Working with programming languages Python , Java, SQL. Working on building ETL (Extraction Transformation and Loading) solution using PySpark Experience in SQL/NoSQL database design More ❯
Better Placed Ltd - A Sunday Times Top 10 Employer!
across multiple sources. Integrate services via RESTful APIs and manage structured/unstructured data formats. Data Visualisation & Automation Build interactive dashboards in BI tools such as Power BI or Apache Superset. Automate KPI tracking and reporting to streamline workflows. Partner with teams to identify opportunities for process optimisation. Apply best visualisation principles for clarity and impact. Ensure dashboard performance … tools such as Power BI, Superset, or Tableau. Experience with AWS data tools and governance. Strong analytical mindset with a focus on business outcomes. Nice-to-Have Python or Airflow experience for automation. Knowledge of data warehouse best practices (Snowflake, BigQuery, Redshift). Experience with MS Business Central or similar ERP systems. eCommerce or omni-channel retail background. Exposure More ❯
Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom
Opus Recruitment Solutions Ltd
SC cleared Software developers (Python & AWS) to join a contract till April 2026.Inside IR35SC clearedWeekly travel to Newcastle Around £400 per dayContract till April 2026Skills:- Python- AWS Services- Terraform- Apache Spark- Airflow- Docker More ❯
Sunderland, Tyne and Wear, England, United Kingdom
Reed
data solutions. Provide hands-on technical guidance on data design, modelling, and integration, ensuring alignment with architectural standards. Drive the adoption of tools such as Alation, Monte Carlo, and Airflow to improve data lineage, quality, and reliability. Ensure data security, privacy, and compliance are integral to all architecture and integration designs. Act as a bridge between business and technology … Glue, Azure Blob Storage, Google BigQuery). Expertise in data modelling (Dimensional, Data Vault, Enterprise). Experience designing and implementing modern data architectures. Proficiency with integration/orchestration tools (Airflow, dbt, Glue). Strong communication and stakeholder management skills. Experience with metadata, cataloguing, and data quality tools, and knowledge of data governance and GDPR. Benefits: Opportunity to work in More ❯