teams (Data Scientists, Quality, Manufacturing, IT) to define data requirements and deliver reliable data solutions. Develop and maintain ETL/ELT workflows using modern orchestration and integration tools (e.g., Airflow, Fivetran, or similar). Implement data governance, lineage, and documentation best practices. Support business intelligence tools (e.g., Tableau, Power BI) by ensuring data accessibility and consistency. Continuously evaluate emerging … hands-on experience with: Snowflake (architecture, performance tuning, cost optimisation) DBT (model development, testing, documentation, deployment) SQL (advanced query optimisation and debugging) Experience with data integration and orchestration tools (Airflow, Dagster, Prefect, or similar). Familiarity with cloud data platforms (AWS, Azure, or GCP). Contract: Initial 6-month contract Hybrid - 1 day per week on-site in London More ❯
troubleshoot data workflows and performance issues Essential Skills & Experience: Proficiency in SQL , Python , or Scala Experience with cloud platforms such as AWS, Azure, or GCP Familiarity with tools like Apache Spark , Kafka , and Airflow Strong understanding of data modelling and architecture Knowledge of CI/CD pipelines and version control systems Additional Information: This role requires active SC More ❯
AWS) to join a contract till April 2026. Inside IR35 SC cleared Weekly travel to Newcastle Around £400 per day Contract till April 2026 Skills: - Python - AWS Services - Terraform - Apache Spark - Airflow - Docker More ❯
Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom
Opus Recruitment Solutions Ltd
SC cleared Software developers (Python & AWS) to join a contract till April 2026.Inside IR35SC clearedWeekly travel to Newcastle Around £400 per dayContract till April 2026Skills:- Python- AWS Services- Terraform- Apache Spark- Airflow- Docker More ❯
and fast-paced, giving you the opportunity to work across a wide variety of data sources and tools. Day-to-day responsibilities include: Designing and developing DBT models and Airflow pipelines within a modern data stack. Building robust data ingestion pipelines across multiple sources - including external partners, internal platforms, and APIs. Implementing automated testing and CI/CD pipelines … on forecasting and predictive analytics initiatives. Bringing modern engineering practices, testing frameworks, and design patterns to the wider data function. Tech Stack & Skills Core skills: Strong experience with DBT, Airflow, Snowflake, and Python Proven background in automated testing, CI/CD, and test-driven development Experience building and maintaining data pipelines and APIs in production environments Nice to have More ❯
Role:JAVA Full Stack Developer Job Type - Glasgow, Hybrid (3 5 days onsite) The Role As a JAVA Full Stack Lead Software Engineer with our client, you will play a crucial role in an agile team, focusing on the enhancement More ❯
code and testing principles. Develop tools and frameworks for data governance, privacy, and quality monitoring, ensuring full compliance with data protection standards. Create resilient data workflows and automation within Airflow, Databricks, and other modern big data ecosystems. Implement and manage data observability and cataloguing tools (e.g., Monte Carlo, Atlan, DataHub) to enhance visibility and reliability. Partner with ML engineers … deploy, and scale production-grade data platforms and backend systems. Familiarity with data governance frameworks, privacy compliance, and automated data quality checks. Hands-on experience with big data tools (Airflow, Databricks) and data observability platforms. Collaborative mindset and experience working with cross-functional teams including ML and analytics specialists. Curiosity and enthusiasm for continuous learning - you stay up to More ❯