of the AWS services like Redshift, Lambda,S3,Step Functions, Batch, Cloud formation, Lake Formation, Code Build, CI/CD, GitHub, IAM, SQS, SNS, Aurora DB - Good experience with DBT, Apache Iceberg, Docker, Microsoft BI stack (nice to have) - Experience in data warehouse design (Kimball and lake house, medallion and data vault) is a definite preference as is knowledge of More ❯
engineering disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith, Langfuse and More ❯
engineering disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith, Langfuse and More ❯
engineering disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith, Langfuse and More ❯
Sheffield, South Yorkshire, England, United Kingdom Hybrid/Remote Options
DCS Recruitment
data tools such as Snowflake, Databricks, or BigQuery. Familiarity with streaming technologies (e.g., Kafka, Spark Streaming, Flink) is an advantage. Experience with orchestration and infrastructure tools such as Airflow, dbt, Prefect, CI/CD pipelines, and Terraform. What you get in return: Up to £60,000 per annum + benefits Hybrid working (3 in office) Opportunity to lead and mentor More ❯
Manchester, Lancashire, England, United Kingdom Hybrid/Remote Options
AJ Bell
policies. Maintain data dictionary. Maintain business level data model. Recommending and introducing new technology where needed. Core: Cloud data platforms (e.g. Snowflake, BigQuery, Redshift) Data transformation technology such as DBT Visual Studio Code Python CI automation systems such as Jenkins A git-based source control system such as BitBucket Data Warehouse/Kimball methodology Data replication technology such as Fivetran More ❯
Salford, Greater Manchester, North West, United Kingdom Hybrid/Remote Options
AJ BELL BUSINESS SOLUTIONS LIMITED
policies. Maintain data dictionary. Maintain business level data model. Recommending and introducing new technology where needed. Core: Cloud data platforms (e.g. Snowflake, BigQuery, Redshift) Data transformation technology such as DBT Visual Studio Code Python CI automation systems such as Jenkins A git-based source control system such as BitBucket Data Warehouse/Kimball methodology Data replication technology such as Fivetran More ❯
platforms, time series databases, and tools like Grafana Understanding of MLOps pipelines and tools for automated model management and deployment Experience with orchestration tools such as Apache Airflow or dbt Application Process Ready to take the next step in your career with beON? Send your CV with the subject line "Data Science & ML Engineer." We respect your time - no cover More ❯
Edinburgh, Midlothian, United Kingdom Hybrid/Remote Options
Aberdeen
systems. Good understanding of security, compliance, and agile delivery practices. Passionate about learning new technologies and collaborating with teams, including mentoring junior engineers. Nice-to-have experience with Snowflake, DBT, Terraform, and Azure DevOps CI/CD pipelines. We are proud to be a Disability Confident Committed employer. If you have a disability and would like to apply to one More ❯
of experience in a data engineering or similar role. Proven expertise in building and managing ETL/ELT processes, data lakes, and data warehouses. Strong proficiency in Python, SQL, DBT, and modern data platforms (e.g., Snowflake, BigQuery, Databricks, or similar). Experience with cloud environments (preferably AWS, Azure, or GCP). Familiarity with orchestration tools like Airflow, Prefect, or Dagster. More ❯
Advanced proficiency in Python and SQL, with experience in spatial SQL and PostGIS. Familiarity with spatial tools and libraries (GeoPandas, QGIS) and feature engineering concepts. Experience with data modelling, dbt, and version control (Git). Knowledge of spatial datasets (MasterMap, AddressBase, Land Registry). Desired: Experience with WMS/WFS services, graph theory (NetworkX), GDAL, and Snowflake. Nice to have More ❯
Terraform) Linux, containers, Kubernetes, and serverless Javascript, Node.js, React and Flutter Knowledge of NoSQL and relational databases (AWS RDS) Solid understanding of networking concepts Integration Services (Informatica Cloud, Airflow, DBT, DLT) Knowledge with messaging solutions (EventBridge, SNS, SQS, Rabbit MQ, Kafka) Knowledge of Data warehousing solutions (Snowflake) Experience in working within compliance (e.g.: quality, regulatory - data privacy, GxP, SOX) and More ❯
Edinburgh, City of Edinburgh, United Kingdom Hybrid/Remote Options
Cathcart Technology
overall data strategy. The ideal person for this role will have a strong background in data engineering , with experience building modern data solutions using technologies like Kafka , Spark , Databricks , dbt , and Airflow . You'll know your way around cloud platforms (AWS, GCP, or Azure) and be confident coding in Python , Java , or Scala . Most importantly, you'll understand More ❯
Employment Type: Permanent
Salary: £80000 - £100000/annum Bonus, Pension and Shares
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid/Remote Options
Cathcart Technology
overall data strategy. The ideal person for this role will have a strong background in data engineering , with experience building modern data solutions using technologies like Kafka , Spark , Databricks , dbt , and Airflow . You'll know your way around cloud platforms (AWS, GCP, or Azure) and be confident coding in Python , Java , or Scala . Most importantly, you'll understand More ❯
cloud-based platforms (e.g., Azure Synapse, AWS Redshift, Google BigQuery ) with legacy systems, ensuring performance and scalability. Deep knowledge of ETL/ELT processes , leveraging tools like Apache Airflow, dbt, or Informatica , with a focus on ensuring data quality, lineage, and integrity across the data lifecycle. Practical expertise in data and AI governance , including implementing frameworks for data privacy, ethical More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Harnham - Data & Analytics Recruitment
Ensure high-quality raw datasets to enable accurate analytics and data modeling. Deploy and maintain data tools on Kubernetes (Airflow, Superset, RStudio Connect). Support Data Analytics initiatives through DBT , DevOps , and deployment governance. You will work closely with a small, focused team, contributing directly to strategic data initiatives. Candidate Profile: The ideal candidate will have: Significant experience in DataMore ❯
exprience in developing architectural strategies, blueprints for hybrid and cloud-native solutions ELT/ETL Frameworks & Pipelines Essential Develop robust ELT/ETL pipelines using tools like Apache Airflow, DBT, AWS Glue, Azure Data Factory, or Kafka Connect. Desirable Optimize data transformations for performance, reusability, and modular design (e.g., using SQL/Scala/Python). Disclosure and Barring Service More ❯
South East London, London, United Kingdom Hybrid/Remote Options
Stepstone UK
and in event-driven and eventual consistency systems using Kafka, .Net, Java, REST APIs, AWS, Terraform and DevOps Nice to have:experience in data pipping and modelling using SQL, DBT, ETL, Data Warehousing, Redshift and Python, and ecommerce and mobile applications background Additional Information Were a community here that cares as much about your life outside work as how you More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid/Remote Options
Cathcart Technology
foster a collaborative, forward-thinking culture where engineers work on large-scale datasets, distributed systems, and modern cloud platforms. Teams leverage cutting-edge tools such as Spark, Airflow, Kafka, dbt, and Databricks to build resilient, scalable, and high-quality data solutions. Why this role? ** Lead a talented team of data engineers delivering high-quality, reliable data systems. ** Shape the architecture More ❯
broughton, central scotland, united kingdom Hybrid/Remote Options
Cathcart Technology
foster a collaborative, forward-thinking culture where engineers work on large-scale datasets, distributed systems, and modern cloud platforms. Teams leverage cutting-edge tools such as Spark, Airflow, Kafka, dbt, and Databricks to build resilient, scalable, and high-quality data solutions. Why this role? ** Lead a talented team of data engineers delivering high-quality, reliable data systems. ** Shape the architecture More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Oscar Technology
Strong expertise in Azure cloud services and Databricks . Advanced proficiency in Python and SQL for data engineering. Experience with ETL/ELT tools and orchestration frameworks (e.g., Airflow, dbt). Strong knowledge of data warehousing and cloud-based data architectures. Understanding of BI tools such as Power BI or Tableau. Strong problem-solving and debugging skills. Ability to communicate More ❯
You'll Bring 6+ years of experience in data engineering or similar roles. Proficiency in Python and SQL for data processing and performance tuning. Hands-on experience with Databricks, dbt, Airflow, or other modern data orchestration tools. Strong grasp of data modeling, distributed systems, and cloud data services. Familiarity with DevOps practices, including CI/CD, IaC (Terraform), and monitoring More ❯
You’ll Bring 6+ years of experience in data engineering or similar roles. Proficiency in Python and SQL for data processing and performance tuning. Hands-on experience with Databricks, dbt, Airflow, or other modern data orchestration tools. Strong grasp of data modeling, distributed systems, and cloud data services. Familiarity with DevOps practices, including CI/CD, IaC (Terraform), and monitoring More ❯
ETL processes, and data governance practices. Hands-on experience with data tools and platforms like Snowflake, Redshift, BigQuery, Databricks , or similar. Familiarity with data pipeline orchestration tools (e.g., Airflow, dbt ). Excellent presentation & Comms are required for this role. If you are interested in more information, reach out to Mathew at Reperio or apply below now. Reperio Human Capital acts More ❯
Point • You enjoy sharing knowledge • You experiment with new tools and ideas • You prefer honest, straightforward engineering cultures Some of the Perks & Benefits • A "Tech Playground" with Databricks, Bicep, DBT, OpenAI and more • Unlimited learning: academies, workshops, lunch & learns • A mix of fun internal clubs (climbing, padel, Dutch club, or create your own) • Wellbeing budget each month • Monthly WFH, travel More ❯