develop and deploy Feature Engineering and Modeling applications to data platforms built on Databricks or similar platforms and platform components (e.g., Snowflake, ML Flow, Airflow, etc.). Demonstrated experience in using Azure-based cloud applications, services and infrastructure or significant, transferrable experience with other Cloud Providers (e.g., AWS or More ❯
technical documentation Comfortable working with Agile, TOGAF, or similar frameworks Desirable: Experience with Python and data libraries (Pandas, Scikit-learn) Knowledge of ETL tools (Airflow, Talend, NiFi) Familiarity with analytics platforms (SAS, Posit) Prior work in high-performance or large-scale data environments Why Join? This is more than More ❯
bath, south west england, united kingdom Hybrid / WFH Options
GMA Consulting
async · An expert working with databases and database architecture, SQL and Stored Procedures · Proven experience with Data integration technologies such as Azure Data Factory, Apacheairflow, Databricks AutoLoader · Ability to build CI/CD pipelines and infrastructure as code · Experience with event driven data flows and message aggregators … Event Hub, Apache Kafka) · Good understanding of current and emerging technologies and their potential to deliver business benefits · Hands on experience working with data matching processes would be highly desirable · Ability to translate business strategy into technical solutions and business requirements into technical design As a precondition of employment More ❯
bath, south west england, United Kingdom Hybrid / WFH Options
Primis
platform. What You’ll Be Doing: Designing cloud-based data platform infrastructure (AWS, Azure, or GCP). Working with modern data tools like DBT , Airflow , and Snowflake . Using Terraform for Infrastructure as Code to automate deployment and configuration. Implementing CI/CD pipelines with tools such as Azure … high-performing. Tech Stack Includes: Infrastructure as Code (Terraform) Cloud databases (Snowflake, Redshift, etc.) Docker and/or Linux tooling Modern data stack: DBT, Airflow Agile working environment What’s in it for You: Fully remote working – permanently Annual performance bonus 5% pension contribution 45 days holiday including bank More ❯
including data pipelines, orchestration and modelling. Lead the team in building and maintaining robust data pipelines, data models, and infrastructure using tools such as Airflow, AWS Redshift, DBT and Looker.Ensuring the team follows agile methodologies to improve delivery cadence and responsiveness. Contribute to hands-on coding, particularly in areas … foster team growth and development Strong understanding of the data engineering lifecycle, from ingestion to consumption Hands-on experience with our data stack (Redshift, Airflow, Python, DVT, MongoDB, AWS, Looker, Docker) Understanding of data modelling, transformation, and orchestration best practices Experience delivering both internal analytics platforms and external data More ❯