Cardiff, Wales, United Kingdom Hybrid / WFH Options
Creditsafe
all backgrounds and encourage applications from those with transferable experience. You’ll ideally bring: Experience building ETL pipelines in Python. Familiarity with analytical data warehouses (e.g. Redshift preferred, or Snowflake/BigQuery). Understanding of orchestration tools such as AWS Step Functions, Airflow, or AWS Batch. Experience working with Agile methodologies. Awareness of automated testing and delivery pipelines. Understanding of More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Enso Recruitment
role in developing and optimising data pipelines and infrastructure to ensure efficient data flow across the business. Key Responsibilities Develop, support and optimise robust data solutions using tools like Snowflake, dbt, Fivetran, and Azure Cloud services Collaborate with cross-functional teams to translate business needs into actionable data architecture Design and manage data pipelines and integration workflows, ensuring performance and More ❯
of key processes in the engineering delivery cycle including Agile and DevOps, Git, APIs, Containers, Microservices and Data Pipelines Experience working with one or more of Spark, Kafka, or Snowflake NICE TO HAVE DP-203 Azure Data Engineering Microsoft Certified: Fabric Analytics Engineer Associate SKIILLS AND EXPERIENCE A high level of drive with the ability to work to tight deadlines. More ❯
Delivery experience using SAS (SAS EG, SAS DI, SAS Viya). Ability to write complex SQL queries. Project experience with Tableau, Python, Power BI, Cloud platforms (Azure, AWS, GCP, Snowflake, Databricks). Experience leading end-to-end projects and familiarity with methodologies like Agile, Waterfall, Scrum, DevOps, Testing. Experience leading technical and/or project teams. Technical Business Analysis experience. More ❯
be great if you have: Experience of relevant cloud services within AWS, Azure or GCP. Experience working in an Agile environment. Experience working with common vendor products such as Snowflake or Data Bricks. Experience working with CI/CD tooling. What you'll get in return is: 25 days' annual leave, rising to 30 days with each year of service. More ❯
London, England, United Kingdom Hybrid / WFH Options
Noir
experience as a Data Engineer (3-5 years), preferably in the energy sector. Right to work in the UK. Strong proficiency in SQL and database technologies (e.g., MS SQL, Snowflake). Hands-on experience with ETL/ELT tools such as Azure Data Factory, DBT, AWS Glue, etc. Proficiency in Power BI and Advanced Analytics for insightful data visualisation. Strong More ❯
and mutual respect; accepting differences and treating everyone fairly. The ability to think outside of the immediate task and constantly look at ways of improving processes. Desirable Experience with Snowflake, Azure, AWS Experience with GA4, Search Console and Meta data sets Experience with property management software MRI Qube, Yardi or similar. Experience with complex databases Experience in system support, monitoring More ❯
solving, debugging, and performance-tuning skills. Preferred Qualifications Strong communication skills and demonstrated ability to engage with business stakeholders and product teams. Experience in data modeling , data warehousing (e.g., Snowflake , AWS Glue , EMR , Apache Spark ), and working with data pipelines . Leadership experience—whether technical mentorship, team leadership, or managing critical projects. Familiarity with Infrastructure as Code (IaC) tools like More ❯
or equivalent Expert experience building data warehouses and ETL pipelines Expert experience of SQL, python, git, dbt (incl. query efficiency and optimization) Expert experience of Cloud Data Platforms (AWS, Snowflake and/or Databricks) Qualification preferred, not mandatory Significant experience of Automation and Integrations tools (FiveTran, Airflow, Astronomer or similar) Significant experience with IoC tools (Terraform, Docker, Kubernetes or similar More ❯
Big Data technologies (Spark, Impala, Hive, Redshift, Kafka, etc.) Experience in data quality testing; adept at writing test cases and scripts, presenting and resolving data issues Experience with Databricks, Snowflake, Iceberg are required Preferred qualifications, capabilities, and skills Experience in application and data design disciplines with an emphasis on real-time processing and delivery e.g. Kafka is preferable Understanding of More ❯
or equivalent Expert experience building data warehouses and ETL pipelines Expert experience of SQL, python, git, dbt (incl. query efficiency and optimization) Expert experience of Cloud Data Platforms (AWS, Snowflake and/or Databricks) Qualification preferred, not mandatory Significant experience of Automation and Integrations tools (FiveTran, Airflow, Astronomer or similar) Significant experience with IoC tools (Terraform, Docker, Kubernetes or similar More ❯
solid understanding of key processes in the engineering delivery cycle including Agile and DevOps, Git, APIs, Containers, Microservices and Data Pipelines. Experience working with one or more of Kafka, Snowflake, Azure Data Factory, Azure Synapse or Microsoft Fabric is highly desirable. Knowledge of data modelling and data architectures: Inmon, Kimball, DataVault. About You A high level of drive with the More ❯
Gloucester, England, United Kingdom Hybrid / WFH Options
Benefact Group
optimization of data pipelines and resources. Knowledge, skills and experience Essential: Cloud Platforms: Experience with Azure, AWS, or Google Cloud for data engineering. Cloud Data Tools: Expertise in Databricks, Snowflake, BigQuery, or Synapse Analytics. Programming & Scripting: Strong knowledge of Python, SQL, Spark, or similar. Data Modelling & Warehousing: Experience with cloud-based data architecture. CI/CD & DevOps: Knowledge of Terraform More ❯
London, England, United Kingdom Hybrid / WFH Options
Automata
to ingest, transform and process large volumes of structured and unstructured data from diverse sources. Strong knowledge of SQL/NoSQL databases and cloud data warehouse technology such as Snowflake/Databricks/Redshift/BigQuery, including performance tuning and optimisation. Understanding of best practices for designing scalable and efficient data models, leveraging dbt for transformations. Familiarity with CircleCI, Terraform More ❯
APIs for data extraction and interacting with cloud resources via APIs/CLIs/SDKs (e.g. boto3). Experience building out a data warehouse on platforms such as Redshift, Snowflake, or Databricks. Comfortable working with Git for source control (in Azure DevOps repos or equivalent). Experience working in an Agile (Scrum) environment for product delivery using Azure DevOps or More ❯
Oriented Design and Data modelling is desirable. Understanding of Salesforce concepts is desirable. Experience with providing technical solutions and supporting documentation. PREFERRED SKILLS AND EXPERIENCE: Must have Experience on Snowflake Storage and Database. Should have experience of working with Cloud Native based applications using AWS/Azure. Understanding of the SDLC and agile delivery methodology. Experience working with databases and More ❯
Numpy/Pandas) and SQL. Proven experience designing and building robust ETL/ELT pipelines (dbt, Airflow). Strong knowledge of data pipelining, schema design, and cloud platforms (e.g., Snowflake, AWS). Excellent communication skills and the ability to translate technical concepts for diverse audiences. Familiarity with software architecture, containerisation, and modern DevOps practices is a plus. A forward-thinking More ❯
Gloucester, England, United Kingdom Hybrid / WFH Options
Benefact Group plc
pipelines and cloud resources. Knowledge, skills and experience Essential: Cloud Platforms: Demonstrable experience with Azure, AWS, or Google Cloud for data engineering. Cloud-Based Data Tools: Expertise in Databricks, Snowflake, BigQuery, or Synapse Analytics for scalable data solutions. Programming & Scripting: Strong knowledge of Python, SQL, Spark, or equivalent cloud technologies. Data Modelling & Warehousing: Experience with modern cloud-based data architecture More ❯
Chester, England, United Kingdom Hybrid / WFH Options
Forge Holiday Group Ltd
of ETL/ELT processes Exposure to Python or any other object-oriented programming languages Experience with modern data stack tools and cloud-based data warehouses like MS Fabric, Snowflake, Databricks, Teradata or AWS Experience in designing and constructing effective reports and dashboards that transform data into actionable insights with Tableau or Power BI Proven ability to manage work within More ❯
London, England, United Kingdom Hybrid / WFH Options
Workato
data transformation tools (DBT, Coalesce) and orchestration frameworks (Airflow, Dagster) to build scalable pipelines. Knowledge of real-time data movement, databases (Oracle, SQL Server, PostgreSQL), and cloud analytics platforms (Snowflake, Databricks, BigQuery). Familiarity with emerging data technologies like Open Table Format, Apache Iceberg, and their impact on enterprise data strategies. Hands-on experience with data virtualization and analytics platforms More ❯
data pipelines and cloud resources. Knowledge, skills and experience Cloud Platforms: Demonstrable experience with Azure, AWS, or Google Cloud for data engineering. Cloud-Based Data Tools: Expertise in Databricks, Snowflake, BigQuery, or Synapse Analytics for scalable data solutions. Programming & Scripting: Strong knowledge of Python, SQL, Spark, or equivalent cloud technologies. Data Modelling & Warehousing: Experience with modern cloud-based data architecture More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Dept Agency
with cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to translate complex technical concepts for business stakeholders Strategic thinking with More ❯
with cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to translate complex technical concepts for business stakeholders Strategic thinking with More ❯
of-breed vendor tools. We deploy team members from on-shore, near-shore and off-shore teams, and often work alongside our major alliance partners, such as Microsoft, IBM, Snowflake, Moody's, Service Now and Pega to deploy solutions. Increasingly we collaborate with FinTech firms too. We are passionate about keeping pace with the latest emerging technology. We have recently More ❯
SSIS, Data Factory, Alteryx) Power BI Development including the data model, DAX, and visualizations. Relational and Dimensional (Kimball) data modelling. Desirable: Databricks (or Alternative Modern Data Platform such as Snowflake) Experience working in a regulated environment and knowledge of the risk and compliance requirements associated with this. Oracle Database MongoDB Cloud Data Technologies (Mainly Azure - SQL Database, Data Lake, HD More ❯