Edgbaston, Birmingham, West Midlands (County), United Kingdom
Network IT
relationship building to understand and provide business needs. Experience: We are looking for a Data Engineer who has experience designing and implementing data warehouses, with strong technical competency using Snowflake (preferably certified), Azure Data Factory for building cloud ETL pipelines, Power BI and Data Build Tool (DBT). Other elements of your experience which are desirable to our client include More ❯
Luton, Bedfordshire, South East, United Kingdom Hybrid / WFH Options
Anson Mccade
effectively Desirable Skills: GCP Professional Data Engineer certification Exposure to Agentic AI systems or intelligent/autonomous data workflows Experience with BI tools such as Looker Exposure to Databricks, Snowflake, AWS, Azure or DBT Academic background in Computer Science, Mathematics or a related field This is an opportunity to work in a forward-thinking environment with access to cutting-edge More ❯
oil and gas. Preferred Qualifications: • Certifications in relevant technologies or methodologies. • Proven experience in building, operating, and supporting robust and performance databases and data pipelines. • Experience with Databricks and Snowflake • Solid understanding of web performance optimization, security, and best practices Experience supporting Power BI dashboards More ❯
as Terraform or Ansible for deployment and infrastructure management Hands-on experience with; ETL/ELT orchestration and pipeline tools (Airflow, Airbyte, DBT, etc.) Data warehousing tools and platforms (Snowflake, Iceberg, etc.) SQL databases, particularly MySQL Desired Experience: Experience with cloud-based services, particularly AWS Proven ability to manage stakeholders, their expectations and explain complex problems or solutions in a More ❯
adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. You will have expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Job More ❯
your own ideas-your voice will be heard. Qualifcations: Degree in Computer Science, Information Technology or a related field. Skill & Experience: 3-5 years SQL experience (bonus: NoSQL or Snowflake). 2-3 years of hands-on Python (scripting and development). Experience in a fast-paced startup or agile environment. Strong background in schema design and dimensional data modeling. More ❯
your own ideas-your voice will be heard. Qualifcations: Degree in Computer Science, Information Technology or a related field. Skill & Experience: 3-5 years SQL experience (bonus: NoSQL or Snowflake). 2-3 years of hands-on Python (scripting and development). Experience in a fast-paced startup or agile environment. Strong background in schema design and dimensional data modeling. More ❯
. Experience with data lakes/lakehouses (Databricks, Unity Catalog). Familiarity with Data Mesh, Data Fabric, and product-led data strategies. Expertise in cloud platforms (AWS, Azure, GCP, Snowflake). Technical Skills Proficiency in big data tools (Apache Spark, Hadoop). Programming knowledge (Python, R, Java) is a plus. Understanding of ETL/ELT, SQL, NoSQL, and data visualisation More ❯
Market PAS platforms (e.g., OpenTWINS, DXC Assure, Sequel, IRIS). Knowledge of BI/MI tooling (e.g., Power BI, Tableau, Qlik). Familiarity with data warehouse technologies (e.g., SQL, Snowflake, Azure, Informatica, etc.). Exposure to Agile delivery and use of tools such as Jira or Azure DevOps. More ❯
Market PAS platforms (e.g., OpenTWINS, DXC Assure, Sequel, IRIS). Knowledge of BI/MI tooling (e.g., Power BI, Tableau, Qlik). Familiarity with data warehouse technologies (e.g., SQL, Snowflake, Azure, Informatica, etc.). Exposure to Agile delivery and use of tools such as Jira or Azure DevOps. More ❯
Must-Have: 2+ years experience in an analytics or data consultancy role Proficiency in SQL and data modelling (preferably with dbt) Hands-on experience with cloud data warehouses (BigQuery, Snowflake, Redshift) Familiarity with BI tools (Looker, Power BI, Tableau, etc.) Excellent communication skills - able to simplify technical concepts for non-technical stakeholders Nice-to-Have: Experience working in client-facing More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Noir
testing including unit, performance, stress, and security testing.Experience working in an agile team.Experience working in a highly regulated industry and with highly sensitive data.Exposure to large data solutions like Snowflake, Trino, Synapse, Azure Data Lake, and Databricks.Experience in data science using R, Stata, or Python.Familiarity with Atlassian tools such as JIRA, Confluence, and BitBucket.Understanding of clinical trials, GCP, and GxP.What More ❯
role. Advanced Power BI skills, including DAX, Power Query (M), and custom visuals. Strong experience in data transformation and modelling . Proven ability in data integration across multiple sources (Snowflake, SQL, APIs, Excel, etc.). Experience working in or with the defence, government, or public sector is highly desirable. Knowledge of Data Warehousing concepts and practices. Familiarity with Agile methodologies More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Anvil Analytical
role. Advanced Power BI skills, including DAX, Power Query (M), and custom visuals. Strong experience in data transformation and modelling . Proven ability in data integration across multiple sources (Snowflake, SQL, APIs, Excel, etc.). Experience working in or with the defence, government, or public sector is highly desirable. Knowledge of Data Warehousing concepts and practices. Familiarity with Agile methodologies More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
Noir
performance, stress, and security testing. Experience working in an agile team. Experience working in a highly regulated industry and with highly sensitive data. Exposure to large data solutions like Snowflake, Trino, Synapse, Azure Data Lake, and Databricks. Experience in data science using R, Stata, or Python. Familiarity with Atlassian tools such as JIRA, Confluence, and BitBucket. Understanding of clinical trials More ❯
as: Hadoop, Kafka, Apache Spark, Apache Flink, object, relational and NoSQL data stores. Hands-on experience with big data application development and cloud data warehousing (e.g. Hadoop, Spark, Redshift, Snowflake, GCP BigQuery) Expertise in building data architectures that support batch and streaming paradigms Experience with standards such as JSON, XML, YAML, Avro, Parquet Strong communication skills Open to learning new More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Anson Mccade
for large-scale platforms and diverse stakeholder groups Strong data modelling skills across OLTP and OLAP systems, with experience building datamarts and warehouses Proficiency with cloud-native tools (e.g. Snowflake, Databricks, Azure Data Services, etc.) High-level SQL skills and experience with tools like SSMS, SSIS, or SSRS Strong understanding of data governance, RBAC, and information security practices Able to More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Anson Mccade
for large-scale platforms and diverse stakeholder groups Strong data modelling skills across OLTP and OLAP systems, with experience building datamarts and warehouses Proficiency with cloud-native tools (e.g. Snowflake, Databricks, Azure Data Services, etc.) High-level SQL skills and experience with tools like SSMS, SSIS, or SSRS Strong understanding of data governance, RBAC, and information security practices Able to More ❯
Central London, London, United Kingdom Hybrid / WFH Options
Anson Mccade
for large-scale platforms and diverse stakeholder groups Strong data modelling skills across OLTP and OLAP systems, with experience building datamarts and warehouses Proficiency with cloud-native tools (e.g. Snowflake, Databricks, Azure Data Services, etc.) High-level SQL skills and experience with tools like SSMS, SSIS, or SSRS Strong understanding of data governance, RBAC, and information security practices Able to More ❯
West Midlands, United Kingdom Hybrid / WFH Options
Anson Mccade
for large-scale platforms and diverse stakeholder groups Strong data modelling skills across OLTP and OLAP systems, with experience building datamarts and warehouses Proficiency with cloud-native tools (e.g. Snowflake, Databricks, Azure Data Services, etc.) High-level SQL skills and experience with tools like SSMS, SSIS, or SSRS Strong understanding of data governance, RBAC, and information security practices Able to More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Anson Mccade
for large-scale platforms and diverse stakeholder groups Strong data modelling skills across OLTP and OLAP systems, with experience building datamarts and warehouses Proficiency with cloud-native tools (e.g. Snowflake, Databricks, Azure Data Services, etc.) High-level SQL skills and experience with tools like SSMS, SSIS, or SSRS Strong understanding of data governance, RBAC, and information security practices Able to More ❯
with tooling, guidance, and best practices. About You: Strong technical foundation in data governance architecture and tooling. You've worked with tools such as DataHub, Apache Airflow, AWS, dbt, Snowflake, BigQuery , or similar. Hands-on experience building and maintaining centralized data inventories, business glossaries, and data mapping frameworks. Proficient in automating data classification and lineage using scripting languages like Python More ❯
ETL tools, data migration, and data cleansing methodologies. Moderate to advanced SQL skills, with experience writing complex queries. Experience integrating with cloud-based data warehouses/data lakes (e.g., Snowflake, AWS, Databricks, Big Query) and data analytics tools (e.g., Tableau). Excellent communication skills, with the ability to explain complex technical concepts to non-technical stakeholders. Ability to thrive in More ❯
data engineering roles, preferably for a customer facing data product Expertise in designing and implementing large-scale data processing systems with data tooling such as Spark, Kafka, Airflow, dbt, Snowflake, Databricks, or similar Strong programming skills in languages such as SQL, Python, Go or Scala Demonstrable use and an understanding of effective use of AI tooling in your development process More ❯
Experience in managing a team of Data Engineers Experience with Data modelling, Data warehousing, and building ETL pipelines Experience with AWS (S3, EKS, EC2, RDS) or similar cloud services, Snowflake, Fivetran, Airbyte, dbt, Docker, Argo Experience in SQL, Python, and Terraform Experience with building Data pipelines and applications to stream and process datasets Robust understanding of DevOps principles is required More ❯