systems for operational decision support and analyzing unstructured data (e.g., text, imagery) Ability to architect and maintain scalable data lakes, data warehouses, or distributed storage systems (e.g., Delta Lake, Snowflake, Hadoop, or NoSQL solutions) Demonstrated understanding of data security, privacy, and sovereignty issues, particularly in military or international environments, ensuring compliance with NATO operational and ethical standards Experience building visually More ❯
Server. Familiarity with Azure Data Lake Storage (Gen2) and Azure Blob Storage. Knowledge of Power BI integration and data modelling Understanding of Azure Functions and Logic Apps for automation. Snowflake: Strong SQL skills and experience with Snowflake's architecture (virtual warehouses, storage, cloud services). Proficiency in Snowflake Streams & Tasks for CDC and automation. Experience with Snowflake Secure Data Sharing … and Snowflake Marketplace. Familiarity with Snowpark for Python/Java-based transformations. Understanding of role-based access control, data masking, and time travel features. Databricks: Hands-on experience with Apache Spark and Databricks Runtime. Proficiency in Delta Lake for ACID-compliant data lakes. Experience with Structured Streaming and Auto Loader. Familiarity with MLflow, Feature Store, and Model Registry. Use of … across business and technology stakeholders. Financial Services industry background or experience as a bonus. Preparation and delivery of MI/BI reporting in Power BI or SAS a bonus. Snowflake experience a bonus. What we look for Someone who is passionate about reaching their full potential and excelling in their career Someone with energy, enthusiasm and courage who enjoys solving More ❯
Kubernetes). Data Management: Understanding of data modeling, metadata management, and data lineage. Experience implementing CI/CD pipelines for data workflows. Familiarity with modern storage and query engines (Snowflake, Redshift, BigQuery, Delta Lake). Soft Skills: Strong analytical and problem-solving abilities; ability to work with large, complex datasets. Excellent verbal and written communication skills; ability to explain technical More ❯
Kafka, Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery, Redshift). Strong understanding of data governance, security, and compliance (experience within financial services is a plus). Leadership experience, with the ability to mentor, influence, and set technical More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
CV TECHNICAL LTD
Kafka, Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery, Redshift). Strong understanding of data governance, security, and compliance (experience within financial services is a plus). Leadership experience, with the ability to mentor, influence, and set technical More ❯
level and translating requirements into solution blueprints, proposals, and presentations. Enterprise Solution Design – Architect end-to-end data and cloud solutions across platforms such as Azure, AWS, GCP, Databricks, Snowflake, Synapse, or Azure Fabric. Cloud Strategy & Adoption – Define and lead cloud migration, modernisation, and optimisation strategies using tools such as Terraform, Azure DevOps, and CI/CD pipelines. Data Products More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Focused Futures Consultancy LTD
level and translating requirements into solution blueprints, proposals, and presentations. Enterprise Solution Design – Architect end-to-end data and cloud solutions across platforms such as Azure, AWS, GCP, Databricks, Snowflake, Synapse, or Azure Fabric. Cloud Strategy & Adoption – Define and lead cloud migration, modernisation, and optimisation strategies using tools such as Terraform, Azure DevOps, and CI/CD pipelines. Data Products More ❯
london, south east england, united kingdom Hybrid / WFH Options
Focused Futures Consultancy LTD
level and translating requirements into solution blueprints, proposals, and presentations. Enterprise Solution Design – Architect end-to-end data and cloud solutions across platforms such as Azure, AWS, GCP, Databricks, Snowflake, Synapse, or Azure Fabric. Cloud Strategy & Adoption – Define and lead cloud migration, modernisation, and optimisation strategies using tools such as Terraform, Azure DevOps, and CI/CD pipelines. Data Products More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Focused Futures Consultancy LTD
level and translating requirements into solution blueprints, proposals, and presentations. Enterprise Solution Design – Architect end-to-end data and cloud solutions across platforms such as Azure, AWS, GCP, Databricks, Snowflake, Synapse, or Azure Fabric. Cloud Strategy & Adoption – Define and lead cloud migration, modernisation, and optimisation strategies using tools such as Terraform, Azure DevOps, and CI/CD pipelines. Data Products More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Focused Futures Consultancy LTD
level and translating requirements into solution blueprints, proposals, and presentations. Enterprise Solution Design – Architect end-to-end data and cloud solutions across platforms such as Azure, AWS, GCP, Databricks, Snowflake, Synapse, or Azure Fabric. Cloud Strategy & Adoption – Define and lead cloud migration, modernisation, and optimisation strategies using tools such as Terraform, Azure DevOps, and CI/CD pipelines. Data Products More ❯
backbone of great data analysis, and as such the data engineer is crucial to the success of the data department overall. TLA works with the modern data stack, utilising Snowflake for our data warehouse, dbt to transform data across our medallion architecture, and Apache Airflow for orchestration. Microsoft Azure is our choice of cloud provider for hosting infrastructure. Within the More ❯
and business stakeholders. Nice to have • Exposure to Microsoft Fabric specifically. • Experience with distributed data processing (e.g. Spark). • Familiarity with equivalent cloud platforms (AWS, GCP). • Exposure to Snowflake or Databricks. How to apply To apply simply complete a CV profile and submit your application. If shortlisted, one of our recruitment team will be in touch to arrange a More ❯
p/day Key Responsibilities: Designing and maintaining scalable ETL/ELT pipelines . Integrating data from multiple sources into a centralised warehouse (SQL Server, PostgreSQL, or Snowflake). Working with Azure Data Factory and cloud-native tooling for data orchestration. Structuring data models to support Power BI dashboards and reports. Implementing data governance, quality frameworks, and security controls . More ❯
in analytics engineering, data engineering, or closely related roles Strong proficiency in SQL, Python, and dbt (strongly preferred) Hands-on experience with Azure Databricks and cloud-based data platforms (Snowflake experience also valued) Solid understanding of dimensional modelling, lakehouse/warehouse design, and modern data stack Familiarity with Git, CI/CD, and software engineering best practices Experience with Power More ❯
in analytics engineering, data engineering, or closely related roles Strong proficiency in SQL, Python, and dbt (strongly preferred) Hands-on experience with Azure Databricks and cloud-based data platforms (Snowflake experience also valued) Solid understanding of dimensional modelling, lakehouse/warehouse design, and modern data stack Familiarity with Git, CI/CD, and software engineering best practices Experience with Power More ❯
time streaming architectures, to support advanced analytics, AI, and business intelligence use cases. Proven experience in designing architectures for structured, semi-structured, and unstructured data , leveraging technologies like Databricks, Snowflake, Apache Kafka , and Delta Lake to enable seamless data processing and analytics. Hands-on experience in data integration , including designing and optimising data pipelines (batch and streaming) and integrating cloud More ❯
orchestration (e.g., Airflow, dbt, Talend, AWS Glue, GCP Dataform/Cloud Composer) Proven ability to design, deploy, and optimize data warehouses and lakehouse architectures using technologies like BigQuery, Redshift, Snowflake, and Databricks Experience with Infrastructure as Code tools (e.g., Terraform, AWS CloudFormation, GCP Deployment Manager) for cloud resource provisioning and management Proficiency with CI/CD pipelines and DevOps practices More ❯
EMR, or Kafka Experience working on real-time data and streaming applications Experience with NoSQL implementation, including MongoDB or Cassandra Experience with data warehousing using AWS Redshift, MySQL, or Snowflake Experience with Agile engineering practices TS/SCI clearance with a polygraph Clearance: Applicants selected will be subject to a security investigation and may need to meet eligibility requirements for More ❯
tools and big data platforms. • Knowledge of data modeling, replication, and query optimization. • Hands-on experience with SQL and NoSQL databases is desirable. • Familiarity with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) would be beneficial. Data Platform Management: • Comfortable operating in hybrid environments (cloud and on-prem). • Experience integrating diverse data sources and systems. • Understanding of secure data transfer More ❯
tools and big data platforms. • Knowledge of data modeling, replication, and query optimization. • Hands-on experience with SQL and NoSQL databases is desirable. • Familiarity with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) would be beneficial. Data Platform Management: • Comfortable operating in hybrid environments (cloud and on-prem). • Experience integrating diverse data sources and systems. • Understanding of secure data transfer More ❯
s Degree in relevant field (data science, analytics, mathematics, economics, business, statistics, engineering, or finance) or equivalent experience Experience or knowledge of cloud-based data analytics e.g. (AWS, Azure, Snowflake) Experience in medical data including hospital, ancillary, and physician Experience in Generative AI Use Cases leveraging LLM's and exposure to LLMs like Langchain, RAG, ChatGpt 4 and Chatgpt 4oc More ❯
stakeholders, including business teams. Build relationships across the bank, establishing a strong peer network and helping to strengthen collaboration.Skills and Experience: Essential Advanced proficiency in databases - SQL Server or Snowflake Advanced experience with low-code/no-code data engineering/ETL tool, preferably Markit EDM (v19.2 or above), however similar tools such as Informatica Power Centre may be acceptable More ❯
Position: Lead ETL Developer with Snowflake. Location: Columbus, OH (Day 1 Onsite) Duration: Long Term looking for a Lead ETL Developer Lead in our Enterprise Data Warehouse. In this role you will be part of a team working to develop More ❯
Snowflake Developer We're looking for a skilled and passionate Snowflake Developer to join our team in a permanent, full-time capacity. This is a hybrid role based in Manchester , requiring you to be in the office 3 days a week . About the Role Job Title: Snowflake Developer Location: Manchester Duration: 3 Days a Week Job Type: Permanent/… FTE As a Snowflake Developer, you'll be a key player in designing, developing, and maintaining our data solutions. You'll work with cutting-edge technologies, leveraging your expertise in Snowflake to build robust and scalable data pipelines and architectures. This role is perfect for someone who is a self-starter , a collaborator , and is eager to learn and adapt … to new technologies. We're seeking a positive , proactive , and pro-team individual who can manage relationships with stakeholders and product owners effectively. Key Responsibilities and Requirements Mandatory Skills Snowflake & ANSI-SQL: Possess a deep understanding of Snowflake's architecture and internals, including roles, dynamic tables, streams, and tasks. You'll need excellent skills in writing complex SQL queries. Data More ❯
be doing: Leading technical teams to design and deliver large-scale, end-to-end data solutions Architecting modern data platforms (data lakes, lakehouses, warehouses) with Databricks and related tools (Snowflake, Synapse, Azure Fabric) Applying data governance and master data management practices to ensure quality and compliance Advising clients on strategy, future-state architectures, and cloud migrations Driving collaboration, innovation, and More ❯