systems for operational decision support and analyzing unstructured data (e.g., text, imagery) Ability to architect and maintain scalable data lakes, data warehouses, or distributed storage systems (e.g., Delta Lake, Snowflake, Hadoop, or NoSQL solutions) Demonstrated understanding of data security, privacy, and sovereignty issues, particularly in military or international environments, ensuring compliance with NATO operational and ethical standards Experience building visually More ❯
Server. Familiarity with Azure Data Lake Storage (Gen2) and Azure Blob Storage. Knowledge of Power BI integration and data modelling Understanding of Azure Functions and Logic Apps for automation. Snowflake: Strong SQL skills and experience with Snowflake's architecture (virtual warehouses, storage, cloud services). Proficiency in Snowflake Streams & Tasks for CDC and automation. Experience with Snowflake Secure Data Sharing … and Snowflake Marketplace. Familiarity with Snowpark for Python/Java-based transformations. Understanding of role-based access control, data masking, and time travel features. Databricks: Hands-on experience with Apache Spark and Databricks Runtime. Proficiency in Delta Lake for ACID-compliant data lakes. Experience with Structured Streaming and Auto Loader. Familiarity with MLflow, Feature Store, and Model Registry. Use of … across business and technology stakeholders. Financial Services industry background or experience as a bonus. Preparation and delivery of MI/BI reporting in Power BI or SAS a bonus. Snowflake experience a bonus. What we look for Someone who is passionate about reaching their full potential and excelling in their career Someone with energy, enthusiasm and courage who enjoys solving More ❯
Kubernetes). Data Management: Understanding of data modeling, metadata management, and data lineage. Experience implementing CI/CD pipelines for data workflows. Familiarity with modern storage and query engines (Snowflake, Redshift, BigQuery, Delta Lake). Soft Skills: Strong analytical and problem-solving abilities; ability to work with large, complex datasets. Excellent verbal and written communication skills; ability to explain technical More ❯
and business stakeholders. Nice to have • Exposure to Microsoft Fabric specifically. • Experience with distributed data processing (e.g. Spark). • Familiarity with equivalent cloud platforms (AWS, GCP). • Exposure to Snowflake or Databricks. How to apply To apply simply complete a CV profile and submit your application. If shortlisted, one of our recruitment team will be in touch to arrange a More ❯
Kafka, Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery, Redshift). Strong understanding of data governance, security, and compliance (experience within financial services is a plus). Leadership experience, with the ability to mentor, influence, and set technical More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
CV TECHNICAL LTD
Kafka, Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery, Redshift). Strong understanding of data governance, security, and compliance (experience within financial services is a plus). Leadership experience, with the ability to mentor, influence, and set technical More ❯
backbone of great data analysis, and as such the data engineer is crucial to the success of the data department overall. TLA works with the modern data stack, utilising Snowflake for our data warehouse, dbt to transform data across our medallion architecture, and Apache Airflow for orchestration. Microsoft Azure is our choice of cloud provider for hosting infrastructure. Within the More ❯
level and translating requirements into solution blueprints, proposals, and presentations. Enterprise Solution Design – Architect end-to-end data and cloud solutions across platforms such as Azure, AWS, GCP, Databricks, Snowflake, Synapse, or Azure Fabric. Cloud Strategy & Adoption – Define and lead cloud migration, modernisation, and optimisation strategies using tools such as Terraform, Azure DevOps, and CI/CD pipelines. Data Products More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Focused Futures Consultancy LTD
level and translating requirements into solution blueprints, proposals, and presentations. Enterprise Solution Design – Architect end-to-end data and cloud solutions across platforms such as Azure, AWS, GCP, Databricks, Snowflake, Synapse, or Azure Fabric. Cloud Strategy & Adoption – Define and lead cloud migration, modernisation, and optimisation strategies using tools such as Terraform, Azure DevOps, and CI/CD pipelines. Data Products More ❯
london, south east england, united kingdom Hybrid / WFH Options
Focused Futures Consultancy LTD
level and translating requirements into solution blueprints, proposals, and presentations. Enterprise Solution Design – Architect end-to-end data and cloud solutions across platforms such as Azure, AWS, GCP, Databricks, Snowflake, Synapse, or Azure Fabric. Cloud Strategy & Adoption – Define and lead cloud migration, modernisation, and optimisation strategies using tools such as Terraform, Azure DevOps, and CI/CD pipelines. Data Products More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Focused Futures Consultancy LTD
level and translating requirements into solution blueprints, proposals, and presentations. Enterprise Solution Design – Architect end-to-end data and cloud solutions across platforms such as Azure, AWS, GCP, Databricks, Snowflake, Synapse, or Azure Fabric. Cloud Strategy & Adoption – Define and lead cloud migration, modernisation, and optimisation strategies using tools such as Terraform, Azure DevOps, and CI/CD pipelines. Data Products More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Focused Futures Consultancy LTD
level and translating requirements into solution blueprints, proposals, and presentations. Enterprise Solution Design – Architect end-to-end data and cloud solutions across platforms such as Azure, AWS, GCP, Databricks, Snowflake, Synapse, or Azure Fabric. Cloud Strategy & Adoption – Define and lead cloud migration, modernisation, and optimisation strategies using tools such as Terraform, Azure DevOps, and CI/CD pipelines. Data Products More ❯
orchestration (e.g., Airflow, dbt, Talend, AWS Glue, GCP Dataform/Cloud Composer) Proven ability to design, deploy, and optimize data warehouses and lakehouse architectures using technologies like BigQuery, Redshift, Snowflake, and Databricks Experience with Infrastructure as Code tools (e.g., Terraform, AWS CloudFormation, GCP Deployment Manager) for cloud resource provisioning and management Proficiency with CI/CD pipelines and DevOps practices More ❯
tools and big data platforms. • Knowledge of data modeling, replication, and query optimization. • Hands-on experience with SQL and NoSQL databases is desirable. • Familiarity with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) would be beneficial. Data Platform Management: • Comfortable operating in hybrid environments (cloud and on-prem). • Experience integrating diverse data sources and systems. • Understanding of secure data transfer More ❯
tools and big data platforms. • Knowledge of data modeling, replication, and query optimization. • Hands-on experience with SQL and NoSQL databases is desirable. • Familiarity with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) would be beneficial. Data Platform Management: • Comfortable operating in hybrid environments (cloud and on-prem). • Experience integrating diverse data sources and systems. • Understanding of secure data transfer More ❯
stakeholders, including business teams. Build relationships across the bank, establishing a strong peer network and helping to strengthen collaboration.Skills and Experience: Essential Advanced proficiency in databases - SQL Server or Snowflake Advanced experience with low-code/no-code data engineering/ETL tool, preferably Markit EDM (v19.2 or above), however similar tools such as Informatica Power Centre may be acceptable More ❯
p/day Key Responsibilities: Designing and maintaining scalable ETL/ELT pipelines . Integrating data from multiple sources into a centralised warehouse (SQL Server, PostgreSQL, or Snowflake). Working with Azure Data Factory and cloud-native tooling for data orchestration. Structuring data models to support Power BI dashboards and reports. Implementing data governance, quality frameworks, and security controls . More ❯
in analytics engineering, data engineering, or closely related roles Strong proficiency in SQL, Python, and dbt (strongly preferred) Hands-on experience with Azure Databricks and cloud-based data platforms (Snowflake experience also valued) Solid understanding of dimensional modelling, lakehouse/warehouse design, and modern data stack Familiarity with Git, CI/CD, and software engineering best practices Experience with Power More ❯
in analytics engineering, data engineering, or closely related roles Strong proficiency in SQL, Python, and dbt (strongly preferred) Hands-on experience with Azure Databricks and cloud-based data platforms (Snowflake experience also valued) Solid understanding of dimensional modelling, lakehouse/warehouse design, and modern data stack Familiarity with Git, CI/CD, and software engineering best practices Experience with Power More ❯
Understanding of Azure services (e.g., Azure Synapse, Azure Data Factory, Azure SQL) Knowledge of Power Automate and Power Apps integration with Power BI Knowledge of non-Microsoft data sources (Snowflake, Oracle, PostgreSQL, MySQL, Synapse, Big Query) Performance optimisation (Partitioning, Understanding of columnar database concept, SQL indexing) Troubleshooting (SQL Profiler or other profiler tools, Tabular Editor, DAX Studio, Power BI datasets More ❯
Understanding of Azure services (e.g., Azure Synapse, Azure Data Factory, Azure SQL) Knowledge of Power Automate and Power Apps integration with Power BI Knowledge of non-Microsoft data sources (Snowflake, Oracle, PostgreSQL, MySQL, Synapse, Big Query) Performance optimisation (Partitioning, Understanding of columnar database concept, SQL indexing) Troubleshooting (SQL Profiler or other profiler tools, Tabular Editor, DAX Studio, Power BI datasets More ❯
Understanding of Azure services (e.g., Azure Synapse, Azure Data Factory, Azure SQL) Knowledge of Power Automate and Power Apps integration with Power BI Knowledge of non-Microsoft data sources (Snowflake, Oracle, PostgreSQL, MySQL, Synapse, Big Query) Performance optimisation (Partitioning, Understanding of columnar database concept, SQL indexing) Troubleshooting (SQL Profiler or other profiler tools, Tabular Editor, DAX Studio, Power BI datasets More ❯
EMR, or Kafka Experience working on real-time data and streaming applications Experience with NoSQL implementation, including MongoDB or Cassandra Experience with data warehousing using AWS Redshift, MySQL, or Snowflake Experience with Agile engineering practices TS/SCI clearance with a polygraph Clearance: Applicants selected will be subject to a security investigation and may need to meet eligibility requirements for More ❯
s Degree in relevant field (data science, analytics, mathematics, economics, business, statistics, engineering, or finance) or equivalent experience Experience or knowledge of cloud-based data analytics e.g. (AWS, Azure, Snowflake) Experience in medical data including hospital, ancillary, and physician Experience in Generative AI Use Cases leveraging LLM's and exposure to LLMs like Langchain, RAG, ChatGpt 4 and Chatgpt 4oc More ❯
time streaming architectures, to support advanced analytics, AI, and business intelligence use cases. Proven experience in designing architectures for structured, semi-structured, and unstructured data , leveraging technologies like Databricks, Snowflake, Apache Kafka , and Delta Lake to enable seamless data processing and analytics. Hands-on experience in data integration , including designing and optimising data pipelines (batch and streaming) and integrating cloud More ❯