City of London, London, United Kingdom Hybrid / WFH Options
twentyAI
data capability and building a new data platform. About the Role You will be part of a diverse group of engineers and machine learning experts, working with cutting-edge Azure cloud technologies, including Microsoft Fabric and related services. Your mission is to design reliable, efficient data pipelines that enable the business to access trusted, well-structured data. You will … also focus on building and scaling the core data infrastructure that supports advanced analytics and machine learning efforts across the business. Responsibilities Design and develop end-to-end data pipelines that ingest, transform, and prepare data for analytics and machine learning workflows. Work with Infrastructure as Code, primarily Terraform, to automate and manage cloud infrastructure, enabling repeatable and … scientists, MLEs, and business teams in an agile environment to deliver data solutions that support key firm initiatives. Build scalable and efficient batch and streaming data workflows within the Azure ecosystem. Apply distributed processing techniques using Apache Spark to handle large datasets effectively. Help drive improvements in data quality, implementing validation, cleansing, and monitoring frameworks. Contribute to the firm More ❯
South East London, England, United Kingdom Hybrid / WFH Options
twentyAI
data capability and building a new data platform. About the Role You will be part of a diverse group of engineers and machine learning experts, working with cutting-edge Azure cloud technologies, including Microsoft Fabric and related services. Your mission is to design reliable, efficient data pipelines that enable the business to access trusted, well-structured data. You will … also focus on building and scaling the core data infrastructure that supports advanced analytics and machine learning efforts across the business. Responsibilities Design and develop end-to-end data pipelines that ingest, transform, and prepare data for analytics and machine learning workflows. Work with Infrastructure as Code, primarily Terraform, to automate and manage cloud infrastructure, enabling repeatable and … scientists, MLEs, and business teams in an agile environment to deliver data solutions that support key firm initiatives. Build scalable and efficient batch and streaming data workflows within the Azure ecosystem. Apply distributed processing techniques using Apache Spark to handle large datasets effectively. Help drive improvements in data quality, implementing validation, cleansing, and monitoring frameworks. Contribute to the firm More ❯
We are looking for a skilled Senior Cloud and Data Solution Architect in the UK (based out of London) with a strong delivery and solutions back ground in Data Analytics and Cloud. The ideal candidate should have expertise in next-gen data technologies such as Microsoft, AWS, GCP, Snowflake, and Databricks. Sales, Delivery, Presales, Solutioning experience is mandatory within … Data Analytics alongside with the ability to deliver cloud-based enterprise platforms and data consulting skills is a must. Responsibilities: 16-18+ years of total experience in DWBI, Big Data, Cloud Technologies Implementation experience and hands on experience in either of the 2 Cloud technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at … least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to More ❯
We are looking for a skilled Senior Cloud and Data Solution Architect in the UK (based out of London) with a strong delivery and solutions back ground in Data Analytics and Cloud. The ideal candidate should have expertise in next-gen data technologies such as Microsoft, AWS, GCP, Snowflake, and Databricks. Sales, Delivery, Presales, Solutioning experience is mandatory within … Data Analytics alongside with the ability to deliver cloud-based enterprise platforms and data consulting skills is a must. Responsibilities: 16-18+ years of total experience in DWBI, Big Data, Cloud Technologies Implementation experience and hands on experience in either of the 2 Cloud technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at … least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to More ❯
We are looking for a skilled Senior Cloud and Data Solution Architect in the UK (based out of London) with a strong delivery and solutions back ground in Data Analytics and Cloud. The ideal candidate should have expertise in next-gen data technologies such as Microsoft, AWS, GCP, Snowflake, and Databricks. Sales, Delivery, Presales, Solutioning experience is mandatory within … Data Analytics alongside with the ability to deliver cloud-based enterprise platforms and data consulting skills is a must. Responsibilities: 16-18+ years of total experience in DWBI, Big Data, Cloud Technologies Implementation experience and hands on experience in either of the 2 Cloud technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at … least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to More ❯
Responsibilities : 16-18+ years of total experience in DWBI, Big Data, Cloud Technologies Implementation experience and hands on experience in either of the 2 Cloud technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam … or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical discussions across industry … communication skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Required Skills : Mandatory Skills [at least 2 Hyperscalers]: GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF. Preferred Skills More ❯
Responsibilities : 16-18+ years of total experience in DWBI, Big Data, Cloud Technologies Implementation experience and hands on experience in either of the 2 Cloud technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam … or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical discussions across industry … communication skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Required Skills : Mandatory Skills [at least 2 Hyperscalers]: GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF. Preferred Skills More ❯
Responsibilities : 16-18+ years of total experience in DWBI, Big Data, Cloud Technologies Implementation experience and hands on experience in either of the 2 Cloud technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam … or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical discussions across industry … communication skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Required Skills : Mandatory Skills [at least 2 Hyperscalers]: GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF. Preferred Skills More ❯
low-carbon energy in the UK. As our Data Engineering Lead, you'll be responsible for designing and leading the development of a secure, scalable data platform on Microsoft Azure , helping turn complex construction data into actionable insight. You'll manage the full data lifecycle - from ingestion and transformation to warehousing and modelling - all while ensuring data quality, integrity … for live environments To succeed in this role, you'll bring a blend of strong technical expertise and people leadership. We're looking for someone with experience in: Microsoft Azure Data services (Data Factory, Databricks, Azure SQL, Synapse) SQL, NoSQL, and API-based data pipelines Agile methodologies such as SCRUM or Kanban Leading engineering teams in complex More ❯
Select Which statement best describes your hands on responsibility for architecting and tuning cloud native data lake/warehouse solutions (e.g., AWS S3 + Glue/Redshift, GCP BigQuery, AzureSynapse)? Select What best reflects your experience building ETL/ELT workflows with Apache Airflow (or similar) and integrating them into containerised CI/CD pipelines (Docker, GitHub More ❯
This fantastic institution located in SW London is seeking a dynamic and versatile Data Engineer/Analytics Engineer to contribute to the development, enhancement, and management of their data systems, pipelines, and reporting infrastructure. This role will focus on designing and maintaining efficient data pipelines while building a scalable data warehouse architecture. A major aspect of this position will … involve integrating new data sources to create a unified Single Customer View. Additionally, collaboration with the Analytics and Insights teams is essential to ensure they have seamless access to the necessary data, as well as assisting in optimizing SQL reporting processes. The ideal candidate will: Possess a strong curiosity and passion for exploring, transforming, and integrating data from diverse … multiple data sources, automating the transition from UAT to Production. Maintain and evolve the Single Customer View to consolidate data from various sources. Demonstrate hands-on experience with Microsoft Azure technologies (including SynapseAnalytics, Datalake, Azure SQL), as well as familiarity with DevOps and GitHub practices. Contract Details: Initially a Fixed Term Contract, likely transitioning to More ❯
A prestigious organization based in London is seeking a dynamic and versatile Data Engineer/Analytics Engineer to contribute to the development, enhancement, and management of their data systems, pipelines, and reporting infrastructure. This role will focus on designing and maintaining efficient data pipelines while building a scalable data warehouse architecture. A major aspect of this position will involve … integrating new data sources to create a unified Single Customer View. Additionally, collaboration with the Analytics and Insights teams is essential to ensure they have seamless access to the necessary data, as well as assisting in optimizing SQL reporting processes. The ideal candidate will: Possess a strong curiosity and passion for exploring, transforming, and integrating data from diverse sources. … multiple data sources, automating the transition from UAT to Production. Maintain and evolve the Single Customer View to consolidate data from various sources. Demonstrate hands-on experience with Microsoft Azure technologies (including SynapseAnalytics, Datalake, Azure SQL), as well as familiarity with DevOps and GitHub practices. This role could accommodate 1 visit a month/fully More ❯
mission to help our customers connect data, software, and purpose to create extraordinary outcomes. You could say we are a digital transformation business. We specialize in software product development, analytics, data science, IoT solutions, machine learning, DevOps optimization, and modernization of applications, data, and platforms. We work with incredible clients in all types of industries such as smart home … sense of belonging by facilitating easy access to the office. Join us in shaping a workplace where proximity enhances collaboration while inclusivity strengthens our community. We are a Microsoft Azure focused consultancy and services company, specializing in App Innovation, Data, AI, and Cloud Infrastructure solutions. Recognized as a Fabric Featured Partner by Microsoft, we build and deploy data platforms … for our customers on Fabric, Databricks and Synapseanalytics, offering the opportunity to shape cutting-edge data initiatives for a wide range of clients. About the role Our Data Architects play a crucial role in shaping and optimising our customers' data landscape, ensuring robust, scalable, governed solutions, built on the latest Azure technology using best practice process More ❯
way that is consistent with achieving good outcomes for consumers; and to comply with the FCA and PRA's Conduct Rules. Key Responsibilities: The development of data ingest, transformation, analytics, and data publishing pipelines, facilitating complex data transformations to meet business requirements, ensuring optimal performance and efficiency of the data platform. Support for the live platform day to day … Manager Person Specification Knowledge/Experience/Skills: Strong communicator with both technical and non-technical communities Experience of mentoring less-experienced developers Significant hands-on experience with the Azure Data Stack, critically ADF and Synapse (experience with Microsoft Fabric is a plus) Highly developed python and data pipeline development knowledge, must include substantial PySpark experience Demonstrable DevOps … of Data Pipeline testing, including automated testing, data validation and code assurance Demonstrable experience of working within Agile Delivery projects An understanding of data formats for ingest, transformation and analytics, data security, access control and authorisation, GDPR, data privacy, and information security Awareness of data models in a Medalion Architecture Experience building Semantic, Metric or Analytic models Experience of More ❯
investment in data initiatives Excellent career development opportunities within a growing team Flexible, modern working environment What you’ll do: Design, build and optimise SQL Server databases to support analytics and operational workloads Write high-quality, efficient T-SQL procedures, views, and functions Develop ETL processes and support the integration of data from core insurance systems Work closely with … relational data modelling and data warehousing principles Experience working with business users, analysts, and other developers in a collaborative setup Nice to have: Exposure to cloud data environments (e.g. Azure SQL, Synapse, Snowflake) Familiarity with ETL tooling, SSIS, or modern orchestration platforms Understanding of how database structures support BI and reporting tools (e.g. Power BI) This is a More ❯
investment in data initiatives Excellent career development opportunities within a growing team Flexible, modern working environment What you’ll do: Design, build and optimise SQL Server databases to support analytics and operational workloads Write high-quality, efficient T-SQL procedures, views, and functions Develop ETL processes and support the integration of data from core insurance systems Work closely with … relational data modelling and data warehousing principles Experience working with business users, analysts, and other developers in a collaborative setup Nice to have: Exposure to cloud data environments (e.g. Azure SQL, Synapse, Snowflake) Familiarity with ETL tooling, SSIS, or modern orchestration platforms Understanding of how database structures support BI and reporting tools (e.g. Power BI) This is a More ❯
investment in data initiatives Excellent career development opportunities within a growing team Flexible, modern working environment What you’ll do: Design, build and optimise SQL Server databases to support analytics and operational workloads Write high-quality, efficient T-SQL procedures, views, and functions Develop ETL processes and support the integration of data from core insurance systems Work closely with … relational data modelling and data warehousing principles Experience working with business users, analysts, and other developers in a collaborative setup Nice to have: Exposure to cloud data environments (e.g. Azure SQL, Synapse, Snowflake) Familiarity with ETL tooling, SSIS, or modern orchestration platforms Understanding of how database structures support BI and reporting tools (e.g. Power BI) This is a More ❯
with data scientists and stakeholders Follow CI/CD and code best practices (Git, testing, reviews) Tech Stack & Experience: Strong Python (Pandas), PySpark, and SQL skills Cloud data tools (Azure Data Factory, Synapse, Databricks, etc.) Data integration experience across formats and platforms Strong communication and data literacy Nice to Have: Commodities/trading background Experience with data lakes More ❯
with data scientists and stakeholders Follow CI/CD and code best practices (Git, testing, reviews) Tech Stack & Experience: Strong Python (Pandas), PySpark, and SQL skills Cloud data tools (Azure Data Factory, Synapse, Databricks, etc.) Data integration experience across formats and platforms Strong communication and data literacy Nice to Have: Commodities/trading background Experience with data lakes More ❯
with data scientists and stakeholders Follow CI/CD and code best practices (Git, testing, reviews) Tech Stack & Experience: Strong Python (Pandas), PySpark, and SQL skills Cloud data tools (Azure Data Factory, Synapse, Databricks, etc.) Data integration experience across formats and platforms Strong communication and data literacy Nice to Have: Commodities/trading background Experience with data lakes More ❯
Job Title: Solutions Architect (Data Analytics)-Pre-sales, RFP creation Location: London (3days/week onsite) Duration: Permanent Responsibilities: 16-18+ years of total experience in DWBI, Big Data, Cloud Technologies Implementation experience and hands on experience in either of the 2 Cloud technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least … Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and … and verbal communication skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills More ❯
Responsibilities: 16-18+ years of total experience in DWBI, Big Data, Cloud Technologies Implementation experience and hands on experience in either of the 2 Cloud technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam … or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical discussions across industry … and verbal communication skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills More ❯
innovation, collaboration, recognition and inclusivity and offer a wide range of benefits to support staff wellbeing. Your Future Starts Here PURPOSE OF JOB: We’re looking for an experienced Azure Data & AI Engineer with a strong focus on advanced analytics, machine learning, and applied AI - particularly Generative AI. This role will suit a technically capable professional who combines … real-world AI/ML solution delivery experience with a solid understanding of the Azure ecosystem. The ideal candidate will bring hands-on expertise in designing and building AI-driven solutions using Azure-native tools and frameworks such as Azure OpenAI, Prompt Flow, Semantic Kernel, Azure AI SDKs (including azure-ai-projects, azure-ai … inference), and open-source frameworks like LangChain. Familiarity with traditional Azure AI services (e.g. Document Intelligence, Vision, Language, and Speech) is expected, alongside a developer-oriented mindset (code-first) and experience with integrating AI workloads into secure, scalable Azure environments. The role requires deep platform knowledge - especially across Azure security, identity and networking (e.g., IAMs, Private Endpoints More ❯
innovation, collaboration, recognition and inclusivity and offer a wide range of benefits to support staff wellbeing. Your Future Starts Here PURPOSE OF JOB: We’re looking for an experienced Azure Data & AI Engineer with a strong focus on advanced analytics, machine learning, and applied AI - particularly Generative AI. This role will suit a technically capable professional who combines … real-world AI/ML solution delivery experience with a solid understanding of the Azure ecosystem. The ideal candidate will bring hands-on expertise in designing and building AI-driven solutions using Azure-native tools and frameworks such as Azure OpenAI, Prompt Flow, Semantic Kernel, Azure AI SDKs (including azure-ai-projects, azure-ai … inference), and open-source frameworks like LangChain. Familiarity with traditional Azure AI services (e.g. Document Intelligence, Vision, Language, and Speech) is expected, alongside a developer-oriented mindset (code-first) and experience with integrating AI workloads into secure, scalable Azure environments. The role requires deep platform knowledge - especially across Azure security, identity and networking (e.g., IAMs, Private Endpoints More ❯
data projects to improve speed, reduce cost, and maximize value. Contribute to internal communities of excellence and practice (CoE, CoP). Skills Needed: Experience with cloud platforms including, AWS, Azure or SAP ETL/ELT Development Data Modeling Data Integration & Ingestion Data Manipulation & Processing Version Control & DevOps: Skilled in GitHub, GitHub Actions, Azure DevOps Azure Data Factory … Databricks, SQL DB, Synapse, Stream Analytics Glue, Airflow, Kinesis, Redshift SonarQube, PyTest If you're ready to take on a new challenge and shape data engineering in a trading-first environment, submit your CV today to be considered. More ❯