Enterprise Data Architect/Senior Solution Data Architect/Head of Data Architecture Data is responsible for shaping, governing, and enabling data-centric solution design across the clients global technology landscape. The role ensures that all data and analytics solutions align to enterprise architecture principles, support the North … and governed data. Working across programmes such as UNIFY (SAP S/4HANA), Integrated Supply Chain (Blue Yonder), and Sales Excellence (Salesforce CGC), the role bridges business strategy, data architecture, and solution delivery to ensure a unified enterprise data fabric and analytics capability. Key Accountability's Solution Leadership Lead end-to-end data solution … of data modelling, data integration, metadata management, and data governance. Experience with modern data platforms such as Azure DataLake, Databricks, Power BI, and SAP BTP. Solid grasp of enterprise integration patterns (APIs, streaming, ETL/ELT, event-driven architectures). Ability to translate complex data concepts More ❯
Location - London, Bristol or Manchester (1 day a month onsite) Duration - 6 months Rate - £550 - £600pd (inside ir35) As a Data Engineer in the Cyber and Domains Protection Team you will: Work within an Agile team to support the development of dashboards and build automated reports to meet the needs of technical and non-technical users Work with … the data analyst and user researcher to update relevant data models to allow business intelligence data to meet the organisation's specific needs Develop business intelligence reports that can be automated, reused and shared with users directly Implement data flows to connect operational systems, data for analytics and business intelligence … like PostgreSQL, MySQL, or similar Cloud data ecosystem (AWS) : hands-on experience with core AWS data services. Key services include: S3 for datalake storage AWS Glue for ETL and data cataloging Amazon Redshift or Athena for data warehousing and analytics Lambda for event-driven data processing. More ❯
City Of Westminster, London, United Kingdom Hybrid / WFH Options
Additional Resources
An opportunity has arisen for a Senior Data Engineer to join a well-established biotech company using large-scale genetic data and AI to predict disease risk and advance precision healthcare. As a Senior Data Engineer , you will be responsible for developing, automating, and optimising scalable data pipelines using modern cloud technologies. … with hybrid/remote working options offering a salary of £500 - £650 per day (Inside IR35) and benefits. You Will Be Responsible For: Designing and implementing cloud-based data architectures using Azure services. Building robust and scalable data pipelines to support complex, high-volume processing. Deploying and managing containerised workloads through Kubernetes, Helm, and Docker. Automating … Code tools (Terraform, Ansible). Hands-on experience with PostgreSQL and familiarity with lakehouse technologies (e.g. Apache Parquet, Delta Tables). Exposure to Spark, Databricks, and datalake/lakehouse environments. Understanding of Agile development methods, CI/CD pipelines, GitHub, and automated testing. Practical experience monitoring live services using tools such as Grafana, Prometheus, or New More ❯
Westminster, City of Westminster, Greater London, United Kingdom Hybrid / WFH Options
Additional Resources
An opportunity has arisen for a Senior Data Engineer to join a well-established biotech company using large-scale genetic data and AI to predict disease risk and advance precision healthcare. As a Senior Data Engineer , you will be responsible for developing, automating, and optimising scalable data pipelines using modern cloud technologies. … with hybrid/remote working options offering a salary of £500 - £650 per day (Inside IR35) and benefits. You Will Be Responsible For: Designing and implementing cloud-based data architectures using Azure services. Building robust and scalable data pipelines to support complex, high-volume processing. Deploying and managing containerised workloads through Kubernetes, Helm, and Docker. Automating … Code tools (Terraform, Ansible). Hands-on experience with PostgreSQL and familiarity with lakehouse technologies (e.g. Apache Parquet, Delta Tables). Exposure to Spark, Databricks, and datalake/lakehouse environments. Understanding of Agile development methods, CI/CD pipelines, GitHub, and automated testing. Practical experience monitoring live services using tools such as Grafana, Prometheus, or New More ❯