etc.) Develop logical data models that unify disparate data sources into a single framework Align new data structures with an existing legacy GCP lake that integrates with SharePoint Collaborate with the multiple teams , who will own the High-Level Design (HLD) and deployment activities Data Implementation & Integration Lead the ingestion and … transformation of data into the new centralized DataLake on Google Cloud Platform (GCP) Work alongside the teamresponsible for Low-Level Design (LLD) and implementation Integrate and manage data flow using Cloud Data Fusion , ingesting into BigQuery for analytics and reporting Ensure smooth integration of new data sources with … Snowflake , supporting expanded business intelligence capabilities Leverage tools such as the clients Analytics Product and ensure compatibility with Neuron-compatible Data Hub DataLake & Cloud Strategy Design scalable, cloud-native data pipelines in GCP Work closely with cloud architects to ensure solutions are aligned with enterprise cloud strategy Ensure all integrations meet performance More ❯
working on high-impact projects for clients across sectors, delivering intuitive, visually engaging tools that connect directly to data platforms like Microsoft Fabric’s DataLake, Power.B.I., Qlik Sense, SQL Server, Airtable, and Databricks. The role calls for someone equally comfortable coding clean, scalable applications as they are working with APIs, shaping UI/UX … that deliver real business value. RESPONSIBILITIES Develop, test, and deploy custom web applications and dashboards, connecting to multiple enterprise data sources (e.g., Microsoft Fabric DataLake, Power.B.I., Qlik Sense, SQL Server, Airtable, Databricks) Build advanced data visualisations using D3.JS and other JavaScript libraries to create highly interactive, responsive interfaces Integrate applications with APIs … development and integration (REST, GraphQL) Familiarity with database technologies and query languages (SQL, NoSQL) Experience working with data sources such as Microsoft Fabric’s DataLake, Power.B.I., Qlik Sense, Airtable, and Databricks Good grasp of DevOps practices, including CI/CD pipelines and code management workflows Cloud deployment experience with Azure and/or AWS More ❯
Data Modeler x 2 + 3-6 month + contracts available + Fully remote - occasional trip into London office + £750-£850 per day - Inside IR35 Key Skills: + Data Transformation & ETL - QLIK + AWS, other Cloud Data Modeler - London Markets We're looking for a dedicated Data Modeler to join a … on contract role (not an Architect, not a Developer-who-dabbles, not a Manager-in-hiding). You'll be laser-focused on designing and delivering enterprise-grade data models across warehouses, lakes, and vaults. The Role You'll be responsible for: Designing and developing conceptual, logical, and physical data models . Implementing models across RDBMS … ODS, data marts, and datalakes (SQL/NoSQL). Translating business needs into robust, long-term data models. Applying Data Vault 2.0 techniques in real-world implementations. Supporting metadata management, governance, and best practices with tools like Erwin or ER/Studio . What We're Looking For (Must-Haves) 5+ More ❯
Data Engineer - PowerBI Inside IR35 Fully Remote £350-£400 per day Initial 2 month contract with chance of extension We are looking for an experienced and solutions-focused Data Engineer with strong Power BI expertise. The role includes leading the delivery of scalable, high-quality data pipelines and analytics platforms that power business intelligence … autonomy in contributing to architectural decisions and engineering standards, with a key focus on Power BI implementation, DAX development, and dashboard optimisation. It suits someone capable of owning data engineering and reporting workstreams, mentoring junior staff, and aligning technical output with strategic business goals. Your Responsibilities · Design, develop, and maintain production-grade data pipelines that support … Strong programming skills in Python. · Exposure to orchestration tools (Airflow, dbt, Dataform) and integration with Power BI. · Experience with Azure-based data services (Azure DataLake, Synapse, Data Factory) and their integration with Power BI. · Knowledge of data modelling techniques including star/snowflake schema design for BI solutions. · Understanding of More ❯
Contract Opportunity: Lead Azure Data Engineer (Remote - 500/day Outside IR35) We're hiring for a Lead Azure Data Engineer to join our team on a hybrid contract based in London, supporting key finance stakeholders and transforming our data platform. This role offers the chance to shape the future of financial reporting through … cutting-edge cloud engineering and data architecture. Location : Central London (UK-based candidates only) Rate : £500/day IR35 Status : Outside IR35 Start Date : ASAP - Interviews next week Duration : 6 months with potential for extension The Role You'll lead the design and delivery of scalable data products that support financial analysis and decision-making. Working … Essential Skills Azure Databricks : Expert in Spark (SQL, PySpark), Databricks Workflows Data Pipeline Design : Proven experience in scalable ETL/ELT development Azure Services : DataLake, Blob Storage, Synapse Data Governance : Unity Catalog, access control, metadata management Performance Tuning : Partitioning, caching, Spark job optimization Cloud Architecture : Infrastructure-as-code, monitoring, automation Finance Domain More ❯
Contract Opportunity: Lead Azure Data Engineer (Remote - 500/day Outside IR35) We're hiring for a Lead Azure Data Engineer to join our team on a hybrid contract based in London, supporting key finance stakeholders and transforming our data platform. This role offers the chance to shape the future of financial reporting through … cutting-edge cloud engineering and data architecture. Location : Central London (UK-based candidates only) Rate : £500/day IR35 Status : Outside IR35 Start Date : ASAP - Interviews next week Duration : 6 months with potential for extension The Role You'll lead the design and delivery of scalable data products that support financial analysis and decision-making. Working … Essential Skills Azure Databricks : Expert in Spark (SQL, PySpark), Databricks Workflows Data Pipeline Design : Proven experience in scalable ETL/ELT development Azure Services : DataLake, Blob Storage, Synapse Data Governance : Unity Catalog, access control, metadata management Performance Tuning : Partitioning, caching, Spark job optimization Cloud Architecture : Infrastructure-as-code, monitoring, automation Finance Domain More ❯
Job Description: We are seeking a highly skilled Senior Data Engineer with strong expertise in Azure cloud technologies and ETL processes. The ideal candidate will have a deep understanding of data architecture, data modeling, and data integration to support scalable and high-performance data solutions. Key … Responsibilities: Design and implement end-to-end Azure-based data solutions using services such as Azure SQL Database, Data Factory, Synapse Analytics, DataLake, DevOps, and Storage. Develop and optimize ETL processes using Informatica for data integration and migration across UNIX and Windows environments. More ❯
Data Analyst Inside IR35 Remote Start Date: 03/09/2025 End Date: 31/12/2025 Hours per week:37.5 hours Max Number of days: 5 days per week Day Rate : £555.56 Vetting Required: BPSS Role Overview My client is seeking a Senior Data Analyst to join the Data, AI and … Advanced Analytics Team. Reporting to the Principal Data Analyst, this is a senior role responsible for bridging business requirements with technical data solutions. The role involves working closely with business users, subject matter experts, project managers, technical teams, and end users to deliver high-quality data analysis and insights. Key Responsibilities * Conduct in-depth … data analysis to uncover trends, patterns, and actionable insights. * Assist in forecasting and predictive modeling to support strategic planning. * Translate business questions into analytical frameworks and data queries. * Understand and document complex business processes across multiple enterprise applications. * Collaborate with cross-functional teams and key stakeholders to drive clarity and effective collaboration. Qualifications & Experience Professional * 10+ More ❯
Belfast, County Antrim, United Kingdom Hybrid / WFH Options
Adecco
Business Data Analyst Contract Contract End Date: June 30, 2026 Daily Rate: £300 - £350 (inside IR35 via umbrella) Location: BELFAST (3 days on site, non negotiable) Hybrid Working Environment. Are you an experienced Business Data Analyst looking for your next challenge? Join our client's Markets Transformation Team, where you will play a pivotal role in … executing technical data-centric deliverables that drive business success. About the Role As a Business Data Analyst, you will leverage your expertise in data analysis, solution … design, and project management to lead critical technical projects. Your primary focus will be on delivering Front Office Critical Data Elements into a central DataLake, ensuring that all requirements are met, and projects are delivered on time. Primary Responsibilities: Lead technical projects to implement Front Office Critical Data Elements into a centralised More ❯
SAP Finance Data & Analytics Specialist ( Interim State Management) Luton - 12 months - outside IR35 - £580 per day The SAP Finance Data & Analytics Specialist ( Interim State Management) will be responsible for ‘keeping the lights on’ for all the critical Finance global & local reporting solutions. Apply data, analytics, process, and object expertise to support interim state and … S/4 design, acting as an integral part of the Data and Analytics design team, considering both business process and analytics requirements. SAP Finance Data & Analytics Specialist ( Interim State Management) will be managing 7 non-SAP analytics applications and responding to changes driven by the ongoing S4 Transformation' Essential Skills – 15 + years of experience …/4HANA and ECC in the FI/CO module functionalities. Experience and knowledge of Analytics tools and Data Warehouses like PowerBI, Qlik, Azure DataLake, Snowflake, SAP B4H, SAP Analytics Cloud etc. SAP Finance Data & Analytics Specialist ( Interim State Management) Osirian Consulting is committed to working with our clients to promote equality More ❯
will be initially 6 months but likely to extend paying £525 per day In order to be considered for this role you will need to have - Experience with data … management and data architectures (SQL/NoSQL), database systems, modern data warehouse platforms (Snowflake, Databricks, BigQuery), ETL/ELT pipeline development, and datalake/warehouse implementations for integrating structured and unstructured laboratory and sequencing data sources. Master Full-Stack technologies, such as Python, JavaScript/TypeScript, Cloud(preferably AWS), SQL More ❯
will be initially 6 months but likely to extend paying £525 per day In order to be considered for this role you will need to have - • Experience with data … management and data architectures (SQL/NoSQL), database systems, modern data warehouse platforms (Snowflake, Databricks, BigQuery), ETL/ELT pipeline development, and datalake/warehouse implementations for integrating structured and unstructured laboratory and sequencing data sources. • Master Full-Stack technologies, such as Python, JavaScript/TypeScript, Cloud (preferably AWS), SQL More ❯
Your new company We are currently collaborating with one of the largest global pharmaceutical companies to recruit an SAP Finance Data & Analytics Specialist on a contract basis. This organisation is currently undergoing significant change and growth projects, and to effectively manage these transitions, they are seeking a qualified finance professional with experience in a SAP S/… role Duration: 12Months Hybrid Working: 2 days per week on-site, 3 days per week remote Outside IR35 [Ltd Comp] or Inside IR35 [Umbrella] As a SAP Finance Data & Analytics Specialist, you will play a pivotal role in ensuring the continuity of critical Finance reporting solutions while supporting the design and implementation of the S/4 HANA … S/4 HANA vs ECC differences in FI/CO modules. Experience with analytics tools and data platforms such as PowerBI, Qlik, Azure DataLake, Snowflake, SAP B4H, and SAP Analytics Cloud. Bachelor's or Master's degree in Finance or Accounting. What you'll get in return The role offers a competitive day More ❯
Employment Type: Contract
Rate: £475 - £575/day Up to £575 Daily Rate In or Outside Scope
Data Quality Analyst Hybrid two days per week in the office (Luton) Are you ready to play a key role in transforming data quality as part of a major ERP implementation? We're seeking a Data Quality Analyst to support the transition to a modern Enterprise Resource Planning (ERP) system. This role is critical … in improving data accuracy, governance, and overall decision-making across key finance functions, including Record to Report (R2R), Order to Cash (O2C), Source to Pay (S2P), and Financial Planning & Analytics (FP&A). What you'll be doing: Define and document critical data elements in collaboration with business stakeholders. Develop and apply data quality … frameworks, profiling tools, and dashboards. Perform data cleansing and validation to support ERP migration activities. Identify inconsistencies and risks through detailed data profiling and analysis. Work cross-functionally with Finance, IT, and other teams to align data requirements and implement governance best practices. Support continuous improvement initiatives to enhance data processes and More ❯
Luton, Bedfordshire, Cockernhoe, Hertfordshire, United Kingdom
Matchtech Group Plc
Data Quality Analyst Hybrid two days per week in the office (Luton) Are you ready to play a key role in transforming data quality as part of a major ERP implementation? We're seeking a Data Quality Analyst to support the transition to a modern Enterprise Resource Planning (ERP) system. This role is critical … in improving data accuracy, governance, and overall decision-making across key finance functions, including Record to Report (R2R), Order to Cash (O2C), Source to Pay (S2P), and Financial Planning & Analytics (FP&A). What you'll be doing: Define and document critical data elements in collaboration with business stakeholders. Develop and apply data quality … frameworks, profiling tools, and dashboards. Perform data cleansing and validation to support ERP migration activities. Identify inconsistencies and risks through detailed data profiling and analysis. Work cross-functionally with Finance, IT, and other teams to align data requirements and implement governance best practices. Support continuous improvement initiatives to enhance data processes and More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Oscar Technology
reporting. Consolidate multiple customer-specific reports into more efficient, scalable solutions. Support integration of Power BI with Fabric (experience here highly beneficial). Work with Azure DataLake and wider Azure ecosystem for data sourcing and performance improvements. Provide best practice guidance for managing a full reporting suite in a SaaS environment. What We're … APIs and driving automation. Background in scaling BI solutions across multiple customers/environments. Familiarity with Fabric (hands-on experience a big plus). Experience with Azure and Data Lakes. Someone who has managed the full lifecycle of reporting suites, including integrations. If this sounds like you and you're looking for your next contract- apply now! PowerBI … possibility of extension) | ASAP Oscar Associates (UK) Limited is acting as an Employment Business in relation to this vacancy. To understand more about what we do with your data please review our privacy policy in the privacy section of the Oscar website. More ❯
with a strong focus on Azure-based architecture, Machine Learning, and MLOps, Deep experience with Azure services, especially Azure Machine Learning, Azure Kubernetes Service (AKS), Azure DataLake, and Azure Synapse, Hands-on experience with machine learning frameworks such as TensorFlow, PyTorch, or Scikit-learn, Strong understanding of MLOps concepts, including continuous integration/continuous, deployment (CI … ML, model versioning, monitoring, and retraining, Proficiency with Scripting and programming languages (Python, R, SQL, etc.), Experience with containerization (Docker) and orchestration (Kubernetes) for ML models, Knowledge of data engineering. More ❯
Alexander Mann Solutions - Public Sector Resourcing
in Xsiam correlation/playbooks. . Excellent communication skills, able to articulate technical concepts to diverse audiences. Technical requirements . Proven experience with Cortex XDR, Cortex DataLake, and Cortex XSOAR. . Solid understanding of cloud security, network/system security fundamentals, and Scripting for automation. Desirable experience . Familiarity with compliance and security standards (GDPR, HIPAA More ❯
via Docker/Kubernetes and integrate with orchestration systems (e.g., Airflow, custom schedulers). ? Work with platform engineers to embed Spark jobs into InfoSum's platform APIs and data pipelines. ? Troubleshoot job failures, memory and resource issues, and execution anomalies across various runtime environments. ? Optimize Spark job performance and advise on best practices to reduce cloud compute and … In-depth knowledge of AWS Glue, including job authoring, triggers, and cost-aware configuration. ? Familiarity with distributed data formats (Parquet, Avro), datalakes (Iceberg, Delta Lake), and cloud storage systems (S3, GCS, Azure Blob). ? Hands-on experience with Docker, Kubernetes, and CI/CD pipelines. ? Strong documentation and communication skills, with the ability to … support and coach internal teams. Key Indicators of Success: ? Spark jobs are performant, fault-tolerant, and integrated into InfoSum's platform with minimal overhead. ? Cost of running data processing workloads is optimized across cloud environments. ? Engineering teams are equipped with best practices for writing, deploying, and monitoring Spark workloads. ? Operational issues are rapidly identified and resolved, with root More ❯
Skills/requirements Deploy comprehensive cloud infrastructure for various products, including Astronomer Airflow and AccelData environments. Facilitate cross-functional integration between vendor products and other systems, such as datalakes, storage, and compute services. Establish best practices for cloud security, scalability, and performance. Manage and configure vendor product deployments, ensuring the setup and maintenance of environments. Ensure high … control. Collaborate with cloud providers (e.g., AWS) for pipeline integration and scaling requirements. Identify, design, and implement internal process improvements, including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes. Develop infrastructure for optimal extraction, transformation, and loading of data from various sources using AWS and SQL technologies. Work with stakeholders, including … design, product, and executive teams, to address platform-related technical issues. Build analytical tools to leverage the data pipeline, providing actionable insights into key business performance metrics, such as operational efficiency and customer acquisition.? All profiles will be reviewed against the required skills and experience. Due to the high number of applications we will only be able to More ❯
Skills/requirements Deploy comprehensive cloud infrastructure for various products, including Astronomer Airflow and AccelData environments. Facilitate cross-functional integration between vendor products and other systems, such as datalakes, storage, and compute services. Establish best practices for cloud security, scalability, and performance. Manage and configure vendor product deployments, ensuring the setup and maintenance of environments. Ensure high … control. Collaborate with cloud providers (e.g., AWS) for pipeline integration and scaling requirements. Identify, design, and implement internal process improvements, including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes. Develop infrastructure for optimal extraction, transformation, and loading of data from various sources using AWS and SQL technologies. Work with stakeholders, including … design, product, and executive teams, to address platform-related technical issues. Build analytical tools to leverage the data pipeline, providing actionable insights into key business performance metrics, such as operational efficiency and customer acquisition.? All profiles will be reviewed against the required skills and experience. Due to the high number of applications we will only be able to More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
deliver critical design work for a platform transformation in the financial services sector. The role focuses on designing cloud-native, secure web applications and API frameworks across distributed data environments. Key Details: Start: ASAP (2-week onboarding window) Location: Hybrid - City of London (1-2 days onsite) Duration: 6-month contract, likely extension Rate: £600.00 per day via … company admin fees.) Requirements: Deep experience designing scalable enterprise web applications across financial services Strong AWS cloud experience across application and integration layers Familiarity with design of APIs, datalakes, BPM processes, and microservices Track record of working with engineering and risk functions to deliver secure solutions Comfortable designing for high-resilience, regulated environments in financial platforms Preferred … Background in financial data platforms, index providers, or investment/asset management Experience in BPM tooling and leading application modernisation efforts Working knowledge of real-time data services or trading platforms Additional: Full financial/criminal checks will be carried out. Robert Half Ltd acts as an employment business for temporary positions and an employment agency More ❯