target state from current DWH estate towards data products/marketplace model on AWS/Snowflake. Review AWS Infrastructure components design & usage and implement enhancements. Design and implement an ETL (Extract, Transform, Load) engine using AWS EMR (Elastic MapReduce) for efficient data processing. Design, review, and implement reporting solutions integrating Tableau with AWS services for seamless data visualization. Design and … tools, and practices. Troubleshoot and resolve infrastructure-related issues, providing technical support and guidance. Your Profile Essential Skills/Knowledge/Experience: Extensive AWS service knowledge Lambda Avaloq experience ETL (Extract, Transform, Load) Integrating Tableau with AWS services Amazon EKS (Elastic Kubernetes Service) Infrastructure as Code, scripting (Python/Bash), Helm charts, Docker, Kubernetes Tools like Terraform, Ansible, and Jenkins More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Akkodis
WFHDuration: 3 months rolling contractType of contract : Freelance, Inside IR35Level: mid-Senior Duties and Tasks: Develop and optimize data pipelines using Databricks and Spark.Design and implement data models andETL processes in Snowflake.Collaborate with data scientists, analysts, and business stakeholders to understand data requirements.Ensure data quality, integrity, and security across platforms.Monitor and troubleshoot data workflows and performance issues. Requirements: Proven More ❯
standards, models, and frameworks. Design data solutions leveraging Azure services such as Azure Data Lake, Azure SQL Database, Azure Synapse Analytics, Azure Data Factory, and Azure Databricks. Data Integration & ETL Develop and optimize data pipelines for ingestion, transformation, and storage using Azure Data Factory and Databricks. Governance & Security Implement data governance, security, and compliance practices aligned with financial services regulations More ❯
Dimensional modelling (star schema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ELT orchestration for structured and unstructured data Proficiency in: PySpark, T-SQL, Notebooks and advanced data manipulation Performance monitoring and orchestration of Fabric solutions Power BI semantic models andMore ❯
design, build, and optimise scalable data pipelines and lakehouse architectures on Azure, enabling advanced analytics and data-driven decision making across the business. Key Responsibilities Design, develop, and maintain ETL/ELT pipelines using Azure Databricks, PySpark, and Delta Lake. Build and optimise data lakehouse architectures on Azure Data Lake Storage ( ADLS ) . Develop high-performance data solutions using Azure More ❯
technical solutions. Maintain clear documentation and contribute to internal best practices. Requirements Strong hands-on experience with PySpark (RDDs, DataFrames, Spark SQL). Proven ability to build and optimise ETL pipelines and dataflows. Familiar with Microsoft Fabric or similar lakehouse/data platform environments. Experience with Git, CI/CD pipelines, and automated deployment. Knowledge of market data, transactional systems More ❯
metadata management, and data governance. Experience with modern data platforms such as Azure Data Lake, Databricks, Power BI, and SAP BTP. Solid grasp of enterprise integration patterns (APIs, streaming, ETL/ELT, event-driven architectures). Ability to translate complex data concepts into clear, value-focused business outcomes. Excellent stakeholder management and communication skills across technical and non-technical audiences. More ❯
metadata management, and data governance. Experience with modern data platforms such as Azure Data Lake, Databricks, Power BI, and SAP BTP. Solid grasp of enterprise integration patterns (APIs, streaming, ETL/ELT, event-driven architectures). Ability to translate complex data concepts into clear, value-focused business outcomes. Excellent stakeholder management and communication skills across technical and non-technical audiences. More ❯
Ability to write Spark code for large-scale data processing, including RDDs, DataFrames, and Spark SQL. Hands-on experience with lakehouses, dataflows, pipelines, and semantic models. Ability to build ETL workflows. Familiarity with time-series data, market feeds, transactional records, and risk metrics. Familiarity with Git, DevOps pipelines, and automated deployment. Strong communication skills with a collaborative mindset to work More ❯
/R) and Spotfire APIs. Working knowledge of Power BI report development and differences between Spotfire and Power BI capabilities. Proficient in SQL, data integration (flat files, APIs, databases), ETL logic interpretation. Understanding of functional and visual parity considerations between BI tools. Strong analytical, debugging, communication skills to interface with stakeholders and migration engineers. The Role Act as the technical More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Sanderson
a Fabric Data/BI Engineer to help create and drive analytics solutions. BI/Data Engineer, key skills: Microsoft Fabric experience Proven data engineering experience - setting up complex ETL processes with ADF pipelines Data visualisation and reporting using PowerBI Data modelling experience - conceptual, logical and physical Proficient in SQL Extensive experience with Microsoft cloud stack Data Engineer, BI Engineer More ❯
South West London, London, England, United Kingdom
Sanderson
retrieval and ensure efficient performance across AWS environments Collaborate with stakeholders to translate business requirements into actionable insights Work with AWS services such as Glue, Lambda, and S3 for ETLand data processing Ensure compliance with government data security and governance standards Essential skills: Strong proficiency in SQL and AWS Athena Hands-on experience with AWS data services (Glue, S3 More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Searchability NS&D
Ensure security and compliance with DV-level clearance standards Skills & Experience: Current DV clearance (essential) Proven experience working with Palantir Foundry in complex environments Strong skills in data engineering, ETL processes, and data modelling Proficiency in relevant programming/scripting languages (e.g. Python, SQL) Experience working with large-scale datasets in secure environments Strong problem-solving skills and stakeholder engagement More ❯
ll need: Strong proficiency in Power BI, Power Query, and PPM software. Advanced skills in data visualisation, dashboard design, and analytics. Experience using SQL, DAX, and Power Query for ETL processes. High-level analytical and data management capabilities. Background in PMO and data analytics, ideally within a large, global organisation. Familiarity with MS Office 365 suite and collaboration tools. About More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Morgan McKinley
Analytics or Data Science, ideally in Customer Success, Operations or Digital Team. Strong SQL skills and experience with dashboarding tools such as Tableau. Familiarity with Snowflake, Databricks, Spark, andETL processes. Python or automation is a plus. Knowledge of A/B testing and statistics. Strong communication and storytelling skills to influence diverse audiences. Experience with call centre or digital More ❯
or similar modeling tools. Working knowledge of TOGAF, LeanIX, or enterprise architecture frameworks. Understanding of Master Data Management (MDM), data catalogs (e.g., Atlan), and data quality frameworks. Experience in ETL pipeline development (Matillion preferred) and Agile/Scrum environments. Knowledge of CI/CD pipelines, versioning (Bitbucket), and automated testing tools. Awareness of AI and Machine Learning integration concepts, including More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Experis
or similar modeling tools. Working knowledge of TOGAF, LeanIX, or enterprise architecture frameworks. Understanding of Master Data Management (MDM), data catalogs (e.g., Atlan), and data quality frameworks. Experience in ETL pipeline development (Matillion preferred) and Agile/Scrum environments. Knowledge of CI/CD pipelines, versioning (Bitbucket), and automated testing tools. Awareness of AI and Machine Learning integration concepts, including More ❯
analyst or data analyst)* Proficiency in working with data formats such as XML, JSON, CSV, EDI, XLSX* Strong communication and collaboration skills* Fluent in English Desirable Skills:* Familiarity with ETL tools or data integration platforms (e.g., IBM DataStage, Palantir)* Experience with documentation tools and specifications (e.g., Excel-based mapping documents, BPMN, UML)* Proficiency in XML tooling, XSLT, and DataStage Designer More ❯
What You'll Bring: 8+ years in Data Analytics/Science. Expertise in SQL and dashboarding tools ( Tableau/Qlik ). Familiarity with big data tools ( Snowflake, Databricks ) andETL . Experience with A/B testing and Python/R is preferred. Contract Details: Location: London, UK Duration: 10 Months Rate: Up to £277 (Umbrella) Ready to turn global More ❯
What You'll Bring: 8+ years in Data Analytics/Science. Expertise in SQL and dashboarding tools ( Tableau/Qlik ). Familiarity with big data tools ( Snowflake, Databricks ) andETL . Experience with A/B testing and Python/R is preferred. Contract Details: Location: London, UK Duration: 10 Months Rate: Up to £277 (Umbrella) Ready to turn global More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Next Best Move
in Microsoft Office skills, including Outlook, Word, Excel and PowerPoint and other applications such as Microsoft 365, SharePoint, Teams and OneDrive. Power Apps and Power Automate experience. Experience in ETL tools, such as SSIS. Business Central experience. Good knowledge and Development experience of MS SQL. Experience in creating UAT scripts. Technical writing experience with the ability to present technical information More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Executive Facilities
in SQL for data extraction, transformation, and pipeline development. Experience with dashboarding and visualization tools (Tableau, Qlik, or similar). Familiarity with big data tools (Snowflake, Databricks, Spark) andETL processes. Useful experience; Python or R for advanced analytics, automation, or experimentation support. Knowledge of statistical methods and experimentation (A/B testing) preferred. Machine learning and Generative AI experience More ❯
will leverage Palantir Foundry to design, develop, and maintain data pipelines, models, and applications that enable advanced analytics and operational insights across the organization. Skills Palantir Gotham Foundry SQL ETL Python Linux Java Systems Engineering Job Title: Palantir Developer Location: London, UK Job Type: Contract Trading as TEKsystems. Allegis Group Limited, Bracknell, RG12 1RT, United Kingdom. No Allegis Group Limited More ❯
develop and maintain robust data pipelines using PostgreSQL and Airflow or Apache Spark Collaborate with frontend/backend developers using Node.js or React Implement best practices in data modelling, ETL processes and performance optimisation Contribute to containerised deployments (Docker/Kubernetes) - experience in this area is desirable, not essential Operate within Agile teams and support DevOps practices What We're More ❯
develop and maintain robust data pipelines using PostgreSQL and Airflow or Apache Spark Collaborate with frontend/backend developers using Node.js or React Implement best practices in data modelling, ETL processes and performance optimisation Contribute to containerised deployments (Docker/Kubernetes) - experience in this area is desirable, not essential Operate within Agile teams and support DevOps practices What We're More ❯