W1S, St James's, Greater London, United Kingdom Hybrid / WFH Options
MFK Recruitment
is seeking a software engineer to join their team in Mayfair, London. We are looking for a Senior Backend Software Engineer with strong data engineering skills to join a small, agile team developing software solutions for our energy supply and trading functions. Hybrid working is in play, with … days at home. Software Engineer - About the role: My client’s energy business is growing rapidly with a strong focus on using advanced data systems and analytics to deliver exceptional service. We are looking for someone to take ownership of the backend architecture that underpins our analytics applications … API development RabbitMQ/Message queue PostgreSQL Databricks Containerisation: Docker, Kubernetes CI/CD: Azure DevOps, GitHub Actions Relational databases and datalake architecture Model and data pipeline integration (e.g. MLflow) Microsoft Azure (Functions, Storage, Compute) Monitoring tools (Grafana, Prometheus, etc.) Mentoring and knowledge sharing More ❯
Senior Data Engineer – Snowflake Platform Build Location: London (2 days a week on site) Contract: 6 month sign on (18–24 month project) We’re building the next version of our data platform, with Snowflake at the core. The PoC is in place — now we need … before to lead the implementation and help scale the platform across the business. The project involves migrating our current on-prem DataLake to Snowflake, while keeping storage on a private cloud . You’ll be setting the foundations: building out ETL pipelines, establishing best practices, and … in Python , PySpark , and Spark Hands-on with platform setup – ideally with a DevOps-first approach Exposure to AWS environments Experience working with data from trading platforms or within commodities, banking, or financial services Tech environment: Primary Platform: Snowflake Other Tech: DBT, Databricks, Spark, PySpark, Python Cloud: AWS More ❯
Senior Data Engineer – Snowflake Platform Build Location: London (2 days a week on site) Contract: 6 month sign on (18–24 month project) We’re building the next version of our data platform, with Snowflake at the core. The PoC is in place — now we need … before to lead the implementation and help scale the platform across the business. The project involves migrating our current on-prem DataLake to Snowflake, while keeping storage on a private cloud . You’ll be setting the foundations: building out ETL pipelines, establishing best practices, and … in Python , PySpark , and Spark Hands-on with platform setup – ideally with a DevOps-first approach Exposure to AWS environments Experience working with data from trading platforms or within commodities, banking, or financial services Tech environment: Primary Platform: Snowflake Other Tech: DBT, Databricks, Spark, PySpark, Python Cloud: AWS More ❯
to £800/day Duration: 6 month rolling contract (multi year project) Location: London About the Role We're seeking an exceptional Senior Data Engineer to lead our clients journey into cloud-based data solutions. This is an opportunity to shape the future of data engineering within a global financial organisation. What You'll Do Lead end-to-end data engineering projects from conception to deployment, delivering high-quality, scalable solutions Transform complex data into actionable business insights using our internal data platforms Develop and implement data … and share best practices and methodologies across the engineering teams Shape the data architecture and technology stack within their new cloud-based datalake-house Foster a culture of continuous learning, improvement, and automation What You Bring Solid understanding of data modelling, data warehousing principles More ❯
to £800/day Duration: 6 month rolling contract (multi year project) Location: London About the Role We're seeking an exceptional Senior Data Engineer to lead our clients journey into cloud-based data solutions. This is an opportunity to shape the future of data engineering within a global financial organisation. What You'll Do Lead end-to-end data engineering projects from conception to deployment, delivering high-quality, scalable solutions Transform complex data into actionable business insights using our internal data platforms Develop and implement data … and share best practices and methodologies across the engineering teams Shape the data architecture and technology stack within their new cloud-based datalake-house Foster a culture of continuous learning, improvement, and automation What You Bring Solid understanding of data modelling, data warehousing principles More ❯
of the Master Systems Integration service. This might involve ensuring that projects contribute to the development and implementation of ontologies, technical submittal reviews, data migration processes, and program development. Compliance with Security Standards: The Project Manager is responsible for ensuring that data migration processes are conducted … Good understanding of networking principles. Knowledge of working in AWS, GCP and migration of data from building outputs to datalake an advantage. Must have excellent verbal and written communication skills. Strong interpersonal skills and an ability to deal with both internal and external customers. … project budgets effectively and efficiently allocate resources. Technical Proficiency: Understanding of MSI Concepts: Familiarity with Master Systems Integration concepts and practices, including ontologies, data migration, and technical submittal review. Technical Background: A background in a technical field related to systems integration, information technology, or a relevant domain. Security More ❯
Senior Data Engineer - Contract - London A Senior Data Engineer is required by a Reinsurance broker to design, implement, and maintain data solutions in their Azure cloud platforms. You will need extensive experience in SQL, NoSQL, DataLakes, Data Mining, DML … DQL, Data Modelling, and Scripting. This is a hybrid contract role. Sound like a bit of you? Then get involved. More ❯
Senior Data Engineer - Contract - London A Senior Data Engineer is required by a Reinsurance broker to design, implement, and maintain data solutions in their Azure cloud platforms. You will need extensive experience in SQL, NoSQL, DataLakes, Data Mining, DML … DQL, Data Modelling, and Scripting. This is a hybrid contract role. Sound like a bit of you? Then get involved. More ❯
Exchange (PMX), Performics, Publicis Sport & Entertainment, Publicis Media Content and NextTECHnow. Together they combine deep expertise in media investment, strategy, insights and analytics, data and technology, commerce, performance marketing and content. Publicis Media is part of Publicis Groupe and is present in more than 100 countries with over … the growth of our business. Responsibilities Guide a team of engineers in developing applications that empower our clients to optimize marketing campaigns through data-driven insights and automated actions, with a specific focus on leveraging LLMs and AI. Own the technical roadmap for your team, aligning it with … and asynchronous APIs. Deep understanding of cloud infrastructure (AWS, GCP) and experience deploying and managing applications at scale. Strong understanding of datalake architectures, including experience with data ingestion, storage, processing, and retrieval of large volumes of structured and unstructured data. Familiarity with containerization technologies More ❯
Imagine being part of a team where your data analysis skills can impact millions of consumers worldwide. In Fulfillment by Amazon (FBA), you will help third-party businesses (our Selling Partners) leverage the robust data and analytics capabilities that Amazon has built over the last … using FBA, generating over 3 billion units served in the EU last year alone. In this role, you will use your expertise in data engineering, business analytics, and problem-solving to provide critical insights that drive the success of the FBA program and our selling partners. Are you … Key job responsibilities In a typical day, as a Support Engineer, you will: Trace Seller specific defects across upstream systems and datalake to detect, investigate, and fix the root causes (e.g., algorithmic bottlenecks, data quality issues, software flaws). Develop the tools for the More ❯
digital analytics, technologies and tools to integrate, digitalise and analyse fragmented, complex processes - turning them into simplified, actionable information. And as a result, data is transformed into intelligence for the right people, in the right place, at the right time. By bringing clarity to risks, we empower teams … mitigate risks across the entire organisation. We do this by helping them to: Transform current business processes into fully digitalised applications to collect data efficiently. Bring data to life by integrating operational processes with culture, risks, and incidents throughArtificial Intelligence (AI) and Machine Learning (ML). … feedback from leadership, project teams, and clients based on project delivery. Channel feedback and market requirements for continuous improvement. Leverage dss+ datalake to derive thought leadership content. Collaborate with dss+ Marketing team to manage internal and external marketing efforts across various channels. Who you are With More ❯
Senior GCP Data Engineer Start: ASAP Duration: initial 12-month contract Pay: inside IR35, negotiable Location: central London (3-days per week in office) We are looking for a Senior GCP Data Engineer to join its Cyber Security team on a contract basis. You'll be … responsible for delivering high-quality data engineering solutions using Google Cloud Platform, supporting large-scale data ingestion, transformation, and integration projects. Key Responsibilities: - Build and maintain APIs and backend systems - Develop and optimise GCP-based data pipelines (BigQuery, DataFlow, Composer) - Design scalable data models and cloud datalakes - Integrate data from cloud and on-prem sources - Automate infrastructure with Terraform and CI/CD tools - Collaborate with product teams to deliver business value Requirements: - 8+ years in data engineering, with strong GCP expertise - Proficient in More ❯
include design and development experience with various relevant Microsoft platforms, tools, technologies, patterns, and techniques related to MSFT suites, Azure tools, integration, and data is required. Experience with industry/domains like pharmaceutical, finance, HR, sales, marketing, and manufacturing is highly preferred. Experience with healthcare industry regulations, data … experience using Azure integration tools like Data factory, Logic Apps, etc. Along with knowledge of utilizing Azure SQL Server, Datalake, etc. In-depth knowledge & experience using Visual Studio, with one of the programming languages: C#/Java/JavaScript/Python, and PowerShell. Thorough … Queries, Datatype conversions, etc.). Experience working with APIs, Postman/SOAPUI tools, MS D365, ServiceNow, Azure Analytics tools, Azure Synapse, Azure BYOD & Data verse, Azure DevOps, Informatica, BI tools, etc. is a big plus. General business skills: Excellent communication to translate and explain business requirements to technology More ❯
Senior Data Engineer | Music Industry | Up to £85K | AWS, Python, SQL Are you passionate about music and data ? We're looking for a Senior Data Engineer to build and scale a cutting-edge data platform in the music industry. What You’ll … Do: 🎵 Design & develop robust data pipelines (ETL/ELT) ensuring accuracy & reliability. ☁️ Own & evolve our AWS-powered data platform , integrating multiple sources (databases, APIs, external datasets). 🚀 Optimise performance —tune queries, enhance scalability & streamline infrastructure. 🔍 Work with data modelling & integration , collaborating closely with analysts … stakeholders. 🛠 Tech stack: AWS, Python, SQL, DataLakes/Warehouses, APIs. What We’re Looking For: ✅ Strong expertise in Python & SQL for data processing. ✅ Experience with AWS services (Lambda, S3, Glue, Redshift, etc.). ✅ Background in building & optimising scalable data platforms . ✅ Knowledge of More ❯
Overview: 3 contract data engineers to supplement existing team during implementation phase of new data platform. Main Duties and Responsibilities: Write clean and testable code using PySpark and SparkSQL scripting languages, to enable our customer data products and business applications. Build and manage data … a structured, trackable and safe manner. Effectively create, optimise and maintain automated systems and processes across a given project(s) or technical domain. Data Analyse, profile and plan work, aligned with project priorities. Perform reviews of code, refactoring where necessary. Deploy code in a structured, trackable and safe … manner. Document your data developments and operational procedures. Ensure adherence to data/software delivery standards and effective delivery. Help monitor, troubleshoot and resolve production data issues when they occur. Contribute to the continuous improvement of the team. Contribute to the team's ability More ❯
This fantastic institution located in SW London is seeking a dynamic and versatile Data Engineer/Analytics Engineer to contribute to the development, enhancement, and management of their data systems, pipelines, and reporting infrastructure. This role will focus on designing and maintaining efficient data pipelines while building a scalable data warehouse architecture. A major aspect of this position will involve integrating new data sources to create a unified Single Customer View. Additionally, collaboration with the Analytics and Insights teams is essential to ensure they have seamless access to … evolve the Single Customer View to consolidate data from various sources. Demonstrate hands-on experience with Microsoft Azure technologies (including Synapse Analytics, Datalake, Azure SQL), as well as familiarity with DevOps and GitHub practices. Contract Details: Initially a Fixed Term Contract, likely transitioning to Permanent. 1-2 days More ❯
Strong knowledge of Python frameworks (e.g. Flask, Django), data libraries (Pandas, NumPy) Cloud-native experience, especially Microsoft Azure (Functions, DataLake, etc.) Familiarity with DevOps practices (Git, CI/CD, testing frameworks) Experience with SQL/NoSQL databases and API integration Proven experience working in More ❯
Strong knowledge of Python frameworks (e.g. Flask, Django), data libraries (Pandas, NumPy) Cloud-native experience, especially Microsoft Azure (Functions, DataLake, etc.) Familiarity with DevOps practices (Git, CI/CD, testing frameworks) Experience with SQL/NoSQL databases and API integration Proven experience working in More ❯
A prestigious organization based in London is seeking a dynamic and versatile Data Engineer/Analytics Engineer to contribute to the development, enhancement, and management of their data systems, pipelines, and reporting infrastructure. This role will focus on designing and maintaining efficient data pipelines … while building a scalable data warehouse architecture. A major aspect of this position will involve integrating new data sources to create a unified Single Customer View. Additionally, collaboration with the Analytics and Insights teams is essential to ensure they have seamless access to the necessary data … evolve the Single Customer View to consolidate data from various sources. Demonstrate hands-on experience with Microsoft Azure technologies (including Synapse Analytics, Datalake, Azure SQL), as well as familiarity with DevOps and GitHub practices. This role could accommodate 1 visit a month/fully remote for the right More ❯
will be responsible for a range of backend systems such as Software applications that include areas such as: Payment Gateway, Settlement, Transaction Fraud Monitoring, Datalake Service and the Core Banking System. You will oversee their deployment, development, enhancements and production operations. You should have extensive experience and skills in Software More ❯
Employment Type: Permanent
Salary: £125000 - £145000 per day + Equity and Benefits
will be responsible for a range of backend systems such as Software applications that include areas such as: Payment Gateway, Settlement, Transaction Fraud Monitoring, Datalake Service and the Core Banking System. You will oversee their deployment, development, enhancements and production operations. You should have extensive experience and skills in Software More ❯
promise of AI. GX has built specialised knowledge AI assistants for the banking and insurance industry. Our assistants are fed by sector-specific data and knowledge and easily adaptable through ontology layers to reflect institution-specific rules. GX AI assistants are designed for Individual Investors, Credit and Claims … professionals and delivering 10x improvements by supporting them in their day-to-day tasks. Responsibilities: Helping to architect, design, implement, and optimise our data ingestion, transformation, and spreading pipelines and processes. Developing data models, processing pipelines, and back-end services supporting the data science … building integrations, and analytics. Desired skills: A university degree in Mathematics, Computer Science, Engineering, Physics or similar. 5+ years of relevant experience in Data Engineering, warehousing, ETL, automation, cloud technologies, or Software Engineering in data related areas. Ability to write clean, scalable, maintainable code in Python More ❯
Other Skills/Technologies that you will use and develop in the role: Managing and optimizing Azure Synapse Analytics and Azure DataLake for large-scale data processing. Overseeing Dynamics 365 (D365) applications and integrating them with other Azure services. Facilitating connectivity between Azure, Azure More ❯
key applications such as Rydoo, Unit4, Cash Up App and Anaplan. ABP also builds and maintains integrations between these applications, our datalake and pipelines for processing financial and property data. With Finance, our continued mission is to transform and modernise their financial processes, be that via More ❯
re leading it. Since our beginning in Paris in 2013, we've been pioneering the future of AI with a platform that makes data actionable and accessible. With over 1,000 teammates across 25 countries and backed by a renowned set of investors, we're the architects of … Everyday AI, enabling data experts and domain experts to work together to build AI into their daily operations, from advanced analytics to Generative AI. Why Engineering at Dataiku? Dataiku's SaaS, cloud or on-premise deployed platform connects many Data Science technologies. Our technology stack reflects … our commitment to quality and innovation. We integrate the best of data and AI tech, selecting tools that truly enhance our product. From the latest LLMs to our dedication to open source communities, you'll work with a dynamic range of technologies and contribute to the collective knowledge More ❯