Job Description: Are you passionate about Data and Analytics (D&A) and excited about how it can transform enterprise operations? Do you possess strategic vision, technical expertise, and leadership skills to drive data-driven solutions? If so, you might be the ideal candidate for the Data and Analytics role within Mars's Global Pet … achieve growth, profitability, and customer centricity. This role requires demonstrating thought leadership, technical proficiency, and the ability to navigate complex challenges while leading a team of top-tier data and analytics professionals. The role involves transitioning to a product-based model to develop digital capabilities. Responsibilities include developing and maintaining robust datapipelines and storage solutions … to support analytics and machine learning initiatives. You will report to the Director of Data Engineering Solutions and collaborate globally with engineering teams across core products. Key Responsibilities: Technical Leadership: Provide leadership to data and DevOps engineers. Collaborate on designing and evolving scalable data platforms. Promote best practices and foster a high-performance culture. More ❯
Numi Location: Slough, United Kingdom Job Category: Other - EU work permit required: Yes Job Views: 4 Posted: 31.05.2025 Expiry Date: 15.07.2025 Job Description: Lead Backend Engineers (Platform Experience & Data Warehousing teams) Location: UK, Ireland, Poland, Lithuania, Spain, Portugal, Finland Numi is proud to partner with a leading data integration technology provider serving some of the biggest … clients in the industry! We are seeking experienced Lead Backend Engineers (PHP) to join either the Platform Experience or Data Warehousing teams. The Platform Experience team focuses on optimizing end-to-end platform journeys and creating impactful customer experiences, including developing data visualization dashboards and building features from scratch. The Data Warehousing team handles … data storage and transfer solutions, scaling and refactoring complex data transfer systems for large data volumes, and enhancing internal storage. As a Lead Backend Engineer, you will lead feature development, ensure high code quality, provide technical guidance, and collaborate with product managers, designers, and frontend engineers. You will identify system bottlenecks, improve monitoring, and More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Social network you want to login/join with: Role: Senior Product Manager (Data Foundations) Location: London (2-3 days per week in our Holborn office) Employment Type: Full-time, as an employee Prosapient is a global Expert Network firm. We provide clients with rapid access to high-quality, relevant expert insights through a combination of world-class … the UK's fastest growing and most successful female co-founded businesses. At proSapient, we’re building the foundations of next-generation expert intelligence. As we grow our data capabilities, we’re looking for a data-savvy product leader to shape how proprietary and third-party data powers our platform. From driving integrations to … building scalable pipelines, you’ll play a central role in turning raw data into meaningful, actionable insights that enable faster, smarter decisions for our users. The key focus of the role includes: · Owning our data foundations product domain end-to-end: setting vision, defining strategy, and delivering tangible impact to deepen the breadth and quality of More ❯
Social network you want to login/join with: Python Developer (DataPipelines) | Top Systematic Hedge Fund, slough col-narrow-left Client: Selby Jennings Location: slough, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 3 Posted: 31.05.2025 Expiry Date: 15.07.2025 col-wide Job Description: Our client, a top systematic hedge fund … are seeking an experienced Python developer to join their Data Team (Alpha Data), which is responsible for delivering vast quantities of data to users worldwide. Heavily responsible for data pipelines. This role involves becoming a technical subject matter expert and developing strong relationships with quant researchers, traders, and colleagues across the Technology … organisation. The Data teams deploy valuable data quickly, ensuring ingestion pipelines and data transformation jobs are resilient and maintainable, with data models designed in collaboration with researchers for efficient query construction and alpha generation. The team builds frameworks, libraries, and services to enhance quality of life, throughput, and code quality. They value More ❯
Social network you want to login/join with: Description of Role and Client: The Head of Data Engineering will lead the technical development and operational management of data engineering solutions for a prominent insurance organization. The client requires a data engineering expert capable of managing complex environments and collaborating with senior technical teams. … This role falls inside IR35. Responsibilities: Design, develop, and manage datapipelines, warehouses, and data lakes. Optimize and maintain enterprise data engineering platforms. Lead strategic management of data engineering solutions across cloud and hybrid environments. Collaborate closely with Data Architects, CIO teams, and senior stakeholders. Continuously enhance the efficiency, reliability … and scalability of data engineering landscapes. Candidate Requirements/Profile: Extensive data engineering experience within insurance sectors. Proficiency with AWS, Azure, GCP, and hybrid data environments. Relevant certifications in cloud platforms or big data technologies. Demonstrated ability to strategically own and enhance enterprise data platforms. Contract duration: Initial 6-month More ❯
Freelance Data Operations Engineer - Market Data (Hedge Fund) Overview We are hiring a Data Operations Engineer to join one of the world's most successful hedge funds, supporting the Market Data and Reference Data team during a critical coverage period. This is a hands-on role focused on real-time … data quality, vendor integration, and operational ownership - ideal for professionals with strong market data and commodities experience in fast-paced financial environments. Key Responsibilities Manage and validate reference data across securities, pricing, accounts, and client data. Ensure data accuracy and integrity from vendor feeds (e.g., Bloomberg, Reuters, IHS Markit, ICE). Handle … Security Master data updates, monitoring, and exception resolution. Collaborate with internal stakeholders to support real-time data needs across trading desks. Develop lightweight Python scripts for automation, validation, and monitoring tasks. Support client-driven enhancements and maintain vendor relationships. Monitor and troubleshoot real-time and static data pipelines. Assist in the integration of dataMore ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Security Cleared Data Engineer - Inside IR35 - Fabric/Azure/Python/C#, Slough Client: Methods Location: Slough, United Kingdom Job Category: Other EU work permit required: Yes Job Views: 2 Posted: 31.05.2025 Expiry Date: 15.07.2025 Job Description: Methods is looking for Data Engineers for a new contract assignment on a hybrid working policy. The client … will be based in London and will require on-site presence 1-2 days per week. The Data Engineer must hold a live Security Clearance used on a Security Cleared site within the last 12 months or be actively using the clearance in their current role. You Will: Design, build, and maintain data pipelines. Ensure data accuracy, consistency, and reliability by implementing validation methods. Implement and enforce data governance policies and ensure compliance with regulations. Build infrastructure for optimal extraction, transformation, and loading (ETL) of data from various sources. Collaborate with stakeholders including data, design, product, and executive teams, assisting with data-related technical issues. This More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
customer segmentation, proidling and personalisation projects Help shape the data roadmap from early foundations to advanced maturity. What you'll bring: Strong SQL and datapipeline understanding (you will work with data engineering teams rather than being hands on coding pipelines). Proven experience building dashboards and defining KPIs. Comfortable with dataMore ❯
Bicester, England, United Kingdom Hybrid / WFH Options
Fido Tech Limited
focus on firmware integration and ensure robust communication between IoT devices and backend systems, whether through Azure IoT Hub or CoIoTe . Your contributions will enable real-time data processing and AI powered analytics for our advanced IoT solutions. Responsibilities Design and maintain backend services using Python, Kafka , and FastAPI . Integrate IoT devices through platforms like Azure … pipelines for PSD data and audio retrieval, ensuring compliance with SLAs. Trigger AI model predictions on new leak profiles, integrating seamlessly with the backend. Monitor pipeline health and resolve issues using tools such as Sentry and Azure Monitoring . Ensure robust system security with practices like mTLS authentication and DDoS protection. Qualifications: Advanced backend development … expertise in Python, Kafka , and cloud-based systems. Proven experience with Azure IoT Hub , CoIoTe , or similar IoT platforms. Familiarity with scalable microservices architectures and real-time data processing. Strong understanding of database systems like MySQL and MongoDB . Knowledge of IoT security practices, including SAS tokens and VNET integrations. Experience in Agile development environments and strong collaboration More ❯
Oxford, England, United Kingdom Hybrid / WFH Options
Randstad Staffing
. Preferably, you will have certifications in AWS or cloud architecture: AWS Cert DevOps Engineer Professional AWS Cert Solutions Architect Professional AWS Cert Developer Associate AWS Cert Big Data AWS Cert Cloud Practitioner Essential Skills Over 10 years of cloud development experience in languages such as Java, JavaScript, Python, AWS. Expertise in building enterprise-level integration solutions. … AWS using services like EventBridge, Lambda, SNS/SQS, API Gateway, Transfer Family, AppFlow, Glue, Step Function, S3, Kinesis, MQ, and DynamoDB Streams. Expertise in designing and implementing data integration workflows using AWS services such as AWS Glue, Amazon S3, AWS Lambda, and Amazon Kinesis for both batch and real-time processing, along with monitoring and troubleshooting data … an emphasis on driving iterative development and continuous delivery of integration solutions. Ability to manage third-party integrations, including working with external vendors and partners to ensure successful data and system integration. Key Duties of Role Design and implement enterprise integration solutions: Architect integration frameworks that connect disparate systems, ensuring smooth and reliable flow between applications and services. More ❯
a Principal Python Engineer to act as a quantitative overlay on global equity trading operations. The engineering team's goal is to continually improve the entire ecosystem-spanning data, research, trading, and post-trade analytics. As a Principal Engineer, you will own end-to-end delivery of new features and projects, engaging in everything from trading framework design … closely with quants, traders, and other engineering teams to build a robust and scalable CRB platform. Responsibilities: Platform Development: Enhance and maintain the CRB ecosystem, including research tools, datapipelines, trading frameworks, back-testing infrastructure, and post-trade analysis systems. End-to-End Project Ownership: Collaborate with stakeholders to define requirements, design solutions, implement code, and oversee production … and other essential tooling. Performance Optimization: Identify and fix performance bottlenecks in multi-threaded systems, ensuring high availability and low latency. Technical Exploration: Investigate emerging technologies-particularly in data analysis and machine learning-and adapt them to evolving platform needs. Ideal Candidate Background: Experience: 7+ years of professional software development with exposure to capital markets or trading (equities More ❯
from AI agents to APIs to workflows to integrations Set up and manage the cloud infrastructure using AWS, Docker containerisation, and Infrastructure-as-Code with Terraform Build robust data integration infrastructure to connect the platform with various customer systems (TMS, RMS, etc.) and external APIs Implement the workflow orchestration layer to ensure reliable execution of complex agentic behaviours … workflows Skills and Qualifications Experience building end-to-end platform solutions that integrate workflow orchestration systems (like Airflow, Temporal, AWS Step Functions) with real-world business processes and datapipelines Strong background in integration engineering and data modelling Exceptional Python skills for building APIs, services, and data processing pipelines Experience with cloud infrastructure (AWS More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Social network you want to login/join with: You will ensure our customer works with updated and correct data in our data analytics tools by being the first to respond to data health failures on key pipelines, reading code and making changes. About us Hexegic is a leading technical consultancy providing agile multi … projects for those keen to develop and build a successful career. Core Responsibilities Maintain build schedules to ensure pipelines run effectively Set up and maintain health checks on datapipelines Respond to, triage, and debug broken pipelines Read and modify code, and update monitoring … setups Communicate outages to end users What we are looking for Proficiency in reading and writing code in Python, PySpark, and Java Basic understanding of Spark Ability to navigate pipeline development tools What’s in it for you? Base salary of £50,000-£60,000 £5000 annual professional development budget Wellness program 25 days of annual leave Hybrid working More ❯
Social network you want to login/join with: We are seeking an experienced Principal Data Engineer to lead a team in developing and maintaining robust, scalable datapipelines, bridging on-premises and cloud environments, and delivering real-time analytics systems. This role requires deep expertise in data engineering and streaming technologies, combined with … to drive the team towards achieving business objectives. You will collaborate with cross-functional teams including architecture, product, and software engineering to ensure the delivery of high-quality data solutions aligned with company goals. Requirements: 5+ years of hands-on experience in data engineering, including expertise in Python, Scala, or Java. Deep understanding of Apache Kafka … as PostgreSQL, MySQL, MSSQL. Familiarity with analytical databases. Familiarity with both cloud solutions (AWS preferably) and on-premises environments as part of cost-optimization efforts. Knowledge of additional data tools and frameworks such as Flink, Redis, RabbitMQ, Superset, Cube.js, Minio, and Grafana (optional but beneficial). Strong leadership and mentoring skills, with the ability to guide a team More ❯
Social network you want to login/join with: We are seeking an experienced Principal Data Engineer to lead a team in developing and maintaining robust, scalable datapipelines, bridging on-premises and cloud environments, and delivering real-time analytics systems. This role requires deep expertise in data engineering and streaming technologies, combined with … to drive the team towards achieving business objectives. You will collaborate with cross-functional teams including architecture, product, and software engineering to ensure the delivery of high-quality data solutions aligned with company goals. Requirements: 5+ years of hands-on experience in data engineering, including expertise in Python, Scala, or Java. Deep understanding of Apache Kafka … as PostgreSQL, MySQL, MSSQL. Familiarity with analytical databases. Familiarity with both cloud solutions (AWS preferably) and on-premises environments as part of cost-optimization efforts. Knowledge of additional data tools and frameworks such as Flink, Redis, RabbitMQ, Superset, Cube.js, Minio, and Grafana (optional but beneficial). Strong leadership and mentoring skills, with the ability to guide a team More ❯
Social network you want to login/join with: We are seeking an experienced Principal Data Engineer to lead a team in developing and maintaining robust, scalable datapipelines, bridging on-premises and cloud environments, and delivering real-time analytics systems. This role requires deep expertise in data engineering and streaming technologies, combined with … to drive the team towards achieving business objectives. You will collaborate with cross-functional teams including architecture, product, and software engineering to ensure the delivery of high-quality data solutions aligned with company goals. Requirements: 5+ years of hands-on experience in data engineering, including expertise in Python, Scala, or Java. Deep understanding of Apache Kafka … as PostgreSQL, MySQL, MSSQL. Familiarity with analytical databases. Familiarity with both cloud solutions (AWS preferably) and on-premises environments as part of cost-optimization efforts. Knowledge of additional data tools and frameworks such as Flink, Redis, RabbitMQ, Superset, Cube.js, Minio, and Grafana (optional but beneficial). Strong leadership and mentoring skills, with the ability to guide a team More ❯
Reading, Berkshire, South East, United Kingdom Hybrid / WFH Options
Hays
JOB DETAILS - £500-£550 PER DAY - OUTSIDE IR35 - REMOTE ROLE - 3-MONTH CONTRACT WITH POTENTIAL FOR EXTENSION - NPPV3 AND SC CLEARANCE REQUIRED SKILLS - Extensive experience in Azure Data Factory, Databricks and Synapse. - Knowledge of Oracle. - Understanding of security protocols, dealing with policing data and clearance requirements. RESPONSIBILITIES - Strong collaboration skills with other teams and colleagues within … the organisation. - Ability to communicate effectively with non-technical and junior colleagues. - Taking a leading role in data transformation, building datapipelines and data modelling. What you need to do now If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us More ❯
Oxford, England, United Kingdom Hybrid / WFH Options
Beautyk Creative
Manager will build and lead a team of software engineers across the full stack, as they work with bioinformatics pipelines and build and deliver exceptional analytical tools processing data from multiple sources such as sequencing machines. Working in a "hands-on" capacity, the Software Engineering Manager will work to develop a scalable architecture/infrastructure, introducing DevOps practices … across the full stack, with proven experience of leading and developing teams Experience of working within a start-up/scale-up environment preferred Experience of working with data at scale/datapipelines, with any experience of bioinformatics/genome data advantageous. Experience of data processing platforms such as Kafka and More ❯
At Dexory, we believe that real time data will revolutionise the logistics industry. We are building the ultimate data insights platform that provides companies with unprecedented, real time access to their operations. Our autonomous data capture technology and insights generation capabilities help to measure, track and find goods across warehouses in real time, while … of facilities. Join us at a very exciting time of growth as we’re significantly ramping up all areas of the business to lead the way in logistics data globally, backed by some of Europe’s best VCs and driven by passion, curiosity and teamwork. What does this role involve? As Senior Perception Engineer at Dexory, you will … be responsible for the design and implementation of high-performance systems for gathering and analysing environmental data from LiDARs, cameras, and other sensors. You will lead a small perception team, focusing on creating robust, scalable, and efficient data collection and processing systems to support our warehouse integrity platform. You will have a unique opportunity to help More ❯
will play a key role in developing and maintaining high-impact applications that support our global scientific and commercial operations. You will contribute to systems that power our data analysis platforms and next-generation telemetry infrastructure, helping us deliver world-class sequencing technology to researchers worldwide. Key Responsibilities: Design, build and maintain data-intensive web applications … with: Strong experience in production-level Python development A background in building and maintaining web applications or APIs Experience with automated testing, deployment, and containerised environments Knowledge of datapipelines, workflow engines, and cloud-native architectures This role is ideal for someone who thrives in a dynamic, fast-paced environment, embraces change, and is motivated by meaningful work More ❯
is bringing AI design into the real world by enabling generative engineering design for physical products. Our focus is creating millions more engineers globally and giving them the data and knowledge necessary to make efficient decisions quickly, one of the main challenges of the physical engineering industry today. Our team has a background in scaling software to millions … Task orchestration frameworks (e.g. Luigi, Dask, Airflow + Celery etc) Experience owning or being involved longer-term in an open-source project Demonstrable Rust experience or keen interest Datapipelines and big data tech Docker: both building but running too Wide AWS and infrastructure knowledge, including production support Scientific computing e.g. Numpy/scipy/pandas More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
with internal stakeholders and external vendors Manage deployments and configuration changes using version control and infrastructure-as-code principles Develop reusable patterns for managing user access, automation, and datapipelines across SaaS applications Own documentation for integration, data handling, and process automation across the application landscape Ensure secure management of sensitive data within cloud More ❯
to have broad knowledge of D365 modules including finance, trade and logistics, stock, warehousing, procurement and sourcing etc. An understanding of the wider Microsoft stack including Azure functionality, Datapipelines and integrations is also desirable. Travel to customer sites to support optimizations or consultancy requirements will be required on occasion. Key Responsibilities: Liaise with clients in order to … implementation with documentation in the role of Senior Consultant Worked in a solution delivery capacity in the past Understanding of the wider Microsoft Stack, e.g. Azure functionality, CE, Datapipelines Willing to expand and train in other areas to cover demand as required Benefits Salary: We offer a competitive, market-aligned salary, that reflects the skills and experience … Kingdom 3 months ago Uxbridge, England, United Kingdom 1 week ago Modern Workplace Engineer/Business Analyst Uxbridge, England, United Kingdom 1 week ago Senior Technical Consultant – CLD – Data Platform Reading, England, United Kingdom 1 day ago Reading, England, United Kingdom 1 day ago Staines-Upon-Thames, England, United Kingdom 9 hours ago Reading, England, United Kingdom More ❯
looking for a Legal AI Engineer who’s passionate about building intelligent systems at the intersection of law and machine learning. You’ll work closely with legal experts, data scientists, and product managers to develop AI features that tackle real legal challenges through the use of agentic GenAI. As Legal AI Engineer you will: Design and implement AI … data. Collaborate with legal SMEs to translate domain knowledge into scalable machine learning solutions. Continuously evaluate model performance, ensuring accuracy, fairness, and compliance. Help shape the datapipeline and MLOps practices for handling sensitive legal content securely. Required Experience: Solid experience with Python and ML/NLP libraries (e.g., spaCy, Hugging Face, TensorFlow/PyTorch). Experience More ❯
looking for a Legal AI Engineer who’s passionate about building intelligent systems at the intersection of law and machine learning. You’ll work closely with legal experts, data scientists, and product managers to develop AI features that tackle real legal challenges through the use of agentic GenAI. As Legal AI Engineer you will: Design and implement AI … data. Collaborate with legal SMEs to translate domain knowledge into scalable machine learning solutions. Continuously evaluate model performance, ensuring accuracy, fairness, and compliance. Help shape the datapipeline and MLOps practices for handling sensitive legal content securely. Required Experience: Solid experience with Python and ML/NLP libraries (e.g., spaCy, Hugging Face, TensorFlow/PyTorch). Experience More ❯