CONTRACT DATA ENGINEER - eDV CLEARED NEW OUTSIDE IR35 CONTRACT OPPORTUNITY AVAILABLE WITHIN A LEADING NATIONAL SECURITY SME FOR A DATA ENGINEER WITH eDV CLEARANCE Contract job opportunity for a Data Engineer National Security client Palantir Foundry Outside IR35 Central London based organisation in an easily accessible location To apply please call/or email WHO WE ARE … We are recruiting a contract Data Engineer to work with a National Security SME in central London. Due to the nature of the work, you must hold enhanced DV Clearance. WE NEED THE DATA ENGINEER TO HAVE.... Current enhanced DV Security Clearance Experience with bigdata tools such as Hadoop, Cloudera or Elasticsearch Experience With Palantir … Scrum environment with tools such as Confluence/Jira Experience in design, development, test and integration of software Willingness to learn new technologies IT WOULD BE NICE FOR THE DATA ENGINEER TO HAVE.... Cloud based architectures Microservice architecture or server-less architecture Messaging/routing technologies such as Apache Nifi/RabbitMQ TO BE CONSIDERED.... Please either apply by More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
Position: Analytics Engineer Location: London (hybrid 3 times a week)Department: ITType: 3 months rolling contractOutside IR35 The client is seeking an experienced Analytics Engineer to join its Data & Analytics team. The role focuses on building scalable data pipelines, transforming raw data into clean, reusable datasets, and enabling data-driven decision-making. Key Responsibilities Design and … build data products, with proficiency throughout the data lifecycle. Develop robust data models through close collaboration with business users and the engineering team. Partner with senior management, operational leads, and other stakeholders, coaching and supporting a data-driven culture, including KPI definition and reporting frameworks. Accountable for data extraction, transforming JSON & XML, utlising high experience … within metadata management. Collaborate with data engineers to develop data, enrich product design and integrate data for predictive models or machine learning. Deliver well-defined, transformed, tested, documented, and code-reviewed datasets for analysis. Evaluate and recommend improvements in data flow, influencing and supporting architects and engineers. To work independently and to manage multiple dataMore ❯
City of London, London, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
Analytics Engineer Location: London (hybrid 3 times a week) Department: IT Type: 3 months rolling contract Outside IR35 The client is seeking an experienced Analytics Engineer to join its Data & Analytics team. The role focuses on building scalable data pipelines, transforming raw data into clean, reusable datasets, and enabling data-driven decision-making. Key Responsibilities Design … and build data products, with proficiency throughout the data lifecycle. Develop robust data models through close collaboration with business users and the engineering team. Partner with senior management, operational leads, and other stakeholders, coaching and supporting a data-driven culture, including KPI definition and reporting frameworks. Accountable for data extraction, transforming JSON & XML, utlising high … experience within metadata management. Collaborate with data engineers to develop data, enrich product design and integrate data for predictive models or machine learning. Deliver well-defined, transformed, tested, documented, and code-reviewed datasets for analysis. Evaluate and recommend improvements in data flow, influencing and supporting architects and engineers. To work independently and to manage multiple dataMore ❯
Azure Data Engineer VIQU has partnered with a leading Telecommunications organisation seeking an experienced Azure Data Engineer to design, develop, and manage scalable data solutions on the Microsoft Azure platform. You’ll play a key role in architecting and implementing robust data integration solutions, as well as building ETL pipelines to support data ingestion, transformation … and loading. The Role:As an Azure data engineer, you’ll collaborate with cross-functional teams in an agile environment — contributing to sprint planning, daily stand-ups, and other agile ceremonies. Strong communication skills are essential, as you’ll often translate complex data concepts into clear insights for non-technical stakeholders. Key Responsibilities: Architect, design, and implement scalable … Azure-based data solutions. Develop and maintain ETL processes for data ingestion, transformation, and loading. Ensure data governance, integrity, and quality throughout the data lifecycle. Implement robust data security, compliance, and privacy standards. Document data architectures, data flows, and processes for knowledge sharing and audit readiness. Continuously enhance workflows using Agile best practices More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Crimson
Senior Data Engineer - Azure Databricks/SC Clearance - Contract Active SC Clearance is required for this position Hybrid working - 3 days/week on site required Up to £620/day - Inside IR35 We are currently recruiting for a well experienced Senior Data Engineer, required for a leading global transformation consultancy, based in London. The Senior Data Engineer will be responsible for providing hands-on, technical expertise within an agile team. The ideal candidate will be well experienced in building and optimising data pipelines, implementing data transformations, and ensuring data quality and reliability. This role requires a strong understanding of data engineering principles, bigdata technologies, cloud computing (specifically Azure … and experience working with large datasets Key skills and responsibilities: Design, build, and maintain scalable ETL pipelines for ingesting, transforming, and loading data from APIs, databases, and financial data sources into Azure Databricks. Optimize pipelines for performance, reliability, and cost, incorporating data quality checks. Develop complex transformations and processing logic using Spark (PySpark/Scala) for cleaning More ❯
the skills and experience required: Degree in Computer Science, Information Systems, or related field. Professional certifications (e.g., TOGAF, DAMA, CDMP) desirable. Demonstrable experience (ideally 5 years+) working in a data or Architecture domain across a range of different technologies (including business intelligence, data warehousing, data management and data delivery, database and ETL technologies ) Expertise in data concepts and tools, such as BigData, ML Ops and Kafka. Confident communicator - able to present complex technical issues in a clear manner to technical and non-technical audiences Knowledge of cloud infrastructure, such as AWS, Azure and GCP. Multi-task and prioritise across a number of projects and initiatives. Work independently and collaborate effectively across the More ❯
applications using C#, .NET Core, and AWS. Build and maintain well-structured RESTful APIs to integrate with various systems. Work with SQL and NoSQL databases to store and retrieve data efficiently. Implement event-driven architectures using messaging systems like RabbitMQ and AWS Kinesis/SQS. Develop modern web interfaces using TypeScript and Angular. Contribute to code reviews and ensure … proficiency in C# programming, with a focus on building scalable applications using .NET Core. Web API Development: Expertise in designing, developing, and maintaining well-structured RESTful APIs to facilitate data exchange between systems. Database Management: Solid understanding of SQL databases and experience working with NoSQL databases, particularly MongoDB. Cloud Architecture: Proven ability to architect and implement scalable cloud solutions … handle asynchronous communication. Desirable Skills Code Quality and Testing: A commitment to writing clean, maintainable code and a strong understanding of testing methodologies (unit, integration, end-to-end). BigData: Familiarity with bigdata concepts, tools, and techniques for handling and processing large datasets. Rullion celebrates and supports diversity and is committed to ensuring equal More ❯
minimum 5 years residency) Location: Remote Duration: 7 months Rate: £350 per day (Inside IR35) Role Overview We are seeking an experienced DataStage Developer to join our Hybrid Cloud & Data team within BigData Services. This role focuses on designing, developing, and implementing DataStage mappings and supporting data transformation processes in an Agile environment. Key Responsibilities … Design, develop, and implement DataStage mappings aligned with project requirements and best practices. Apply MOSAIC or IBM 10-step Data Migration Methodologies for effective data transformation. Collaborate with cross-functional teams to understand business needs and deliver robust solutions. Implement XML, XSLT , and other relevant technologies for data manipulation and transformation. Use Jira for project management and … Required Skills UK National with minimum 5 years residency (BPSS clearance required). Proven experience in DataStage Designer and mapping development. Strong understanding of MOSAIC or IBM 10-step Data Migration Methodologies . Proficiency in XML, XSLT , and related data manipulation technologies. Familiarity with Agile methodologies and Jira. Excellent problem-solving and communication skills. Degree in Computer Science More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
minimum 5 years residency) Location: Remote Duration: 7 months Rate: £350 per day (Inside IR35) Role Overview We are seeking an experienced DataStage Developer to join our Hybrid Cloud & Data team within BigData Services. This role focuses on designing, developing, and implementing DataStage mappings and supporting data transformation processes in an Agile environment. Key Responsibilities … Design, develop, and implement DataStage mappings aligned with project requirements and best practices. Apply MOSAIC or IBM 10-step Data Migration Methodologies for effective data transformation. Collaborate with cross-functional teams to understand business needs and deliver robust solutions. Implement XML, XSLT , and other relevant technologies for data manipulation and transformation. Use Jira for project management and … Required Skills UK National with minimum 5 years residency (BPSS clearance required). Proven experience in DataStage Designer and mapping development. Strong understanding of MOSAIC or IBM 10-step Data Migration Methodologies . Proficiency in XML, XSLT , and related data manipulation technologies. Familiarity with Agile methodologies and Jira. Excellent problem-solving and communication skills. Degree in Computer Science More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Harnham - Data & Analytics Recruitment
for the right candidate. A market-leading global e-commerce client is urgently seeking a Senior MLOps Lead to establish and drive operational excellence within their largest, most established data function (60+ engineers). This is a mission-critical role focused on scaling their core on-site advertising platform from daily batch processing to real-time capability. This role … large-scale Spark/Python codebases for production efficiency, focusing on minimising latency and cost. Knowledge Transfer: Act as the technical lead to embed MLOps standards into the core Data Engineering team. Key Skills: Must Have: MLOps: Proven experience designing and implementing end-to-end MLOps processes in a production environment. Cloud ML Stack: Expert proficiency with Databricks and … MLflow . BigData/Coding: Expert Apache Spark and Python engineering experience on large datasets. Core Engineering: Strong experience with GIT for version control and building CI/CD/release pipelines. Data Fundamentals: Excellent SQL skills. Nice-to-Have/Desirable Skills DevOps/CICD (Pipeline experience) GCP (Familiarity with Google Cloud Platform) DataMore ❯
have but not essential Experience/Knowledge of automation IT WOULD BE NICE FOR THE SENIOR SOFTWARE ENGINEER TO HAVE.... Cloud based experience Microservice architecture or server-less architecture BigData/Messaging technologies such as Apache Nifi/MiNiFi/Kafka TO BE CONSIDERED.... Please either apply by clicking online or emailing me directly to For further More ❯