Contract Data Ingestion Jobs in the UK

11 of 11 Contract Data Ingestion Jobs in the UK

AWS data engineer - Ireland

United Kingdom
LA International Computer Consultants Ltd
Systems Architecture - AWS data engineer Job Description Location: UK - Will require travel to customer site ](Belfast) Job Summary: We are seeking a skilled and experienced AWS Data Engineer to join our team. The successful candidate will be responsible for implementing, and managing data architecture solutions for our customers, with a strong emphasis on Cloud technologies (Preferably AWS … and tooling. This role requires understanding of data modelling, database design, and data integration techniques. Key Responsibilities -Data Ingestion and Extraction: Design and implement efficient data ingestion pipelines to and from databases and file storage services. -Data Transformation and Cleaning: Transform and clean raw data to ensure data quality and consistency. … Data Pipelines: Build, maintain, and optimise data pipelines to automate data flows and enable real-time data processing. -Data Quality Assurance: Monitor data quality and implement measures to ensure data accuracy and completeness. -Database Administration: Manage and maintain databases (e.g., SQL, NoSQL) to ensure optimal performance and security. -Cloud Infrastructure: Deploy and manage More ❯
Employment Type: Contract
Rate: £400 - £650 per day + Inside IR35
Posted:

Data Engineer - Retail banking

London, United Kingdom
Alexander Mann Solutions
The banks' expertise and services span across Business Services, Corporate banking, Wealth Management, Group Functions, Retail and Investment Banking. On behalf of this organisation, AMS are looking for a Data Engineer for a 12 month contract based in London (hybrid -2 days in the office per week). Purpose of the role: The Data Engineer will help drive … the build of effortless, digital first customer experiences, simplifying the organisation by developing innovative data driven solutions through data pipelines, modelling and ETL design, inspiring to be commercially successful through insight and keep customers and the organisation's data safe and secure. What you'll do: Build advanced automation of data engineering pipelines through removal of … manual stages. Embed new data techniques through role modelling, training and experiment design oversight. Deliver a clear understanding of data platform costs to meet departments cost saving and income targets. Source new data using the most appropriate tooling for the situation. Develop solutions for streaming data ingestion and transformations in line with the client's More ❯
Employment Type: Contract
Rate: GBP Annual
Posted:

Data Engineer NPPV3

London, South East, England, United Kingdom
Hybrid / WFH Options
Hays Specialist Recruitment Limited
Job Title: Data Engineer (NPPV3 + SC Clearance Required)Location: Hybrid (UK-based)Contract: 3 Months (with potential extension/move to other projects)Rate: £550 per day (Outside IR35)A bout the Role:We are seeking an experienced Data Engineer with NPPV3 + SC clearance to join our dynamic team on a 3-month contract basis. This … role offering the flexibility of working both remotely and onsite. The contract has potential for extension or transition onto other exciting projects. Key Responsibilities:* Design, build, and maintain scalable data pipelines using Azure Data Factory, Databricks, and Azure Synapse.* Collaborate with cross-functional teams to ensure data quality and integration.* Support data ingestion, transformation, and … storage in cloud environments.* Troubleshoot and optimise data workflows for performance and reliability.* Maintain compliance with security protocols and clearance requirements. Essential Skills & Experience:* Must hold NPPV3 + SC clearance (this is a mandatory requirement).* Proven expertise in Azure Data Factory, Databricks, and Azure Synapse Analytics.* Strong experience in building and managing cloud-based data solutions. More ❯
Employment Type: Contractor
Rate: £500 - £550 per day
Posted:

Databricks Engineer

London, United Kingdom
Tenth Revolution Group
Data Pipeline Development: Design and implement end-to-end data pipelines in Azure Databricks, handling ingestion from various data sources, performing complex transformations, and publishing data to Azure Data Lake or other storage services. Write efficient and standardized Spark SQL and PySpark code for data transformations, ensuring data integrity and accuracy across … the pipeline. Automate pipeline orchestration using Databricks Workflows or integration with external tools (e.g., Apache Airflow, Azure Data Factory). Data Ingestion & Transformation: Build scalable data ingestion processes to handle structured, semi-structured, and unstructured data from various sources (APIs, databases, file systems). Implement data transformation logic using Spark, ensuring data is cleaned, transformed, and enriched according to business requirements. Leverage Databricks features such as Delta Lake to manage and track changes to data, enabling better versioning and performance for incremental data loads. Data Publishing & Integration: Publish clean, transformed data to Azure Data Lake or other cloud storage solutions for consumption by analytics and reporting More ❯
Employment Type: Contract
Rate: £400 - £500/day
Posted:

Databricks Engineer

London, South East, England, United Kingdom
Tenth Revolution Group
Data Pipeline Development: Design and implement end-to-end data pipelines in Azure Databricks, handling ingestion from various data sources, performing complex transformations, and publishing data to Azure Data Lake or other storage services. Write efficient and standardized Spark SQL and PySpark code for data transformations, ensuring data integrity and accuracy across … the pipeline. Automate pipeline orchestration using Databricks Workflows or integration with external tools (e.g., Apache Airflow, Azure Data Factory). Data Ingestion & Transformation: Build scalable data ingestion processes to handle structured, semi-structured, and unstructured data from various sources (APIs, databases, file systems). Implement data transformation logic using Spark, ensuring data is cleaned, transformed, and enriched according to business requirements. Leverage Databricks features such as Delta Lake to manage and track changes to data, enabling better versioning and performance for incremental data loads. Data Publishing & Integration: Publish clean, transformed data to Azure Data Lake or other cloud storage solutions for consumption by analytics and reporting More ❯
Employment Type: Contractor
Rate: £400 - £500 per day
Posted:

ETL Developer

London, Coleman Street, United Kingdom
Deerfoot Recruitment Solutions Limited
you will play a key role in ensuring that critical business services have the capacity and resilience to meet regulatory and operational demands. You will focus on developing robust data pipelines and solutions that support capacity monitoring, reporting, and forecasting across complex IT environments. Role Overview: Develop ETL processes to extract and normalise capacity data from infrastructure sources. … Automate data ingestion and processing into SQL Server databases. Normalising raw infrastructure and monitoring data into structured formats suitable for graphing and trending Collaborate with DBAs and technical teams to ensure transparency and supportability. Build data flows that support capacity reporting and trending, using modern technologies. Skills & Experience Required: 3+ years' experience in ETL/data processing roles. Proficiency with ETL tools, SQL Server, RESTful APIs. Experience working with infrastructure data (e.g., Nutanix, VMware, Dell EMC desirable). Scripting experience in Python, SQL (T-SQL), Excel VBA. Knowledge of data optimisation, large datasets, JSON/XML formats. Experience of setting up SQL DTSS and Control/M batch jobs useful Excellent communication, analytical More ❯
Employment Type: Contract
Posted:

GCP Data Engineer (Java, Spark, ETL)

London, United Kingdom
Hybrid / WFH Options
Staffworx Limited
GCP Data Engineer (Java, Spark, ETL) Future Talent Pool, GCP Data Engineer, London, hybrid role new workstreams on digital Google Cloud transformation programme Proficiency in programming languages such as Python and Java Programming languages Pyspark & Java develop ETL processes for Data ingestion & preparation SparkSQL CloudRun, DataFlow, CloudStorage GCP BigQuery Google Cloud Platform Data Studio Unix …/Linux Platform Version control tools (Git, GitHub), automated deployment tools Google Cloud Platform services, Pub/Sub, BigQuery Streaming and related technologies. Deep understanding of Real Time data processing and event-driven architectures. Familiarity with data orchestration tools Google Cloud Platform cloud composer. Google Cloud Platform certification(s) is a strong advantage. Develop, implement, and optimize Real … Time data processing workflows using Google Cloud Platform services such as Dataflow, Pub/Sub, and BigQuery Streaming. 6 months initial, likely long term extensions This advert was posted by Staffworx Limited - a UK based recruitment consultancy supporting the global E-commerce, software & consulting sectors. Services advertised by Staffworx are those of an Agency and/or an Employment More ❯
Employment Type: Contract, Work From Home
Posted:

Technical Programme Manager

London, South East, England, United Kingdom
Harnham - Data & Analytics Recruitment
TECHNICAL PROGRAMME MANAGER - DATA INGESTION (PHARMA/SNOWFLAKE) UP TO £560 PER DAY HYBRID (1/2 DAYS PER WEEK IN SPAIN & GERMANY) 6 MONTHS THE COMPANY: A global data and analytics consultancy are delivering a large-scale data ingestion programme for a leading pharmaceutical client. THE ROLE: As a Technical Programme Manager, your key … responsibilities will include: Leading client engagement and delivery governance for the data ingestion workstream. Acting as the bridge between offshore engineering teams (India) and European stakeholders (Germany/Spain). Overseeing ingestion of 500+ data sets into the client's Snowflake environment. Driving delivery oversight, issue/risk management, and programme direction. YOUR SKILLS AND EXPERIENCE … The successful Technical Programme Manager will have: Strong programme or project management background in data engineering or data platform environments. Experience working with Snowflake, AWS (S3, Glue), DBT, SnapLogic, and PySpark. A background in the pharmaceutical industry. Proven success in managing distributed/offshore teams and client-side stakeholders. Excellent communication and stakeholder management skills, with an assertive More ❯
Employment Type: Contractor
Rate: £500 - £560 per day
Posted:

Junior Dat Developer

London, South East, England, United Kingdom
FDM Group
role that will be based in London. Our client is seeking a detail-oriented and dedicated DAT Developer to support day-to-day client requests within a multi-tenant, data-driven, cloud-based reporting platform. In this role, you’ll gather user requirements, adapt data ingestion methods to meet client needs, investigate data issues, and contribute … work closely with a collaborative team to ensure the system remains scalable and robust. The ideal candidate will have hands-on experience with Microsoft SQL Server development, ETL processes, data modelling, and version control systems. Experience in the financial services industry is a plus. Develop, implement, and optimise database objects to meet client-specific and system-wide requirements. Collaborate … quality support and solutions for clients. Create clear and detailed documentation and reports for both internal and client use. Develop and maintain procedures and scripts for release controls and data migrations. Provide ongoing data management and general product support to clients. Perform quality assurance testing to ensure data integrity and system reliability. About You Minimum of More ❯
Employment Type: Contractor
Rate: Competitive salary
Posted:

Technical Programme Manager

London, South East, England, United Kingdom
Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
with potential 3-month extensions and permanent option) Day Rate: £500-560pd (Outside IR35) Start Date: Immediate Overview You'll play a pivotal role in ensuring the successful ingestion of 500+ data sets into Snowflake, coordinating sprint-based delivery, managing cross-border teams, and maintaining high-quality communication with senior stakeholders across Data, IT, Compliance, and … Change. Key Responsibilities Oversee phased ingestion of 500+ data sources into the client's Snowflake environment Liaise between offshore engineering teams (based in Bangalore) and EU client stakeholders Monitor delivery timelines, report progress, flag risks, and ensure quality governance Work across multi-disciplinary teams including data, compliance, change, and technology Maintain momentum in delivery while ensuring stakeholder … expectations are managed Support issue identification and help shape future data opportunities within the platform What Success Looks Like Timely and accurate delivery of phased ingestion milestones Strong stakeholder alignment and minimal delivery blockers High-quality programme oversight across offshore/onshore boundaries Ability to push back where necessary and steer project direction assertively Foundation laid for long More ❯
Employment Type: Contractor
Rate: £500 - £560 per day
Posted:

Solutions Architect

London, United Kingdom
Staffworx Limited
the Braze platform, ensuring it aligns with business objectives and integrates seamlessly with our existing marketing technology stack. End-to-end implementation of Braze, including SDK integration, API connections, data ingestion, and the configuration of campaigns, Canvases, and Content Cards. Serve as the primary technical consultant for all things Braze, providing strategic advice and best practice recommendations to … marketing, product, and engineering teams. Collaborate with stakeholders to understand their needs and translate them into technical requirements and actionable use cases within Braze. Work closely with data and engineering teams to design and manage data flows, ensuring the right data is available in Braze to power personalised customer journeys. Essential: Hands-on experience with the Braze … architecting and leading at least two full-cycle Braze implementations. Proven experience with Docker, Kubernetes, Terraform, or similar IaC technologies Have experience with MongoDB, Redis, Kafka, Postgres, or similar data technologies Experience with data integration, including ETL processes and customer data platforms (CDPs). Comprehensive awareness of various marketing technology platforms like CRM, DAM, dynamic creative optimization More ❯
Employment Type: Contract
Posted:
Data Ingestion
10th Percentile
£400
25th Percentile
£450
Median
£500
75th Percentile
£598
90th Percentile
£684