Contract Data Ingestion Jobs

1 to 5 of 5 Contract Data Ingestion Jobs

Data Engineer (python and azure) - Remote - £500 per day - Inside IR35

Nationwide, United Kingdom
Hybrid / WFH Options
Exalto Consulting
Data engineer - Python - Azure - Inside IR35 - £500 per day - 6 months Exalto Consulting are currently recruiting for a contract Data Engineer for a client, 100% remote working inside IR35 paying £500 per day, initially 6 months Skills required for the role: Big data experience Data inventory … and data familiarisation Efficient data ingestion and ingestion pipelines Data cleaning and transformation Databricks (ideally with Unity Catalog) Python and PySpark CI/CD (ideally with Azure Devops) Unit testing (PyTest) If you have the above experience and are looking for a new contract role … please send your CV for immediate consideration as our client are looking to hire ASAP Data engineer - Python - Azure - Inside IR35 - £500 per day - 6 months more »
Employment Type: Contract
Rate: £500 - £510/day £500 per day
Posted:

Data Engineer

City of London, Cordwainer, United Kingdom
Xpertise Recruitment
Azure Data Engineer. Contract. Remote Python/PyTest/Azure/Databricks Notebooks Our client, a leading consultancy are looking to bring in an Azure Data Engineer to support their data transformation and wondered if you could be open to a chat? If so, please find more … contract, Location: Remote with very occasional travel to any of their offices across the UK (Scotland, Manchester, Midlands, London, Bristol + others) Experience required: Data inventory and data familiarisation Efficient data ingestion and ingestion pipelines Data cleaning and transformation Databricks (ideally with Unity Catalog more »
Employment Type: Contract
Rate: £500 - £501/day
Posted:

Senior Golang Developer

City, London, United Kingdom
Talent International
London office on a hybrid basis. As a Contract Senior Golang Developer, you will play a crucial role in assessing, analysing, and enhancing our data ingestion, transformation, and storage layers. Your primary focus will be on Go Programming and DBT, with secondary skills in Google Cloud Services. The … successful candidate will bring 5 to 10 years of experience in developing robust data pipelines, conducting unit testing, and providing support for production deployments. Key Responsibilities: Assess and analyse data ingestion, transformation, and storage layers, including RAW, Exploratory, Curated, and application layers. Understand existing codebase and make … alterations or fixes as per design requirements. Develop and optimize data ingestion pipelines using Go Programming and DBT. Conduct unit testing and validation of the developed code to ensure reliability and performance. Collaborate with cross-functional teams to support production deployments and resolve any issues that may arise. more »
Employment Type: Contract
Rate: GBP Daily
Posted:

Data Engineer

London, United Kingdom
Movement8 Ltd
Azure Data Engineer - Contract - Inside IR35 This is an Azure Data Engineer position for a financial services end client helping them on delivering and implementing new data pipelines. You will need prior financial services experience as well as having strong Azure Data Factory and Databricks experience. … Work Involved: Implementing and delivering data pipelines Providing mentorship to junior members of the team Working with the Platform team defining capabilities for the lake framework Developing business conceptual data models Producing new frameworks using design patterns and data ingestion Requirements: Strong Data Pipeline experience … Databricks Azure Data Factory Spark Azure cloud Financial Services If you're interested get in touch ASAP more »
Employment Type: Contract
Rate: £500 - £550/day
Posted:

Kafka Admin (polish based)

Warsaw, Poland
Brabers Consultancy
maintenance of our Kafka ecosystem, ensuring its scalability, reliability, and performance to meet the evolving needs of our organization. You will collaborate closely with data engineers, software developers, and other stakeholders to architect robust solutions and streamline data pipelines. Your key responsibilities will include: Installing, configuring, and managing … health, performance metrics, and throughput, and proactively identifying and addressing potential bottlenecks or issues. Implementing security measures, access controls, and encryption protocols to safeguard data privacy and integrity within the Kafka ecosystem. Managing Kafka topics, partitions, replication, and consumer groups, and optimizing configurations for efficient resource utilization and high … availability. Collaborating with cross-functional teams to design and implement data ingestion pipelines, Real Time processing workflows, and event-driven architectures. Performing capacity planning, scaling, and disaster recovery planning to ensure scalability, fault tolerance, and business continuity. Automating routine tasks, such as cluster provisioning, deployment, monitoring, and alerting more »
Employment Type: Contract
Rate: GBP Annual
Posted:
Data Ingestion
10th Percentile
£400
25th Percentile
£425
Median
£525
75th Percentile
£613
90th Percentile
£732