Data engineer - Python - Azure - Inside IR35 - £500 per day - 6 months Exalto Consulting are currently recruiting for a contract Data Engineer for a client, 100% remote working inside IR35 paying £500 per day, initially 6 months Skills required for the role: Big data experience Data inventory … and data familiarisation Efficient dataingestion and ingestion pipelines Data cleaning and transformation Databricks (ideally with Unity Catalog) Python and PySpark CI/CD (ideally with Azure Devops) Unit testing (PyTest) If you have the above experience and are looking for a new contract role … please send your CV for immediate consideration as our client are looking to hire ASAP Data engineer - Python - Azure - Inside IR35 - £500 per day - 6 months more »
Azure Data Engineer. Contract. Remote Python/PyTest/Azure/Databricks Notebooks Our client, a leading consultancy are looking to bring in an Azure Data Engineer to support their data transformation and wondered if you could be open to a chat? If so, please find more … contract, Location: Remote with very occasional travel to any of their offices across the UK (Scotland, Manchester, Midlands, London, Bristol + others) Experience required: Data inventory and data familiarisation Efficient dataingestion and ingestion pipelines Data cleaning and transformation Databricks (ideally with Unity Catalog more »
London office on a hybrid basis. As a Contract Senior Golang Developer, you will play a crucial role in assessing, analysing, and enhancing our dataingestion, transformation, and storage layers. Your primary focus will be on Go Programming and DBT, with secondary skills in Google Cloud Services. The … successful candidate will bring 5 to 10 years of experience in developing robust data pipelines, conducting unit testing, and providing support for production deployments. Key Responsibilities: Assess and analyse dataingestion, transformation, and storage layers, including RAW, Exploratory, Curated, and application layers. Understand existing codebase and make … alterations or fixes as per design requirements. Develop and optimize dataingestion pipelines using Go Programming and DBT. Conduct unit testing and validation of the developed code to ensure reliability and performance. Collaborate with cross-functional teams to support production deployments and resolve any issues that may arise. more »
Azure Data Engineer - Contract - Inside IR35 This is an Azure Data Engineer position for a financial services end client helping them on delivering and implementing new data pipelines. You will need prior financial services experience as well as having strong Azure Data Factory and Databricks experience. … Work Involved: Implementing and delivering data pipelines Providing mentorship to junior members of the team Working with the Platform team defining capabilities for the lake framework Developing business conceptual data models Producing new frameworks using design patterns and dataingestion Requirements: Strong Data Pipeline experience … Databricks Azure Data Factory Spark Azure cloud Financial Services If you're interested get in touch ASAP more »
maintenance of our Kafka ecosystem, ensuring its scalability, reliability, and performance to meet the evolving needs of our organization. You will collaborate closely with data engineers, software developers, and other stakeholders to architect robust solutions and streamline data pipelines. Your key responsibilities will include: Installing, configuring, and managing … health, performance metrics, and throughput, and proactively identifying and addressing potential bottlenecks or issues. Implementing security measures, access controls, and encryption protocols to safeguard data privacy and integrity within the Kafka ecosystem. Managing Kafka topics, partitions, replication, and consumer groups, and optimizing configurations for efficient resource utilization and high … availability. Collaborating with cross-functional teams to design and implement dataingestion pipelines, Real Time processing workflows, and event-driven architectures. Performing capacity planning, scaling, and disaster recovery planning to ensure scalability, fault tolerance, and business continuity. Automating routine tasks, such as cluster provisioning, deployment, monitoring, and alerting more »