services. Knowledge of using PySpark in notebooks for data analysis and manipulation. Strong proficiency with SQL and data modelling. Experience with modern ELT/ETL tools within the Microsoft ecosystem. Solid understanding of data lake and lakehouse architectures. Hands-on experience with Power BI for data integration and visualisation. Familiarity More ❯
and deliver sustainable solutions. Monitor and troubleshoot data pipeline issues to maintain data integrity and accuracy. Assist in the development, maintenance, and optimization of ETL (Extract, Transform, Load) processes for efficiency and reliability. Project & Improvement: Assist in gathering, documenting, and managing data engineering requirements and workflows. Contribute to the development … reviews of designs, prototypes, and other work products to ensure requirements are met. Skills & Experience: Essential: Basic understanding of data engineering concepts, such as ETL processes, data pipelines, and data quality management. Hands-on experience with SQL (e.g., writing queries, basic database management). Familiarity with data tools and platforms More ❯
IF YOU ARE Experienced with Python/PySpark Proficient working with Databricks Lakehouse architecture and principles Having 2+ years of designing data models, building ETL pipelines, and wrangling data to solve business problems Experienced with Azure cloud technologies Modern Data Estate such as Azure Data Factory, Azure DevOps, Azure Synapse More ❯
collaborate with many technology partners like AWS, GCP, Microsoft, Databricks, Snowflake, Confluent, and others. IF YOU ARE A technology professional focused on Data Warehouses, ETL, and BI solutions development Experienced in eliciting business requirements to address the customer’s data visualization needs Ready to dive into a customer's subject More ❯
public sector. Required education None Preferred education Bachelor's Degree Required technical and professional expertise Responsibilities: Design, build, and maintain scalable data pipelines andETL processes. Collaborate with data scientists and engineers to integrate complex data systems. Ensure data quality, accuracy, and reliability through testing and validation procedures. Develop andMore ❯
leicester, midlands, United Kingdom Hybrid / WFH Options
Ingentive
On a daily basis your varied role will include, but will not be limited to: Design, build, and optimize high-performance data pipelines andETL workflows using tools like Azure Synapse, Azure Databricks or Microsoft Fabric. Implement scalable solutions to ingest, store, andtransform vast datasets, ensuring data availability andMore ❯
reporting development (advanced/expert levels only). Key Tools & Technologies Visualization Tools: Tibco Spotfire, Tableau, Power BI, or QlikView Data Management & Querying: SQL, ETL pipelines, Data Warehousing (e.g., ODW) Scripting/Programming: Iron Python, R, R Shiny, SAS Collaboration & Platforms: SharePoint, clinical trial data platforms Qualifications Entry Level: Bachelor More ❯
to offices and client sites Employment: Permanent, Full-time What You'll Actually Be Doing: Designing, developing, and deploying robust Azure data solutions including ETL pipelines, data warehouses, and real-time analytics. No fluff, just solid engineering. Turning complex client requirements into clear, scalable Azure solutions alongside experienced Architects andMore ❯
practices, and best practices for data modeling, warehousing, and interoperability. Lead the evaluation and implementation of data tools, platforms, and technologies (e.g., cloud services, ETL tools, data lakes). Ensure data security, compliance, and privacy policies are embedded in all architecture designs. Provide technical guidance and mentorship to data engineers More ❯
meet current best practices and internal standards. Work closely with project managers and technical leads to integrate new enterprise data sources into ongoing projects. ETL Development Develop robust, automated ETL (Extract, Transform, Load) pipelines using industry-standard tools and frameworks, prioritizing scalability, reliability, and fault tolerance. Essential Skills & Experience Strong … ESRI, 3GIS, Bentley, Hexagon, Crescent Link, CadTel, etc.). Experience with business requirement analysis and the development of reporting and analytics structures. Familiarity with ETL solutions, including experience with SAFE FME, is highly desirable. Strong knowledge of data privacy regulations and practices. Exposure to analytics and reporting tools is considered More ❯
in: Azure Data & AI services (e.g., Azure Machine Learning, Azure OpenAI, Cognitive Services, Synapse) Programming with Python for data and AI workloads Data pipelines, ETL/ELT processes, and analytics foundations App development techniques to integrate AI capabilities Working in secure, enterprise-ready cloud environments Consulting fundamentals and effective customer More ❯
business/data analyst role, ideally in a consultancy or commercial setting. - Strong analytical, problem-solving, and communication skills. - Experience with operational data processes, ETL, data warehouse migration, schema mapping, and MI/BI reporting. - Proficient in tools such as JIRA, Confluence, Asana, Miro, and Excel. - Familiarity with Agile (SCRUM More ❯
shape something from the ground up — this is for you. What you’ll do: Design and build a cloud-native data warehouse Develop scalable ETL/ELT pipelines and dimensional models (Kimball, Data Vault, etc.) Integrate multiple data sources (cloud & on-prem) Ensure high data quality, performance and reliability Collaborate More ❯
data sets, produce dashboards and drive actionable insights. SQL Development: Write and optimise complex Microsoft SQL Server queries for data extraction, transformation and loading (ETL). Data Governance: Implement master data management and governance policies to maintain data quality, compliance and lineage. Stakeholder Management: Communicate effectively with project managers andMore ❯
leicester, midlands, United Kingdom Hybrid / WFH Options
Ocho
of opportunities for ownership and innovation. Key Responsibilities Design and deploy cloud-based data platforms using Snowflake, DBT, and related tools Develop performant, scalable ETL/ELT pipelines across varied data sources Build and maintain dimensional models using Kimball, Data Vault, or Data Mesh methodologies Collaborate with cross-functional teams More ❯
leicester, midlands, United Kingdom Hybrid / WFH Options
Bison Global Technology Search
needs and business strategies. Cloud Architecture: Design and implement scalable, cost-effective architectures across AWS, Azure, or GCP environments. Data Engineering: Develop and manage ETL/ELT pipelines, data integration workflows, and implement CDC and delta load design for effective data management. SAP Tools: Lead SAP BW to SAP Datasphere More ❯
leicester, midlands, United Kingdom Hybrid / WFH Options
Realtime Recruitment
Knowledge (Snowflake, PostgreSQL, MSSQL, etc.). Scripting languages (Python, PowerShell). Data quality and resolving related processes Cloud Data Warehousing solutions and ELT/ETL solutions (e.g., Snowflake, DBT). Experience in working within an Agile environment. Experience with Automation, including Unit and Integration Testing. Knowledge of Cloud Concepts. For More ❯
leicester, midlands, United Kingdom Hybrid / WFH Options
Harnham
and dynamic team. Responsibilities: Collaborate with various squads within the data team on project-based work. Develop and optimize data models, warehouse solutions, andETL processes. Work with Scala, Spark, and Java to handle large-scale data processing. Contribute to manual Databricks-like data processing solutions. Requirements: Minimum of More ❯
Exposure to multi-platform integration (MS tools preferred). On premise SQL environments, legacy SSIS, SSRS and other SQL-related technologies employed in complex ETL or ELT patterns Synapse Link for Dataverse Dataverse, Data Flows, Cloud Flows, DAX and Power Platform implementations Synchronisation methods for Synapse and Fabric from D365 More ❯
experience writing technical documentation Comfortable working with Agile, TOGAF, or similar frameworks Desirable: Experience with Python and data libraries (Pandas, Scikit-learn) Knowledge of ETL tools (Airflow, Talend, NiFi) Familiarity with analytics platforms (SAS, Posit) Prior work in high-performance or large-scale data environments Why Join? This is more More ❯
leicester, midlands, United Kingdom Hybrid / WFH Options
Bright Purple
and working with Azure (Azure DevOps, Azure ML, Azure SQL, ADF etc.) Able to work with databases like SQL Server, NoSQL, etc Experience with ETL process, data pipelines, Devops This role is hybrid, with requirement to be in one of their UK offices in either Scotland or England on occasion More ❯
leicester, midlands, United Kingdom Hybrid / WFH Options
Eutopia Solutions
delivery approach within a DevOps environment Extensive experience in creating technical specifications and code for data migration Proven expertise in extracting, transforming, and loading (ETL) financial data during migration from on-premise systems to the cloud Demonstrated ability to track, report, and improve data migration quality metrics It would be More ❯
an agile framework preferable, including defining functional and non-functional requirements and sprint tasks. Understanding of data engineering, some experience with building production-grade ETL pipelines, as well as backend web development, backend-for-frontend, GraphQL, and FastAPI. Strong communication skills, able to communicate with both technical and commercial people. More ❯
Coalville, Leicestershire, East Midlands, United Kingdom Hybrid / WFH Options
Ibstock PLC
Accountabilities: Ensure the existing design, development, and expansion of a near real-time data platform integrating AWS, Databricks and on-prem JDE systems. Develop ETL processes to integrate andtransform data from multiple sources into a centralised data platform. Develop and optimise complex queries and data pipelines. Optimise and manage … data requirements. Design and build interactive and insightful dashboards and reports for internal and external stakeholders. Develop and maintain comprehensive documentation for data models, ETL processes, and BI solutions. Ensure data accuracy, integrity, and consistency across the data platform. Knowledge, Skills and Experience: Essentia l Strong expertise in Databricks and … analytics. Proficient in SQL and Python/PySpark for data transformation and analysis. Experience in data lakehouse development and Delta Lake optimisation. Experience with ETL/ELT processes for integrating diverse data sources. Experience in gathering, documenting, and refining requirements from key business stakeholders to align BI solutions with business More ❯
They utilize tools such as Informatica, Ab Initio software, and DataStage (formerly Ascential) - IBM's WebSphere Data Integration Suite. This role involves creating and implementing Extract, Transform, andLoad (ETL) processes, ensuring the seamless flow of data throughout the business intelligence solution's lifecycle. Required education None Preferred education Bachelor More ❯