services. Knowledge of using PySpark in notebooks for data analysis and manipulation. Strong proficiency with SQL and data modelling. Experience with modern ELT/ETL tools within the Microsoft ecosystem. Solid understanding of data lake and lakehouse architectures. Hands-on experience with Power BI for data integration and visualisation. Familiarity More ❯
IF YOU ARE Experienced with Python/PySpark Proficient working with Databricks Lakehouse architecture and principles Having 2+ years of designing data models, building ETL pipelines, and wrangling data to solve business problems Experienced with Azure cloud technologies Modern Data Estate such as Azure Data Factory, Azure DevOps, Azure Synapse More ❯
collaborate with many technology partners like AWS, GCP, Microsoft, Databricks, Snowflake, Confluent, and others. IF YOU ARE A technology professional focused on Data Warehouses, ETL, and BI solutions development Experienced in eliciting business requirements to address the customer’s data visualization needs Ready to dive into a customer's subject More ❯
leicester, midlands, United Kingdom Hybrid / WFH Options
Intec Select
and deliver sustainable solutions. Monitor and troubleshoot data pipeline issues to maintain data integrity and accuracy. Assist in the development, maintenance, and optimization of ETL (Extract, Transform, Load) processes for efficiency and reliability. Project & Improvement: Assist in gathering, documenting, and managing data engineering requirements and workflows. Contribute to the development … reviews of designs, prototypes, and other work products to ensure requirements are met. Skills & Experience: Essential: Basic understanding of data engineering concepts, such as ETL processes, data pipelines, and data quality management. Hands-on experience with SQL (e.g., writing queries, basic database management). Familiarity with data tools and platforms More ❯
public sector. Required education None Preferred education Bachelor's Degree Required technical and professional expertise Responsibilities: Design, build, and maintain scalable data pipelines andETL processes. Collaborate with data scientists and engineers to integrate complex data systems. Ensure data quality, accuracy, and reliability through testing and validation procedures. Develop andMore ❯
leicester, midlands, United Kingdom Hybrid / WFH Options
Ingentive
On a daily basis your varied role will include, but will not be limited to: Design, build, and optimize high-performance data pipelines andETL workflows using tools like Azure Synapse, Azure Databricks or Microsoft Fabric. Implement scalable solutions to ingest, store, andtransform vast datasets, ensuring data availability andMore ❯
reporting development (advanced/expert levels only). Key Tools & Technologies Visualization Tools: Tibco Spotfire, Tableau, Power BI, or QlikView Data Management & Querying: SQL, ETL pipelines, Data Warehousing (e.g., ODW) Scripting/Programming: Iron Python, R, R Shiny, SAS Collaboration & Platforms: SharePoint, clinical trial data platforms Qualifications Entry Level: Bachelor More ❯
to offices and client sites Employment: Permanent, Full-time What You'll Actually Be Doing: Designing, developing, and deploying robust Azure data solutions including ETL pipelines, data warehouses, and real-time analytics. No fluff, just solid engineering. Turning complex client requirements into clear, scalable Azure solutions alongside experienced Architects andMore ❯
meet current best practices and internal standards. Work closely with project managers and technical leads to integrate new enterprise data sources into ongoing projects. ETL Development Develop robust, automated ETL (Extract, Transform, Load) pipelines using industry-standard tools and frameworks, prioritizing scalability, reliability, and fault tolerance. Essential Skills & Experience Strong … ESRI, 3GIS, Bentley, Hexagon, Crescent Link, CadTel, etc.). Experience with business requirement analysis and the development of reporting and analytics structures. Familiarity with ETL solutions, including experience with SAFE FME, is highly desirable. Strong knowledge of data privacy regulations and practices. Exposure to analytics and reporting tools is considered More ❯
in: Azure Data & AI services (e.g., Azure Machine Learning, Azure OpenAI, Cognitive Services, Synapse) Programming with Python for data and AI workloads Data pipelines, ETL/ELT processes, and analytics foundations App development techniques to integrate AI capabilities Working in secure, enterprise-ready cloud environments Consulting fundamentals and effective customer More ❯
business/data analyst role, ideally in a consultancy or commercial setting. - Strong analytical, problem-solving, and communication skills. - Experience with operational data processes, ETL, data warehouse migration, schema mapping, and MI/BI reporting. - Proficient in tools such as JIRA, Confluence, Asana, Miro, and Excel. - Familiarity with Agile (SCRUM More ❯
OAC). Should be aware about process of creating Semantic data model in FDI. Proven experience in data modeling, data extraction, transformation, and loading (ETL) using Oracle Cloud tools Hands-on experience in creating dashboards, KPIs, custom reports, and visual analytics for Finance, Supply Chain, and HCM modules Strong understanding More ❯
Solid understanding of investment data flows and financial instruments across asset classes. Hands-on proficiency with tools and languages such as SQL, Python, andETL/data integration platforms. Strong communication, presentation, and stakeholder engagement capabilities. Advanced skills in MS Office Suite and the creation of professional deliverables. More ❯
shape something from the ground up — this is for you. What you’ll do: Design and build a cloud-native data warehouse Develop scalable ETL/ELT pipelines and dimensional models (Kimball, Data Vault, etc.) Integrate multiple data sources (cloud & on-prem) Ensure high data quality, performance and reliability Collaborate More ❯
data sets, produce dashboards and drive actionable insights. SQL Development: Write and optimise complex Microsoft SQL Server queries for data extraction, transformation and loading (ETL). Data Governance: Implement master data management and governance policies to maintain data quality, compliance and lineage. Stakeholder Management: Communicate effectively with project managers andMore ❯
leicester, midlands, United Kingdom Hybrid / WFH Options
Ocho
of opportunities for ownership and innovation. Key Responsibilities Design and deploy cloud-based data platforms using Snowflake, DBT, and related tools Develop performant, scalable ETL/ELT pipelines across varied data sources Build and maintain dimensional models using Kimball, Data Vault, or Data Mesh methodologies Collaborate with cross-functional teams More ❯
leicester, midlands, United Kingdom Hybrid / WFH Options
Bison Global Technology Search
needs and business strategies. Cloud Architecture: Design and implement scalable, cost-effective architectures across AWS, Azure, or GCP environments. Data Engineering: Develop and manage ETL/ELT pipelines, data integration workflows, and implement CDC and delta load design for effective data management. SAP Tools: Lead SAP BW to SAP Datasphere More ❯
leicester, midlands, United Kingdom Hybrid / WFH Options
Realtime Recruitment
Knowledge (Snowflake, PostgreSQL, MSSQL, etc.). Scripting languages (Python, PowerShell). Data quality and resolving related processes Cloud Data Warehousing solutions and ELT/ETL solutions (e.g., Snowflake, DBT). Experience in working within an Agile environment. Experience with Automation, including Unit and Integration Testing. Knowledge of Cloud Concepts. For More ❯
leicester, midlands, United Kingdom Hybrid / WFH Options
Harnham
and dynamic team. Responsibilities: Collaborate with various squads within the data team on project-based work. Develop and optimize data models, warehouse solutions, andETL processes. Work with Scala, Spark, and Java to handle large-scale data processing. Contribute to manual Databricks-like data processing solutions. Requirements: Minimum of More ❯
Exposure to multi-platform integration (MS tools preferred). On premise SQL environments, legacy SSIS, SSRS and other SQL-related technologies employed in complex ETL or ELT patterns Synapse Link for Dataverse Dataverse, Data Flows, Cloud Flows, DAX and Power Platform implementations Synchronisation methods for Synapse and Fabric from D365 More ❯
pipelines, ensuring data quality, and supporting a reliable, cloud-native data platform. What You’ll Do: Design, build, and maintain scalable data pipelines andETL/ELT workflows Work with Snowflake , SQL , and Python to transformand manage large datasets Leverage AWS services (e.g., S3, Lambda, Glue) to build modern More ❯
leicester, midlands, United Kingdom Hybrid / WFH Options
Bright Purple
and working with Azure (Azure DevOps, Azure ML, Azure SQL, ADF etc.) Able to work with databases like SQL Server, NoSQL, etc Experience with ETL process, data pipelines, Devops This role is hybrid, with requirement to be in one of their UK offices in either Scotland or England on occasion More ❯
delivery approach within a DevOps environment Extensive experience in creating technical specifications and code for data migration Proven expertise in extracting, transforming, and loading (ETL) financial data during migration from on-premise systems to the cloud Demonstrated ability to track, report, and improve data migration quality metrics It would be More ❯
an agile framework preferable, including defining functional and non-functional requirements and sprint tasks. Understanding of data engineering, some experience with building production-grade ETL pipelines, as well as backend web development, backend-for-frontend, GraphQL, and FastAPI. Strong communication skills, able to communicate with both technical and commercial people. More ❯
Key aspects of the role include: Data Pipeline Development: Architect, develop, and optimise data extraction, transformation, and loading (ETL) processes for high-volume healthcare data. Integration of NHS Systems Data: Work directly with NHS data sources and ensure seamless integration of systems like UDAL, NCDR, SUS+, HES, and ODS into … and ODS—and to address the unique challenges these datasets present. Data Engineering Proficiency: Advanced proficiency in building and maintaining robust data pipelines using ETL tools and frameworks. Familiarity with data warehousing concepts and experience in developing scalable, high-performance data architectures. Technical Expertise: Strong command of SQL and scripting More ❯