Newbury, Berkshire, United Kingdom Hybrid / WFH Options
Viavi
frameworks (PyTorch, TensorFlow) Solid understanding of CI/CD, Git, testing, and agile methodologies Hands-on experience with Linux, Docker, and containerized deployments Familiarity with data engineering concepts (SQL, ETL, data lakes) Experience with orchestration frameworks like LangChain, LangGraph, or Semantic Kernel Cloud deployment experience (AWS, GCP, Azure) Work on high-impact AI projects that shape the future of telecoms More ❯
Job Responsibilities Leading Partner with CPG stakeholders to align data engineering solutions with supply chain, sales, and marketing priorities. Define solution architectures and oversee end-to-end data pipelines, ETL/ELT processes, and data platform implementations. Guide teams in applying best practices across Python, SQL, Databricks, and Delta Lake to deliver high-performance outcomes. Manage client programmes, ensuring projects More ❯
Job Responsibilities Leading Partner with CPG stakeholders to align data engineering solutions with supply chain, sales, and marketing priorities. Define solution architectures and oversee end-to-end data pipelines, ETL/ELT processes, and data platform implementations. Guide teams in applying best practices across Python, SQL, Databricks, and Delta Lake to deliver high-performance outcomes. Manage client programmes, ensuring projects More ❯
london (city of london), south east england, united kingdom
MathCo
Job Responsibilities Leading Partner with CPG stakeholders to align data engineering solutions with supply chain, sales, and marketing priorities. Define solution architectures and oversee end-to-end data pipelines, ETL/ELT processes, and data platform implementations. Guide teams in applying best practices across Python, SQL, Databricks, and Delta Lake to deliver high-performance outcomes. Manage client programmes, ensuring projects More ❯
South East London, London, United Kingdom Hybrid / WFH Options
Stepstone UK
in event-driven and eventual consistency systems using Kafka, .Net, Java, REST APIs, AWS, Terraform and DevOps Nice to have:experience in data pipping and modelling using SQL, DBT, ETL, Data Warehousing, Redshift and Python, and ecommerce and mobile applications background Additional Information Were a community here that cares as much about your life outside work as how you feel More ❯
Stroud, England, United Kingdom Hybrid / WFH Options
Ecotricity
as a Data Analyst Managing and manipulating data experience Strong working knowledge of SQL for data analysis (MS-SQL/Databricks - T-SQL, View and table design, Experience of ETL) Expert with PowerBI Advanced MS Excel (Power Query, Power Pivots) Stakeholder management and facilitation of decisions of all sizes Strong presentation skills for engaging senior stakeholders Ability to handle high More ❯
software engineering experience. Experience in full-stack development, with working knowledge of frontend (e.g., JavaScript) and backend (e.g., Python, Java) technologies. Experience with databases, data analytics (SQL/NoSQL, ETL/ELT), and APIs (REST, GraphQL). Experience working on cloud-native architectures in a public cloud environment, ideally AWS. Strong oral and written communication skills and the ability to More ❯
with senior stakeholders. Architect & Build Scalable Data Solutions Collaborate closely with senior product stakeholders to understand data needs and architect end-to-end ingestion pipelines Design and build robust ETL/ELT processes and data architectures using modern tools and techniques Lead database design, data modelling, and integration strategies to support analytics at scale Drive Data Integration & Management Design and … of software engineering best practices - code reviews, testing frameworks, CI/CD, and code maintainability Experience deploying applications into production environments, including packaging, monitoring, and release management Ability to extract insights from complex and disparate data sets and communicate clearly with stakeholders Hands-on experience with cloud platforms such as AWS, Azure, or GCP Familiarity with traditional ETL tools (e.g. More ❯
Uckfield, East Sussex, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
and non-technical audiences What You'll Bring Strong experience in Power BI , SQL , and DAX Hands-on experience with Azure Cloud Services and Azure Data Factory Knowledge of ETL processes and working with multiple data sources (Azure, SQL, Excel, etc.) Excellent attention to detail and a proactive approach to solving problems Experience working in an agile, fast-paced environment More ❯
performance-optimized schema design. Advanced knowledge of Customer Data Platforms (Segment.io) , including event stream management, identity resolution, and building a "Golden Profile." Hands-on experience designing and implementing Reverse ETL pipelines (e.g., with Census or Hightouch) to activate customer data in marketing systems like Iterable or CRMs. Strong collaboration skills, able to partner with architects and stakeholders to translate business More ❯
Azure-based architecture, with a focus on performance, scalability, and reliability. Responsibilities Design and implement robust data migration pipelines using Azure Data Factory, Synapse Analytics, and Databricks Develop scalable ETL processes using PySpark and Python Collaborate with stakeholders to understand legacy data structures and ensure accurate mapping and transformation Ensure data quality, governance, and performance throughout the migration lifecycle Document More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
ABS Group Events Limited T/A ABS Talent
with SQL (across common RDBMS platforms). Solid knowledge of Python for data engineering tasks; interest in AI/ML. Hands-on experience with Alteryx or similar workflow/ETL tools. Experience with servers and managing data in a data lake environment. Good communication skills and ability to work with both technical and business users. Working Pattern Based at Whitefield More ❯
Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
Corecom Consulting
looking for: Proven experience in data engineering at a senior level Strong hands-on knowledge of Databricks Experience with Azure or AWS cloud platforms Expertise in data warehousing andETL processes Ability to work both independently and as part of a collaborative team Why join us? £65,000 salary Flexible hybrid working (2 days a week in our Leeds office More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
Azure-based architecture, with a focus on performance, scalability, and reliability. Responsibilities Design and implement robust data migration pipelines using Azure Data Factory, Synapse Analytics, and Databricks Develop scalable ETL processes using PySpark and Python Collaborate with stakeholders to understand legacy data structures and ensure accurate mapping and transformation Ensure data quality, governance, and performance throughout the migration lifecycle Document More ❯
and implementation of low-code applications across the Microsoft Power Platform - Improving efficiency through automating solutions - Driving AI strategy - Oversee system enhancements and API integrations - Oversee data engineering process - ETL/ELT pipelines with Azure Data Factory - Drive development of Snowflake Data Warehouse - Team management and leading Data Engineers/Developers - Digital roadmap development - IT projects To be considered suitable More ❯
various data modelling techniques and tools. Deep understanding of different data storage solutions (e.g., relational, NoSQL, data lakes) and their appropriate use cases. Strong knowledge of data integration patterns, ETL/ELT processes, and data pipeline orchestration. Experience with AWS data platforms and their respective data services. Solid understanding of data governance principles, including data quality, metadata management, and access More ❯
various data modelling techniques and tools. Deep understanding of different data storage solutions (e.g., relational, NoSQL, data lakes) and their appropriate use cases. Strong knowledge of data integration patterns, ETL/ELT processes, and data pipeline orchestration. Experience with AWS data platforms and their respective data services. Solid understanding of data governance principles, including data quality, metadata management, and access More ❯
hoc data requests and provide technical expertise to stakeholders. Profile A successful Data Engineer should have: Proven experience with data engineering tools and technologies. Strong knowledge of database systems, ETL processes, and data modelling. Proficiency in programming languages such as Python, SQL, or similar. Experience with cloud-based data platforms and services. Ability to work collaboratively in a team environment More ❯
tools and dashboards to communicate key partner data Handle and scope new data requirements from external partners Investigate and resolve data issues (e.g., product availability errors) Build and manage ETL pipelines and APIs to improve data delivery and accuracy Contribute to the overall data strategy alongside BI and analytics colleagues Tech You’ll Use SQL Python Databricks Experience with web More ❯
tools and dashboards to communicate key partner data Handle and scope new data requirements from external partners Investigate and resolve data issues (e.g., product availability errors) Build and manage ETL pipelines and APIs to improve data delivery and accuracy Contribute to the overall data strategy alongside BI and analytics colleagues Tech You’ll Use SQL Python Databricks Experience with web More ❯
tools and dashboards to communicate key partner data Handle and scope new data requirements from external partners Investigate and resolve data issues (e.g., product availability errors) Build and manage ETL pipelines and APIs to improve data delivery and accuracy Contribute to the overall data strategy alongside BI and analytics colleagues Tech You’ll Use SQL Python Databricks Experience with web More ❯
Design end-to-end technical solutions integrating infrastructure and data components Architect virtualized infrastructure solutions and data integration strategies Create architectural documentation, diagrams, and technical specifications Design data pipelines, ETL processes, and BI solutions Provide technical guidance to development and infrastructure teams Engage with stakeholders to translate business requirements into technical solutions Essential Skills & Experience SQL: Advanced database design, optimization More ❯
Telford, Shropshire, West Midlands, United Kingdom
Sanderson Recruitment
please note this is not a typical Data Engineering role. SAS 9.4 (DI), SAS Viya 3.x (SAS Studio, VA, VI) Platform LSF, Jira, Platform Support GIT Strong expertise in ETL tools: Pentaho, Talend Experience with data virtualization using Denodo Proficiency in SAS for data analytics and reporting Oracle (good to have) Solid understanding of Agile and Scrum frameworks Hands-on More ❯
play a key role in developing and delivering advanced AI solutions for a Government client . Responsibilities include: Designing, building, and maintaining data processing pipelines using Apache Spark Implementing ETL/ELT workflows for large-scale data sets Developing and optimising Python-based data ingestion tools Collaborating on the design and deployment of machine learning models Ensuring data quality, integrity More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Next Best Move
in Microsoft Office skills, including Outlook, Word, Excel and PowerPoint and other applications such as Microsoft 365, SharePoint, Teams and OneDrive. Power Apps and Power Automate experience. Experience in ETL tools, such as SSIS. Business Central experience. Good knowledge and Development experience of MS SQL. Experience in creating UAT scripts. Technical writing experience with the ability to present technical information More ❯