Milton Keynes, Buckinghamshire, United Kingdom Hybrid / WFH Options
The Boeing Company
of MOD and Government based programs. Provide technical expertise in a variety of technologies to a multi-site program team. The successful candidate will be involved in the design, development, testing, implementation and support of a sophisticated integration solution and management of associated system components. You will possess technical experience of SAP Data Services, ideally with experience in designing … and developing integration applications using the product. The position requires cross team coordination and leadership with separate, but tightly coupled developmentand architecture teams, infrastructure support teams, key suppliers, and other Boeing programs … applying the same technologies. Preferred Skills/Experience: The ideal candidate we're looking for has the following skills; ETLDevelopment: Design and implement Extract, Transform, Load (ETL) processes using SAP Data Services. Data Quality Management: Develop and implement data quality rules and validation processes. Job Scheduling: Create and manage job schedules for data integration tasks. Performance More ❯
pipelines and architectures, ideally integrating data from web analytics, content management systems (CMS), subscription platforms, ad tech, and social media. Ability to automate and optimise data workflows, using modern ETL/ELT tools (e.g., Airflow, dbt, Apache Spark) to ensure timely and reliable delivery of data. Experience building robust data models and reporting layers to support performance dashboards, user … meet functional requirements. Process Automation and Optimisation: Identify, design, and implement improvements to automate manual processes, enhance data delivery performance, and re-architect infrastructure for improved scalability and resilience. ETLDevelopmentand Infrastructure Building: Build and manage the infrastructure necessary for optimal ETL or ELT of data using Python, SQL, and Google Cloud Platform (GCP) big data … issues and uncover opportunities for operational or strategic improvements. Unstructured Data Handling: Capability for working with unstructured and semi-structured datasets, transforming raw information into actionable insights. Data Workflow Development: Skilled in developing and maintaining data transformation processes, managing data structures, metadata, workload dependencies, and orchestration frameworks. Large-scale Data Processing: A demonstrated history of manipulating, processing, and extracting More ❯
pipelines and architectures, ideally integrating data from web analytics, content management systems (CMS), subscription platforms, ad tech, and social media. Ability to automate and optimise data workflows, using modern ETL/ELT tools (e.g., Airflow, dbt, Apache Spark) to ensure timely and reliable delivery of data. Experience building robust data models and reporting layers to support performance dashboards, user … meet functional requirements. Process Automation and Optimisation: Identify, design, and implement improvements to automate manual processes, enhance data delivery performance, and re-architect infrastructure for improved scalability and resilience. ETLDevelopmentand Infrastructure Building: Build and manage the infrastructure necessary for optimal ETL or ELT of data using Python, SQL, and Google Cloud Platform (GCP) big data … issues and uncover opportunities for operational or strategic improvements. Unstructured Data Handling: Capability for working with unstructured and semi-structured datasets, transforming raw information into actionable insights. Data Workflow Development: Skilled in developing and maintaining data transformation processes, managing data structures, metadata, workload dependencies, and orchestration frameworks. Large-scale Data Processing: A demonstrated history of manipulating, processing, and extracting More ❯
Datasphere, and integration of water utility data. Key Responsibilities Develop and maintain data models using SAP BW/4HANA , CDS Views , ADSO , and Composite Providers . Build and optimise ETL data flows from SAP S/4HANA , legacy systems , and real-time data sources (e.g. SCADA, telemetry). Deliver robust BI reporting layers to support business-critical dashboards (Power … as a Data Engineer or BI Developer on SAP S/4HANA programmes . Deep understanding of SAP BW/4HANA , HANA modelling , and CDS views . Hands-on ETLdevelopment experience using SLT , SDI , or third-party tools. Proficiency in SQL , ABAP , or Python for data transformation & automation. Background in utility or regulated industries (ideally water) highly More ❯
Newcastle Upon Tyne, Tyne and Wear, North East, United Kingdom Hybrid / WFH Options
Reed Technology
that's driving transformation through data. This is a unique opportunity to step into a role originally designed for a niche integration specialist - now reimagined for someone with strong ETL, API, and data pipeline experience who's ready to innovate. Senior Data Integration Developer Location: Hybrid/North East England Salary: £47,000 - £54,000 The Opportunity You'll … lead the design, development, and optimisation of data integration flows across a complex digital ecosystem. Working with modern standards like REST and XML, you'll ensure seamless data exchange between internal systems and third-party platforms. You'll also collaborate closely with data platform teams to align on data standards and architecture. This role is ideal for someone who … Join Us? You'll be part of a digitally ambitious organisation that invests in its people and technology. Expect: Generous holiday entitlement Flexible working options Access to training anddevelopment Staff perks including salary-sacrifice schemes and wellbeing support More ❯
ideally within internal consultancy or transformation environments. Strong experience with PostgreSQL , Power BI , and low-code platforms (Budibase or similar). Solid programming skills, preferably in Python , especially for ETLdevelopmentand support. Proficiency with version control systems (e.g., GitHub ), with an understanding of best practices for collaboration, review, and deployment. Familiarity with REST APIs , including how to … Familiarity with cloud data infrastructure (e.g. Azure, AWS). Background in business process design or improvement . And what do we have to offer you? Opportunities for continuous professional development thanks to a company focused on career-long learning and training. Access to innovative projects, offering a stimulating and rewarding experience. A corporate culture based on collaboration, mutual help More ❯
well-established knowledge-sharing community. What you'll do Design and implement data solutions using Snowflake across cloud platforms (Azure and AWS) Build and maintain scalable data pipelines andETL processes Optimise data models, storage, and performance for analytics and reporting Ensure data integrity, security, and best practice compliance Serve as a subject matter expert in Snowflake engineering efforts … you'll need Proven experience designing and delivering enterprise-scale data warehouse solutions using Snowflake In-depth understanding of Snowflake architecture, performance optimisation, and best practices Strong experience in ETLdevelopment, data modelling, and data integration Proficient in SQL, Python, and/or Java for data processing Hands-on experience with Azure and AWS cloud environments Familiarity with More ❯
our teams to make rapid, informed decisions that enhance overall performance. Responsibilities include, but are not limited to: Maintain existing and develop new features of the data platform. Deliver development projects efficiently. Apply established software engineering practices and principles. Take ownership of BAU processes and develop domain expertise. Ensure compliance with relevant standards and policies. Utilize CI/CD … an edge Experience using Python on Google Cloud Platform for Big Data projects, including BigQuery, DataFlow (Apache Beam), Cloud Run, Cloud Functions, Cloud Workflows, and Cloud Composer. Strong SQL development skills. Proven expertise in data modeling, ETLdevelopment, and data warehousing. Knowledge of data management fundamentals and storage principles. Familiarity with statistical models or data mining algorithms More ❯
Baginton, Warwickshire, United Kingdom Hybrid / WFH Options
Arden University
and ecosystems suchas MicrosoftSynapse/Fabric,Hadoop, Spark, Kafka, and others. Skills in data modelling and datawarehousing solutions. Experienceof dimensional modelling (Kimball). Proven Experience in designing and developing ETL/ELT processes. Knowledge of data pipeline toolssuch as Data Factory, Airflow, andNiFi. Knowledge of version controlsystems such as Git. Ability to analyse problems anddesign better solutions. A familiarity More ❯
ensure data is reliable, accessible, and valuable across all areas of the business. What you’ll be doing: Designing, building, and owning high-performance data pipelines and APIs Developing ETL processes to support analytics, reporting, and business operations Assembling complex datasets from a wide variety of sources using tools like SQL, Python, dbt, and Azure Supporting and improving data … we’re looking for: Strong experience with Azure cloud technologies , particularly around data services Proficient in SQL and experienced with Python for data transformation Hands-on experience with dbt , ETLdevelopment , and data warehousing best practices Comfortable with deploying infrastructure as code and building CI/CD pipelines (e.g., using GitHub, Azure DevOps) Ability to manage large, unstructured More ❯
Stratford-upon-avon, Warwickshire, United Kingdom Hybrid / WFH Options
Big Red Recruitment
and grow, a data estate designing and creating modern data platforms to fundamentally transform the data environment. The projects are varied, from setting up new data pipelines, integration/ETL across all the different disparate business systems and all the relevant data flows to BI strategy around AI. You'll be working with all the different business divisions and … to a new Azure data solution to release the power and potential of the data across the organisation. We need a few basics from you: Data pipeline creation Strong ETLdevelopment experience with Azure Data Factory Datawarehouse and data storage concepts - star schema/snowflake/Kimball Azure Databricks and AI Power BI reporting skills would be advantageous More ❯
are looking for someone who can demonstrate experience in the following areas: Commercial experience with implementing Fabric strong Azure experience - Ideally using ADF, Databricks, ADLS etc Data Engineering background - ETLdevelopment, data storage platforms such as Data Warehouse, Lake, or Lakehouse architectures You will ideally have come from a consultancy background, and henceforth understand how to balance multiple More ❯
tasked with building and enhancing a modern Azure data platform, with an expertise in Databricks. About the Role: In this role, you will play a key part in developing ETL pipelines, optimizing data integration processes, and supporting data-driven decision-making across the business. You'll collaborate with stakeholders to understand data requirements, contribute to architectural decisions, and support … the migration of reporting services to Azure. Key Responsibilities: Design, build, and maintain ETL pipelines using Azure Data Factory , Azure Data Lake , Synapse , and Databricks . Design and build a greenfield Azure data platform to support business-critical data needs. Collaborate with stakeholders across the organization to gather and define data requirements. Assist in the migration of reporting services … validate design decisions to ensure optimal data architecture. Work closely with marketing and insights teams to provide actionable data outputs. Opportunity to transition into a leadership role, supporting the development of junior team members. Key Skills & Experience: 5+ years of experience in Azure (ADF, Data Lake, Synapse). Strong SQL skills for data integration and reporting. Experience with Power More ❯
Essential Core Technical Experience 5 to 10+ years of experience in SQL Server data warehouse or data provisioning architectures. Advanced SQL query writing and stored procedure experience. Experience developing ETL solutions in SQL Server, including SSIS & T-SQL. Experience with Microsoft BI technologies (SQL Server Management Studio, SSIS, SSAS, SSRS). Knowledge of data/system integration and dependency … identification. Proficiency in MS Excel & VBA. Knowledge of Insurance or Banking/Finance domains. Exposure to DW development practices (code & configuration management, build processes). Understanding of data warehouse design patterns (e.g., Data Modelling). Ability to elicit business requirements and collaborate on end-to-end solutions. Supporting Core Technical Experience Experience with Jira, Confluence, MS Teams, or similar More ❯
Experience maintaining and/or contributing to bug bounty and responsible disclosure programs Understanding of language models and transformers Rich understanding of vector stores and search algorithms Large-scale ETLdevelopment Direct engineering experience of high performance, large-scale ML systems Hands on MLOps experience, with an appreciation of the end-to-end CI/CD process Have … day. We want the best talent around the world to be energized to join us, motivated to stay and empowered to thrive. Job Family Group: Technology Job Family: Applications Development Time Type: Full time Most Relevant Skills Please see the requirements listed above. Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. Citi More ❯
but not necessary Experience with Google Platforms: Google Analytics, Google Ads, Google Search Experience with Snowflake or any other relational database Strong SQL skills + Experience with design anddevelopment of ETL processes Experience with user-facing analytics platforms: Tableau, Superset, Microstrategy Advanced Excel skills required, including proficiency with pivot tables and VLOOKUP; equivalent Google Sheets experience preferred More ❯
and to understand the business and strategic impact of your great engineering work - to whatever extent suits you. WHAT YOU'LL BE DOING Build innovative data solutions Support the developmentand rollout of an industry-first global analytics programme Develop and deploy automated code pipelines, from data acquisition through cleaning and preparing data for modelling, through to visualisation Help … HAVES OR EXCITED TO LEARN: Some experience designing, building and maintaining SQL databases (and/or NoSQL) Some experience with designing efficient physical data models/schemas and developing ETL/ELT scripts Some experience developing data solutions in cloud environments such as Azure, AWS or GCP - Azure Databricks experience a bonus 30 minute video interview with the People More ❯