technical concepts into clear, actionable insights. ● Collaborate cross-functionally to align data strategy with business objectives. What we are looking for: ● 2-3+ years experience in Data Engineering, ETLDevelopment, or Database Administration. ● Prior experience working with business intelligence, analytics, or machine learning teams. ● Experience in cloud-native data solutions and real-time data processing. ● Proficiency in … Databricks, Python, SQL (for ETL & data transformation). ● Knowledge of GDPR, data security best practices, and access control policies ● Strong problem-solving and analytical skills to optimise data processes. ● Excellent collaboration and communication skills to work with cross-functional teams. ● Ability to translate business requirements into technical solutions. ● Strong attention to detail and commitment to data quality. More ❯
well-established knowledge-sharing community. What you'll do Design and implement data solutions using Snowflake across cloud platforms (Azure and AWS) Build and maintain scalable data pipelines andETL processes Optimise data models, storage, and performance for analytics and reporting Ensure data integrity, security, and best practice compliance Serve as a subject matter expert in Snowflake engineering efforts … you'll need Proven experience designing and delivering enterprise-scale data warehouse solutions using Snowflake In-depth understanding of Snowflake architecture, performance optimisation, and best practices Strong experience in ETLdevelopment, data modelling, and data integration Proficient in SQL, Python, and/or Java for data processing Hands-on experience with Azure and AWS cloud environments Familiarity with More ❯
ETL Developer Inside IR35 - Hybrid initially but 5 days onsite required We are seeking an experienced and detail-oriented ETL Application Developer to join our data engineering team. This is a high-impact role dedicated to enhancing our organisation’s operational resilience by ensuring comprehensive and accurate capacity monitoring across critical systems. Your work will help mitigate potential … functional teams to deliver infrastructure-based data acquisition solutions that support our Capacity Management function and ensure compliance with key regulatory requirements . Key Responsibilities: Design, develop, and maintain ETL processes to ingest, transform, and store capacity-related metrics. Integrate data from diverse infrastructure sources using APIs and direct connectors. Ensure data quality and integrity through validation, normalization, and … data trending, analysis, and reporting. Optimise database storage and performance for large datasets. Support control and automation frameworks, including Control-M and SQL DTSS. Required Skills & Experience: Experience in ETLdevelopment, ideally in a complex financial or enterprise environment. Proficiency with ETL tools and data extraction using RESTful APIs . Solid experience with Microsoft SQL Server , including More ❯
Our client is looking for an Azure Data Developer to join their team and help drive innovation in the automotive industry. If you have hands-on experience in Data Developmentand a passion for building data solutions, we want to hear from you! You’ll be part of the Enterprise Data team, responsible for creating and maintaining data integration … Your Mission: Design and build data integration solutions following technical standards. Develop application interfaces for data and analytics products. Work on big data projects, leveraging real-time technologies. Support ETLdevelopmentand data transformations based on business requirements. Ensure data work aligns with governance and security policies. Collaborate with the wider data and analytics team. Maintain and troubleshoot … production data pipelines. Document data development using technical documents and DevOps tools. Skills & Experience: Solid experience with Azure data integration stack. Proficiency with SQL, ETL, Python, and APIs. Strong knowledge of relational databases (MySQL, SQL Server, PostgreSQL). Experience with NoSQL databases (e.g., Cosmos DB). Familiar with data integration methods (real-time, periodic, batch). Strong understanding More ❯
looking for someone who can demonstrate experience in the following areas: Very strong Azure experience - Ideally using ADF, Databricks, ADLS etc - Fabric experience is very desirable Data Engineering background - ETLdevelopment, data storage platforms such as Data Warehouse, Lake, or Lakehouse architectures You will ideally have come from a consultancy background, and henceforth understand how to balance multiple More ❯
are looking for someone who can demonstrate experience in the following areas: Commercial experience with implementing Fabric strong Azure experience - Ideally using ADF, Databricks, ADLS etc Data Engineering background - ETLdevelopment, data storage platforms such as Data Warehouse, Lake, or Lakehouse architectures You will ideally have come from a consultancy background, and henceforth understand how to balance multiple More ❯
Milton Keynes, Buckinghamshire, United Kingdom Hybrid / WFH Options
The Boeing Company
of MOD and Government based programs. Provide technical expertise in a variety of technologies to a multi-site program team. The successful candidate will be involved in the design, development, testing, implementation and support of a sophisticated integration solution and management of associated system components. You will possess technical experience of SAP Data Services, ideally with experience in designing … and developing integration applications using the product. The position requires cross team coordination and leadership with separate, but tightly coupled developmentand architecture teams, infrastructure support teams, key suppliers, and other Boeing programs … applying the same technologies. Preferred Skills/Experience: The ideal candidate we're looking for has the following skills; ETLDevelopment: Design and implement Extract, Transform, Load (ETL) processes using SAP Data Services. Data Quality Management: Develop and implement data quality rules and validation processes. Job Scheduling: Create and manage job schedules for data integration tasks. Performance More ❯