ensure understanding and adherence to architectural and modeling standards. Stakeholder Engagement: Partner with the and Data Management team to drive the groups data strategy. Collaborate with business units to extract greater value from data assets. Engage with key stakeholders to identify technical opportunities for enhancing data product delivery. Provides consultative advice to business leaders and organizational stakeholders with ctionable recommendations … EA Sparx). Strong knowledge of SQL and NoSQL databases (e.g., MongoDB, Cassandra). Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud). Experience with data integration andETL tools (e.g., Talend, Informatica). Excellent analytical and technical skills. Excellent planning and organizational skills. Knowledge of all components of holistic enterprise architecture. What we offer: Colt is a growing More ❯
Easter Howgate, Midlothian, United Kingdom Hybrid / WFH Options
Leonardo UK Ltd
and maintaining data pipelines, data warehouses, and leveraging data services. Proficient in DataOps methodologies and tools, including experience with CI/CD pipelines, containerisation, and workflow orchestration. Familiar with ETL/ELT frameworks, and experienced with Big Data Processing Tools (e.g. Spark, Airflow, Hive, etc.) Knowledge of programming languages (e.g. Java, Python, SQL) Hands-on experience with SQL/NoSQL More ❯
Lambda), Snowflake , Databricks , and Reltio to design and implement scalable data solutions. Key Responsibilities: Lead data architecture design using AWS, Snowflake, Databricks, and Reltio. Manage data migration, data modeling, ETL processes, and data integration. Work with reporting tools such as Power BI or Tableau. Implement and maintain data governance frameworks. Ensure compliance with GDPR and other data privacy regulations. Maintain … plus. Required Skills & Experience: 15-20 years of experience in data architecture . Expertise in AWS , Snowflake , Databricks , and Reltio . Experience with data governance , data privacy regulations , andETL tools . Familiarity with reporting tools like Power BI and Tableau . Experience with motor fleet insurance is a plus. Additional Information: Remote work with occasional travel to Belgium More ❯
working closely with senior leadership. The team has already laid the foundations for a modern data platform using Azure and Databricks and is now focused on building out scalable ETL processes, integrating AI tools, and delivering bespoke analytics solutions across the organisation. THE ROLE As a Data Engineer, you'll play a pivotal role in designing and implementing robust data … This role combines hands-on development with collaborative architecture design, and offers the opportunity to contribute to AI readiness within a fast-paced business. KEY RESPONSIBILITIES Develop and maintain ETL pipelines, including manual and semi-manual data loads Connect and integrate diverse data sources across cloud platforms Collaborate with analytics and design teams to create bespoke, scalable data solutions Support … ready for future AI use cases REQUIRED SKILLS & EXPERIENCE Strong experience with Azure and Databricks environments Advanced Python skills for data engineering (pandas, PySpark) Proficiency in designing and maintaining ETL pipelines Experience with Terraform for infrastructure automation Track record of working on cloud migration projects, especially Azure to Databricks Comfortable working onsite in London 2 days/week and engaging More ❯
moving/transforming data across layers (Bronze, Silver, Gold) using ADF, Python, and PySpark. Must have hands-on experience with PySpark, Python, AWS, data modelling. Must have experience in ETL processes. Must have hands-on experience in Databricks development. Good to have experience in developing and maintaining data integrity and accuracy, data governance, and data security policies and procedures. Must More ❯
Your Mission: Design and build data integration solutions following technical standards. Develop application interfaces for data and analytics products. Work on big data projects, leveraging real-time technologies. Support ETL development and data transformations based on business requirements. Ensure data work aligns with governance and security policies. Collaborate with the wider data and analytics team. Maintain and troubleshoot production data … pipelines. Document data development using technical documents and DevOps tools. Skills & Experience: Solid experience with Azure data integration stack. Proficiency with SQL, ETL, Python, and APIs. Strong knowledge of relational databases (MySQL, SQL Server, PostgreSQL). Experience with NoSQL databases (e.g., Cosmos DB). Familiar with data integration methods (real-time, periodic, batch). Strong understanding of application interfaces (REST More ❯
SSIS) packages and new cloud-native solutions within a microservice and containerised architecture. Key accountabilities include: Developing near real-time integration services using cloud technologies. Building and optimising SSIS ETL packages. Selecting appropriate integration approaches (ETL/ELT) aligned to business outcomes. Collaborating across product and analytics teams to integrate with platforms such as Azure Data Lake and Synapse. Managing … through innovation, prototyping, and cross-functional collaboration. We are looking for a technically skilled and proactive professional with: Strong experience in Microsoft SQL, Azure Data Factory, Synapse, and related ETL/ELT tools. Proven capability in managing Azure-based integration services and data solutions. A solid understanding of microservices, data modelling, and DevOps principles. Experience in handling large, complex datasets More ❯
SSIS) packages and new cloud-native solutions within a microservice and containerised architecture. Key accountabilities include: Developing near real-time integration services using cloud technologies. Building and optimising SSIS ETL packages. Selecting appropriate integration approaches (ETL/ELT) aligned to business outcomes. Collaborating across product and analytics teams to integrate with platforms such as Azure Data Lake and Synapse. Managing … through innovation, prototyping, and cross-functional collaboration. We are looking for a technically skilled and proactive professional with: Strong experience in Microsoft SQL, Azure Data Factory, Synapse, and related ETL/ELT tools. Proven capability in managing Azure-based integration services and data solutions. A solid understanding of microservices, data modelling, and DevOps principles. Experience in handling large, complex datasets More ❯
and digital currency trading and custody solutions. Main Duties & Responsibilities: Design, develop, and maintain scalable data infrastructure to support our BI/MI workloads. Manage and optimize data pipelines, ETL/ELT processes, and data warehousing solutions. Ensure high availability, performance, and reliability of our data platforms and services. Conduct code reviews, mentoring others, and enforce best practices in data … Proficiency in Python or Java for data retrieval and pipeline development Experience with IaC tools such as Terraform or Ansible for deployment and infrastructure management Hands-on experience with; ETL/ELT orchestration and pipeline tools (Airflow, Airbyte, DBT, etc.) Data warehousing tools and platforms (Snowflake, Iceberg, etc.) SQL databases, particularly MySQL Desired Experience: Experience with cloud-based services, particularly More ❯
helping businesses make better decisions. Someone who thrives in a collaborative environment and values transparency and integrity. Strong analytical and problem-solving skills. Basic knowledge of data modelling andETL processes. Proficiency in Excel and other Microsoft Office applications. Ability to communicate clearly and effectively with technical and non-technical audiences. Willingness to learn and adapt in a fast-paced … with stakeholders to define requirements, develop BI solutions and give advice on data strategy, visualization best practices and performance optimization. In addition, you'll assist with data integration andETL processes with the IT and data teams identifying opportunities for analytics-driven improvements across the business. You will support the design and implementation of BI solutions, including data models andMore ❯
Reading, Berkshire, South East, United Kingdom Hybrid / WFH Options
Bowerford Associates
with your team and other teams to optimise data solutions. This is a fantastic opportunity for highly skilled and motivated Lead Data Engineer with strong expertise in data architecture, ETL pipelines, cloud technologies and big data solutions. In this role, you will have the following: - Technical Leadership Team Management Leading by Example (100% hands-on) Data Strategy & Solutions Cloud (Azure … minimum of 2). Extensive Big Data hands-on experience across coding/configuration/automation/monitoring/security is necessary. Significant AWS or Azure hands-on experience. ETL Tools such as Azure Data Fabric (ADF) and Databricks or similar ones. Data Lakes: Azure Data, Delta Lake, Data Lake or Databricks Lakehouse. Certifications: AWS, Azure, or Cloudera certifications are … our client is NOT offering sponsorship for this role. KEYWORDS Lead Data Engineer, Senior Lead Data Engineer, Spark, Java, Python, PySpark, Scala, Big Data, AWS, Azure, On-Prem, Cloud, ETL, Azure Data Fabric, ADF, Databricks, Azure Data, Delta Lake, Data Lake. Please note that due to a high level of applications, we can only respond to applicants whose skills andMore ❯
and problem-solving abilities Bring thought leadership to your area of responsibility and enjoy staying ahead in your field You possess the following skills and experiences: Solid understanding of ETL tooling to perform data transformation tasks Strong understanding of data design principles and dimensional data modeling Advanced SQL skills and understanding of query optimization strategies Preferred skills and experience across … the following; ETL Tools –Informatica IICS, Unix/Linux Shell Scripting, SQL Server & Stored Procedures (SSMS), SSMA (SQL Server Migration Assistant), GitHub, API Integration, Alteryx, Data Visualization, Automation & Scheduling, Documentation & Communication What’s Next? If you have the skills and passion to take on this DATA ENGINEER position then APPLY NOW for immediate consideration. More ❯
Employment Type: Permanent
Salary: £60000 - £67000/annum Hybrid, Great Benefits
for the full end-to-end work, pulling and cleaning data, creating dashboards in Power BI and presenting insights back to stakeholders. YOUR SKILLS AND EXPERIENCE • Advanced SQL skills • ETL experience (Azure Data Factory or SSIS preferred) • Strong data visualisation skills (ideally Power BI) • Proven ability to interpret and communicate data insights • Experience managing workflows or mentoring peers • Agile mindset More ❯
consumable information for all stakeholders.* Reviewing, evaluating and designing data services. * Implementing high quality data architectures.* Advising in data system design, applied to complex financial data.* Designing and implementing ETL processes for efficient data extraction.* Planning and organising stakeholder workshops to gather requirements. BI Developer - What will you need? * Strong story-telling skills in Power BI.* Advanced SQL skills.* Ability More ❯
QUALIFICATIONS - Experience in analyzing and interpreting data with Redshift, Oracle, NoSQL etc. - Experience with data visualization using Tableau, Quicksight, or similar tools - Experience with data modeling, warehousing and building ETL pipelines - Experience in Statistical Analysis packages such as R, SAS and Matlab - Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process … for modeling - Experience with SQL - Experience in the data/BI space PREFERRED QUALIFICATIONS - Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift - Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to More ❯
the board - all while building slick, automated data pipelines in Python. Key Responsibilities: Lead data migration and integration tasks, ensuring quality and alignment across systems. Design, implement, and optimise ETL processes to streamline data movement from diverse sources. Build and enhance a robust Snowflake data warehouse to support evolving reporting needs. Troubleshoot and resolve data quality issues, supporting business users … data lifecycle - from ingestion and transformation to validation and analysis. Strong Python skills and confidence building automation for large-scale data tasks. Hands-on Snowflake experience. Deep understanding of ETL/ELT pipelines and data governance standards. A solid grasp of financial data or lending domain knowledge is highly desirable. A passion for mentoring and upskilling teammates in a supportive More ❯
years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience - Experience with data visualization using Tableau, Quicksight, or similar tools - Experience with data modeling, warehousing and building ETL pipelines - Experience in Statistical Analysis packages such as R, SAS and Matlab - Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process … data for modeling PREFERRED QUALIFICATIONS - Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift - Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during More ❯
support strategic decision-making and network improvements. BASIC QUALIFICATIONS - Experience with SQL - Experience with data visualization using Tableau, Quicksight, or similar tools - Experience with data modeling, warehousing and building ETL pipelines - Experience in analyzing and interpreting data with Redshift, Oracle, NoSQL etc. - Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process … data for modeling PREFERRED QUALIFICATIONS - Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift - Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting decisions based on your More ❯
Feel free to reach out and apply today! Responsibilities: Design and build robust, scalable data pipelines that serve business-critical applications and analytics use cases. Modernise and migrate Legacy ETL processes into cloud-native solutions using tools such as Azure Data Factory, Snowflake, and Databricks. Collaborate with cross-functional teams including data architects, analysts, and platform engineers to deliver production More ❯
and the mapping of data lineage/inventory models to support business practices. A high level of proficiency with Data management and analysis tools (e.g., Azure data products, SQL, ETL, APIs, Power BI, DQ analysis tools, Excel etc). Experience in financial services and an understanding of investment, insurance and pension products. Sound report writing skills, able to appropriately select More ❯
of data, enjoys the challenge of highly complex technical contexts, and, above all else, is passionate about data and analytics. He/she is an expert with data modeling, ETL design and business intelligence tools, has hand-on knowledge on columnar databases such as Redshift and other related AWS technologies. He/she passionately partners with the customers to identify … business analysis, benchmarking, and optimization efforts BASIC QUALIFICATIONS - Bachelor's degree or equivalent - Experience defining requirements and using data and metrics to draw business insights - Experience with SQL or ETL - Knowledge of Python, VBA, Macros, Selenium scripts PREFERRED QUALIFICATIONS - Experience working with Tableau - Experience using very large datasets - Knowledge of data visualization tools such as Quick Sight, Tableau, Power BI More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
InterQuest Group (UK) Limited
Troubleshoot integration issues and optimize pipeline performance Document workflows and maintain best practices for SnapLogic development Requirements: Proven hands-on experience with SnapLogic (Enterprise Integration Cloud) Strong understanding of ETL/ELT concepts and integration patterns Experience working with APIs, cloud platforms (e.g., AWS, Azure, GCP), and databases (SQL/NoSQL) Familiarity with REST, JSON, XML, and data mapping/ More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Interquest
Troubleshoot integration issues and optimize pipeline performance Document workflows and maintain best practices for SnapLogic development Requirements: Proven hands-on experience with SnapLogic (Enterprise Integration Cloud) Strong understanding of ETL/ELT concepts and integration patterns Experience working with APIs, cloud platforms (e.g., AWS, Azure, GCP), and databases (SQL/NoSQL) Familiarity with REST, JSON, XML, and data mapping/ More ❯
Hybrid Role with 2 days per week onsite in Central London. Skillset required: * Data Pipeline Expertise: Extensive experience in designing and implementing scalable ETL/ELT data pipelines in Azure Databricks, transforming raw data into usable datasets for analysis. * Azure Databricks Proficiency: Strong knowledge of Spark (SQL, PySpark) for data transformation and processing within Databricks, along with experience building workflows More ❯
Hybrid Role with 2 days per week onsite in Central London. Skillset required: * Data Pipeline Expertise: Extensive experience in designing and implementing scalable ETL/ELT data pipelines in Azure Databricks, transforming raw data into usable datasets for analysis. * Azure Databricks Proficiency: Strong knowledge of Spark (SQL, PySpark) for data transformation and processing within Databricks, along with experience building workflows More ❯