London, England, United Kingdom Hybrid / WFH Options
Axis Capital
our data solutions are secure, efficient, and optimized. Key Responsibilities: Design and implement data solutions using Azure services, including Azure Databricks, ADF, and Data Lake Storage. Develop and maintain ETL/ELT pipelines to process structured and unstructured data from multiple sources. - Automate loads using Databricks workflows and Jobs Develop, test and build CI/CD pipelines using Azure DevOps More ❯
Manassas, Virginia, United States Hybrid / WFH Options
Centene
information into the Security Data Lake. Data Storage and Management: Store and manage the data the ingested data. This involves optimizing data schemas and ensuring data quality and integrity. ETL (Extract, Transform, Load) Processes: Design ETL pipelines to transform raw data into a format suitable for analysis. This involves data cleansing, aggregation, and enrichment, ensuring the data is usable for … quality and governance Machine learning model development and maintenance Data integration processes Data security and privacy regulations Data visualization tools development Data warehouse and data mart design and development ETL (Extract, Transform, Load) processes. Data governance and compliance Proficiency in SQL and Python Knowledge of Big Data technologies (Hadoop, Spark) Cloud computing (AWS, Azure, GCP) Data modeling and architecture Advanced More ❯
Richmond, Virginia, United States Hybrid / WFH Options
Centene
information into the Security Data Lake. Data Storage and Management: Store and manage the data the ingested data. This involves optimizing data schemas and ensuring data quality and integrity. ETL (Extract, Transform, Load) Processes: Design ETL pipelines to transform raw data into a format suitable for analysis. This involves data cleansing, aggregation, and enrichment, ensuring the data is usable for … quality and governance Machine learning model development and maintenance Data integration processes Data security and privacy regulations Data visualization tools development Data warehouse and data mart design and development ETL (Extract, Transform, Load) processes. Data governance and compliance Proficiency in SQL and Python Knowledge of Big Data technologies (Hadoop, Spark) Cloud computing (AWS, Azure, GCP) Data modeling and architecture Advanced More ❯
Virginia Beach, Virginia, United States Hybrid / WFH Options
Centene
information into the Security Data Lake. Data Storage and Management: Store and manage the data the ingested data. This involves optimizing data schemas and ensuring data quality and integrity. ETL (Extract, Transform, Load) Processes: Design ETL pipelines to transform raw data into a format suitable for analysis. This involves data cleansing, aggregation, and enrichment, ensuring the data is usable for … quality and governance Machine learning model development and maintenance Data integration processes Data security and privacy regulations Data visualization tools development Data warehouse and data mart design and development ETL (Extract, Transform, Load) processes. Data governance and compliance Proficiency in SQL and Python Knowledge of Big Data technologies (Hadoop, Spark) Cloud computing (AWS, Azure, GCP) Data modeling and architecture Advanced More ❯
Fairfax, Virginia, United States Hybrid / WFH Options
Centene
information into the Security Data Lake. Data Storage and Management: Store and manage the data the ingested data. This involves optimizing data schemas and ensuring data quality and integrity. ETL (Extract, Transform, Load) Processes: Design ETL pipelines to transform raw data into a format suitable for analysis. This involves data cleansing, aggregation, and enrichment, ensuring the data is usable for … quality and governance Machine learning model development and maintenance Data integration processes Data security and privacy regulations Data visualization tools development Data warehouse and data mart design and development ETL (Extract, Transform, Load) processes. Data governance and compliance Proficiency in SQL and Python Knowledge of Big Data technologies (Hadoop, Spark) Cloud computing (AWS, Azure, GCP) Data modeling and architecture Advanced More ❯
Newport News, Virginia, United States Hybrid / WFH Options
Centene
information into the Security Data Lake. Data Storage and Management: Store and manage the data the ingested data. This involves optimizing data schemas and ensuring data quality and integrity. ETL (Extract, Transform, Load) Processes: Design ETL pipelines to transform raw data into a format suitable for analysis. This involves data cleansing, aggregation, and enrichment, ensuring the data is usable for … quality and governance Machine learning model development and maintenance Data integration processes Data security and privacy regulations Data visualization tools development Data warehouse and data mart design and development ETL (Extract, Transform, Load) processes. Data governance and compliance Proficiency in SQL and Python Knowledge of Big Data technologies (Hadoop, Spark) Cloud computing (AWS, Azure, GCP) Data modeling and architecture Advanced More ❯
London, England, United Kingdom Hybrid / WFH Options
Morgan Advanced Materials
Azure-hosted analytical platform , ensuring it meets current and future business requirements. Optimize the platform for performance, security, and scalability, ensuring alignment with best practices. Design and develop scalable ETL pipelines and data solutions using Azure Data Factory, Synapse Analytics , and related tools. Work closely with stakeholders to understand business needs and translate them into actionable data solutions. Lead and … data pipelines for seamless integration of ERP and other critical data sources into Azure Synapse. Ensure data workflows are optimized for performance, scalability, and reliability. Lead efforts to standardize ETL/ELT processes and enforce data quality standards across the platform. Implement data transformation logic to create clean, usable datasets for reporting and analytics. Platform Administration and Optimization: Monitor and … future needs. Key Skills: Leadership: Experience managing and mentoring data engineers or technical teams. Technical Expertise: Advanced knowledge of Azure Synapse Analytics, Azure Data Factory, and SQL. Proficiency in ETL/ELT pipeline design and optimization. Familiarity with automation tools (Azure Logic Apps, Power Automate, Python, or PowerShell scripting). Knowledge of cloud performance optimization and cost management. Collaboration: Strong More ❯
Bracknell, England, United Kingdom Hybrid / WFH Options
Evelyn Partners
needs. • Extensive use of and fully conversant in SQL. • Experience working with programming languages like C#, Python, Java, Spark. • Create and maintain ETL/ELT processes to extract, transform, andload data from various sources into the data platforms. • Design, develop, and deploy SSIS packages and ADF pipelines. • Manage and troubleshoot SSIS packages and ADF pipelines, ensuring data integrity and … queries and data extracts to support business needs. • Work closely with other engineers, DBAs, and Tech staff to ensure seamless data integration and system compatibility. • Document data warehouse architecture, ETL processes, and database configurations. • Maintain effective relationships with stakeholders and ensure communication is effective and timely. • Provide time estimates for work so stakeholder expectations and resources can be managed. • Collaborate … MS SQL Server, including T-SQL programming, stored procedures, and functions. • Extensive experience with SQL Server Integration Services (SSIS) & Visual Studio and Azure Data Factory (ADF) and GitHub, for ETL development. • Experience building and maintaining reports using SQL Server Reporting Services (SSRS). • Solid understanding of data warehousing concepts, including star/snowflake schemas, dimensional modelling, and data vault 2.0 More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
Aventum Group
of Excellence for Insight and analysis for the whole Aventum Group Identify data quality issues and support the data governance initiative by developing an exceptional reports Model business logic, extract key performance metrics, and create reusable data structures. Maintain data so that it remains available and usable,e.g. SQL (Basic, Advanced modelling, Efficient, Big Data and Programmatic) Support Data munging … such as reshaping, aggregating, joining disparate sources, small-scale ETL, API interaction, and automation, e.g. Python Support the data normalisation and modelling as part of the transform step of ETL/ELT pipelines Dive deep to create trustworthy and understandable data for our DA to create insights on how well Aventum is taking care of the customer as it relates … Azure App service, Azure ML is a plus Languages: Python, SQL, T-SQL, SSIS DB: Azure SQL Database, Cosmos DB, NoSQL, Methodologies: Agile, DevOps must have Concepts: ELT/ETL, DWH, APIs (RESTful), Spark APIs, FTP protocols, SSL, SFTP, PKI (public Key Infrastructure) and Integration testing Management Duties Yes We are an equal opportunity employer, and we are proud to More ❯
Blackpool, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
with business priorities Data Architecture & Design: Architect and implement robust designs, ensuring optimal performance, scalability, and security Translate high-level solution designs into detailed technical specifications, including data models, ETL processes, and database schemas Establish and enforce data modelling best practices, promoting data integrity and consistency across all platforms ETL Development & Optimization: Develop and optimize complex ETL pipelines for diverse … data sources (on-premise and cloud-based), utilizing best practices for data extraction, transformation, and loading Implement data quality checks and validation processes within ETL pipelines, ensuring data accuracy and reliability Optimize ETL performance for speed and efficiency, addressing bottlenecks and improving data processing times Data Platform Management & Governance: Maintain and enhance the company's data platforms, ensuring high availability … communicator, particularly in presenting technical information to non-technical stakeholders, and ensuring alignment with business objectives The Person Essential: Minimum 5+ years of experience in designing and implementing data ETL processes and design Expert-level proficiency in T-SQL, SSIS, and the Microsoft data stack (SQL Server, Azure SQL Database) Proven proficiency in Azure Data Factory Proven experience in designing More ❯
London, England, United Kingdom Hybrid / WFH Options
ScanmarQED
programming within a dynamic (software) project environment. Data Infrastructure and Engineering Foundations: Data Warehousing: Knowledge of tools like Snowflake, DataBricks, ClickHouse and traditional platforms like PostgreSQL or SQL Server. ETL/ELT Development: Expertise in building pipelines using tools like Apache Airflow, dbt, Dagster. Cloud providers: Proficiency in Microsoft Azure or AWS. Programming and Scripting: Programming Languages: Strong skills in More ❯
data challenges. Your Job's Key Responsibilities Are: Designing, developing, and maintaining robust data pipelines using Databricks, Spark, and Python Building efficient and scalable ETL processes to ingest, transform, andload data from various sources (databases, APIs, streaming platforms) into cloud-based data lakes and warehouses Leveraging the Databricks ecosystem (SQL, Delta Lake, Workflows, Unity Catalog) to deliver reliable and … data manipulation libraries (e.g., PySpark, Spark SQL) Experience with core components of the Databricks ecosystem: Databricks Workflows, Unity Catalog, and Delta Live Tables Solid understanding of data warehousing principles, ETL/ELT processes, data modeling and techniques, and database systems Proven experience with at least one major cloud platform (Azure, AWS, or GCP) Excellent SQL skills for data querying, transformation More ❯
data challenges. Your Job's Key Responsibilities Are: Designing, developing, and maintaining robust data pipelines using Databricks, Spark, and Python Building efficient and scalable ETL processes to ingest, transform, andload data from various sources (databases, APIs, streaming platforms) into cloud-based data lakes and warehouses Leveraging the Databricks ecosystem (SQL, Delta Lake, Workflows, Unity Catalog) to deliver reliable and … data manipulation libraries (e.g., PySpark, Spark SQL) Experience with core components of the Databricks ecosystem: Databricks Workflows, Unity Catalog, and Delta Live Tables Solid understanding of data warehousing principles, ETL/ELT processes, data modeling and techniques, and database systems Proven experience with at least one major cloud platform (Azure, AWS, or GCP) Excellent SQL skills for data querying, transformation More ❯
data challenges. Your Job's Key Responsibilities Are: Designing, developing, and maintaining robust data pipelines using Databricks, Spark, and Python Building efficient and scalable ETL processes to ingest, transform, andload data from various sources (databases, APIs, streaming platforms) into cloud-based data lakes and warehouses Leveraging the Databricks ecosystem (SQL, Delta Lake, Workflows, Unity Catalog) to deliver reliable and … data manipulation libraries (e.g., PySpark, Spark SQL) Experience with core components of the Databricks ecosystem: Databricks Workflows, Unity Catalog, and Delta Live Tables Solid understanding of data warehousing principles, ETL/ELT processes, data modeling and techniques, and database systems Proven experience with at least one major cloud platform (Azure, AWS, or GCP) Excellent SQL skills for data querying, transformation More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
database standards. Collaborate with team members to support data workflows for bronze, silver, and gold data tables. Help identify recurring issues and suggest improvements for operational efficiency. Support basic ETL (Extract, Transform, Load) processes to maintain data pipelines. Execute simple data quality checks to ensure the accuracy and consistency of data. Assist in troubleshooting database errors and performance bottlenecks. Escalate … as needed. Adhere to security protocols and compliance requirements in database operations. Assist in Data Migration and Integration Efforts - Support the transition of data between systems by helping with Extract, Transform, Load (ETL) processes and ensuring data consistency across different platforms. Monitor and Troubleshoot Database Performance Issues - Identify potential bottlenecks, perform root cause analysis, and work with senior architects to More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
part in ongoing training and knowledge-sharing sessions with more experienced architects Assist in Data Migration and Integration Efforts - Support the transition of data between systems by helping with Extract, Transform, Load (ETL) processes and ensuring data consistency across different platforms. Monitor and Troubleshoot Database Performance Issues - Identify potential bottlenecks, perform root cause analysis, and work with senior architects to … in either Python or SQL, with knowledge in building and optimizing data pipelines or analytics solutions. Familiarity with popular relational databases (e.g., MySQL, PostgreSQL, SQL Server) Basic understanding of ETL processes and tools (e.g., Talend, Informatica, or similar) Exposure to cloud-based data services (e.g., AWS RDS, Azure SQL Database, or Google Cloud SQL) Understanding of fundamental networking concepts andMore ❯
data manipulation libraries (e.g., PySpark, Spark SQL) • Experience with core components of the Databricks ecosystem: Databricks Workflows, Unity Catalog, and Delta Live Tables • Solid understanding of data warehousing principles, ETL/ELT processes, data modeling and techniques, and database systems • Proven experience with at least one major cloud platform (Azure, AWS, or GCP) • Excellent SQL skills for data querying, transformation … platform, supporting clients in solving complex data challenges. • Designing, developing, and maintaining robust data pipelines using Databricks, Spark, and Python • Building efficient and scalable ETL processes to ingest, transform, andload data from various sources (databases, APIs, streaming platforms) into cloud-based data lakes and warehouses • Leveraging the Databricks ecosystem (SQL, Delta Lake, Workflows, Unity Catalog) to deliver reliable and … Staying up to date with advancements in Databricks, data engineering, and cloud technologies to continuously improve tools and approaches Technologies: AI AWS Azure CI/CD Cloud Databricks DevOps ETL GCP Support Machine Learning Power BI Python PySpark SQL Spark Terraform Unity GameDev Looker SAP More: NETCONOMY has grown over the past 20 years from a startup to a More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Harding Retail
life assurance, healthcare cash plan, 65 days leave annually (including Christmas shutdown) What you will be doing: Data Warehouse Optimisation: Performance Tuning: Analyse, optimise, and refactor existing stored procedures, ETL jobs, and data workflows to improve performance and reduce load times on the current data warehouse Capacity Management: Clean up and enhance the efficiency of the current Power BI Premium … of data jobs to ensure timeliness and reliability, improving operational processes to meet evolving business needs Data Quality & Governance: Implement data quality checks, governance, and performance monitoring for both ETL processes and reporting outputs Operational Support: Provide ongoing support for the existing data platform, responding to and resolving performance issues, data quality concerns, and user queries Migration Leadership & Futureproofing: Cloud … and actively execute the long-term migration of the on-premises data warehouse to Microsoft Fabric, including hands-on development and implementation of a robust strategy for transitioning datasets, ETL pipelines, and reporting solutions to the cloud. ETL & Pipeline Development: Build scalable ETL processes using Azure Data Factory, Azure Synapse, and other Fabric-native tools, ensuring a smooth integration between More ❯
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Creditsafe
for business intelligence and reporting. We integrate data from Dynamics 365 (CE & F&O) and other sources, building efficient data structures to support analytical insights. By developing and optimizing ETL/ELT pipelines, we ensure data accuracy, consistency, and performance across the warehouse. Leveraging Azure Data Services such as Synapse, Data Factory, and SQL Server, we provide scalable solutions that … skilled Data Warehouse Developer to design, build, and maintain an internal data warehouse using the Kimball methodology. The ideal candidate will have expertise in creating fact and dimension tables, ETL processes, and ensuring data integrity. Experience with Dynamics 365 CE & F&O is highly desirable. KEY DUTIES AND RESPONSIBILITIES Design and implement a Kimball-style data warehouse architecture, including fact … and dimension tables. Develop and optimize ETL/ELT pipelines to integrate data from Dynamics 365 (CE & F&O) and other sources. Collaborate with business stakeholders to define key business metrics and reporting needs. Ensure data quality, consistency, and performance across the warehouse. Work with Azure Data Services (Synapse, Data Factory, Data Lake, SQL DB, etc.) to build scalable solutions. More ❯
London, England, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - Leading Energy Company - London (Tech Stack: Data Engineer, Databricks, Python, Power BI, AWS QuickSight, AWS, TSQL, ETL, Agile Methodologies) Company Overview: Join a dynamic team, a leading player in the energy sector, committed to innovation and sustainable solutions.Our client are seeking a talented Data Engineer to help build and optimise our data infrastructure, enabling them to harness the … years), preferably in the energy sector. Right to work in the UK. Strong proficiency in SQL and database technologies (e.g., MS SQL, Snowflake). Hands-on experience with ETL/ELT tools such as Azure Data Factory, DBT, AWS Glue, etc. Proficiency in Power BI and Advanced Analytics for insightful data visualisation. Strong programming skills in Python for data processing More ❯
to ensure data infrastructure is optimized for performance, scalability, and reliability. Provide leadership and mentorship to data architecture and engineering teams. Data Integration and Management: Design and implement robust ETL processes to integrate data from various sources into the data warehouse and big data platforms. Oversee the management of metadata, master data, and data lineage across systems. Ensure data consistency … Proven track record of designing and implementing large-scale data architectures in complex environments. CICD/DevOps experience is a plus. Skills: Strong expertise in data modeling, data integration (ETL/ELT), and database design. Proficiency in SQL, PL/SQL, and performance tuning in Teradata, Oracle, and other databases. Strong experience with cloud data platforms tools and services (e.g. More ❯
Gloucester, England, United Kingdom Hybrid / WFH Options
Benefact Group
performance environment, supporting the Head of Data Operations in coordinating activities and achieving milestones. Cloud Data Engineering: Lead the design, build, and maintenance of cloud-native data pipelines using ETL/ELT, streaming, and modern data processing techniques. Data Visualisation & Analytics: Oversee scalable, cloud-based analytics and data services to enable business decision-making. Operational Support & Troubleshooting: Ensure high availability More ❯
Gloucester, England, United Kingdom Hybrid / WFH Options
Benefact Group plc
team activities, set clear goals, and monitor progress to achieve project milestones and objectives. Cloud Data Engineering: Lead the design, build, and maintenance of cloud-native data pipelines using ETL/ELT, streaming, and modern data processing techniques. Data Visualisation & Analytics: Oversee the provision of scalable, cloud-based analytics and data services that empower business decision-making. Operational Support & Troubleshooting More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
Jira (or similar agile tool), ensuring timely and high-quality deliverables. Experience with cloud platforms (AWS, Azure, or GCP), including infrastructure provisioning, configuration, and cost optimization. Solid understanding of ETL/ELT processes and tools, with a track record of designing and managing reliable data pipelines. Expertise in CI/CD tools (e.g., Jenkins, GitLab CI, or Azure DevOps), including More ❯
London, England, United Kingdom Hybrid / WFH Options
Endeavour Recruitment Solutions
Technologies: Data Engineer ETL SQL Power BI Azure Data Warehouse We have an exciting Hybrid working opportunity for a Data Engineer to join our clients growing Data team, playing a key role in surfacing data within their fast-growing Finance business on the South Coast The role Responsibility for designing, building, and implementing a robust and scalable data warehouse that … functions with the Microsoft Azure Platform. This requires defining data models and ensuring data integrity and accuracy and enabling integrations with other platforms. Automating data pipelines to extract, transform, andload data from various sources into the data warehouse, identifying and implementing appropriate tools such as data bricks, designing, and testing data transformations, and ensuring data quality and consistency. Having … use their initiative to bring about positive change and take initiative in spotting opportunities and finding ways to solve challenges with data. Has hands-on experience with data warehousing andETL tools and packages which integrate with Azure Data Factory. Has a comprehensive understanding of ETL technical design, data quality testing, cleansing, and monitoring. You should have experience with data More ❯