our data solutions are secure, efficient, and optimized. Key Responsibilities: Design and implement data solutions using Azure services, including Azure Databricks, ADF, and Data Lake Storage. Develop and maintain ETL/ELT pipelines to process structured and unstructured data from multiple sources. - Automate loads using Databricks workflows and Jobs Develop, test and build CI/CD pipelines using Azure DevOps More ❯
Transformation Integrate andtransform data from multiple organisational SQL databases and SaaS applications using end-to-end dependency-based data pipelines, to establish an enterprise source of truth. Create ETLand ELT processes using Azure Databricks, ensuring audit-ready financial data pipelines and secure data exchange with Databricks Delta Sharing and SQL Warehouse endpoints. Governance and Compliance Ensure compliance with … including Storage, ADLS Gen2, Azure Functions, Kubernetes. Background in cloud platforms and data architectures, such as Corporate DataLake, Medallion Architecture, Metadata Driven Platform, Event-driven architecture. Proven experience of ETL/ELT, including Lakehouse, Pipeline Design, Batch/Stream processing. Strong working knowledge of programming languages, including Python, SQL, PowerShell, PySpark, Spark SQL. Good working knowledge of data warehouse andMore ❯
appropriate architecture design, opting for modern architectures where possible. Data Modeling: Design and optimize data models and schemas for efficient storage, retrieval, and analysis of structured and unstructured data. ETL Processes: Develop, optimize and automate ETL workflows to extract data from diverse sources, transform it into usable formats, andload it into data warehouses, data lakes or lakehouses. Big Data … teams, including data scientists, analysts, and software engineers, to understand requirements, define data architectures, and deliver data-driven solutions. Documentation: Create and maintain technical documentation, including data architecture diagrams, ETL workflows, and system documentation, to facilitate understanding and maintainability of data solutions. Best Practices: Stay current with emerging technologies and best practices in data engineering, cloud architecture, and DevOps. Mentoring … and Flink. Experience in using modern data architectures, such as lakehouse. Experience with CI/CD pipelines, version control systems like Git, and containerization (e.g., Docker). Experience with ETL tools and technologies such as Apache Airflow, Informatica, or Talend. Strong understanding of data governance and best practices in data management. Experience with cloud platforms and services such as AWS More ❯
Jira). • Experience creating BI models and dashboards, ideally in Power BI. • Excellent verbal and written communication skills Technical Skills: • Familiarity with SQL Server. • Advanced SQL scripting. • Familiarity with ETL/ELT tools and experience navigating data pipelines. • Experience using scripting languages (e.g. Python, PowerShell etc.) to extract insights from file-based storage. • Familiarity with Git or other source control More ❯
Data Engineer - Leading Energy Company - London (Tech Stack: Data Engineer, Databricks, Python, PySpark, Power BI, AWS QuickSight, AWS, TSQL, ETL, Agile Methodologies) Company Overview: Join a dynamic team, a leading player in the energy sector, committed to innovation and sustainable solutions. Our client are seeking a talented Data Engineer to help build and optimise our data infrastructure, enabling them to … years), preferably in the energy sector. Right to work in the UK. Strong proficiency in SQL and database technologies (e.g., MS SQL, Snowflake). Hands-on experience with ETL/ELT tools such as Azure Data Factory, DBT, AWS Glue, etc. Proficiency in Power BI and Advanced Analytics for insightful data visualisation. Strong programming skills in Python for data processing More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - Leading Energy Company - London (Tech Stack: Data Engineer, Databricks, Python, PySpark, Power BI, AWS QuickSight, AWS, TSQL, ETL, Agile Methodologies) Company Overview: Join a dynamic team, a leading player in the energy sector, committed to innovation and sustainable solutions. Our client are seeking a talented Data Engineer to help build and optimise our data infrastructure, enabling them to … years), preferably in the energy sector. Right to work in the UK. Strong proficiency in SQL and database technologies (e.g., MS SQL, Snowflake). Hands-on experience with ETL/ELT tools such as Azure Data Factory, DBT, AWS Glue, etc. Proficiency in Power BI and Advanced Analytics for insightful data visualisation. Strong programming skills in Python for data processing More ❯
closely collaborate with data scientists, analysts, and software engineers to ensure efficient data processing, storage, and retrieval for business insights and decision-making. From their expertise in data modelling, ETL (Extract, Transform, Load) processes, and big data technologies it becomes possible to develop robust and reliable data solutions. RESPONSIBILITIES Data Pipeline Development: Design, implement, and maintain scalable data pipelines for … sources using tools such as databricks, python and pyspark. Data Modeling: Design and optimize data models and schemas for efficient storage, retrieval, and analysis of structured and unstructured data. ETL Processes: Develop and automate ETL workflows to extract data from diverse sources, transform it into usable formats, andload it into data warehouses, data lakes or lakehouses. Big Data Technologies … teams, including data scientists, analysts, and software engineers, to understand requirements, define data architectures, and deliver data-driven solutions. Documentation: Create and maintain technical documentation, including data architecture diagrams, ETL workflows, and system documentation, to facilitate understanding and maintainability of data solutions. Best Practices: Continuously learn and apply best practices in data engineering and cloud computing. QUALIFICATIONS Proven experience as More ❯
associated roadmaps towards unlocking these in alignment with the analytics product development strategy and priorities. The role bridges str ategy, operations, and governance, focusing on enabling the organization to extract value from data while maintaining compliance and quality standards. What will be your Key Responsibilities? Strategic Data Management Execute the vision, strategy, and roadmap for the assigned data domain in … Qualifications Strong expertise in data management, data integration, and data engineering best practices. Proficiency in SQL, Python, or other data-focused programming languages. Experience with big data technologies andETL processes. Project management skills with the ability to operate in Agile framework. Good communication skills, with a track record of aligning technical solutions to business needs. Knowledge of data governance More ❯
Engineer **Experience Required:** 7-8 Years **Role Overview:** We are seeking a Senior Data Architect/Data Engineer to design and implement scalable data solutions, including real-time andETL/ELT data pipelines. Responsibilities include optimizing our data warehouse, creating Power BI reports, and developing AI-driven solutions using OpenAI APIs. **Key Responsibilities:** - Design and implement robust real-time … teams to align data strategy with business goals. - Ensure data quality, consistency, and compliance. **Required Qualifications:** - 7-8 years in Data Engineering, Data Architecture, or related roles. - Experience with ETL/ELT pipelines and complex data ecosystems. - Proficiency in Python and strong SQL skills. - Familiarity with Azure services (Data Factory, Logic Apps, etc.). - Expertise in Power BI for data More ❯
Engineer **Experience Required:** 7-8 Years **Role Overview:** We are seeking a Senior Data Architect/Data Engineer to design and implement scalable data solutions, including real-time andETL/ELT data pipelines. Responsibilities include optimizing our data warehouse, creating Power BI reports, and developing AI-driven solutions using OpenAI APIs. **Key Responsibilities:** - Design and implement robust real-time … teams to align data strategy with business goals. - Ensure data quality, consistency, and compliance. **Required Qualifications:** - 7-8 years in Data Engineering, Data Architecture, or related roles. - Experience with ETL/ELT pipelines and complex data ecosystems. - Proficiency in Python and strong SQL skills. - Familiarity with Azure services (Data Factory, Logic Apps, etc.). - Expertise in Power BI for data More ❯
Technical Business Analysis experience. A proactive awareness of industry standards, regulations, and developments. Ideally, you'll also have: Experience of Relational Databases and Data Warehousing concepts. Experience of Enterprise ETL tools such as Informatica, Talend, Datastage or Alteryx. Project experience using the any of the following technologies: Hadoop, Spark, Scala, Oracle, Pega, Salesforce. Cross and multi-platform experience. Team building More ❯
and Databricks. Proficiency in working with the cloud environment and various platforms, including Azure, SQL Server. NoSQL databases is good to have. Hands-on experience with data pipeline development, ETL processes, and big data technologies (e.g., Hadoop, Spark, Kafka). Experience with DataOps practices and tools, including CI/CD for data pipelines. Experience in medallion data architecture and other More ❯
and Databricks. Proficiency in working with the cloud environment and various platforms, including Azure, SQL Server. NoSQL databases is good to have. Hands-on experience with data pipeline development, ETL processes, and big data technologies (e.g., Hadoop, Spark, Kafka). Experience with DataOps practices and tools, including CI/CD for data pipelines. Experience in medallion data architecture and other More ❯
and Databricks. Proficiency in working with the cloud environment and various platforms, including Azure, SQL Server. NoSQL databases is good to have. Hands-on experience with data pipeline development, ETL processes, and big data technologies (e.g., Hadoop, Spark, Kafka). Experience with DataOps practices and tools, including CI/CD for data pipelines. Experience in medallion data architecture and other More ❯
and Databricks. Proficiency in working with the cloud environment and various platforms, including Azure, SQL Server. NoSQL databases is good to have. Hands-on experience with data pipeline development, ETL processes, and big data technologies (e.g., Hadoop, Spark, Kafka). Experience with DataOps practices and tools, including CI/CD for data pipelines. Experience in medallion data architecture and other More ❯
Azure Machine Learning Studio. Data Storage & Databases: SQL & NoSQL Databases: Experience with databases like PostgreSQL, MySQL, MongoDB, and Cassandra. Big Data Ecosystems: Hadoop, Spark, Hive, and HBase. Data Integration & ETL: Data Pipelining Tools: Apache NiFi, Apache Kafka, and Apache Flink. ETL Tools: AWS Glue, Azure Data Factory, Talend, and Apache Airflow. AI & Machine Learning: Frameworks: TensorFlow, PyTorch, Scikit-learn, Keras More ❯
such as Tray.io, Azure Data Factory. • Implement data lakes and data warehouses using Azure Synapse Analytics, Azure Data Fabric or similar tools. Data Integration & Transformation • Develop and maintain ELT (Extract, Load, Transform) andETL (Extract, Transform, Load) processes to support the Data Platform development enabling best practice reporting and analytics. • Integrate data from multiple systems, including legal practice/matter … and reporting teams by ensuring access to clean and structured data. • Document processes and provide training on data tools and workflows. Skills and experience • Experience in building ELT/ETL pipelines and managing data workflows. • Proficiency in programming languages such as PySPark, Python, SQL, or Scala. • Solid understanding of data modelling and relational database concepts. • Knowledge of GDPR and UK More ❯
with senior stakeholders. Architect & Build Scalable Data Solutions Collaborate closely with senior product stakeholders to understand data needs and architect end-to-end ingestion pipelines Design and build robust ETL/ELT processes and data architectures using modern tools and techniques Lead database design, data modelling, and integration strategies to support analytics at scale Drive Data Integration & Management Design and … of software engineering best practices - code reviews, testing frameworks, CI/CD, and code maintainability Experience deploying applications into production environments, including packaging, monitoring, and release management Ability to extract insights from complex and disparate data sets and communicate clearly with stakeholders Hands-on experience with cloud platforms such as AWS, Azure, or GCP Familiarity with traditional ETL tools (e.g. More ❯
London, England, United Kingdom Hybrid / WFH Options
Inspiring Search
a few in-person meetings in shared co-working spaces on an ad hoc basis. Role Description We are looking for an SQL Developer (Snowflake), specializing in data modelling, ETL processes, and cloud-based data solutions. This position requires expertise in Snowflake, Azure, Python and Power BI , with a strong focus on building semantic models and supporting analytics. Key Responsibilities … Develop and optimise complex SQL queries , views, and stored procedures in Snowflake . Design and maintain efficient ETL/ELT pipelines using modern data integration platforms. Create and manage Python-based stored procedures in Snowflake to support advanced transformations and automation. Build and maintain Power BI datasets, data models, and semantic models to support business intelligence needs. Work closely with … of data warehousing, dimensional modelling, and ELT best practices. Knowledge of version control and Agile development methodologies. Qualifications: Strong experience in Data Engineering, with a focus on data modelling, ETL, and Snowflake. Snowflake certification (e.g., SnowPro Core) is a strong plus Proficiency in Snowflake for data warehousing, including semantic modelling and Python-based stored procedures. Experience with Azure Data Factory More ❯
intelligence and reporting tools like Tableau, PowerBI or similar. Experience with version control systems (e.g. Git) Ability to work in an Agile environment Experience with Microsoft SQL. Experience with ETL Tools and Data Migration. Experience with Data Analysis, Data mapping and UML. Experience with programming languages (Python, Ruby, C++, PHP, etc). The ability to work with large datasets across More ❯
intelligence and reporting tools like Tableau, PowerBI or similar. Experience with version control systems (e.g. Git) Ability to work in an Agile environment Experience with Microsoft SQL. Experience with ETL Tools and Data Migration. Experience with Data Analysis, Data mapping and UML. Experience with programming languages (Python, Ruby, C++, PHP, etc). The ability to work with large datasets across More ❯
intelligence and reporting tools like Tableau, PowerBI or similar. Experience with version control systems (e.g. Git) Ability to work in an Agile environment Experience with Microsoft SQL. Experience with ETL Tools and Data Migration. Experience with Data Analysis, Data mapping and UML. Experience with programming languages (Python, Ruby, C++, PHP, etc). The ability to work with large datasets across More ❯
Job Title: QA Tester – ETL Testing (Informatica) & Azure Data Engineering Location: Westminster, London Type: Full-time Department: Quality Assurance/Data Engineering Reporting to: Data Engineering Lead Job Summary: We are looking for a skilled and detail-oriented QA Tester with hands-on experience in ETL testing (Informatica or Microsoft Fabric) and a strong understanding of Azure Data Engineering tools … will possess good hands-on knowledge of writing and executing SQL queries to validate data across systems. Experience in Microsoft Fabric is considered a strong bonus. Key Responsibilities: Perform ETL/data pipeline testing using Informatica PowerCenter or IICS . Validate data ingestion , transformation , and loading processes across Azure services. Execute source-to-target data validation , data profiling, and data … analysts. Contribute to daily Agile ceremonies and maintain clear and detailed QA documentation. Required Skills & Experience: 3–7 years of software testing experience, with at least 2+ years in ETL testing using Informatica . Strong hands-on experience in writing and executing complex SQL queries . Experience testing cloud-based data pipelines built on Azure , specifically: Azure SQL Database Azure More ❯
Job Title: QA Tester – ETL Testing (Informatica) & Azure Data Engineering Location: Westminster, London Type: Full-time Department: Quality Assurance/Data Engineering Reporting to: Data Engineering Lead Job Summary: We are looking for a skilled and detail-oriented QA Tester with hands-on experience in ETL testing (Informatica or Microsoft Fabric) and a strong understanding of Azure Data Engineering tools … will possess good hands-on knowledge of writing and executing SQL queries to validate data across systems. Experience in Microsoft Fabric is considered a strong bonus. Key Responsibilities: Perform ETL/data pipeline testing using Informatica PowerCenter or IICS . Validate data ingestion , transformation , and loading processes across Azure services. Execute source-to-target data validation , data profiling, and data … analysts. Contribute to daily Agile ceremonies and maintain clear and detailed QA documentation. Required Skills & Experience: 3–7 years of software testing experience, with at least 2+ years in ETL testing using Informatica . Strong hands-on experience in writing and executing complex SQL queries . Experience testing cloud-based data pipelines built on Azure , specifically: Azure SQL Database Azure More ❯
Job Title: QA Tester – ETL Testing (Informatica) & Azure Data Engineering Location: Westminster, London Type: Full-time Department: Quality Assurance/Data Engineering Reporting to: Data Engineering Lead Job Summary: We are looking for a skilled and detail-oriented QA Tester with hands-on experience in ETL testing (Informatica or Microsoft Fabric) and a strong understanding of Azure Data Engineering tools … will possess good hands-on knowledge of writing and executing SQL queries to validate data across systems. Experience in Microsoft Fabric is considered a strong bonus. Key Responsibilities: Perform ETL/data pipeline testing using Informatica PowerCenter or IICS . Validate data ingestion , transformation , and loading processes across Azure services. Execute source-to-target data validation , data profiling, and data … analysts. Contribute to daily Agile ceremonies and maintain clear and detailed QA documentation. Required Skills & Experience: 3–7 years of software testing experience, with at least 2+ years in ETL testing using Informatica . Strong hands-on experience in writing and executing complex SQL queries . Experience testing cloud-based data pipelines built on Azure , specifically: Azure SQL Database Azure More ❯