Bath, England, United Kingdom Hybrid / WFH Options
Noir
Job Reference: NC/RG/DE_1745802035 Job Views: 50 Posted: 28.04.2025 Expiry Date: 12.06.2025 col-wide Job Description: Data Engineer - FinTech Company - Newcastle (Tech Stack: Data Engineer, Databricks, Python, Azure, Power BI, AWS QuickSight, AWS, TSQL, ETL, Agile Methodologies) I’m working with a leading Software House in the FinTech industry, based in Newcastle, who are looking to … working with data and delivering value to stakeholders. * Strong proficiency in SQL, Python, and Apache Spark, with hands-on experience using these technologies in a production environment. * Experience with Databricks and Microsoft Azure is highly desirable. * Financial Services experience is a plus but not essential. * Excellent communication skills, with the ability to explain complex data concepts in a clear and More ❯
with the ability to engage technical and non-technical stakeholders alike. The Head of BI will work within a modern, cloud-based BI ecosystem, including: Data Integration: Fivetran, HVR, Databricks, Apache Kafka, Google BigQuery, Google Analytics 4 Data Lake & Storage: Databricks Delta Lake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST … API Advanced Analytics: Databricks (AI & Machine Learning) Governance & Infrastructure: Centralised Data Catalogue & Access Control (Okta) Job Scheduling & Monitoring (AWS, Splunk) Agile Data Engineering with centralised code repositories BI Data Portal: Power BI However this coexists with a legacy tech stack, which the head of BI will support the transition to the new architecture already in place. Qualifications/Education: Degree More ❯
ability to engage technical and non-technical stakeholders alike. The Head of Data Engineering & Insight will work within a modern, cloud-based BI ecosystem , including: Data Integration: Fivetran , HVR, Databricks , Apache Kafka, Google BigQuery , Google Analytics 4 Data Lake & Storage: Databricks Delta Lake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST … API Advanced Analytics: Databricks (AI & Machine Learning) Governance & Infrastructure: Centralised Data Catalogue & Access Control (Okta) Job Scheduling & Monitoring (AWS, Splunk) Agile Data Engineering with centralised code repositories BI Data Portal: Power BI However this coexists with a legacy tech stack , which theHead of Data Engineering & Insight will support the transition to the new architecture already in place. Qualifications/Education More ❯
translating these into technical deliverables. Motivated to expand technical skills. Desirable Experience with Microsoft BI Tools such as Power BI, SSRS & SQL Server. Experience with Azure Data Factory and Databricks (Python). Experience working with DevOps, IaC & CI/CD pipelines (e.g. Terraform and Databricks Asset Bundles). More ❯
translating these into technical deliverables. Motivated to expand technical skills. Desirable Experience with Microsoft BI Tools such as Power BI, SSRS & SQL Server. Experience with Azure Data Factory and Databricks (Python). Experience working with DevOps, IaC & CI/CD pipelines (e.g. Terraform and Databricks Asset Bundles). More ❯
ability to engage technical and non-technical stakeholders alike. The Head of Data Engineering & Insight will work within a modern, cloud-based BI ecosystem , including: Data Integration: Fivetran , HVR, Databricks , Apache Kafka, Google BigQuery , Google Analytics 4 Data Lake & Storage: Databricks Delta Lake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST … API Advanced Analytics: Databricks (AI & Machine Learning) Governance & Infrastructure: Centralised Data Catalogue & Access Control (Okta) Job Scheduling & Monitoring (AWS, Splunk) Agile Data Engineering with centralised code repositories BI Data Portal: Power BI However this coexists with a legacy tech stack , which theHead of Data Engineering & Insight will support the transition to the new architecture already in place. Qualifications/Education More ❯
Wilmslow, England, United Kingdom Hybrid / WFH Options
The Citation Group
pipelines and infrastructure, who can implement processes on our modern tech stack with robust, pragmatic solutions. Responsibilities develop, and maintain ETL/ELT data pipelines using AWS services Data, Databricks and dbt Manage and optimize data storage solutions such as Amazon S3, Redshift, RDS, and DynamoDB. Implement and manage infrastructure-as-code (IaC) using tools like Terraform or AWS CloudFormation. … tools like Terraform or CloudFormation. Experience with workflow orchestration tools (e.g., Airflow, Dagster). Good understanding of Cloud providers – AWS, Microsoft Azure, Google Cloud Familiarity with DBT, Delta Lake, Databricks Experience working in Agile environments with tools like Jira and Git. About Us We are Citation. We are far from your average service provider. Our colleagues bring their brilliant selves More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
IAT Level II Certification may be required (GSEC, GICSP, CND, CySA+, Security+ CE, SSCP or CCNA-Security). Advanced proficiency in Python, SQL, PySpark, MLOps, computer vision, and NLP, Databricks, AWS, Data Connections Cluster Training and tools such as GitLab, with proven success in architecting complex AI/ML pipelines. Demonstrated ability to write clean, efficient code in Python and … technical teams, providing guidance on project direction, technical challenges, and professional development. One or more of the following certifications are desired: AWS Certified Solutions Architect or Machine Learning Specialty. Databricks Certified Machine Learning Professional. Agile/Scrum Master Certification. Specialized certifications in AI/ML tools or methodologies. Preferred Qualifications: Familiarity with Apache Spark or comparable distributed data processing frameworks … preferably for large-scale data transformations and analytics. Working knowledge of data governance platforms (e.g., Collibra) and cloud-based analytics solutions (e.g., Databricks). Hands-on experience with modern BI tools (e.g., Qlik, Power BI, Tableau) for data visualization and insight generation. Awareness of data access control best practices (e.g., Immuta) and familiarity with role or attribute-based access control More ❯
London, England, United Kingdom Hybrid / WFH Options
Wiraa
solutions that drive organizational efficiency and sustainability goals. As a key member of the technology department, you will work with a variety of modern cloud-based data platforms, including Databricks on AWS, SQL, Power BI, and other data-centric tools. Your technical expertise will contribute to the delivery of projects, day-to-day business-as-usual operations, and continuous improvements … your skills further through training and certifications, with a supportive environment focused on innovation and sustainability. Qualifications Proven experience as a Data Engineer or Data Analyst Strong proficiency in Databricks/Apache Spark Expertise in SQL and Python programming Experience with version control tools such as BitBucket or GitHub Knowledge of data modeling and ETL processes Understanding of cloud platforms More ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
reliable data movement and transformation. • Data Modelling using Kimball, 3NF or Dimensional methodologies • Utilize SQL and Python languages to extract, transform, and load data from various sources into Azure Databricks and Azure SQL/SQL Server. • Design and implement metadata driven pipelines to automate data processing tasks. • Collaborate with cross-functional teams to understand data requirements and implement appropriate solutions. … are some obvious required Skills and Experience we are going to be seeking out. For this role we'd be expecting to see: • Solid experience in Azure, specifically Azure Databricks and Azure SQL/SQL Server. • Proficiency in SQL and Python languages. • Hands-on experience in designing and building data pipelines using Azure Data Factory (ADF). • Familiarity with building More ❯
Newbury, England, United Kingdom Hybrid / WFH Options
Intuita
reliable data movement and transformation. • Data Modelling using Kimball, 3NF or Dimensional methodologies • Utilize SQL and Python languages to extract, transform, and load data from various sources into Azure Databricks and Azure SQL/SQL Server. • Design and implement metadata driven pipelines to automate data processing tasks. • Collaborate with cross-functional teams to understand data requirements and implement appropriate solutions. … are some obvious required Skills and Experience we are going to be seeking out. For this role we'd be expecting to see: • Solid experience in Azure, specifically Azure Databricks and Azure SQL/SQL Server. • Proficiency in SQL and Python languages. • Hands-on experience in designing and building data pipelines using Azure Data Factory (ADF). • Familiarity with building More ❯
Liverpool, England, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
reliable data movement and transformation. • Data Modelling using Kimball, 3NF or Dimensional methodologies • Utilize SQL and Python languages to extract, transform, and load data from various sources into Azure Databricks and Azure SQL/SQL Server. • Design and implement metadata driven pipelines to automate data processing tasks. • Collaborate with cross-functional teams to understand data requirements and implement appropriate solutions. … are some obvious required Skills and Experience we are going to be seeking out. For this role we'd be expecting to see: • Solid experience in Azure, specifically Azure Databricks and Azure SQL/SQL Server. • Proficiency in SQL and Python languages. • Hands-on experience in designing and building data pipelines using Azure Data Factory (ADF). • Familiarity with building More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
to detail with a willingness to document processes and changes. Effective communication skills and a collaborative mindset. One or more of the following certifications are desired: AWS Certified Developer, Databricks, Agile/Scrum, Python Programmer Preferred Qualifications: Familiarity with Apache Spark or comparable distributed data processing frameworks, preferably for large-scale data transformations and analytics. Working knowledge of data governance … platforms (e.g., Collibra) and cloud-based analytics solutions (e.g., Databricks). Hands-on experience with modern BI tools (e.g., Qlik, Power BI, Tableau) for data visualization and insight generation. Awareness of data access control best practices (e.g., Immuta) and familiarity with role or attribute-based access control frameworks. Other: Must hold an Active DOD Secret, Top Secret or TS/ More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
abilities, and the capacity to work productively in cross-functional teams while maintaining a continuous improvement mindset. One or more of the following certifications are desired: AWS Certified Developer, Databricks, Agile/Scrum, Python Programmer Preferred Qualifications: Familiarity with Apache Spark or comparable distributed data processing frameworks, preferably for large-scale data transformations and analytics. Working knowledge of data governance … platforms (e.g., Collibra) and cloud-based analytics solutions (e.g., Databricks). Hands-on experience with modern BI tools (e.g., Qlik, Power BI, Tableau) for data visualization and insight generation. Awareness of data access control best practices (e.g., Immuta) and familiarity with role or attribute-based access control frameworks. Other: Must hold an Active DOD Secret, Top Secret or TS/ More ❯
analytical skills and attention to detail. Preferred/Bonus Skills: Experience with Microsoft Fabric (OneLake, DirectLake, Data Warehouse, Data Activator) is a significant plus. Knowledge of Azure Synapse Analytics , Databricks , or Power BI . Experience with automated data validation or scripting (e.g., Python, PowerShell). Familiarity with CI/CD processes in data environments. Relevant certifications (e.g., Microsoft Certified: Azure More ❯
analytical skills and attention to detail. Preferred/Bonus Skills: Experience with Microsoft Fabric (OneLake, DirectLake, Data Warehouse, Data Activator) is a significant plus. Knowledge of Azure Synapse Analytics , Databricks , or Power BI . Experience with automated data validation or scripting (e.g., Python, PowerShell). Familiarity with CI/CD processes in data environments. Relevant certifications (e.g., Microsoft Certified: Azure More ❯
for Azure services. Skills, Experience & Competencies Required Skills: Extensive experience in data engineering with Microsoft Azure. Proficiency in Azure services such as Azure Data Factory, Azure Data Lake, Azure Databricks, Azure Synapse Analytics, and Azure SQL Database. Strong experience with ETL/ELT pipelines, data modelling, and data integration. Proficiency in SQL and programming languages like Python, Scala, or PowerShell. More ❯
practices. Collaborating with leadership and stakeholders to align data priorities. Qualifications and Experience: Expertise in Commercial/Procurement Analytics and SAP (S/4 Hana). Experience with Spark, Databricks, or similar tools. Strong proficiency in data modeling, SQL, NoSQL, and data warehousing. Hands-on experience with data pipelines, ETL, and big data technologies. Proficiency in cloud platforms like AWS More ❯
using both on-premises and cloud-based technologies (Azure, AWS, or GCP). You'll design scalable data architectures, including data lakes, lakehouses, and warehouses, leveraging tools such as Databricks, Snowflake, and Azure Synapse. The ideal candidate will have a deep technical background in data engineering and a passion for leading the development of best-in-class data solutions. You More ❯
London, England, United Kingdom Hybrid / WFH Options
McCabe & Barton
telco. The ideal candidate with have expertise in some of the following: Python, SQL, Scala, and Java for data engineering. Strong experience with big data tools (Apache Spark, Hadoop, Databricks, Dask) and cloud platforms (AWS, Azure, GCP). Proficient in data modelling (relational, NoSQL, dimensional) and DevOps automation (Docker, Kubernetes, Terraform, CI/CD). Skilled in designing scalable, fault More ❯
London, England, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - Leading Fashion Company - London (Tech Stack: Data Engineer, Databricks, Python, Power BI, AWS QuickSight, AWS, TSQL, ETL, Agile Methodologies) We're recruiting on behalf of a leading fashion brand based in London that's recognised for combining creativity with cutting-edge technology. They're on the lookout for a talented Data Engineer to join their growing data team. More ❯
/ELT, Data Lakes, Warehousing, MDM, and BI. Engineering delivery practices : Knowledge of Agile, DevOps, Git, APIs, Containers, Microservices, and Data Pipeline orchestration. Broader platform experience (desirable): Familiarity with Databricks, Snowflake, Azure Data Factory, Azure Synapse, Microsoft SQL/SSIS. Certifications that are a plus: Microsoft Certified: Fabric Analytics Engineer Associate (DP-600) Microsoft Certified: Fabric Data Engineer Associate (DP More ❯
Watford, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Certification: DP-600 – Fabric Analytics Engineer Associate. Responsibilities Daily tasks include, but are not limited to: Designing, building, and optimizing data pipelines and ETL workflows using Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions for data ingestion, storage, and transformation. Writing clean, efficient Python code for data engineering. Collaborating with data scientists and stakeholders on machine learning and More ❯
SQL skills, including data transformation and profiling. Experience building scalable production ETL/ELT pipelines. 1-2 years of hands-on experience with Azure services such as Data Factory, Databricks, Synapse (DWH), Azure Functions, and other data analytics tools, including streaming. Experience with Airflow and Kubernetes. Programming skills in Python (PySpark) and scripting languages like Bash. Knowledge of Git, CI More ❯
analytics user experience, unlocking growth and operational excellence. What are we looking for? Hands-on experience designing scalable, greenfield data platforms in cloud environments using Azure D&A stack, Databricks, and Azure Open AI. Proficiency in coding (Python, PL/SQL, Shell Script), relational and non-relational databases, ETL tools (e.g., Informatica), and scalable data platforms. Knowledge of Azure Data More ❯