will be a hybrid set up, requiring 3 days a week in the London HQ. What are we looking for? Advanced proficiency in SQL (T-SQL, PL/SQL, Databricks SQL) Experience using scripting languages such as Python, PowerShell etc. Data Modelling, cleansing and enrichment. Experience with Microsoft Azure. Strong knowledge of ETL/ELT tools and experience navigating data More ❯
within a modern Azure-based environment. We’re investing in our data platform, and this role offers the opportunity to work with tools like Azure Synapse , Data Lake , and Databricks , while contributing to the evolution of our data engineering practices. Here at Severn Trent, we embrace innovation, and you will be looked upon to adopt best practices in data engineering … Lake and Data Factory. You’ll also bring proven capability in PySpark, with experience building scalable data transformation workflows and working with large datasets in distributed environments. Experience with Databricks is a plus, as is familiarity with DevOps and ML Ops practices that support collaboration with analytics and data science teams. You should be comfortable working in a fast-paced More ❯
London, England, United Kingdom Hybrid / WFH Options
Axiom Software Solutions Limited
be a bonus) SQL PySpark Delta Lake Bash (both CLI usage and scripting) Git Markdown Scala (bonus, not compulsory) Azure SQL Server as a HIVE Metastore (bonus) Technologies: Azure Databricks Apache Spark Delta Tables Data processing with Python PowerBI (Integration/Data Ingestion) JIRA #J-18808-Ljbffr More ❯
required to join major project teams to provide data analysis expertise. Requirements Knowledge & Experience Strong background in data analysis, modelling, and visualisation. Expert in DAX, Python, and data notebooks (Databricks, Jupyter, or Azure Synapse). Proficient in Power BI and Excel. Familiar with Microsoft Data Lake and data modelling best practices. Proven experience managing data projects and stakeholder expectations. Strong … ability to tell stories with data Qualifications Certification in Data Analytics Technology (preferred) ITIL v3 Foundation (desirable) Technical Skills Power BI & DAX Jupyter or Azure Synapse Notebooks Databricks Microsoft Data Lake Python (or similar) What We Offer Competitive salary Flexible working Travel covered to any of our sites (subject to HMRC advisory rates) Extensive corporate benefits including, Private Medical, Pension More ❯
metadata-driven ingestion and transformation pipelines. Experienced in Master Data Management (MDM), data cataloguing, and managing data lineage Familiarity with accelerators and extensive knowledge of Azure Cloud Services including DataBricks, Azure Data Factory, Azure SQL, CosmosDB, and other Azure services. Profound understanding of Microsoft SQL Server and skilled in designing, constructing, and maintaining data warehouses, data lakes, and lakehouses (preferably … utilising Databricks). Experienced with structured and unstructured datasets, capable of designing various architectural solutions including data applications and integration systems. Comprehensive experience in the technology delivery lifecycle from inception to maintenance and delivery. Developed architecture in agile environments and familiar with methodologies like SCRUM, PRINCE2, and Lean. Strong problem-solving skills, able to design solutions that address conflicting requirements More ❯
with Kharon’s platform goals. Establish and enforce best practices for data modeling, ETL/ELT development, pipeline orchestration, and observability. Drive adoption of modern data tooling (e.g., Airflow, Databricks, Spark, AWS-native services) and architectural patterns. Implement and oversee data governance practices including retention, monitoring and alerting, deduplication. Promote a culture of data quality and stewardship, ensuring high integrity … Python, with a solid foundation in modern software development practices. Deep experience designing and scaling data pipelines, data warehousing solutions, and ETL/ELT workflows using tools like Airflow, Databricks, and AWS. Proven ability to lead high-performing teams with a focus on mentorship, technical rigor, and a strong engineering culture. Strategic thinker with a track record of delivering measurable More ❯
and services company, specializing in App Innovation, Data, AI, and Cloud Infrastructure solutions. Recognized as a Fabric Featured Partner by Microsoft, we build and deploy data platforms on Fabric, Databricks, and Synapse analytics, offering the opportunity to shape cutting-edge data initiatives for a wide range of clients. About The Role Our Data Architects play a crucial role in shaping … projects. Documentation & Knowledge Transfer: Maintain high-quality technical documentation, ensuring clarity for both engineering teams and business stakeholders. Industry Awareness: Stay informed on Microsoft Fabric and comparable platforms like Databricks and Synapse, understanding their differences and relevance to projects. Experience & Skills You should have strong soft skills in stakeholder engagement, with the ability to build technical credibility and take a More ❯
IT teams to understand data requirements and deliver robust solutions. Optimize and enhance data architecture for performance and scalability Mandatory Skills: Azure Data Factory (ADF) Data Lake Snowflake Python Databricks Fivetran Experience in ETRM domain Integration with trading systems like Allegro, RightAngle, Endur Preferred Qualifications: Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Ability to More ❯
and its native tech stack in designing and building data & AI solutions. - Experience in building cloud data pipelines using Azure native programming techniques such as PySpark or Scala and Databricks for sourcing, enriching, and maintaining structured and unstructured data sets for analysis and reporting (secondary skill). - Experience working in an agile environment, leveraging DevOps, and test-driven development techniques. More ❯
IT teams to understand data requirements and deliver robust solutions. Optimize and enhance data architecture for performance and scalability Mandatory Skills: Azure Data Factory (ADF) Data Lake Snowflake Python Databricks Fivetran Experience in ETRM domain Integration with trading systems like Allegro, RightAngle, Endur Preferred Qualifications: Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Ability to More ❯
etc) is a must Significant AWS or Azure hands-on experience (coding/configuration/automation/monitoring/security) ETL Tools such as Azure Data Fabric (ADF) and Databricks or similar ones Data Lakes: Azure Data, Delta Lake, Data Lake or Databricks Lakehouse Certifications AWS, Azure, or Cloudera certifications are a plus. Nice to have skills: Advanced Database and More ❯
London, England, United Kingdom Hybrid / WFH Options
Women in Data®
deliver a data roadmap aligned to business goals, with thoughtful prioritisation balancing today’s needs and tomorrow’s vision Shape and deliver a single, reliable source of truth in DataBricks, enabling smarter and more personalised decisions across the organisation Cultivate a culture of curiosity and self service by upskilling teams in data literacy and creating space for the team to … engagement and relationship-building, especially at senior levels A thoughtful and empowering leader, dedicated to nurturing team wellbeing and growth Deeply familiar with data transformation programmes and platforms like DataBricks A believer in strong data foundations, with a pragmatic approach to delivery Holds a relevant degree or equivalent experience in statistics, mathematics or related fields It would be great if More ❯
in concept design. The evolved prototype would be made production ready (you would lead that). Solutions would come from combination of SAP (origin of business data), Microsoft Azure Databricks and Microsoft Power Platform (principally Power Apps & Power Automate). Key to success, is understanding the business process and context, in order to provide advantageous solutions. Another aspect of this … Computing, Mathematics, Economics or similar. Proficiency in KNIME for data integration, processing, and manipulation. Experience with Power Platform for building custom applications and automating business processes. Familiarity with Azure Databricks for data storage and processing, or knowledge of Data Warehousing concepts. Knowledge of python for data transformation and SQL for database querying. Right To Work: please ensure you have valid More ❯
is a must. Significant AWS or Azure hands-on experience (coding/configuration/automation/monitoring/security). ETL Tools such as Azure Data Fabric (ADF) and Databricks or similar ones. Data Lakes: Azure Data, Delta Lake, Data Lake or Databricks Lakehouse. Certifications: AWS, Azure, or Cloudera certifications are a plus. Nice to have skills: Advanced Database and More ❯
specializing in App Innovation, Data, AI, and Cloud Infrastructure solutions. Recognized as a Fabric Featured Partner by Microsoft we build and deploy data platforms for our customers on Fabric, Databricks and Synapse analytics, offering the opportunity to shape cutting-edge data initiatives for a wide range of clients. About the role Our Data Architects play a crucial role in shaping … projects. Documentation & Knowledge Transfer: Maintain high-quality technical documentation, ensuring clarity for both engineering teams and business stakeholders. Industry Awareness: Stay informed on Microsoft Fabric and comparable platforms like Databricks and Synapse, understanding their differences and relevance to projects. Experience & Skills You should have strong soft skills in stakeholder engagement, with the ability to build technical credibility and take a More ❯
Power BI (or similar tools) Some Python knowledge Strong attention to detail and problem-solving skills Comfortable explaining data to non-technical stakeholders Nice to have: Experience with Azure, Databricks or ETL processes Understanding of data modelling concepts What’s on offer: Up to £40,000 salary Hybrid working – 3 days a week in the Manchester office 25 days holiday More ❯
quality data solutions. Ensure data quality through rigorous testing and validation. Apply data modeling techniques to structure data for optimal performance and usability. Profile Strong knowledge with SQL, DBT, Databricks, Python, Oracle Strong testing mindset (quality-first principle) Star schema modeling (facts/dimensions) You have an e xperience in the banking sector You have good written and spoken communication More ❯
East Hagbourne, England, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - Insurance Company - London (NC/RG/DE) Didcot, England Salary: GBP45000 - GBP55000 per annum DataEngineer-InsuranceCompany-Didcot (TechStack:DataEngineer,Databricks,Python,PowerBI,AWSQuickSight,AWS,TSQL,ETL,AgileMethodologies) We’reworkinginpartnershipwithaforward-thinking insurancecompanybasedinDidcot that’sheavilyinvestingindataandtechnologytodrivesmarterdecisionsandbetteroutcomesfortheircustomers.Aspartofthisgrowth,theyarenowseekinga DataEngineer tohelpshapetheirdatainfrastructureandanalyticscapabilities. TheRole AsaDataEngineer,you’llplayacrucialroleindesigningandmaintainingdatapipelines,ensuringdataisclean,structured,andaccessibleforanalysis.You'llworkwithmoderndatatoolsandcloudtechnologiestoenablereal-timeinsightsandsupportstrategicbusinessdecisions. KeyResponsibilities Developandmaintainscalabledatapipelinesandsolutionsfordataingestion,transformation,anddelivery. Use Python toautomatedataworkflows,supportETLprocesses,andenhanceanalyticscapabilities. Workwith relationaldatabases suchas More ❯
Information Systems, or a related field. • Extensive relevant experience. • Demonstrable experience in implementing data solutions, preferably within the insurance industry. • Proficiency in modern data platforms tools (e.g., Informatica, Snowflake, Databricks) and analytics platforms (e.g., Tableau, Power BI, Cognos). • Experience with building foundational and modern data mesh architectures with automated Data Governance capabilities with consideration for emerging technologies. • Good knowledge More ❯
London, England, United Kingdom Hybrid / WFH Options
Harnham
work end-to-end, making meaningful contributions within a small, agile team. Experience We're looking for candidates with: Extensive experience in Data Engineering with a focus on Azure, Databricks, and Delta Lake. Proficiency in Kubernetes, Infrastructure as Code, and Terraform. Expertise in Azure DevOps and a commitment to best practices. A preference for simple, transparent solutions and a drive More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
work end-to-end, making meaningful contributions within a small, agile team. Experience We're looking for candidates with: Extensive experience in Data Engineering with a focus on Azure, Databricks, and Delta Lake. Proficiency in Kubernetes, Infrastructure as Code, and Terraform. Expertise in Azure DevOps and a commitment to best practices. A preference for simple, transparent solutions and a drive More ❯
Qualifications Proven experience in a Data Engineer role. Expertise in Microsoft and Azure technologies: SQL Server, ADF, Logic Apps, Data Lake, Power Query, Fabric, Function Apps, Power Automate, Spark, Databricks, SSIS. Strong understanding of data modelling techniques including Normalisation and Kimball methodologies. Proficient in SQL, and experience with programming languages such as Python or C#. Familiarity with low-code platforms More ❯
Qualifications Proven experience in a Data Engineer role. Expertise in Microsoft and Azure technologies: SQL Server, ADF, Logic Apps, Data Lake, Power Query, Fabric, Function Apps, Power Automate, Spark, Databricks, SSIS. Strong understanding of data modelling techniques including Normalisation and Kimball methodologies. Proficient in SQL, and experience with programming languages such as Python or C#. Familiarity with low-code platforms More ❯
London, England, United Kingdom Hybrid / WFH Options
SR2 | Socially Responsible Recruitment | Certified B Corporation™
a Data Engineering position. Strong experience with building data pipelines in the cloud (AWS, Azure or GCP). Excellent knowledge of PySpark, Python and SQL fundamentals. Familiar with Airflow, Databricks and/or BigQuery. Ability to work on messy, complex real-world data challenges. Comfortable working in a fast-paced environment. Experience working in finance previous would be beneficial but More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
La Fosse
tools, architecture repositories, and visual communication. Experience in assessing emerging technologies and their impact on enterprise architecture. Desirable Skills Experience with Microsoft Azure technologies (e.g., Data Factory, Synapse, ADLS, Databricks, Power BI). Understanding of enterprise data architecture and analytics platforms. More ❯