with SQL and data manipulation across large datasets. Familiarity with data visualization tools (e.g., Matplotlib, Seaborn, Plotly, Tableau, or Power BI). Exposure to modern collaborative data platforms (e.g., Databricks, Snowflake, Palantir Foundry) is a plus. Excellent problem-solving skills, eagerness to learn, and ability to work in fast-paced, evolving environments. Strong written and verbal communication skills, with the More ❯
have experience working with some of the following technologies: Power BI Power Apps Blob storage Synapse Azure Data Factory (ADF) IOT Hub SQL Server Azure Data Lake Storage Azure Databricks Purview Power Platform Python More ❯
skills; comfortable influencing C-suite clients. Advantageous competencies (but not essential): Exposure to AI/ML, NLP or advanced modelling Exposure to the modern data stack tools (e.g. Snowflake, Databricks). Experience managing P&L, setting go-to-market strategy or building consulting practices. Exposure to behavioural data sources (eg Google or Adobe analytics). Benefits Benefits At Interpath, our More ❯
skills to accelerate our clients' journeys to becoming more data-driven and less data-busy. Key Responsibilities Client Delivery Design, validate, and optimise Data Vault 2.0 architectures across Snowflake, Databricks, and BigQuery environments. Provide best-practice guidance on dbt modelling, testing frameworks, and macros. Define governance and metadata standards (naming, access, lineage, compliance) suited to regulated industries like higher education More ❯
based data solutions (preferably in Azure). Expert-level SQL and database design (normalisation, indexing, query optimisation). Strong experience with ETL/ELT tools. E.g. Azure Data Factory, Databricks, Synapse Pipelines, SSIS, etc. Experience with Python, PySpark, or Scala for data processing. Familiarity with CI/CD practices. Experience with Data lake, Data warehouse and Medallion architectures. Understanding of More ❯
Edinburgh, Midlothian, United Kingdom Hybrid/Remote Options
Aberdeen
as Azure Data Factory, Synapse Pipelines, and SQL/T-SQL, ensuring data quality, performance, and reliability. Contribute to the evolution of our cloud-native data architecture, leveraging Azure Databricks, Azure Data Lake, and Snowflake where appropriate. Apply strong data modelling and transformation skills to support analytics, regulatory reporting, and operational use cases. Promote and implement engineering best practices, including More ❯
Greater Manchester, Lancashire, England, United Kingdom
Sagacity
Experience Educational Background: • Bachelor’s degree in computer science, Engineering, Information Technology, or a related field • Relevant certifications in data engineering or cloud platforms (e.g., AWS, Azure, Google Cloud, Databricks) are a strong plus Technical Skills: • Proficiency in SQL and experience with relational databases (e.g., MySQL, PostgreSQL) • Familiarity with data pipeline and workflow management tools (e.g., Apache Airflow) • Experience with More ❯
london (city of london), south east england, united kingdom
Gen Re Corporation
managed responsibly throughout its lifecycle, aligned with regulatory standards and organizational policies. Qualifications & Experience • Working knowledge of data architecture and implementation of governed data solutions using Microsoft Purview, Azure, Databricks, Python, DevOps, Unity Catalogue, Data Factory, and RDBMSs. Experience with metadata management, lineage tracking, and automation tools for governance workflows. • Experience in designing and maintaining end-to-end data architectures More ❯
Strong knowledge of software development methodologies, tools, and frameworks, particularly Agile. · Proficiency in both SQL and NOSQL database management systems (e.g. SQL Server/Oracle/MongoDB, CosmosDB, Snowflake, Databricks). · Hands-on experience with data modelling tools, data warehousing, ETL processes, and data integration techniques. · Experience with at least one cloud data platform (e.g. AWS, Azure, Google Cloud) and More ❯
Senior Databricks Engineer - Banbury Hybrid - Salary £65-75K + Benefits Bibby Financial Services have an exciting opportunity available for a reliable Senior Databricks Engineer to join our team. You will join us on a full time, permanent basisand in return, you will receive a competitive salary of £65,000 - £75,000 per annum. About the role: As our Senior … Databricks Engineer, you will operate within an Agile delivery environment, working closely with the Data Product Manager and Data Architect to ensure your team maintains a pipeline of delivery against the Backlog; providing vital insight from our wide-ranging dataset to support executive and operational decision making that will underpin sustained growth of BFS business units domestically and internationally. You … coach, support and organise to ensure we sustain a predictable pipeline of delivery, whilst ensuring all appropriate governance and best practice is adhered to. Your responsibilities as our Senior Databricks Engineer will include: Understand the business/product strategy and supporting goals with the purpose of ensuring data interpretation aligns Provide technical leadership on how to break down initiatives into More ❯
Bristol, Avon, England, United Kingdom Hybrid/Remote Options
Proactive Appointments
data using tools such as SQL, dbt, pyspark and SSIS Comfortable building out and administrating data models, ensuring that they are accessible and used appropriately, leveraging platforms such as Databricks and Power BI, and Microsoft Fabric Familiarity with data processing paradigms such as ETL/ELT, Kimball, Medallion Data Lakehouse Passionate about data and information with a strong understanding of More ❯
Warrington, Cheshire, England, United Kingdom Hybrid/Remote Options
Brookson
example - Computer Science, Mathematics, Engineering or other STEM A strong team player with empathy, humility and dedication to joint success and shared development. Desirable Experience and Qualifications: Experience with Databricks or Delta Lake architecture. Experience building architecture and Data Warehousing within the Microsoft Stack Experience in development Source control (e.g. Bit Bucket, Github) Experience in Low Code Analytical Tools (e.g. More ❯
CI/CD workflows · If you have previous exposure to geospatial data, that would be advantageous but is not a requirement for the position. · Familiarity with Apache Spark or Databricks · Excellent communication and collaboration skills Benefits About Prevail Partners Prevail Partners delivers strategic advice, intelligence, specialist capabilities, and managed services to clients ranging from governments and multinational corporations to non More ❯
capacity to translate client requirements into engineered, scalable systems Preferred Skills and Experience Certifications across AWS, Azure or Google Cloud Proven experience with data and AI platforms such as Databricks, Snowflake, Fabric or Elastic Prior involvement in digital transformation or AI native product development within enterprise settings Experience with graph theory, knowledge representation or distributed systems Contributions to technical or More ❯
Reigate, Surrey, England, United Kingdom Hybrid/Remote Options
esure Group
Skilled in team collaboration and exhibits good interpersonal skills. Able to prioritise, multi-task, and deliver at pace with an iterative mindset. Experience with modern data platforms such as Databricks or Snowflake is advantageous; LLM API experience is a bonus. Additional Information What’s in it for you?: Competitive salary that reflects your skills, experience and potential. Discretionary bonus scheme More ❯
disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith, Langfuse and similar More ❯
Develop clean, reusable Python code for data engineering tasks. Research and integrate the latest cloud-based technologies. Requirements: Proven experience in data engineering with Azure tools (Synapse, Data Factory, Databricks). Experience with Microsoft Fabric. Strong Python programming skills. Knowledge of data modelling and Kimball methodology. Excellent communication and problem-solving skills. Fabric certifications are highly desirable Benefits: Competitive salary More ❯
Crawley, Sussex, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
Develop clean, reusable Python code for data engineering tasks. Research and integrate the latest cloud-based technologies. Requirements: Proven experience in data engineering with Azure tools (Synapse, Data Factory, Databricks). Experience with Microsoft Fabric. Strong Python programming skills. Knowledge of data modelling and Kimball methodology. Excellent communication and problem-solving skills. Fabric certifications are highly desirable Benefits: Competitive salary More ❯
Kent, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
Develop clean, reusable Python code for data engineering tasks. Research and integrate the latest cloud-based technologies. Requirements: Proven experience in data engineering with Azure tools (Synapse, Data Factory, Databricks). Experience with Microsoft Fabric. Strong Python programming skills. Knowledge of data modelling and Kimball methodology. Excellent communication and problem-solving skills. Fabric certifications are highly desirable Benefits: Competitive salary More ❯
Luton, Bedfordshire, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
Develop clean, reusable Python code for data engineering tasks. Research and integrate the latest cloud-based technologies. Requirements: Proven experience in data engineering with Azure tools (Synapse, Data Factory, Databricks). Experience with Microsoft Fabric. Strong Python programming skills. Knowledge of data modelling and Kimball methodology. Excellent communication and problem-solving skills. Fabric certifications are highly desirable Benefits: Competitive salary More ❯
Slough, Berkshire, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
Develop clean, reusable Python code for data engineering tasks. Research and integrate the latest cloud-based technologies. Requirements: Proven experience in data engineering with Azure tools (Synapse, Data Factory, Databricks). Experience with Microsoft Fabric. Strong Python programming skills. Knowledge of data modelling and Kimball methodology. Excellent communication and problem-solving skills. Fabric certifications are highly desirable Benefits: Competitive salary More ❯
Crawley, West Sussex, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
Develop clean, reusable Python code for data engineering tasks. Research and integrate the latest cloud-based technologies. Requirements: Proven experience in data engineering with Azure tools (Synapse, Data Factory, Databricks). Experience with Microsoft Fabric. Strong Python programming skills. Knowledge of data modelling and Kimball methodology. Excellent communication and problem-solving skills. Fabric certifications are highly desirable Benefits: Competitive salary More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
Develop clean, reusable Python code for data engineering tasks. Research and integrate the latest cloud-based technologies. Requirements: Proven experience in data engineering with Azure tools (Synapse, Data Factory, Databricks). Experience with Microsoft Fabric. Strong Python programming skills. Knowledge of data modelling and Kimball methodology. Excellent communication and problem-solving skills. Fabric certifications are highly desirable Benefits: Competitive salary More ❯
passionate about clean design, collaboration, and the pursuit of data excellence. Role and Responsibility In this role, you'll develop and maintain robust data models and transformation pipelines using Databricks, Azure, and Power BI to turn complex datasets into reliable, insight-ready assets. You'll apply strong skills in SQL, Python, and PySpark to build efficient ELT workflows and ensure More ❯