Key Responsibilities 1. Technical Leadership - Architect and deploy scalable ML models (e.g., dynamic pricing, demand forecasting, desirability scoring) using Python, PyTorch/TensorFlow, and cloud ML tools (AWS SageMaker, Databricks). - Define best practices for model governance, monitoring, and retraining in production. - Lead R&D into emerging techniques (e.g., graph neural networks for inventory routing, GenAI for buyer personalisation). More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Made Tech
experience in IaC and able to guide how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop. Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes) Ability to create data pipelines on a cloud environment and integrate More ❯
Social network you want to login/join with: col-narrow-left Client: Location: Manchester, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Reference: a1edf758a33a Job Views: 7 Posted: 18.06.2025 Expiry Date: 02.08.2025 col More ❯
Social network you want to login/join with: This start-up focusses on healthcare AI and are looking for a Data Engineer to join their team. You will be joining a team of 45 people, including Data Scientists, ML More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Matillion Limited
Matillion is The Data Productivity Cloud. We are on a mission to power the data productivity of our customers and the world, by helping teams get data business ready, faster. Our technology allows customers to load, transform, sync and orchestrate More ❯
requirements with the need for data accessibility, security, and performance across the organization. Architect and oversee the integration of various data tools, including Azure Data Factory, Informatica Cloud Services, Databricks, and Dremio, to create a seamless, high-performance data environment that meets the organization's comprehensive data needs. Work closely with stakeholders, including data engineers, analysts, and business leaders, to … and implementation of the Common Data Model and associated data architectures across the organization, ensuring they meet global and regional data needs. Utilize advanced tools like Azure Data Factory, Databricks, and Power BI to integrate and transform data across multiple regions, ensuring consistency, accuracy, and scalability. Ensure that all data architectures and processes comply with relevant data residency regulations, optimizing … architecture, with a strong focus on designing and implementing Common Data Models. Expertise in cloud-based data platforms and tools, particularly within the Azure ecosystem (e.g., Azure Data Factory, Databricks, Power BI). Strong knowledge of data integration, transformation, and management practices, with the ability to design scalable and efficient data architectures. Deep understanding of data governance, compliance, and residency More ❯
reports that are both insightful and easy to use. Helping to shape and optimise semantic models to improve performance and user experience. Working with cloud tools like Azure and Databricks to support data visualisation processes. Applying best practices in Power BI development , including DAX and visual design. Collaborating with colleagues to understand their needs and bring data to life in … tools) and a passion for creating great user experiences. Familiarity with SQL , and ideally some exposure to Python or PySpark . Experience with Azure Data Services and platforms like Databricks . A collaborative mindset and experience working in Agile teams . Strong communication skills and a proactive, problem-solving approach. A willingness to learn and grow with new tools, technologies More ❯
frameworks (e.g., DoWhy, causalml) Programming & Data Tools : Python: Strong foundation in Pandas, NumPy, matplotlib/seaborn, scikit-learn, TensorFlow, Pytorch etc. SQL: Advanced querying for large-scale datasets. Jupyter, Databricks, or notebooks-based workflows for experimentation. Data Access & Engineering Collaboration : Comfort working with cloud data warehouses (e.g., Snowflake, Databricks, Redshift, BigQuery) Familiarity with data pipelines and orchestration tools like Airflow More ❯
modelling and data structure design across Royal London's Enterprise Data Platform (EDP) which is being built on Azure Databricks. The EDP is a Data Lakehouse built using the Databricks Medalion architecture and consists of a relational core 'silver' layer with a variety of dimensional, de-normalised and relational structured data products being exposed for consumption through the 'Gold' layer. … data models using data modelling tools (Idera ER/Studio or similar). Data engineering/data pipeline experience, with hands on experience on integration tools such as Azure Databricks Notebooks, Azure Data Factory or PySpark. Python extremely beneficial. About Royal London We're the UK's largest mutual life, pensions, and investment company, offering protection, long-term savings and More ❯
modelling and data structure design across Royal London's Enterprise Data Platform (EDP) which is being built on Azure Databricks. The EDP is a Data Lakehouse built using the Databricks Medalion architecture and consists of a relational core 'silver' layer with a variety of dimensional, de-normalised and relational structured data products being exposed for consumption through the 'Gold' layer. … data models using data modelling tools (Idera ER/Studio or similar). Data engineering/data pipeline experience, with hands on experience on integration tools such as Azure Databricks Notebooks, Azure Data Factory or PySpark. Python extremely beneficial. About Royal London We're the UK's largest mutual life, pensions, and investment company, offering protection, long-term savings and More ❯
informed on industry trends, competitor positioning, and emerging technologies, supporting the sales approach accordingly. Partner & Ecosystem Collaboration Work alongside partners, including cloud providers (Azure, AWS) and data platform vendors (Databricks, Snowflake, Oracle), to enhance solution offerings. Support engagement with Version 1’s delivery and engineering teams to ensure sales commitments align with execution capabilities. Identify co-sell opportunities and collaborate … of Data & Analytics). Data & Technology Knowledge Understanding of data platforms, data engineering, AI/ML, analytics, and cloud ecosystems. Familiarity with technologies such as Azure Data Services, AWS, Databricks, Snowflake, Oracle, and Power BI. Ability to articulate the business impact of data-driven decision-making and AI solutions. Stakeholder Engagement & Communication Strong communication skills, able to simplify complex technical More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Starr Underwriting
and presentation skills Stakeholder management skills to understand and meet data needs Ability to work independently and in teams Experience with agile methodologies and Azure DevOps preferred Experience with Databricks, SSRS, or QlikView is a plus Our Benefits We offer comprehensive benefits focusing on wellbeing, including hybrid working, competitive salary, pension, bonuses, health and dental insurances, and more. About Us More ❯
and releases. Oversee projects involving data ingestion or production, providing technical advice and education on Data Engineering best practices. Skills needed: 5+ years in Data Engineering roles. Experience with Databricks, Data Factory, Azure SQL, Azure SQL DW. Ability to manage multiple priorities in a fast-paced environment. Knowledge of emerging technologies and their application to business problems. Experience with Kimball More ❯
practical experience with generative AI applications, including prompt engineering and fine-tuning. Strong experience with cloud platforms (e.g., GCP, AWS, Azure) and their data/ML services (e.g., BigQuery, Databricks, Vertex AI, Sagemaker). Experience with DevSecOps performance metrics, e.g., strong background with DORA Metrics and other relevant performance indicators. We value diverse perspectives and are committed to creating inclusive More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Canopius
the ability to work independently as well as in team environments Experience of working within an agile squad and knowledge of Azure DevOps is preferable Experience of developing with Databricks a bonus, but not essential Experience of developing solutions in SSRS or QlikView a bonus, but not essential About Us Our benefits We offer all employees a comprehensive benefits package More ❯
practical experience with generative AI applications, including prompt engineering and fine-tuning • Strong experience with cloud platforms (e.g., GCP, AWS, Azure) and their data/ML services (e.g., BigQuery, Databricks, Vertex AI, Sagemaker) • Metrics & Process: DevSecOps performance metrics e.g. strong background with DORA Metrics and other relevant performance indicators Being open to different points of view is important for our More ❯
data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching, Good knowledge of Databricks, Snowflake, Azure/AxoWS/Oracle cloud, R, Python. Additional Information At Version 1, we believe in providing our employees with a comprehensive benefits package that prioritises their well-being More ❯
our Data Strategy, embarking on a transformative journey from traditional on-premises infrastructure to a cloud-based architecture. Our cloud-native Data Platform utilises Microsoft Azure technology including Azure Databricks, Azure Data Factory and dbt. We seek an individual with a proven track record in Azure cloud Data Engineering to join our team, contributing their expertise to shape and execute More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
First Central Services
practices and tools (Azure DevOps preferred). Experience with microservices architecture, RESTful API development, and system integration. Prior experience in financial services or insurance sectors advantageous. Familiarity with AzureML, Databricks, related Azure technologies, Docker, Kubernetes, and containerization is advantageous. Advanced proficiency in Python, and familiarity with AI frameworks such as LangChain Skilled in designing and operationalising AI Ops frameworks within More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
First Central Services
practices and tools (Azure DevOps preferred). Experience with microservices architecture, RESTful API development, and system integration. Prior experience in financial services or insurance sectors advantageous. Familiarity with AzureML, Databricks, related Azure technologies, Docker, Kubernetes, and containerization is advantageous. Advanced proficiency in Python, and familiarity with AI frameworks such as LangChain Skilled in designing and operationalising AI Ops frameworks within More ❯
external vendors to enhance data management capabilities. Provide expert-level troubleshooting, root cause analysis, and performance optimisation for data platforms, such as Azure SQL databases, Fabric Warehouse and Onelake, Databricks, and Azure Data Factory. Document technical solutions, best practices, and knowledge base articles to facilitate effective knowledge transfer and continuous improvement. Diagnose and resolve data-related incidents and problems escalated More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
HSO
HSO are a leading member of the Microsoft Dynamics Inner Circle, founded in 1987, and specialize in sectors such as Retail, Manufacturing, Professional Services, Financial Services and Local Government. We’ve won several prestigious awards over the last few years More ❯
Solution Architect - Databricks Admin with GCP/AWS Solution Architect - Databricks Admin with GCP/AWS 2 days ago Be among the first 25 applicants Get AI-powered advice on this job and more exclusive features. Sign in to access AI-powered advices Continue with Google Continue with Google Continue with Google Continue with Google Continue with Google Continue with … Google Continue with Google Continue with Google Continue with Google Continue with Google Continue with Google Continue with Google Responsibilities will include designing, implementing, and maintaining the Databricks platform, and providing operational support. Operational support responsibilities include platform set-up and configuration, workspace administration, resource monitoring, providing technical support to data engineering, Data Science/ML, and Application/integration … the root causes of issues, and resolving issues. The position will also involve the management of security and changes. The position will work closely with the Team Lead, other Databricks Administrators, System Administrators, and Data Engineers/Scientists/Architects/Modelers/Analysts. ͏ Required Skills: 3+ years of production support of the Databricks platform Preferred: 2+ years of experience More ❯
our current data solution as well as advancing it to the next level. We have created an initial gem of a Data Lake and Lakehouse (Azure Data Lake, ADF, Databricks, Airflow, DBT) to enable Business Intelligence and Data Analytics (Superset, RStudio Connect). Our Data Lake is fully metadata driven, cost efficient, documented, and reproducible. We need our one-source … source data lives in different systems (including SQL servers, Google Analytics and Salesforce). If you have experience setting up data pipelines and in-depth knowledge of Azure/Databricks technologies, you are comfortable taking a lead, taking responsibility and you want to put your stamp on our evolving data solution, then this will be the job for you. We … Data space, and we need you to drive innovation in your solutions as we grow. Main responsibilities: Take ownership of the centralisation of different data sources into our Lakehouse (Databricks on Azure Data Lake) and its architecture. Be responsible for the reliability and quality of data in the Data Lake (including anomaly detection, data quality checks, reconciliations, access, permission, and More ❯
Azure Data Platform Engineer, you will be responsible for:- Designing, building, and maintaining scalable data solutions on Microsoft Azure. Designing and implementing scalable data pipelines using Azure Data Factory, Databricks, Synapse Analytics, and other Azure services. Leading technical workstreams and supporting project delivery. Acting as the subject-matter expert for cloud-based data engineering. Ensure data governance, security, and compliance … maintain best practices. If you possess a combination of some of the following skills, then LETS TALK! Proven experience with Azure data services (SQL, Data Factory, Data Lake, Synapse, Databricks, Azure SQL and Cosmos DB). Strong proficiency in designing and operating scalable data solutions and pipelines. Familiar with Cloud security, performance optimisation and monitoring tools (Azure Monitor, Log Analytics More ❯