machines - both Windows and Linux. Familiarity with server patching and maintenance. * Strong understanding of security best practices within Azure and ideally AWS. * Experience of configuring cloud data services (preferably Databricks) in Azure and ideally AWS. * Excellent communication and collaboration skills, with the ability to work across multiple technical and non-technical teams. What happens now? After submitting your application for More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Fynity
You will be adaptable and able to work with different tools and technologies, depending on the client and project needs but will ideally possess stength in: MS Fabric stack Databricks Azure SQL Tableau/Power BI A knowledge of data modelling and of general IT architecture and systems integration is also required. Other technologies such as Azure Data Factory, RedShift More ❯
You will be adaptable and able to work with different tools and technologies, depending on the client and project needs but will ideally possess stength in: MS Fabric stack Databricks Azure SQL Tableau/Power BI A knowledge of data modelling and of general IT architecture and systems integration is also required. Other technologies such as Azure Data Factory, RedShift More ❯
sets from multiple sources. Strong experience with SQL & database management. Experience with cloud platforms such as AWS (preferred), Azure, or Google Cloud. Experience with big data technologies specifically with Databricks & Spark. Experience with data visualization tools like AWS QuickSight, Tableau, Power BI, or similar. Experience with project management tools such as Jira, SharePoint, MS Teams, MS Word and MS Excel. More ❯
Liverpool, Merseyside, North West, United Kingdom Hybrid / WFH Options
Forward Role
scheme, long service recognition, and regular company events What you'll need: Solid experience in data engineering, management and analysis Strong experience with Azure Data Warehouse solutions and AWS Databricks platforms Exceptional Python/PySpark + additional languages for data processing Strong SQL with experience across both relational databases (SQL Server, MySQL) and NoSQL solutions (MongoDB, Cassandra) Hands-on knowledge More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Avencia Consulting
within the insurance sector, ideally with London Market experience. Strong knowledge of insurance data models, including policy, claims, and risk-related structures. Expertise in Azure Cloud Services (Data Factory, Databricks, Key Vault, Azure SQL) and data warehousing methodologies (Kimball, Inmon). Proficiency in T-SQL for data transformation and summarisation, with Python skills for building data pipelines. Advanced Excel skills More ❯
Experience building production-grade data pipelines (primarily in Python and SQL) in cloud environments, with an emphasis on scalability, code clarity, and long-term maintainability Hands-on experience with Databricks and/or Spark, especially Delta Lake, Unity Catalog, and MLflow Deep familiarity with cloud platforms, particularly AWS and Google Cloud Proven ability to manage data architecture and production pipelines More ❯
Williamsburg, Virginia, United States Hybrid / WFH Options
National Center for State Courts
as causal inference (e.g., quasi-experimental and experimental designs) & machine learning methods (e.g., regression/classification); cloud-based services for DS (e.g., Microsoft Azure, Amazon AWS, Google Cloud Platform, Databricks or others); and (v) Able to travel domestically; potential for international travel. Job in Williamsburg, VA - remote work permitted with periodic travel to home office as needed. Full-time/ More ❯
Python and AI/ML frameworks such as PyTorch, LangChain, LangGraph, GraphRAG, and AutoGen. Experience with modern vector and graph databases (e.g., ChromaDB, Neo4j) and LLMOps platforms (e.g., Azure, Databricks, Azure OpenAI). Proven track record of delivering scalable AI solutions in enterprise settings, preferably in life sciences. Excellent communication and interpersonal skills, with the ability to lead projects and More ❯
Python and AI/ML frameworks such as PyTorch, LangChain, LangGraph, GraphRAG, and AutoGen. Experience with modern vector and graph databases (e.g., ChromaDB, Neo4j) and LLMOps platforms (e.g., Azure, Databricks, Azure OpenAI). Proven track record of delivering scalable AI solutions in enterprise settings, preferably in life sciences. Excellent communication and interpersonal skills, with the ability to lead projects and More ❯
Python and AI/ML frameworks such as PyTorch, LangChain, LangGraph, GraphRAG, and AutoGen. Experience with modern vector and graph databases (e.g., ChromaDB, Neo4j) and LLMOps platforms (e.g., Azure, Databricks, Azure OpenAI). Proven track record of delivering scalable AI solutions in enterprise settings, preferably in life sciences. Excellent communication and interpersonal skills, with the ability to lead projects and More ❯
models. Experience using version control tools such as Git, Bitbucket or similar central repository tools Familiarity with NoSQL Databases. Familiarity with JIRA preferred. Preferred experience with Azure: Data Factory, Databricks, Data Lake Storage, Analysis Services, and API - PySpark. Ability to multi-task in a fast-paced environment. Ability to prioritize tasks and projects using a browser-based tracking system. Ability More ❯
engineering roles, preferably for a customer facing data product Expertise in designing and implementing large-scale data processing systems with data tooling such as Spark, Kafka, Airflow, dbt, Snowflake, Databricks, or similar Strong programming skills in languages such as SQL, Python, Go or Scala Demonstrable use and an understanding of effective use of AI tooling in your development process A More ❯
strategy roles, ideally within a complex, regulated, or customer-centric environment Degree in a quantitative field (e.g., Data Science, Economics, Engineering, Business Analytics) Deep understanding of data platforms (e.g., Databricks, Tableau Cloud), data governance, and AI/ML applications in business contexts Strategic thinker with a bias for action and a passion for unlocking business value through data Excellent communication More ❯
Experience with go-to-market activities, RFI/RFP responses, and proposal materials. TOGAF or equivalent certification; cloud certifications (AWS/Azure); data & AI tooling (e.g., Azure Fabric, Snowflake, Databricks). Experience with Data Mesh and data governance platforms (e.g., Collibra, Informatica). Ability to develop a common language for data, data models, data dictionaries, vocabularies, taxonomies, or ontologies. Experience More ❯
Experience with go-to-market activities, RFI/RFP responses, and proposal materials. TOGAF or equivalent certification; cloud certifications (AWS/Azure); data & AI tooling (e.g., Azure Fabric, Snowflake, Databricks). Experience with Data Mesh and data governance platforms (e.g., Collibra, Informatica). Ability to develop a common language for data, data models, data dictionaries, vocabularies, taxonomies, or ontologies. Experience More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Deloitte LLP
Experience with go-to-market activities, RFI/RFP responses, and proposal materials. TOGAF or equivalent certification; cloud certifications (AWS/Azure); data & AI tooling (e.g., Azure Fabric, Snowflake, Databricks). Experience with Data Mesh and data governance platforms (e.g., Collibra, Informatica). Ability to develop a common language for data, data models, data dictionaries, vocabularies, taxonomies, or ontologies. Experience More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Deloitte LLP
Experience with go-to-market activities, RFI/RFP responses, and proposal materials. TOGAF or equivalent certification; cloud certifications (AWS/Azure); data & AI tooling (e.g., Azure Fabric, Snowflake, Databricks). Experience with Data Mesh and data governance platforms (e.g., Collibra, Informatica). Ability to develop a common language for data, data models, data dictionaries, vocabularies, taxonomies, or ontologies. Experience More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
Deloitte LLP
Experience with go-to-market activities, RFI/RFP responses, and proposal materials. TOGAF or equivalent certification; cloud certifications (AWS/Azure); data & AI tooling (e.g., Azure Fabric, Snowflake, Databricks). Experience with Data Mesh and data governance platforms (e.g., Collibra, Informatica). Ability to develop a common language for data, data models, data dictionaries, vocabularies, taxonomies, or ontologies. Experience More ❯
e.g., Pandas, NumPy) and deep expertise in SQL for building robust data extraction, transformation, and analysis pipelines. Hands-on experience with big data processing frameworks such as Apache Spark, Databricks, or Snowflake, with a focus on scalability and performance optimization. PREFERRED QUALIFICATIONS: Solid understanding of cloud infrastructure, particularly AWS, with practical experience using Docker, Kubernetes, and implementing CI/CD More ❯
disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith, Langfuse and similar More ❯
like web applications on Azure cloud infrastructure with Docker, Kubernetes, automation testing and DevOps (ideally terraform + github actions). Create, monitor and maintain data pipelines and batch processes (databricks, pyspark, lakehouses, kafka). Productionise quant models into software applications, ensuring robust day-to-day operation, monitoring and back testing are in place. Stakeholder engagement: Elicit, validate and translate business More ❯
Leicester, Leicestershire, England, United Kingdom
Harnham - Data & Analytics Recruitment
and manage ETL pipelines and APIs to improve data delivery and accuracy Contribute to the overall data strategy alongside BI and analytics colleagues Tech You'll Use SQL Python Databricks Experience with web scraping or similar techniques would be a bonus. Interview Process Initial interview with the Data Manager Technical discussion/assessment (you'll walk through your approach to More ❯
Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom Hybrid / WFH Options
True North Group
to meet business needs. Contribute to pre-sales, proposals, and thought leadership activities. Requirements Strong experience designing and maintaining data platforms and ETL/ELT solutions. Solid knowledge of Databricks, Python, PySpark, Spark SQL, Azure and/or AWS. Data modelling expertise (Inmon, Kimball, Data Vault). Familiar with DataOps practices and pipeline monitoring. Experience with sensitive data and applying More ❯