We work with some of the UK's biggest companies and government departments to provide a pragmatic approach to technology, delivering bespoke software solutions and expert advice. Our clients are increasingly looking to us to help them make the best More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
CODEVERSE LIMITED
Azure Databricks Platform Lead Hybrid | Slough Are you a Databricks expert with deep knowledge of both data infrastructure and engineering We’re looking for a Databricks Champion to take ownership of an enterprise-scale Azure Databricks environment — driving performance, governance, connectivity, and cost optimisation across the platform. This is a client-facing, hands-on role that combines cloud engineering, data … practices across the board. Location : Hybrid with travel to Slough (a few days per week) Start : ASAP Type: Full time permenant What You’ll Be Doing: Manage & optimise Azure Databricks workspaces (VNet Injection, private endpoints, cluster config, performance tuning, cost control) Design and implement Unity Catalog for governance, access control, data lineage, and auditing Integrate Entra ID for SSO, SCIM … credential passthrough, Key Vault, and ACLs Build secure networking (VNet peering, NSGs, DNS/routing, firewalls) Orchestrate data pipelines with ADF Databricks Configure and troubleshoot ADLS Gen2 access , mounts, and credential passthrough Implement monitoring, alerting, DR, and platform health checks Define and share best practices , architecture documents, and runbooks Act as SME , collaborating across teams and representing the Databricks function More ❯
of clients. For this role, we are looking for someone who can demonstrate experience in the following areas: Commercial experience with implementing Fabric strong Azure experience - Ideally using ADF, Databricks, ADLS etc Data Engineering background - ETL development, data storage platforms such as Data Warehouse, Lake, or Lakehouse architectures You will ideally have come from a consultancy background, and henceforth understand More ❯
segmentation, forecasting, and marketing spend optimisation. Proficiency in Python, SQL, and Git, with hands-on experience in tools like Jupyter notebooks, Pandas, and PyTorch. Expertise in cloud platforms (AWS, Databricks, Snowflake) and containerisation tools (Docker, Kubernetes). Strong leadership skills with experience mentoring and managing data science teams. Deep knowledge of media measurement techniques, such as media mix modelling. Experience More ❯
segmentation, forecasting, and marketing spend optimisation. Proficiency in Python, SQL, and Git, with hands-on experience in tools like Jupyter notebooks, Pandas, and PyTorch. Expertise in cloud platforms (AWS, Databricks, Snowflake) and containerisation tools (Docker, Kubernetes). Strong leadership skills with experience mentoring and managing data science teams. Deep knowledge of media measurement techniques, such as media mix modelling. Experience More ❯
segmentation, forecasting, and marketing spend optimisation. Proficiency in Python, SQL, and Git, with hands-on experience in tools like Jupyter notebooks, Pandas, and PyTorch. Expertise in cloud platforms (AWS, Databricks, Snowflake) and containerisation tools (Docker, Kubernetes). Strong leadership skills with experience mentoring and managing data science teams. Deep knowledge of media measurement techniques, such as media mix modelling. Experience More ❯
chance to be involved in all aspects of the project process from conception through to completion and launch. Python Developer Experience: Python, including APIs, data structures, and async processing Databricks/Microsoft Fabric Cloud, preferably Azure (Data Lake, Functions, App Services) Containerisation with Docker and CI/CD pipelines MLOps tooling (MLFlow, Git-based versioning, environment tracking) Desirable Skills & Interests More ❯
City of London, London, United Kingdom Hybrid / WFH Options
ITSS Recruitment
chance to be involved in all aspects of the project process from conception through to completion and launch. Python Developer Experience: * Python, including APIs, data structures, and async processing * Databricks/Microsoft Fabric * Cloud, preferably Azure (Data Lake, Functions, App Services) * Containerisation with Docker and CI/CD pipelines * MLOps tooling (MLFlow, Git-based versioning, environment tracking) Desirable Skills & Interests More ❯
Employment Type: Permanent
Salary: £65000 - £80000/annum Bonus, 26 days holiday, private heal
intelligence tools and visualisation platforms. Promote technology best practices and scalable analytics processes. Key Skills & Experience: Strong academic background in a relevant field. Proficiency in ETL tools (e.g. Alteryx, Databricks); experience with Retool is a plus. Advanced SQL skills for querying and modelling relational databases. Solid understanding of financial concepts; tax reporting experience is beneficial. Prior experience in KPI modelling More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
ITSS Recruitment Ltd
the chance to be involved in all aspects of the project process from conception through to completion and launch.Python Developer Experience:* Python, including APIs, data structures, and async processing* Databricks/Microsoft Fabric* Cloud, preferably Azure (Data Lake, Functions, App Services)* Containerisation with Docker and CI/CD pipelines* MLOps tooling (MLFlow, Git-based versioning, environment tracking)Desirable Skills & Interests More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Anson Mccade
Ensure solutions meet governance, compliance, and security standards. Skills & Experience: Proven experience as a Data Solution Architect or similar senior-level data architecture role. Strong knowledge of Kafka, Confluent, Databricks, Unity Catalog, and cloud-native architecture. Skilled in Data Mesh, Data Fabric, and product-led data strategy design. Experience with big data tools (e.g., Spark), ETL/ELT, SQL/ More ❯
background and a knack for translating data into actionable insights. - Hands-on experience with Adobe Analytics, SQL and Power BI. - Ability to communicate findings clearly to stakeholders. - Experience with Databricks and Snowplow is preferred but not essential. What you'll get in return This is a 6-month contract offering the opportunity to make a tangible impact in a high More ❯
of data to support ranking evolution Assist Senior Data Scientist in data consultancy projects, including: Reviewing and maintaining codebase and ETL pipelines (Python, Excel) Building skills in bibliometrics analyses (Databricks, Python) Collaborating with experts to develop scalable analyses for the HE sector Work with Data Engineer to disseminate data project results, including: Developing analyses and dashboards with the Product team More ❯
models for digital media and marketing domains. Translate media buying and activation workflows into structured data assets. Collaborate with data engineers to implement models in cloud platforms (e.g., Azure, Databricks). Domain Expertise Understand and model data from platforms such as DV360, Meta, Amazon DSP, TikTok, IAS, Innovid, and others. Integrate campaign metadata, audience segments, impressions, clicks, conversions, and spend More ❯
to translate data into action What You’ll Bring Strong hands-on data science and ML experience Expert in Python and modern data science libraries Experience with AWS SageMaker, Databricks, or similar cloud tools Strong background in supervised/unsupervised learning, statistical modelling, and model deployment Solid understanding of NLP techniques for document processing Domain experience in insurance (e.g., underwriting More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
to translate data into action What You’ll Bring Strong hands-on data science and ML experience Expert in Python and modern data science libraries Experience with AWS SageMaker, Databricks, or similar cloud tools Strong background in supervised/unsupervised learning, statistical modelling, and model deployment Solid understanding of NLP techniques for document processing Domain experience in insurance (e.g., underwriting More ❯
forecasting and analytics frameworks across the business What You'll Need Excellent Python programming skills Experience with time series data, forecasting, and machine learning (e.g. Redis cache, AWS S3, Databricks, Grafana) Exposure to the German or European electricity market (e.g. EPEX Spot, Redispatch, TSOs) Experience building data pipelines and automating data workflows Ability to clearly communicate modelling approaches, including assumptions More ❯
bring to the role: 5 years in a Sales Engineering, Solutions Engineering, Consulting or similar role within the data space, ideally with experience in modern data tools like Snowflake, Databricks, Fivetran, or Tableau. Hands-on Python scripting skills for data pipeline support. Familiarity with core data engineering concepts such as orchestration, ELT, Git, and Role-Based Access Control (RBAC). More ❯
Expertise in causal inference methods and forecasting. Expertise in data querying languages (e.g. SQL) and scripting languages (e.g. Python, R). Experience with data architecture technologies such as Airflow, Databricks, and dbt. Preferred qualifications: Experience in technology, financial services and/or a high growth environment. Experience with Excel and Finance systems (e.g. Oracle). Equal opportunity Airwallex is proud More ❯
problems, take the initiative, and identify creative solutions to deliver outcomes in the face of obstacles. Knowledge of common data science tools around SQL-based data warehousing (eg. Snowflake, Databricks, DBT), BI tools (eg. Tableau, Looker), workflow orchestration, and ML Ops. Excellent spoken and written English skills. Fluency with scripting in Python. Ability to work effectively across time zones. Teammates More ❯
concise communication Previous experience in aviation, transportation, logistics, or supply chain optimisation. Familiarity with airline planning domains (e.g. aircraft routing, slot planning, scheduling). Exposure to cloud platforms (e.g. Databricks, GCP, AWS and Azure) and version control (e.g. Git). Experience working in Agile product teams or similar collaborative environments. What You'll Get in Return: A dynamic and creative More ❯
and control analysis, solution design, AI governance, and end-to-end implementation management. Assist clients in executing their AI & Data strategy through deployment of modern Data platforms like Snowflake, Databricks, and Microsoft Fabric. Develop and implement data management artifacts by establishing data governance requirements including data lineage discovery, data quality control design and measurement solutions and data privacy related activities. More ❯
are willing to teach if you're willing to learn! Required experience: Python Git Nice to have: SQL dbt Github CircleCI Airflow Kubernetes Terraform A Cloud warehouse provider e.g. Databricks, GCP, Snowflake AWS We aren't necessarily looking for someone who is "10-out-of-10" in all these areas; but rather someone who has good experience in most of More ❯
and control analysis, solution design, AI governance, and end-to-end implementation management. • Assist clients in executing their AI & Data strategy through deployment of modern Data platforms like Snowflake, Databricks, and Microsoft Fabric. • Develop and implement data management artifacts by establishing data governance requirements including data lineage discovery, data quality control design and measurement solutions and data privacy related activities. More ❯
mandatory; familiarity with Python-related programming languages (e.g. Pyspark, Polars) is beneficial Proficiency in SQL for data extraction, transformation, and manipulation is beneficial Experience with data lakehouse paradigms (e.g. Databricks, Snowflake, implementations from major cloud providers) is beneficial Exposure to structured and unstructured data storage solutions in some capacity (e.g. SQL, Postgres, MongoDB, AWS S3) is beneficial Experience working in More ❯