solutions on cloud platforms. Python: Proficient in writing clean, production-quality code. AI Model Management: Familiarity with platforms such as MLFlow, Hugging Face, or LangChain. Data Processing: Experience with Databricks/Spark. SQL: Solid querying and data preparation skills. Data Architectures: Understanding of modern data systems (lakehouses, data lakes). Additional (nice-to-have) skills: Infrastructure as Code: Terraform or More ❯
Expertise in causal inference methods and forecasting. Expertise in data querying languages (e.g. SQL) and scripting languages (e.g. Python, R). Experience with data architecture technologies such as Airflow, Databricks, and dbt. Preferred qualifications: Experience in technology, financial services and/or a high growth environment. Experience with Excel and Finance systems (e.g. Oracle). Equal opportunity Airwallex is proud More ❯
middleware tools like Tray, Zapier, Make, n8n, and SuperBlocks, and extensive experience with rETL platforms and architecture (Polytomic, Hightouch, RudderStack, etc.) Familiarity with data warehouse platforms like Snowflake, BigQuery, DataBricks, etc. Bonus If You: Experience with other types of data storage ranging cache tools like Redis to S3 Built systems integrating data from a variety of tools and reconciling user More ❯
are willing to teach if you're willing to learn! Required experience: Python Git Nice to have: SQL dbt Github CircleCI Airflow Kubernetes Terraform A Cloud warehouse provider e.g. Databricks, GCP, Snowflake AWS We aren't necessarily looking for someone who is "10-out-of-10" in all these areas; but rather someone who has good experience in most of More ❯
mandatory; familiarity with Python-related programming languages (e.g. Pyspark, Polars) is beneficial Proficiency in SQL for data extraction, transformation, and manipulation is beneficial Experience with data lakehouse paradigms (e.g. Databricks, Snowflake, implementations from major cloud providers) is beneficial Exposure to structured and unstructured data storage solutions in some capacity (e.g. SQL, Postgres, MongoDB, AWS S3) is beneficial Experience working in More ❯
play a key role in supporting and scaling our growing data infrastructure. In this role, you'll be responsible for building and maintaining scalable ETL/ELT pipelines using Databricks and modern cloud tools. You'll also step in to temporarily support our business intelligence needs, developing and maintaining reports and dashboards in ThoughtSpot (or a similar BI platform). … business metrics during the analyst's leave period. Translate complex datasets into usable, decision-support insights. Key Requirements: Essential Strong experience building and managing ETL/ELT pipelines in Databricks or similar platforms. Proficiency in Python and SQL for data processing, transformation, and analysis. Deep knowledge of data modeling and warehousing concepts. Experience with BI tools, preferably ThoughtSpot (or Power More ❯
10+ years Job Summary : We are seeking a highly experienced and visionary Data Architect to lead the design and implementation of the data architecture for our cutting-edge Azure Databricks platform focused on economic data. This platform is crucial for our Monetary Analysis, Forecasting, and Modelling efforts. The Data Architect will be responsible for defining the overall data strategy, data … within the economic domain. Key Experience: Extensive Data Architecture Knowledge: They possess a deep understanding of data architecture principles, including data modeling, data warehousing, data integration, and data governance. Databricks Expertise: They have hands-on experience with the Databricks platform, including its various components such as Spark, Delta Lake, MLflow, and Databricks SQL. They are proficient in using Databricks for … various data engineering and data science tasks. Cloud Platform Proficiency: They are familiar with cloud platforms like AWS, Azure, or GCP, as Databricks operates within these environments. They understand cloud- data architectures and best practices. Leadership and Communication Skills: They can lead technical teams, mentor junior architects, and effectively communicate complex technical concepts to both technical and non-technical stakeholders. More ❯
Data Engineer (Databricks) - Leeds (Data Engineer, Python, PySpark, SQL, Big Data, Databricks, R, Machine Learning, AI, Agile, Scrum, TDD, BDD, CI/CD, SOLID principles, Github, Azure DevOps, Jenkins, Terraform, AWS CDK, AWS CloudFormation, Azure, Data Engineer) Our client is a global innovator and world leader with one of the most recognisable names within technology. They are looking for Data … Engineers with significant Databricks experience to join an exceptional Agile engineering team. We are seeking a Data Engineer with strong Python, PySpark and SQL experience, possess a clear understanding of databricks, as well as a passion for Data Science (R, Machine Learning and AI). Database experience with SQL and No-SQL - Aurora, MS SQL Server, MySQL is expected, as … Salary: £40k - £50k + Pension + Benefits To apply for this position please send your CV to Nathan Warner at Noir Consulting. (Data Engineer, Python, PySpark, SQL, Big Data, Databricks, R, Machine Learning, AI, Agile, Scrum, TDD, BDD, CI/CD, SOLID principles, Github, Azure DevOps, Jenkins, Terraform, AWS CDK, AWS CloudFormation, Azure, Data Engineer) NOIRUKTECHREC NOIRUKREC More ❯
Lead Data Engineer (Databricks) - Leeds (Lead Data Engineer, Team Lead, Technical Lead, Senior Data Engineer, Data Engineer, Python, PySpark, SQL, Big Data, Databricks, R, Machine Learning, AI, Agile, Scrum, TDD, BDD, CI/CD, SOLID principles, Github, Azure DevOps, Jenkins, Terraform, AWS CDK, AWS CloudFormation, Azure, Lead Data Engineer, Team Lead, Technical Lead, Senior Data Engineer, Data Engineer) Our client … is a global innovator and world leader with one of the most recognisable names within technology. They are looking for a Lead Data Engineer with significant Databricks experience as well as leadership responsibility to run an exceptional Agile engineering team and provide technical leadership through coaching and mentorship. We are seeking a Lead Data Engineer capable of leading client delivery … data solutions and developing complex enterprise data ETL and ML pipelines and projections. The successful candidate will have strong Python, PySpark and SQL experience, possess a clear understanding of databricks, as well as a passion for Data Science (R, Machine Learning and AI). Database experience with SQL and No-SQL - Aurora, MS SQL Server, MySQL is expected, as well More ❯
Data Engineer Outside IR35 Fully UK Remote An exciting new opportunity has arisen for a Data Engineer to join a forward thinking organisation on a contract basis. Key Skills Databricks Azure Data Factory DevOps CI/CD Azure Event Hubs This role will be supporting a key iniative within the organisation and is initially 1 month with a good chance More ❯
data analysis, the ideal candidate should have experience in cleaning and transforming datasets for analysis. They should also be familiar with a range of tools including: Alteryx DataRobot SAS Databricks SPSS R Python Scala Java Spark In addition, exposure to data visualization platforms such as Tableau or Power BI would be beneficial. A solid understanding of machine learning concepts is More ❯
with Celery for distributed task execution and background job processing, particularly in data pipeline or microservices environments Hands-on experience with Azure cloud services, especially Azure Data Factory, Azure Databricks, Azure Storage, and Azure Synapse. Proficiency in designing and deploying CI/CD pipelines using Azure DevOps (YAML pipelines, release management, artifact handling). More ❯
DATABRICKS ENGINEER 6-MONTH CONTRACT £550-£600 PER DAY This role offers a great opportunity for an Azure Databricks Engineer to join a renewable energy firm based in London. You'll play a hands-on role in developing and optimising modern data lakehouse solutions on Azure, while supporting critical analytics and data delivery systems. The environment encourages technical ownership, collaboration … You'll be responsible for data pipeline development, resource deployment, and ongoing optimisation of cloud-native systems. Your responsibilities will include: Designing and implementing scalable data lakehouse architectures using Databricks on Azure. Building efficient ETL/ELT pipelines for structured and unstructured data. Working with stakeholders to ensure high-quality, accessible data delivery. Optimising SQL workloads and data flows for … Automating infrastructure deployment using Terraform and maintaining CI/CD practices. Supporting secure and performant data access via cloud-based networking. KEY SKILLS AND REQUIREMENTS Strong experience with Azure Databricks in production environments. Background with Azure Data Factory, Azure Functions, and Synapse Analytics. Proficient in Python and advanced SQL, including query tuning and optimisation. Hands-on experience with big data More ❯
Birmingham Business Park, Birmingham, West Midlands, England, United Kingdom
MYO Talent
and stakeholder management skills Desirable Any exposure working in a software house, consultancy, retail or retail automotive sector would be beneficial but not essential. Exposure to Azure/Azure DatabricksMore ❯
National Society for the Prevention of Cruelty to Children
and supporting more junior data engineers. Experience in building cloud native data products and platforms. A strong grounding in DevOps best practice including IaC tools. Experience with Azure and Databricks an advantage. Experience in data automation and integration tools. Proven experience in data architecture, including designing data models and platforms. Join us and you'll become part of a team More ❯
ll also play a key part in mentoring junior engineers and shaping the long-term DevOps roadmap. Architect and build scalable Azure infrastructure and deployment pipelines Lead integration of Databricks, Azure services, and CI/CD workflows Automate infrastructure using Terraform (IaC) Mentor junior DevOps engineer Establish and govern DevOps best practices, automation, and security standards Guide adoption of Microsoft More ❯
Greater Manchester, Lancashire, England, United Kingdom
Interquest
core banking systems — enabling smarter decisions, customer innovation, and regulatory excellence. Key Responsibilities: Lead and develop high-performing data engineering teams Architect scalable, secure data pipelines and platforms (Azure, Databricks, etc.) Partner with product, risk, and tech leaders to deliver data-driven outcomes Champion best practices in data governance, quality, and DevOps What You’ll Bring: Proven leadership in large More ❯
core banking systems — enabling smarter decisions, customer innovation, and regulatory excellence. Key Responsibilities: Lead and develop high-performing data engineering teams Architect scalable, secure data pipelines and platforms (Azure, Databricks, etc.) Partner with product, risk, and tech leaders to deliver data-driven outcomes Champion best practices in data governance, quality, and DevOps What You'll Bring: Proven leadership in large More ❯
and manage data pipelines. Our no-code/low-code ETL platform allows seamless integration of data from any source whether databases, applications, or files into lakehouses like Snowflake, Databricks, and Redshift. With pipelines that just work and features like advanced data transformation using dbt Core and end-to-end pipeline observability, were focused on making robust data pipelines accessible More ❯
such as OSCP, CEH, or GIAC are a plus. Nice to Have: Experience with Kubernetes and container security. Familiarity with CI/CD security integration. Familiarity with Snowflake and Databricks Red Team experience As well as working as part of an amazing, engaging and collaborative team, we offer our staff a wide range of benefits to motivate them to be More ❯
managing multiple teams and offshore resources. Cloud Migration Expertise: Demonstrated success in leading end-to-end data platform migrations to the cloud, with a strong preference for Azure (Synapse, Databricks, Azure SQL, PowerBI). Legacy System Knowledge: Experience with traditional SQL Server technologies and an understanding of maintaining and decommissioning legacy data warehouses. Technical Acumen: Capability to be hands-on More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
and session persistence to deliver consistent user experiences Design event-driven architectures with pub/sub patterns for asynchronous processing Integrate with third-party data sources like Snowflake and Databricks Create robust error handling, monitoring, and retry systems for high availability Collaborate with MLOps and DevOps teams to optimise architecture and deployment pipelines What You'll Bring 5+ years in More ❯
Milton Keynes, Buckinghamshire, United Kingdom Hybrid / WFH Options
Solo Search Services Ltd
of Azure components including Virtual Machines/Scale Sets, Azure Kubernetes Service, API Management, App Services/Service Environments, Azure SQL and MI, Functions, Networking, App Insights, Data Factory, Databricks, Identify Management (Entra) Azure DevOps/ADO Solid understanding of Terraform, Helm, Flux, Powershell/Azure CLI Attributes : Able to work well in a team, as part of a department More ❯
and engineering organization Demonstrated experience managing cloud and technology costs Familiarity and understanding of AWS, Azure, and GCP service offerings Proficiency with data analysis tools and techniques (Excel, SQL, Databricks) Ability to communicate articulately and effectively across a diverse stakeholder population Excellent interpersonal and organizational skills Preferred Qualifications: Certifications with AWS, Azure, or GCP FinOps Certified Practitioner Certification FinOps Certified More ❯
work collaboratively in a team environment. Experience in developing and implementing automated trading or decision-making systems is highly desirable Experience with Kubernetes and Kafka are desirable Experience with Databricks is desirable Experience with experimentation is desirable A Bachelor's degree in a relevant field such as Computer Science, Statistics, Mathematics, or a related discipline. Join Our Team We're More ❯