segmentation, forecasting, and marketing spend optimisation. Proficiency in Python, SQL, and Git, with hands-on experience in tools like Jupyter notebooks, Pandas, and PyTorch. Expertise in cloud platforms (AWS, Databricks, Snowflake) and containerisation tools (Docker, Kubernetes). Strong leadership skills with experience mentoring and managing data science teams. Deep knowledge of media measurement techniques, such as media mix modelling. Experience More ❯
segmentation, forecasting, and marketing spend optimisation. Proficiency in Python, SQL, and Git, with hands-on experience in tools like Jupyter notebooks, Pandas, and PyTorch. Expertise in cloud platforms (AWS, Databricks, Snowflake) and containerisation tools (Docker, Kubernetes). Strong leadership skills with experience mentoring and managing data science teams. Deep knowledge of media measurement techniques, such as media mix modelling. Experience More ❯
segmentation, forecasting, and marketing spend optimisation. Proficiency in Python, SQL, and Git, with hands-on experience in tools like Jupyter notebooks, Pandas, and PyTorch. Expertise in cloud platforms (AWS, Databricks, Snowflake) and containerisation tools (Docker, Kubernetes). Strong leadership skills with experience mentoring and managing data science teams. Deep knowledge of media measurement techniques, such as media mix modelling. Experience More ❯
intelligence tools and visualisation platforms. Promote technology best practices and scalable analytics processes. Key Skills & Experience: Strong academic background in a relevant field. Proficiency in ETL tools (e.g. Alteryx, Databricks); experience with Retool is a plus. Advanced SQL skills for querying and modelling relational databases. Solid understanding of financial concepts; tax reporting experience is beneficial. Prior experience in KPI modelling More ❯
St Ives, Saint Ives, Cambridgeshire, United Kingdom Hybrid / WFH Options
Interaction Recruitment
and maintaining reports and dashboards to monitor performance and identify trends Contributing to experimentation and testing around customer experience and pricing Learning modern tools including SQL, Python, Tableau, and Databricks (training provided) Collaborating with stakeholders across the business and within the wider analytics team What we’re looking for: A degree (or equivalent experience) in Maths, Economics, Statistics or similar More ❯
wide range of stakeholders Hands-on experience with LLMs, vector databases, RAG pipelines, prompt engineering, and model fine-tuning Familiarity with cloud platforms (especially Azure), and tools such as Databricks, Hugging Face, LangChain, and open-source GenAI frameworks Deep curiosity and a passion for staying ahead of AI and data science developments Commercial acumen and a focus on delivering value More ❯
wide range of stakeholders Hands-on experience with LLMs, vector databases, RAG pipelines, prompt engineering, and model fine-tuning Familiarity with cloud platforms (especially Azure), and tools such as Databricks, Hugging Face, LangChain, and open-source GenAI frameworks Deep curiosity and a passion for staying ahead of AI and data science developments Commercial acumen and a focus on delivering value More ❯
Establishing thought leadership , writing blogs, publishing articles, and presenting at external events. What you'll need to succeed Expertise in data warehousing, cloud analytics, and modern data architectures (Snowflake, Databricks, Matillion, Power BI, or similar). Proven ability to engage and influence senior stakeholders , providing strategic guidance and technical leadership. Strong consulting and client management experience , with a track record More ❯
and building robust processes for scale. Nice-to-have Experience shipping at pace on AWS S3 environments. Experience working with Dagster orchestrator or similar data orchestration systems (e.g. Airflow, Databricks)". Software Engineering background. Benefits Competitive salary & equity options Unlimited holiday Benefits package Career development opportunities as the company scales Ownership of ambitious, mission-driven work with real-world impact More ❯
basis. Your Role As a Data Engineering Specialist, you will have a considerable understanding of data engineering principles including ETL Processes. You will have hands on experience working with Databricks, Pyspark for data transformation and be familiar with cloud computing such as Microsoft Azure Services. In your role you will have an understanding of data warehouse architectures as well as More ❯
and integrating data across systems to support business performance and decision-making. The team works closely with Finance, Technology, and Product to build scalable, cloud-native analytics solutions on Databricks, Azure Synapse, and Power BI. About the Role We are seeking a dedicated Technical Analyst based in the UK to support our ongoing Crystal Reports migration to the BDAP platform. More ❯
industry, with an understanding of how data products can drive business value in these sectors. You will have experience with cloud platforms such as Azure and data tools like Databricks, with an understanding of how to leverage the platform for product development, data integration, and advanced analytics. You will have experience working within Agile or continuous delivery frameworks, with a More ❯
solutions on cloud platforms. Python: Proficient in writing clean, production-quality code. AI Model Management: Familiarity with platforms such as MLFlow, Hugging Face, or LangChain. Data Processing: Experience with Databricks/Spark. SQL: Solid querying and data preparation skills. Data Architectures: Understanding of modern data systems (lakehouses, data lakes). Additional (nice-to-have) skills: Infrastructure as Code: Terraform or More ❯
Expertise in causal inference methods and forecasting. Expertise in data querying languages (e.g. SQL) and scripting languages (e.g. Python, R). Experience with data architecture technologies such as Airflow, Databricks, and dbt. Preferred qualifications: Experience in technology, financial services and/or a high growth environment. Experience with Excel and Finance systems (e.g. Oracle). Equal opportunity Airwallex is proud More ❯
Cambridge, Cambridgeshire, United Kingdom Hybrid / WFH Options
Arm Limited
cloud and running hybrids. Container orchestration deployment, configuration, and administration. (e.g. Docker Swarm, Kubernetes). Automation and configuration management (e.g., Ansible, Puppet, Chef). Analytics infrastructure and tooling (e.g. Databricks, Apache Spark). Technical leadership, championing standard methodologies, and leading technical product transformations. In Return: We are proud to have a set of behaviours that reflect our culture and guide More ❯
problems, take the initiative, and identify creative solutions to deliver outcomes in the face of obstacles. Knowledge of common data science tools around SQL-based data warehousing (eg. Snowflake, Databricks, DBT), BI tools (eg. Tableau, Looker), workflow orchestration, and ML Ops. Excellent spoken and written English skills. Fluency with scripting in Python. Ability to work effectively across time zones. Teammates More ❯
Stroud, England, United Kingdom Hybrid / WFH Options
Ecotricity
reporting What will you bring... Strong technical analytical skills as a Data Analyst Managing and manipulating data experience Strong working knowledge of SQL for data analysis (MS-SQL/Databricks - T-SQL, View and table design, Experience of ETL) Expert with PowerBI Advanced MS Excel (Power Query, Power Pivots) Stakeholder management and facilitation of decisions of all sizes Strong presentation More ❯
and ingest large scale data using Apache Spark Top Secret clearance Bachelor's degree Nice If You Have: Experience with developing ETL and ELT pipelines with Apache Nifi and Databricks Ability to perform message queuing and real time streaming with Apache Kafka Ability to perform Scala programming Clearance: Applicants selected will be subject to a security investigation and may need More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
RVU Co UK
proficiency developing applications using most of the following: Strong knowledge of SQL and Python programming. Extensive experience working within a cloud environment. Experience with big data technologies (e.g. Spark, Databricks, Delta Lake, BigQuery). Experience with alternative data technologies (e.g. duckdb, polars, daft). Familiarity with eventing technologies (Event Hubs, Kafka etc ). Deep understanding of file formats and their More ❯
to expand your cloud/platform engineering capabilities Experience working with Big Data Experience of data storage technologies: Delta Lake, Iceberg, Hudi Proven knowledge and understanding of Apache Spark, Databricks or Hadoop Ability to take business requirements and translate these into tech specifications Competence in evaluating and selecting development tools and technologies Sound like the role you have been looking More ❯
and control analysis, solution design, AI governance, and end-to-end implementation management. Assist clients in executing their AI & Data strategy through deployment of modern Data platforms like Snowflake, Databricks, and Microsoft Fabric. Develop and implement data management artifacts by establishing data governance requirements including data lineage discovery, data quality control design and measurement solutions and data privacy related activities. More ❯
are willing to teach if you're willing to learn! Required experience: Python Git Nice to have: SQL dbt Github CircleCI Airflow Kubernetes Terraform A Cloud warehouse provider e.g. Databricks, GCP, Snowflake AWS We aren't necessarily looking for someone who is "10-out-of-10" in all these areas; but rather someone who has good experience in most of More ❯
Experience Designing and Maintaining Data warehouse in big data solutions on Cloud and on Premises. Experience and knowledge of cloud platforms such as Snowflake, AWS (S3, Glue, EC2, Athena), Databricks and DBT . This includes hands-on knowledge of cloud-based data storage, processing, and analytics solutions. Good understanding of business processes. Strong analytical skills and techniques. Expertise in Batch More ❯
Experience Designing and Maintaining Data warehouse in big data solutions on Cloud and on Premises. Experience and knowledge of cloud platforms such as Snowflake, AWS (S3, Glue, EC2, Athena), Databricks and DBT . This includes hands-on knowledge of cloud-based data storage, processing, and analytics solutions. Good understanding of business processes. Strong analytical skills and techniques. Expertise in Batch More ❯
product engineering, data marketplace architecture, data developer portals and understanding of AI/ML solutions. Experience co-selling partner solutions with hyperscalers or platforms (e.g. AWS, Azure, GCP, Snowflake, Databricks). Outstanding communication skills - able to translate complex ideas for both technical and business audiences. Demonstrated thought leadership in data & AI such as speaking at industry events, contributing to whitepapers More ❯