logistics, utilities, airlines etc). Hands-on experience with Azure Databricks, Delta Lake, Data Factory, and Synapse. Strong understanding of Lakehouse architecture and medallion design patterns. Proficient in Python, PySpark, and SQL (advanced query optimization). Experience building scalable ETL pipelines and data transformations. Knowledge of data quality frameworks and monitoring. Experience with Git, CI/CD pipelines, and More ❯
salary, equity, and benefits. Join a supportive, high-calibre engineering team committed to quality and innovation. Location: Paddington, London Tech Stack: Python, SQL, dbt, Databricks, Azure, GCP, Airflow, Terraform, PySparkMore ❯
salary, equity, and benefits. Join a supportive, high-calibre engineering team committed to quality and innovation. Location: Paddington, London Tech Stack: Python, SQL, dbt, Databricks, Azure, GCP, Airflow, Terraform, PySparkMore ❯
/week On-Site) Job Type: Contract Skills/Qualifications: 4+ years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. 3+ years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines 3+ years of proficiency in working with Snowflake or similar cloud More ❯
months with possible extensions (No Sponsorship Available ) Skills/Qualifications: 4+ years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. 3+ years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines 3+ years of proficiency in working with Snowflake or similar cloud More ❯
and deliver end-to-end analytics and ML/AI capabilities. This is a long-term contract role. The following skills/experience is essential: Strong expertise in Python, PySpark, and SQL Extensive hands-on experience with Databricks Familiarity with Data Science and Machine Learning frameworks Previously worked in Financial Services Trading environment Background in DevOps and Infrastructure as More ❯
City of London, Greater London, UK Hybrid / WFH Options
Hunter Bond
and deliver end-to-end analytics and ML/AI capabilities. This is a long-term contract role. The following skills/experience is essential: Strong expertise in Python, PySpark, and SQL Extensive hands-on experience with Databricks Familiarity with Data Science and Machine Learning frameworks Previously worked in Financial Services Trading environment Background in DevOps and Infrastructure as More ❯
in Power BI Service Strong understanding of REST API principles Advanced DAX skills Advanced Power Query skills Nice-to-Have: Microsoft DP-700 Certification Python for data analysis Spark & PySpark experience Evidence of dashboards or system integration projects Knowledge of CI/CD in Microsoft Fabric/Azure DevOps Familiarity with latest Power BI features such as field parameters More ❯
week Contract role (6 to 12 Months) Skills/Qualifications: 4+ years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. 3+ years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines 3+ years of proficiency in working with Snowflake or similar cloud More ❯
Binley Woods, Warwickshire, UK Hybrid / WFH Options
Lorien
/executing tests Requirements Strong experience as a Data Engineer (migrating legacy systems onto AWS, building data pipelines) Strong Python experience Tech stack experience required: AWS Glue, Redshift, Lambda, PySpark, Airflow SSIS or SAS experience (Desirable) Benefits Salary up to 57,500 + up to 20% bonus Hybrid working: once a fortnight in the office 28 days holiday plus More ❯
drive results. What were looking for: Proven experience in Data Architecture and data modelling. Strong skills in Microsoft Azure tools (Fabric, OneLake, Data Factory). Confident with Python/PySpark and relational databases. Hands-on ETL/ELT experience. A problem-solver with a positive, can-do attitude. Bonus points if you bring: Tableau, Power BI, SSAS, SSIS or More ❯
AIP adoption and improve automation What Youll Bring Experience as a Data & AI Engineer Hands-on experience with Palantir Foundry (data integration, ontology, pipelines, applications) Strong skills in Python, PySpark, SQL , and data modelling Practical experience with AIP features (RAG workflows, copilots, agent-based apps) Ability to work independently and engage with non-technical stakeholders Strong problem-solving mindset More ❯
ll Actually Do: Design & Build Data Pipelines: Take full ownership of designing, building, and managing the full lifecycle of complex data pipelines using Azure Data Factory, Databricks (Python/PySpark), and advanced SQL. Productionise Databricks: Lead the development of robust, scalable solutions on Databricks. This is role focused on production code, Delta Lake, Structured Streaming, and Spark performance tuningnot … in implementing and managing CI/CD pipelines for data solutions, specifically using Azure DevOps. Expert Programming Skills: Expert-level skills for data transformation and automation, especially in Python (PySpark) and advanced SQL. Data Warehousing & Modelling: Proven experience in data warehousing principles and designing data models (e.g., dimensional, medallion) to support analytics. Exceptional Communication: The ability to translate complex More ❯
Your new company This is a pivotal opportunity to join the Data and Innovation division of a large complex organisation leading the delivery of SAM (Supervisory Analytics and Metrics)-a transformative programme enhancing supervisory decision-making through advanced data and More ❯
Job title: Data Analyst Client: Elite FinTech Salary: 65,000-100,000 + Bonus Location: London Skills: SQL, Python, PySpark, Airflow, Linux The role: My client are looking for a Data Analyst to join their team. Responsibilities: Playing a key role in all Data related activities for a wide range of datasets, used by Quants and Traders Working closely … Core skills required: 3+ years working as a Data Analyst, ideally within FinTech or Financial Services Exposure to Derivatives or other Financial instruments Strong SQL experience Strong Python experience (PySpark, Pandas, Jupyter Notebooks etc.) Airflow/Algo for workflow management Git Any exposure to compressed file formats such as Parquet or HDF5 highly advantageous Linux/Bash skills highly More ❯
Job title: Data Analyst Client: Elite FinTech Salary: 65,000-100,000 + Bonus Location: London Skills: SQL, Python, PySpark, Airflow, Linux The role: My client are looking for a Data Analyst to join their team. Responsibilities: Playing a key role in all Data related activities for a wide range of datasets, used by Quants and Traders Working closely … Core skills required: 3+ years working as a Data Analyst, ideally within FinTech or Financial Services Exposure to Derivatives or other Financial instruments Strong SQL experience Strong Python experience (PySpark, Pandas, Jupyter Notebooks etc.) Airflow/Algo for workflow management Git Any exposure to compressed file formats such as Parquet or HDF5 highly advantageous Linux/Bash skills highly More ❯