as Flask, Django, or FastAPI. Proficiency in Python 3.x and libraries like Pandas, NumPy, and Dask. Experience with data manipulation and processing frameworks (e.g., PySpark, Apache Beam). Strong knowledge of databases, including SQL and NoSQL (e.g., PostgreSQL, MongoDB). Familiarity with ETL processes and tools such as Airflow More ❯
Hands-on experience with Azure Databricks, Delta Lake, Data Factory, and Synapse. Strong understanding of Lakehouse architecture and medallion design patterns. Proficient in Python, PySpark, and SQL (advanced query optimization). Experience building scalable ETL pipelines and data transformations. Knowledge of data quality frameworks and monitoring. Experience with Git More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - Leading Energy Company - London (Tech Stack: Data Engineer, Databricks, Python, PySpark, Power BI, AWS QuickSight, AWS, TSQL, ETL, Agile Methodologies) Company Overview: Join a dynamic team, a leading player in the energy sector, committed to innovation and sustainable solutions. Our client are seeking a talented Data Engineer More ❯
and performance Work within Azure Databricks and follow code-based deployment practices Must-Have Skills: 3 years of experience with Databricks (Lakehouse, Delta Lake, PySpark, Spark SQL) Strong SQL skills (5 years) Experience with Azure, focusing on Databricks Excellent client-facing communication skills Experience deploying Databricks pipelines and jobs More ❯
System Integration, Application Development or Data-Warehouse projects, across technologies used in the enterprise space. Software development experience using: Object-oriented languages (e.g., Python, PySpark,) and frameworks Stakeholder Management Expertise in relational and dimensional modelling, including big data technologies. Exposure across all the SDLC process, including testing and deployment. More ❯
System Integration, Application Development or Data-Warehouse projects, across technologies used in the enterprise space. Software development experience using: Object-oriented languages (e.g., Python, PySpark,) and frameworks Stakeholder Management Expertise in relational and dimensional modelling, including big data technologies. Exposure across all the SDLC process, including testing and deployment. More ❯
System Integration, Application Development or Data-Warehouse projects, across technologies used in the enterprise space. Software development experience using: Object-oriented languages (e.g., Python, PySpark,) and frameworks Stakeholder Management Expertise in relational and dimensional modelling, including big data technologies. Exposure across all the SDLC process, including testing and deployment. More ❯
machine learning, with a strong portfolio of relevant projects. Proficiency in Python with libraries like TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like Apache Airflow, Spark, or Kafka. Strong understanding of SQL, NoSQL More ❯
DevOps and Infrastructure-as-Code (ideally using Terraform) Take ownership of system observability, stability, and documentation Requirements Strong experience in Python (especially Pandas and PySpark) and SQL Proven expertise in building data pipelines and working with Databricks and Lakehouse environments Deep understanding of Azure (or similar cloud platforms), including More ❯
City of London, England, United Kingdom Hybrid / WFH Options
Anson McCade
data platforms using Google Cloud Platform Hands-on experience with GCP tools: BigQuery, Dataform, Dataproc, Composer, Pub/Sub Strong programming skills in Python, PySpark , and SQL Deep understanding of data engineering concepts, including ETL, data warehousing, and cloud storage Strong communication skills with the ability to collaborate across More ❯
City of London, England, United Kingdom Hybrid / WFH Options
Anson McCade
data warehouses and data lakes. Expertise in GCP data services including BigQuery, Composer, Dataform, DataProc, and Pub/Sub. Strong programming experience with Python, PySpark, and SQL. Hands-on experience with data modelling, ETL processes, and data quality frameworks. Proficiency with BI/reporting tools such as Looker or More ❯
team solving real-world trading challenges What We’re Looking For 8+ years of professional experience in Python application development Solid knowledge of Pandas, PySpark, and modern testing (PyTest) Strong background in Azure cloud services (Databricks, ADF, Key Vaults, etc.) Experience developing web UIs using ReactJS, TypeScript Familiarity with More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Mars
on a multi-year digital transformation journey where your work will unlock real impact. 🌟 What you'll do Build robust data pipelines using Python, PySpark, and cloud-native tools Engineer scalable data models with Databricks, Delta Lake, and Azure tech Collaborate with analysts, scientists, and fellow engineers to deliver More ❯
complex data challenges that have wide-reaching impact across multiple business domains. Key Requirements: Strong experience in AWS data engineering tools (e.g., Glue, Athena, PySpark, Lake Formation) Solid skills in Python and SQL for data processing and analysis Deep understanding of data governance, quality, and security A passion for More ❯
at scale. Hands-on expertise in core GCP data services such as BigQuery, Composer, Dataform, Dataproc, and Pub/Sub. Strong programming skills in PySpark, Python, and SQL. Proficiency in ETL processes, data mining, and data storage principles. Experience with BI and data visualisation tools, such as Looker or More ❯
City of London, England, United Kingdom Hybrid / WFH Options
Anson McCade
company Data & AI focus Hands-on experience with GCP services such as BigQuery, Dataproc, Dataform, Composer, and Pub/Sub Strong programming skills in PySpark, Python, and SQL Solid understanding of ETL/ELT processes and data quality at scale Excellent communication and documentation skills Desirable Experience: Exposure to More ❯
City of London, England, United Kingdom Hybrid / WFH Options
Anson McCade
and data lake patterns , including ingestion, governance, and quality. Strong technical skills in BigQuery, DataProc, Dataform, Composer, Pub/Sub . Fluent in Python, PySpark, and SQL . Experience with BI tools like Looker or Power BI. Strong client-facing and communication skills — able to lead conversations with both More ❯
coding practices. Required Technical Skills: Proven experience in data warehouse architecture and implementation. Expertise in designing and configuring Azure-based deployment pipelines. SQL, Python, PySpark Azure Data Lake+ Databricks Traditional ETL tool This is an excellant opportunity for a talented Senior Data Engineer to join a business who are More ❯
City of London, London, United Kingdom Hybrid / WFH Options
La Fosse
Collaborate across technical and non-technical teams Troubleshoot issues and support wider team adoption of the platform What You’ll Bring: Proficiency in Python, PySpark, Spark SQL or Java Experience with cloud tools (Lambda, S3, EKS, IAM) Knowledge of Docker, Terraform, GitHub Actions Understanding of data quality frameworks Strong More ❯
and reliable operations. Required: 6-10 years of experience in software development with a focus on production-grade code. Proficiency in Java, Python, and PySpark; experience with C++ is a plus. Deep expertise in Azure services, including Azure Storage, and familiarity with AWS S3. Strong understanding of data security More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Hexegic
to code, and modify monitoring set-ups Communicating outages to end users What we are looking for Comfortable reading and writing code in Python, PySpark and Java Basic understanding of Spark Ability to navigate pipeline development tools What’s in it for you? Base salary of More ❯
City of London, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
Science, Data Science, Mathematics, or related field. 5+ years of experience in ML modeling, ranking, or recommendation systems . Proficiency in Python, SQL, Spark, PySpark, TensorFlow . Strong knowledge of LLM algorithms and training techniques . Experience deploying models in production environments. Nice to Have: Experience in GenAI/ More ❯
Future Talent Pool - GCP Data Engineer, London, hybrid role - digital Google Cloud transformation programme Proficiency in programming languages such as Python, PySpark and Java develop ETL processes for Data ingestion & preparation SparkSQL CloudRun, DataFlow, CloudStorage GCP BigQuery Google Cloud Platform Data Studio Unix/Linux Platform Version control tools More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Syntax Consultancy Limited
with proficiency in designing and implementing CI/CD pipelines in Cloud environments. Excellent practical expertise in Performance tuning and system optimisation. Experience with PySpark and Azure Databricks for distributed data processing and large-scale data analysis. Proven experience with web frameworks , including knowledge of Django and experience with More ❯
City of London, London, United Kingdom Hybrid / WFH Options
eTeam
office. Contract duration: 6 months Location: London Rate: 300 to 340 pounds/day on inside Required Technical Skills: Palantir Foundry ETL SparkPythonPySpark Informatica AWS SQL, PLSQL Shell Scripting Data Lake Data warehousing Scala Oracle MS SQL Server PowerBI More ❯