Wakefield, Yorkshire, United Kingdom Hybrid / WFH Options
Flippa.com
/CD) automation, rigorous code reviews, documentation as communication. Preferred Qualifications Familiar with data manipulation and experience with Python libraries like Flask, FastAPI, Pandas, PySpark, PyTorch, to name a few. Proficiency in statistics and/or machine learning libraries like NumPy, matplotlib, seaborn, scikit-learn, etc. Experience in building More ❯
storage, data pipelines to ingest and transform data, and querying & reporting of analytical data. You've worked with technologies such as Python, Spark, SQL, Pyspark, PowerBI etc. You're a problem-solver, pragmatically exploring options and finding effective solutions. An understanding of how to design and build well-structured More ❯
Proven experience of ETL/ELT, including Lakehouse, Pipeline Design, Batch/Stream processing. Strong working knowledge of programming languages, including Python, SQL, PowerShell, PySpark, Spark SQL. Good working knowledge of data warehouse and data mart architectures. Good experience in Data Governance, including Unity Catalog, Metadata Management, Data Lineage More ❯
Proven experience of ETL/ELT, including Lakehouse, Pipeline Design, Batch/Stream processing. Strong working knowledge of programming languages, including Python, SQL, PowerShell, PySpark, Spark SQL. Good working knowledge of data warehouse and data mart architectures. Good experience in Data Governance, including Unity Catalog, Metadata Management, Data Lineage More ❯
Greater Bristol Area, United Kingdom Hybrid / WFH Options
Peaple Talent
various departments to gather requirements and ensure data solutions reflect real business needs. Key Experience Required: Deep expertise in SQL, Python, and Spark (particularly PySpark) for building and testing end-to-end pipelines in that process both structured and semi-structured datasets. Experience mentoring peers and supporting team growth More ❯
and support architectural decisions as a recognised Databricks expert. Essential Skills & Experience: Demonstrable expertise with Databricks and Apache Spark in production environments. Proficiency in PySpark, SQL, and working within one or more cloud platforms (Azure, AWS, or GCP). In-depth understanding of Lakehouse concepts, medallion architecture, and modern More ❯
Cambridge, England, United Kingdom Hybrid / WFH Options
Intellect Group
working days in Cambridge Nice to Have Experience working within a consultancy or client-facing environment Familiarity with tools and frameworks such as: Databricks PySpark Pandas Airflow or dbt Experience deploying solutions using cloud-native services (e.g., BigQuery, AWS Glue, S3, Lambda) What’s On Offer Fully remote working More ❯
cambridge, east anglia, United Kingdom Hybrid / WFH Options
Intellect Group
working days in Cambridge Nice to Have Experience working within a consultancy or client-facing environment Familiarity with tools and frameworks such as: Databricks PySpark Pandas Airflow or dbt Experience deploying solutions using cloud-native services (e.g., BigQuery, AWS Glue, S3, Lambda) What’s On Offer Fully remote working More ❯
Cambridge, south west england, United Kingdom Hybrid / WFH Options
Intellect Group
working days in Cambridge Nice to Have Experience working within a consultancy or client-facing environment Familiarity with tools and frameworks such as: Databricks PySpark Pandas Airflow or dbt Experience deploying solutions using cloud-native services (e.g., BigQuery, AWS Glue, S3, Lambda) What’s On Offer Fully remote working More ❯
of experience in Data Engineering, with a focus on cloud platforms (Azure, AWS, GCP). You have a proven track record working with Databricks (PySpark, SQL, Delta Lake, Unity Catalog). You have extensive experience in ETL/ELT development and data pipeline orchestration (Databricks Workflows, DLT, Airflow, ADF More ❯
to store and process data. • Document workflows, pipelines, and transformation logic for transparency. Key Skills & Experience: • Strong hands-on experience in Python (Pandas, NumPy, PySpark). • Experience building ETL/ELT processes. • Familiarity with cloud platforms (AWS, Azure, GCP) and big data technologies (e.g., Snowflake, Databricks). • Understanding of More ❯
experience in a Data Engineering role: Passion for data and industry best practices in a dynamic environment. Proficiency in technologies such as Spark/PySpark, Azure Data services, Python or Scala, SQL, testing frameworks, open table formats, CI/CD workflows, and cloud infrastructure management. Excellent communication, analytical, and More ❯
Leeds, Yorkshire, United Kingdom Hybrid / WFH Options
Low Carbon Contracts Company
in a highly numerate subject is essential Minimum 2 years' experience in Python development, including scientific computing and data science libraries (NumPy, pandas, SciPy, PySpark) Solid understanding of object-oriented software engineering design principles for usability, maintainability and extensibility Experience working with Git in a version-controlled environment Good More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
ADLIB Recruitment
experience as a Senior Data Engineer, with some experience mentoring others Excellent Python and SQL skills, with hands-on experience building pipelines in Spark (PySpark preferred) Experience with cloud platforms (AWS/Azure) Solid understanding of data architecture, modelling, and ETL/ELT pipelines Experience using tools like Databricks More ❯
Coalville, Leicestershire, East Midlands, United Kingdom Hybrid / WFH Options
Ibstock PLC
Knowledge, Skills and Experience: Essentia l Strong expertise in Databricks and Apache Spark for data engineering and analytics. Proficient in SQL and Python/PySpark for data transformation and analysis. Experience in data lakehouse development and Delta Lake optimisation. Experience with ETL/ELT processes for integrating diverse data More ❯
to store and process data. Document workflows, pipelines, and transformation logic for transparency. Key Skills & Experience: Strong hands-on experience in Python (Pandas, NumPy, PySpark). Experience building ETL/ELT processes. Familiarity with cloud platforms (AWS, Azure, GCP) and big data technologies (e.g., Snowflake, Databricks). Understanding of More ❯
salisbury, south west england, United Kingdom Hybrid / WFH Options
Ascentia Partners
encryption best practices. Nice-to-Have Skills Exposure to AWS Redshift, Glue, or Snowflake. Familiarity with BigQuery and Google Analytics APIs. Proficiency in Python, PySpark, or dbt for data transformations. Background in insurance, especially in pricing analytics or actuarial data. Click Apply More ❯
Cambridge, England, United Kingdom Hybrid / WFH Options
Intellect Group
Azure) Strong communication skills and a proactive attitude UK-based with the option to attend weekly meet-ups in Cambridge Nice to Have Databricks PySpark Pandas These are not essential but will be highly beneficial in this role. What’s On Offer Fully remote working with flexible hours Weekly More ❯
cambridge, east anglia, United Kingdom Hybrid / WFH Options
Intellect Group
Azure) Strong communication skills and a proactive attitude UK-based with the option to attend weekly meet-ups in Cambridge Nice to Have Databricks PySpark Pandas These are not essential but will be highly beneficial in this role. What’s On Offer Fully remote working with flexible hours Weekly More ❯
Cambridge, south west england, United Kingdom Hybrid / WFH Options
Intellect Group
Azure) Strong communication skills and a proactive attitude UK-based with the option to attend weekly meet-ups in Cambridge Nice to Have Databricks PySpark Pandas These are not essential but will be highly beneficial in this role. What’s On Offer Fully remote working with flexible hours Weekly More ❯
to Octopus offices across Europe and the US. Our Data Stack: SQL-based pipelines built with dbt on Databricks Analysis via Python Jupyter notebooks Pyspark in Databricks workflows for heavy lifting Streamlit and Python for dashboarding Airflow DAGs with Python for ETL running on Kubernetes and Docker Django for More ❯
a highly numerate subject is essential At least 2 years of Python development experience, including scientific computing and data science libraries (NumPy, pandas, SciPy, PySpark) Strong understanding of object-oriented design principles for usability and maintainability Experience with Git in a version-controlled environment Knowledge of parallel computing techniques More ❯
Birmingham, Staffordshire, United Kingdom Hybrid / WFH Options
Low Carbon Contracts Company
a highly numerate subject is essential At least 2 years of Python development experience, including scientific computing and data science libraries (NumPy, pandas, SciPy, PySpark) Strong understanding of object-oriented design principles for usability and maintainability Experience with Git in a version-controlled environment Knowledge of parallel computing techniques More ❯
machine learning, with a strong portfolio of relevant projects. Proficiency in Python with libraries like TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like Apache Airflow, Spark, or Kafka. Strong understanding of SQL, NoSQL More ❯
machine learning, with a strong portfolio of relevant projects. Proficiency in Python with libraries like TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like Apache Airflow, Spark, or Kafka. Strong understanding of SQL, NoSQL More ❯