Proven experience of ETL/ELT, including Lakehouse, Pipeline Design, Batch/Stream processing. Strong working knowledge of programming languages, including Python, SQL, PowerShell, PySpark, Spark SQL. Good working knowledge of data warehouse and data mart architectures. Good experience in Data Governance, including Unity Catalog, Metadata Management, Data Lineage More ❯
various departments to gather requirements and ensure data solutions reflect real business needs. Key Experience Required: Deep expertise in SQL, Python, and Spark (particularly PySpark) for building and testing end-to-end pipelines in that process both structured and semi-structured datasets. Experience mentoring peers and supporting team growth More ❯
various departments to gather requirements and ensure data solutions reflect real business needs. Key Experience Required: Deep expertise in SQL, Python, and Spark (particularly PySpark) for building and testing end-to-end pipelines in that process both structured and semi-structured datasets. Experience mentoring peers and supporting team growth More ❯
Greater Bristol Area, United Kingdom Hybrid / WFH Options
Peaple Talent
various departments to gather requirements and ensure data solutions reflect real business needs. Key Experience Required: Deep expertise in SQL, Python, and Spark (particularly PySpark) for building and testing end-to-end pipelines in that process both structured and semi-structured datasets. Experience mentoring peers and supporting team growth More ❯
newport, wales, united kingdom Hybrid / WFH Options
Peaple Talent
various departments to gather requirements and ensure data solutions reflect real business needs. Key Experience Required: Deep expertise in SQL, Python, and Spark (particularly PySpark) for building and testing end-to-end pipelines in that process both structured and semi-structured datasets. Experience mentoring peers and supporting team growth More ❯
bath, south west england, united kingdom Hybrid / WFH Options
Peaple Talent
various departments to gather requirements and ensure data solutions reflect real business needs. Key Experience Required: Deep expertise in SQL, Python, and Spark (particularly PySpark) for building and testing end-to-end pipelines in that process both structured and semi-structured datasets. Experience mentoring peers and supporting team growth More ❯
bradley stoke, south west england, united kingdom Hybrid / WFH Options
Peaple Talent
various departments to gather requirements and ensure data solutions reflect real business needs. Key Experience Required: Deep expertise in SQL, Python, and Spark (particularly PySpark) for building and testing end-to-end pipelines in that process both structured and semi-structured datasets. Experience mentoring peers and supporting team growth More ❯
in Data Management, Data Integration, Data Quality, Data Monitoring, and Analytics. Experience leading teams of technologists and managing global stakeholders. Proficiency in Python and PySpark for data engineering tasks. Experience building cloud-native applications using platforms such as AWS, Azure, or GCP, and leveraging cloud services for data storage More ❯
Wakefield, Yorkshire, United Kingdom Hybrid / WFH Options
Flippa.com
/CD) automation, rigorous code reviews, documentation as communication. Preferred Qualifications Familiar with data manipulation and experience with Python libraries like Flask, FastAPI, Pandas, PySpark, PyTorch, to name a few. Proficiency in statistics and/or machine learning libraries like NumPy, matplotlib, seaborn, scikit-learn, etc. Experience in building More ❯
to store and process data. Document workflows, pipelines, and transformation logic for transparency. Key Skills & Experience: Strong hands-on experience in Python (Pandas, NumPy, PySpark). Experience building ETL/ELT processes. Familiarity with cloud platforms (AWS, Azure, GCP) and big data technologies (e.g., Snowflake, Databricks). Understanding of More ❯
initiatives Key Requirements of the Database Engineer: Proven experience with Databricks, Azure Data Lake, and Delta Live Tables Strong programming in Python and Spark (PySpark or Scala) Solid knowledge of data modelling, warehousing, and integration concepts Comfortable working in Agile teams, with CI/CD and Azure DevOps experience More ❯
Somerset, England, United Kingdom Hybrid / WFH Options
CA Tech Talent
initiatives Key Requirements of the Database Engineer: Proven experience with Databricks, Azure Data Lake, and Delta Live Tables Strong programming in Python and Spark (PySpark or Scala) Solid knowledge of data modelling, warehousing, and integration concepts Comfortable working in Agile teams, with CI/CD and Azure DevOps experience More ❯
bath, south west england, united kingdom Hybrid / WFH Options
CA Tech Talent
initiatives Key Requirements of the Database Engineer: Proven experience with Databricks, Azure Data Lake, and Delta Live Tables Strong programming in Python and Spark (PySpark or Scala) Solid knowledge of data modelling, warehousing, and integration concepts Comfortable working in Agile teams, with CI/CD and Azure DevOps experience More ❯
machine learning, with a strong portfolio of relevant projects. Proficiency in Python with libraries like TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like Apache Airflow, Spark, or Kafka. Strong understanding of SQL, NoSQL More ❯
Delta Lake, Azure SQL, Purview and APIM. Proficiency in developing CI/CD data pipelines and strong programming skills in Python, SQL, Bash, and PySpark for automation. Strong aptitude for data pipeline monitoring and an understanding of data security practices such as RBAC and encryption. Implemented data and pipeline More ❯
bradford, yorkshire and the humber, united kingdom
Peregrine
Delta Lake, Azure SQL, Purview and APIM. Proficiency in developing CI/CD data pipelines and strong programming skills in Python, SQL, Bash, and PySpark for automation. Strong aptitude for data pipeline monitoring and an understanding of data security practices such as RBAC and encryption. Implemented data and pipeline More ❯
london, south east england, united kingdom Hybrid / WFH Options
Bounce Digital
and external (eBay APIs) sources Define data quality rules, set up monitoring/logging, and support architecture decisions What You Bring Strong SQL & Python (PySpark); hands-on with GCP or AWS Experience with modern ETL tools (dbt, Airflow, Fivetran) BI experience (Looker, Power BI, Metabase); Git and basic CI More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Bounce Digital
and external (eBay APIs) sources Define data quality rules, set up monitoring/logging, and support architecture decisions What You Bring Strong SQL & Python (PySpark); hands-on with GCP or AWS Experience with modern ETL tools (dbt, Airflow, Fivetran) BI experience (Looker, Power BI, Metabase); Git and basic CI More ❯
london, south east england, united kingdom Hybrid / WFH Options
Anson McCade
data platforms using Google Cloud Platform Hands-on experience with GCP tools: BigQuery, Dataform, Dataproc, Composer, Pub/Sub Strong programming skills in Python, PySpark , and SQL Deep understanding of data engineering concepts, including ETL, data warehousing, and cloud storage Strong communication skills with the ability to collaborate across More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Anson McCade
data platforms using Google Cloud Platform Hands-on experience with GCP tools: BigQuery, Dataform, Dataproc, Composer, Pub/Sub Strong programming skills in Python, PySpark , and SQL Deep understanding of data engineering concepts, including ETL, data warehousing, and cloud storage Strong communication skills with the ability to collaborate across More ❯
london, south east england, united kingdom Hybrid / WFH Options
Anson McCade
data warehouses and data lakes. Expertise in GCP data services including BigQuery, Composer, Dataform, DataProc, and Pub/Sub. Strong programming experience with Python, PySpark, and SQL. Hands-on experience with data modelling, ETL processes, and data quality frameworks. Proficiency with BI/reporting tools such as Looker or More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Anson McCade
data warehouses and data lakes. Expertise in GCP data services including BigQuery, Composer, Dataform, DataProc, and Pub/Sub. Strong programming experience with Python, PySpark, and SQL. Hands-on experience with data modelling, ETL processes, and data quality frameworks. Proficiency with BI/reporting tools such as Looker or More ❯
added flexibility for diverse migration and integration projects. Prior experience with tools such as MuleSoft, Boomi, Informatica, Talend, SSIS, or custom scripting languages (Python, PySpark, SQL) for data extraction and transformation. Prior experience with Data warehousing and Data modelling (Star Schema or Snowflake Schema). Skilled in security frameworks More ❯
added flexibility for diverse migration and integration projects. Prior experience with tools such as MuleSoft, Boomi, Informatica, Talend, SSIS, or custom scripting languages (Python, PySpark, SQL) for data extraction and transformation. Prior experience with Data warehousing and Data modelling (Star Schema or Snowflake Schema). Skilled in security frameworks More ❯
added flexibility for diverse migration and integration projects. Prior experience with tools such as MuleSoft, Boomi, Informatica, Talend, SSIS, or custom scripting languages (Python, PySpark, SQL) for data extraction and transformation. Prior experience with Data warehousing and Data modelling (Star Schema or Snowflake Schema). Skilled in security frameworks More ❯