Wakefield, Yorkshire, United Kingdom Hybrid / WFH Options
Flippa.com
/CD) automation, rigorous code reviews, documentation as communication. Preferred Qualifications Familiar with data manipulation and experience with Python libraries like Flask, FastAPI, Pandas, PySpark, PyTorch, to name a few. Proficiency in statistics and/or machine learning libraries like NumPy, matplotlib, seaborn, scikit-learn, etc. Experience in building More ❯
Proven experience of ETL/ELT, including Lakehouse, Pipeline Design, Batch/Stream processing. Strong working knowledge of programming languages, including Python, SQL, PowerShell, PySpark, Spark SQL. Good working knowledge of data warehouse and data mart architectures. Good experience in Data Governance, including Unity Catalog, Metadata Management, Data Lineage More ❯
Greater Bristol Area, United Kingdom Hybrid / WFH Options
Peaple Talent
various departments to gather requirements and ensure data solutions reflect real business needs. Key Experience Required: Deep expertise in SQL, Python, and Spark (particularly PySpark) for building and testing end-to-end pipelines in that process both structured and semi-structured datasets. Experience mentoring peers and supporting team growth More ❯
Cambridge, England, United Kingdom Hybrid / WFH Options
Intellect Group
working days in Cambridge Nice to Have Experience working within a consultancy or client-facing environment Familiarity with tools and frameworks such as: Databricks PySpark Pandas Airflow or dbt Experience deploying solutions using cloud-native services (e.g., BigQuery, AWS Glue, S3, Lambda) What’s On Offer Fully remote working More ❯
Leeds, Yorkshire, United Kingdom Hybrid / WFH Options
Low Carbon Contracts Company
in a highly numerate subject is essential Minimum 2 years' experience in Python development, including scientific computing and data science libraries (NumPy, pandas, SciPy, PySpark) Solid understanding of object-oriented software engineering design principles for usability, maintainability and extensibility Experience working with Git in a version-controlled environment Good More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
ADLIB Recruitment
experience as a Senior Data Engineer, with some experience mentoring others Excellent Python and SQL skills, with hands-on experience building pipelines in Spark (PySpark preferred) Experience with cloud platforms (AWS/Azure) Solid understanding of data architecture, modelling, and ETL/ELT pipelines Experience using tools like Databricks More ❯
Azure Databricks Azure Function Apps & Logic Apps Azure Stream Analytics Azure Resource Manager tools: Terraform, Azure Portal, Azure CLI, and Azure PowerShell Proficient in PySpark, Delta Lake, Unity Catalog, and Python Ability to write unit and integration tests using unittest, pytest, etc. Solid understanding of software engineering principles, including More ❯
Coalville, Leicestershire, East Midlands, United Kingdom Hybrid / WFH Options
Ibstock PLC
Knowledge, Skills and Experience: Essentia l Strong expertise in Databricks and Apache Spark for data engineering and analytics. Proficient in SQL and Python/PySpark for data transformation and analysis. Experience in data lakehouse development and Delta Lake optimisation. Experience with ETL/ELT processes for integrating diverse data More ❯
Salisbury, England, United Kingdom Hybrid / WFH Options
Ascentia Partners
encryption best practices. Nice-to-Have Skills Exposure to AWS Redshift, Glue, or Snowflake. Familiarity with BigQuery and Google Analytics APIs. Proficiency in Python, PySpark, or dbt for data transformations. Background in insurance, especially in pricing analytics or actuarial data. Click Apply More ❯
Cambridge, England, United Kingdom Hybrid / WFH Options
Intellect Group
Azure) Strong communication skills and a proactive attitude UK-based with the option to attend weekly meet-ups in Cambridge Nice to Have Databricks PySpark Pandas These are not essential but will be highly beneficial in this role. What’s On Offer Fully remote working with flexible hours Weekly More ❯
The ability to problem-solve. Knowledge of AWS or equivalent cloud technologies. Knowledge of Serverless technologies frameworks and best practices. Apache Spark (Scala or Pyspark) Experience using AWS CloudFormation or Terraform for infrastructure automation. Knowledge of Scala or 00 language such as Java or C#. SQL or Python development More ❯
a highly numerate subject is essential At least 2 years of Python development experience, including scientific computing and data science libraries (NumPy, pandas, SciPy, PySpark) Strong understanding of object-oriented design principles for usability and maintainability Experience with Git in a version-controlled environment Knowledge of parallel computing techniques More ❯
Birmingham, Staffordshire, United Kingdom Hybrid / WFH Options
Low Carbon Contracts Company
a highly numerate subject is essential At least 2 years of Python development experience, including scientific computing and data science libraries (NumPy, pandas, SciPy, PySpark) Strong understanding of object-oriented design principles for usability and maintainability Experience with Git in a version-controlled environment Knowledge of parallel computing techniques More ❯
Newcastle upon Tyne, Tyne & Wear Hybrid / WFH Options
Client Server
A A B grades at A-level You have commercial Data Engineering experience working with technologies such as SQL, Apache Spark and Python including PySpark and Pandas You have a good understanding of modern data engineering best practices Ideally you will also have experience with Azure and Data Bricks More ❯
performance Utilise Azure Databricks and adhere to code-based deployment practices Essential Skills: Over 3 years of experience with Databricks (including Lakehouse, Delta Lake, PySpark, Spark SQL) Strong proficiency in SQL with 5+ years of experience Extensive experience with Azure Data Factory Proficiency in Python programming Excellent stakeholder/ More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Peaple Talent
Strong experience designing and delivering data solutions in Databricks Proficient with SQL and Python Experience using Big Data technologies such as Apache Spark or PySpark Great communication skills, effectively participating with Senior Stakeholders Nice to haves: Azure/AWS Data Engineering certifications Databricks certifications What's in it for More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Peaple Talent
Strong experience designing and delivering data solutions in Databricks Proficient with SQL and Python Experience using Big Data technologies such as Apache Spark or PySpark Great communication skills, effectively participating with Senior Stakeholders Nice to haves: Azure/AWS Data Engineering certifications Databricks certifications What's in it for More ❯
our ambitious data initiatives and future projects. IT Manager - Microsoft Azure- The Skills You'll Need to Succeed: Mastery of Data bricks, Python/PySpark and SQL/SparkSQL. Experience in Big Data/ETL (Spark and Data bricks preferred). Expertise in Azure. Proficiency with versioning control (Git More ❯
environment with one or more modern programming languages and database querying languages Proficiency in coding one or more languages such as Java, Python or PySpark Experience in Cloud implementation with AWS Data Services, Glue ETL (or) EMR, S3, Glue Catalog, Athena, Lambda, Step Functions, Event Bridge, ECS, Data De More ❯
Lake, Synapse, Power BI) Required Skills & Experience Proven experience as a Data Architect in enterprise environments Extensive hands-on experience with Databricks (including SQL, PySpark, Delta Lake) Solid background in data warehousing , data lakes , and big data frameworks Strong knowledge of Azure cloud services , especially in data integration Experience More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Snap Analytics
cloud-based architectures on AWS, Azure, or GCP A strong background in Databricks for building enterprise data warehouses with strong knowledge of SQL and Pyspark A deep understanding of data modelling and advanced pipeline development Knowledge and understanding of best practice CI/CD approaches utilising Git. Strategic Vision More ❯
languages eg. Python, R, Scala, etc.; (Python preferred). Proficiency in database technologies eg. SQL, ETL, No-SQL, DW, and Big Data technologies e.g. pySpark, Hive, etc. Experienced working with structured and also unstructured data eg. Text, PDFs, jpgs, call recordings, video, etc. Knowledge of machine learning modelling techniques More ❯
experience with data warehousing, ETL/ELT, integration tools, and BI solutions. Expertise with Synapse (notebooks and data flows) is essential. Working knowledge of PySpark is essential. Experience with Azure Data Lake Storage is essential. Understanding of medallion data lakehouse architecture is essential. Strong problem-solving skills for complex More ❯
london, south east england, united kingdom Hybrid / WFH Options
La Fosse
Collaborate across technical and non-technical teams Troubleshoot issues and support wider team adoption of the platform What You’ll Bring: Proficiency in Python, PySpark, Spark SQL or Java Experience with cloud tools (Lambda, S3, EKS, IAM) Knowledge of Docker, Terraform, GitHub Actions Understanding of data quality frameworks Strong More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Insight Global
transformation logic, and build of all Power BI dashboards - including testing, optimization & integration to data sources Good exposure to all elements of data engineering (PySpark, Lakehouses, Kafka, etc.) Experience building reports from streaming data Strong understanding of CI/CD pipeline Financial and/or trading exposure, particularly energy More ❯