and data stores to support organizational requirements. Skills/Qualifications: 4+ years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. 3+ years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines 3+ years of proficiency in working with Snowflake or similar cloud More ❯
on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in Python, Pyspark, Scala or Java. Experience operating in a cloud-native environment (e.g. Fabric, AWS, GCP, or Azure). Excellent stakeholder management and communication skills. A strategic mindset, with a practical More ❯
Employment Type: Permanent
Salary: £80000 - £95000/annum Attractive Bonus and Benefits
West London, London, United Kingdom Hybrid / WFH Options
Young's Employment Services Ltd
on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in Python, Pyspark, Scala or Java. Experience operating in a cloud-native environment (e.g. Fabric, AWS, GCP, or Azure). Excellent stakeholder management and communication skills. A strategic mindset, with a practical More ❯
West London, London, England, United Kingdom Hybrid / WFH Options
Young's Employment Services Ltd
on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in Python, Pyspark, Scala or Java. Experience operating in a cloud-native environment (e.g. Fabric, AWS, GCP, or Azure). Excellent stakeholder management and communication skills. A strategic mindset, with a practical More ❯
delivering enterprise-grade data platforms on GCP, AWS, or Azure Deep expertise in data modeling, data warehousing, distributed systems, and modern data lake architectures Advanced proficiency in Python (including PySpark) and SQL, with experience building scalable data pipelines and analytics workflows Strong background in cloud-native data infrastructure (e.g., BigQuery, Redshift, Snowflake, Databricks) Demonstrated ability to lead teams, set More ❯
data engineering and reporting. Including storage, data pipelines to ingest and transform data, and querying & reporting of analytical data. You've worked with technologies such as Python, Spark, SQL, Pyspark, PowerBI etc. You're a problem-solver, pragmatically exploring options and finding effective solutions. An understanding of how to design and build well-structured, maintainable systems. Strong communication skills More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Nigel Frank International
an impact, get in touch ASAP as interviews are already taking place. Don't miss out! Key Skills: AWS, Data, Architecture, Data Engineering, Data Warehousing, Data Lakes, Databricks, Glue, Pyspark, Athena, Python, SQL, Machine Learning, London More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
an impact, get in touch ASAP as interviews are already taking place. Don't miss out! Key Skills: AWS, Data, Architecture, Data Engineering, Data Warehousing, Data Lakes, Databricks, Glue, Pyspark, Athena, Python, SQL, Machine Learning, London More ❯
development) Strong experience with CI/CD tools and pipelines for data science Solid understanding of AWS services (e.g. EC2, S3, Lambda, Glue) and CDK Proficient in Python and PySpark; SQL fluency Experience with MLflow or other model lifecycle tools Effective communicator and trainer - able to help others upskill Comfortable building internal tools and documentation Nice to Have: Experience More ❯
Rochdale, Greater Manchester, North West, United Kingdom Hybrid / WFH Options
Footasylum Ltd
knowledge sharing sessions and self-development. About You Experience with finance/financial systems and concepts Azure Databricks Azure Data Factory Excellent SQL skills Good Python/Spark/pyspark skills Experience of Kimball Methodology and star schemas (dimensional model). Experience of working with enterprise data warehouse solutions. Experience of working with structured and unstructured data Experience of More ❯
Medallion Architecture to an Azure Datawarehouse Advanced Knowledge around CI/CD pipelines, deployment automation, Infrastructure as Code and work management within Azure DevOps Knowledge of development within Databricks, PySpark, Delta Lake, Unity Catalog and Notebook Development Demonstratable experience in SQL, T-SQL, JSON, Python & data consumption via API's, with some understanding of Dax and PowerShell Understanding and More ❯
Reading, Berkshire, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
tools to build robust data pipelines and applications that process complex datasets from multiple operational systems.Key Responsibilities: Build and maintain AWS-based ETL/ELT pipelines using S3, Glue (PySpark/Python), Lambda, Athena, Redshift, and Step Functions Develop backend applications to automate and support compliance reporting Process and validate complex data formats including nested JSON, XML, and CSV More ❯
Reading, Berkshire, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
to build robust data pipelines and applications that process complex datasets from multiple operational systems. Key Responsibilities: Build and maintain AWS-based ETL/ELT pipelines using S3, Glue (PySpark/Python), Lambda, Athena, Redshift, and Step Functions Develop backend applications to automate and support compliance reporting Process and validate complex data formats including nested JSON, XML, and CSV More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Forward Role
financial datasets Python experience, particularly for data processing and ETL workflows Hands-on experience with cloud platforms- Azure Experience designing and maintaining data pipelines using tools like Databricks and PySpark Knowledge of data warehousing solutions - Snowflake experience would be brilliant Understanding of CI/CD processes for deploying data solutions Some exposure to big data technologies and distributed processing More ❯
Bachelors' degree preferred. Minimum one (1) year of commercial experience; SQL DBA/Data Engineer preferred Familiar with Data Engineering applications such as Databricks and Azure Data Factory Basic PySpark and/or Python programming knowledge Familiar with ETL processes and techniques Administration of Microsoft SQL Server 2017/2019/2022 preferred Basic T-SQL programming preferred Database More ❯
Azure. Especially Synapse, ADF and Power BI (Datasets and Reports). Ideally SSIS, SSRS, SSAS with some understanding of Power App design and delivery Expert in SQL and Python (PySpark) languages, any other object orientated language skills would be a benefit Expert in data modelling and data architecture concepts Experience of setup and management of code management & deployment tools More ❯
Birmingham, West Midlands, United Kingdom Hybrid / WFH Options
MYO Talent
Engineer/Data Engineering role Large and complex datasets Azure, Azure Databricks Microsoft SQL Server Lakehouse, Delta Lake Data Warehousing ETL CDC Stream Processing Database Design ML Python/PySpark Azure Blob Storage Parquet Azure Data Factory Desirable: Any exposure working in a software house, consultancy, retail or retail automotive sector would be beneficial but not essential. More ❯
Birmingham, West Midlands, United Kingdom Hybrid / WFH Options
Amtis Professional Ltd
Troubleshoot issues and continuously improve data infrastructure Explore AI-driven enhancements to boost data accuracy and productivity Requirements: Strong experience with: Azure Databricks, Data Factory, Blob Storage Python/PySpark SQL Server, Parquet, Delta Lake Deep understanding of: ETL/ELT, CDC, stream processing Lakehouse architecture and data warehousing Scalable pipeline design and database optimisation A proactive mindset, strong More ❯
Atherstone, Warwickshire, West Midlands, United Kingdom Hybrid / WFH Options
Aldi Stores
end-to-end ownership of demand delivery Provide technical guidance for team members Providing 2nd or 3rd level technical support About You Experience using SQL, SQL Server DB, Python & PySpark Experience using Azure Data Factory Experience using Data Bricks and Cloudsmith Data Warehousing Experience Project Management Experience The ability to interact with the operational business and other departments, translating More ❯
mitigating any risks, issues or control weaknesses that arise in your day-to-day What we're looking for Strong experience in dbt Proficiency in SQL/Python/Pyspark and/or other languages relevant to data processing Familiarity with cloud platforms like AWS, GCP or Azure (desirable) Excellent probleming solving skills and attention to detail A passion More ❯
Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
architecture Ensuring best practices in data governance, security, and performance tuning Requirements: Proven experience with Azure Data Services (ADF, Synapse, Data Lake) Strong hands-on experience with Databricks (including PySpark or SQL) Solid SQL skills and understanding of data modelling and ETL/ELT processes Familiarity with Delta Lake and lakehouse architecture A proactive, collaborative approach to problem-solving More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Hexegic
and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in Apache Spark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base salary of More ❯
and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in Apache Spark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base salary of More ❯
london, south east england, united kingdom Hybrid / WFH Options
Hexegic
and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in Apache Spark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base salary of More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Hexegic
and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in Apache Spark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base salary of More ❯