cloud data platforms, Lakehouse architecture, and data engineering frameworks. Required Qualifications 6+ years of experience in data engineering 3+ years of hands-on experience with Databricks, Delta Lake, and Spark (PySpark preferred). Proven track record implementing Medallion Architecture (Bronze, Silver, Gold layers) in production environments. Strong knowledge of data modeling, ETL/ELT design, and data lakehouse concepts. … Proficiency in Python, SQL, and Spark optimization techniques. Experience working with cloud data platforms such as Azure Data Lake, AWS S3, or GCP BigQuery. Strong understanding of data quality frameworks, testing, and CI/CD pipelines for data workflows. Excellent communication skills and ability to collaborate across teams. Preferred Qualifications Experience with Databricks Unity Catalog and Delta Live Tables. More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Hexegic
to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in ApacheSpark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base More ❯
to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in ApacheSpark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base More ❯
london, south east england, united kingdom Hybrid / WFH Options
Hexegic
to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in ApacheSpark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Hexegic
to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in ApacheSpark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Hexegic
to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in ApacheSpark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base More ❯
Winchester, Hampshire, England, United Kingdom Hybrid / WFH Options
Ada Meher
days a week – based business need. To Be Considered: Demonstrable expertise and experience working on large-scale Data Engineering projects Strong experience in Python/PySpark, Databricks & ApacheSpark Hands on experience with both batch & streaming pipelines Strong experience in AWS and associated tooling (Eg, S3, Glue, Redshift, Lambda, Terraform etc) Experience designing Data Engineering platforms from scratch More ❯
City of London, London, England, United Kingdom Hybrid / WFH Options
Ada Meher
days a week – based business need. To Be Considered: Demonstrable expertise and experience working on large-scale Data Engineering projects Strong experience in Python/PySpark, Databricks & ApacheSpark Hands on experience with both batch & streaming pipelines Strong experience in AWS and associated tooling (Eg, S3, Glue, Redshift, Lambda, Terraform etc) Experience designing Data Engineering platforms from scratch More ❯
City of London, London, England, United Kingdom Hybrid / WFH Options
Ada Meher
days a week – based business need. To Be Considered: Demonstrable expertise and experience working on large-scale Data Engineering projects Strong experience in Python/PySpark, Databricks & ApacheSpark Hands on experience with both batch & streaming pipelines Strong experience in AWS and associated tooling (Eg, S3, Glue, Redshift, Lambda, Terraform etc) Experience designing Data Engineering platforms from scratch More ❯
Winchester, Hampshire, England, United Kingdom Hybrid / WFH Options
Ada Meher
days a week – based business need. To Be Considered: Demonstrable expertise and experience working on large-scale Data Engineering projects Strong experience in Python/PySpark, Databricks & ApacheSpark Hands on experience with both batch & streaming pipelines Strong experience in AWS and associated tooling (Eg, S3, Glue, Redshift, Lambda, Terraform etc) Experience designing Data Engineering platforms from scratch More ❯
within an Agile framework Building knowledge of all data resources within ND and prototype new data sources internally and externally Skills and Experience Essential Proficiency in technologies such as Spark (SQL and/or Scala), Kafka. Analytical and problem-solving skills, applied to data solution Proficiency with traditional database SQL technologies Experience with integration of data from multiple data More ❯
Experience domains. Strong SQL skills for data extraction, transformation, and pipeline development. Proficiency with data visualization tools (Tableau, Qlik, or similar). Experience with big data platforms (Snowflake, Databricks, Spark) and ETL processes . Working knowledge of Python or R for analytics or automation (preferred). Understanding of statistical methods and A/B testing . Excellent storytelling and More ❯
Experience domains. Strong SQL skills for data extraction, transformation, and pipeline development. Proficiency with data visualization tools (Tableau, Qlik, or similar). Experience with big data platforms (Snowflake, Databricks, Spark) and ETL processes . Working knowledge of Python or R for analytics or automation (preferred). Understanding of statistical methods and A/B testing . Excellent storytelling and More ❯
Experience domains. Strong SQL skills for data extraction, transformation, and pipeline development. Proficiency with data visualization tools (Tableau, Qlik, or similar). Experience with big data platforms (Snowflake, Databricks, Spark) and ETL processes . Working knowledge of Python or R for analytics or automation (preferred). Understanding of statistical methods and A/B testing . Excellent storytelling and More ❯
london (city of london), south east england, united kingdom
RedCat Digital
Experience domains. Strong SQL skills for data extraction, transformation, and pipeline development. Proficiency with data visualization tools (Tableau, Qlik, or similar). Experience with big data platforms (Snowflake, Databricks, Spark) and ETL processes . Working knowledge of Python or R for analytics or automation (preferred). Understanding of statistical methods and A/B testing . Excellent storytelling and More ❯
Experience domains. Strong SQL skills for data extraction, transformation, and pipeline development. Proficiency with data visualization tools (Tableau, Qlik, or similar). Experience with big data platforms (Snowflake, Databricks, Spark) and ETL processes . Working knowledge of Python or R for analytics or automation (preferred). Understanding of statistical methods and A/B testing . Excellent storytelling and More ❯
Relevant experience in delivery of AI design, build, deployment or management Proficiency or certification in Microsoft Office tools, as well as relevant technologies such as Python, TensorFlow, Jupiter Notebook, Spark, Azure Cloud, Git, Docker and/or any other relevant technologies Strong analytical and problem solving skills, with the ability to work on complex projects and deliver actionable insights More ❯
You are proficient in Python, with experience using PySpark and ML libraries such as scikit-learn, TensorFlow, or Keras . You are familiar with big data technologies (e.g., Hadoop, Spark), cloud platforms (AWS, GCP), and can effectively communicate technical concepts to non-technical stakeholders. Accommodation requests If you need assistance with any part of the application or recruiting process More ❯
talented Senior Software Engineer to join our mission-driven Sideways6 squad. If you're passionate about building scalable, secure, and intuitive software that empowers employees to share ideas and spark change, this is your chance to make a real impact. You'll lead technical initiatives, collaborate across teams, and help shape the future of our platform - all while working More ❯
best practices. Ability to communicate technical concepts clearly to both technical and non-technical stakeholders. Experience working with large datasets and distributed computing tools such as Python, SQL, Hadoop, Spark, and optimisation software. As a precondition of employment for this role, you must be eligible and authorised to work in the United Kingdom. What we offer: At AXA UK More ❯
and maintaining data pipelines. Proficiency in JVM-based languages (Java, Kotlin), ideally combined with Python and experience in Spring Boot Solid understanding of data engineering tools and frameworks, like Spark, Flink, Kafka, dbt, Trino, and Airflow. Hands-on experience with cloud environments (AWS, GCP, or Azure), infrastructure-as-code practices, and ideally container orchestration with Kubernetes. Familiarity with SQL … Snowflake, Databricks) Strong DevOps mindset with experience in CI/CD pipelines, monitoring, and observability tools (Grafana or equivalent). Exposure to analytics, reporting, and BI tools such as Apache Superset, Lightdash or OpenSearch Willingness to work across the stack by contributing to API development and, at times, UI components (Vue.js, Zoho, or similar). Excellent communication and collaboration More ❯
Data Science. Proven experience managing large-scale data platforms or complex data pipelines. Deep understanding of analytics, ML workflows, and data product development. Strong grasp of tools such as Spark, Cassandra, and Redshift. Comfortable discussing technical architecture and data systems with engineers. Strong communicator with the ability to explain complex concepts to non-technical stakeholders. Experience working within SaaS More ❯
and the ability to implement them through libraries. Experience with programming, ideally Python, and the ability to handle large data volumes with modern data processing tools (e.g., Hadoop/Spark/SQL). Experience with or ability to learn open-source software including machine learning packages (e.g., Pandas, scikit-learn) and data visualization technologies. Experience in the retail sector More ❯
AWS) to join a contract till April 2026. Inside IR35 SC cleared Weekly travel to Newcastle Around £400 per day Contract till April 2026 Skills: - Python - AWS Services - Terraform - ApacheSpark - Airflow - Docker More ❯
Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom
Opus Recruitment Solutions Ltd
SC cleared Software developers (Python & AWS) to join a contract till April 2026.Inside IR35SC clearedWeekly travel to Newcastle Around £400 per dayContract till April 2026Skills:- Python- AWS Services- Terraform- ApacheSpark- Airflow- Docker More ❯