and DevOps principles, with practical experience supporting data-centric workloads. Experienced working across both bare-metal and Cloud environments. Skilled in designing and operating distributed data platforms such as Spark, Kafka, Cassandra, or Elasticsearch. Familiarity with modern data practices such as data lakes, data warehousing, and large-scale batch/streaming pipelines. Comfortable writing automation and platform tooling (Python More ❯
LHH. Growth opportunities within a human resources global leader. We prioritize learning to stay agile in an increasingly competitive business environment. We foster an open minded environment where people spark new ideas and explore alternatives. Our benefits include Flexible working model Private medical insurance (PMI) Group personal pension plan Career support for family and friends 25 working days paid More ❯
manchester, north west england, united kingdom Hybrid / WFH Options
Autotrader
Supports Industry-leading Data Science. These Are Some Technologies That Our Data Scientists Use (we Don't Expect You To Have Experience With All Of These) Python and Databricks Spark, MLFlow, and Airflow for ML Workflows Google Cloud Platform for our analytics infrastructure dbt and BigQuery for data modelling and warehousing Some examples of our data science work can More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Hexegic
to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in ApacheSpark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base More ❯
to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in ApacheSpark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base More ❯
london, south east england, united kingdom Hybrid / WFH Options
Hexegic
to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in ApacheSpark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Hexegic
to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in ApacheSpark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Hexegic
to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in ApacheSpark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base More ❯
Winchester, Hampshire, England, United Kingdom Hybrid / WFH Options
Ada Meher
days a week – based business need. To Be Considered: Demonstrable expertise and experience working on large-scale Data Engineering projects Strong experience in Python/PySpark, Databricks & ApacheSpark Hands on experience with both batch & streaming pipelines Strong experience in AWS and associated tooling (Eg, S3, Glue, Redshift, Lambda, Terraform etc) Experience designing Data Engineering platforms from scratch More ❯
City of London, London, England, United Kingdom Hybrid / WFH Options
Ada Meher
days a week – based business need. To Be Considered: Demonstrable expertise and experience working on large-scale Data Engineering projects Strong experience in Python/PySpark, Databricks & ApacheSpark Hands on experience with both batch & streaming pipelines Strong experience in AWS and associated tooling (Eg, S3, Glue, Redshift, Lambda, Terraform etc) Experience designing Data Engineering platforms from scratch More ❯
City of London, London, England, United Kingdom Hybrid / WFH Options
Ada Meher
days a week – based business need. To Be Considered: Demonstrable expertise and experience working on large-scale Data Engineering projects Strong experience in Python/PySpark, Databricks & ApacheSpark Hands on experience with both batch & streaming pipelines Strong experience in AWS and associated tooling (Eg, S3, Glue, Redshift, Lambda, Terraform etc) Experience designing Data Engineering platforms from scratch More ❯
Winchester, Hampshire, England, United Kingdom Hybrid / WFH Options
Ada Meher
days a week – based business need. To Be Considered: Demonstrable expertise and experience working on large-scale Data Engineering projects Strong experience in Python/PySpark, Databricks & ApacheSpark Hands on experience with both batch & streaming pipelines Strong experience in AWS and associated tooling (Eg, S3, Glue, Redshift, Lambda, Terraform etc) Experience designing Data Engineering platforms from scratch More ❯
within an Agile framework Building knowledge of all data resources within ND and prototype new data sources internally and externally Skills and Experience Essential Proficiency in technologies such as Spark (SQL and/or Scala), Kafka. Analytical and problem-solving skills, applied to data solution Proficiency with traditional database SQL technologies Experience with integration of data from multiple data More ❯
Experience domains. Strong SQL skills for data extraction, transformation, and pipeline development. Proficiency with data visualization tools (Tableau, Qlik, or similar). Experience with big data platforms (Snowflake, Databricks, Spark) and ETL processes . Working knowledge of Python or R for analytics or automation (preferred). Understanding of statistical methods and A/B testing . Excellent storytelling and More ❯
Experience domains. Strong SQL skills for data extraction, transformation, and pipeline development. Proficiency with data visualization tools (Tableau, Qlik, or similar). Experience with big data platforms (Snowflake, Databricks, Spark) and ETL processes . Working knowledge of Python or R for analytics or automation (preferred). Understanding of statistical methods and A/B testing . Excellent storytelling and More ❯
Experience domains. Strong SQL skills for data extraction, transformation, and pipeline development. Proficiency with data visualization tools (Tableau, Qlik, or similar). Experience with big data platforms (Snowflake, Databricks, Spark) and ETL processes . Working knowledge of Python or R for analytics or automation (preferred). Understanding of statistical methods and A/B testing . Excellent storytelling and More ❯
london (city of london), south east england, united kingdom
RedCat Digital
Experience domains. Strong SQL skills for data extraction, transformation, and pipeline development. Proficiency with data visualization tools (Tableau, Qlik, or similar). Experience with big data platforms (Snowflake, Databricks, Spark) and ETL processes . Working knowledge of Python or R for analytics or automation (preferred). Understanding of statistical methods and A/B testing . Excellent storytelling and More ❯
Experience domains. Strong SQL skills for data extraction, transformation, and pipeline development. Proficiency with data visualization tools (Tableau, Qlik, or similar). Experience with big data platforms (Snowflake, Databricks, Spark) and ETL processes . Working knowledge of Python or R for analytics or automation (preferred). Understanding of statistical methods and A/B testing . Excellent storytelling and More ❯
Relevant experience in delivery of AI design, build, deployment or management Proficiency or certification in Microsoft Office tools, as well as relevant technologies such as Python, TensorFlow, Jupiter Notebook, Spark, Azure Cloud, Git, Docker and/or any other relevant technologies Strong analytical and problem solving skills, with the ability to work on complex projects and deliver actionable insights More ❯
You are proficient in Python, with experience using PySpark and ML libraries such as scikit-learn, TensorFlow, or Keras . You are familiar with big data technologies (e.g., Hadoop, Spark), cloud platforms (AWS, GCP), and can effectively communicate technical concepts to non-technical stakeholders. Accommodation requests If you need assistance with any part of the application or recruiting process More ❯
talented Senior Software Engineer to join our mission-driven Sideways6 squad. If you're passionate about building scalable, secure, and intuitive software that empowers employees to share ideas and spark change, this is your chance to make a real impact. You'll lead technical initiatives, collaborate across teams, and help shape the future of our platform - all while working More ❯
best practices. Ability to communicate technical concepts clearly to both technical and non-technical stakeholders. Experience working with large datasets and distributed computing tools such as Python, SQL, Hadoop, Spark, and optimisation software. As a precondition of employment for this role, you must be eligible and authorised to work in the United Kingdom. What we offer: At AXA UK More ❯
and maintaining data pipelines. Proficiency in JVM-based languages (Java, Kotlin), ideally combined with Python and experience in Spring Boot Solid understanding of data engineering tools and frameworks, like Spark, Flink, Kafka, dbt, Trino, and Airflow. Hands-on experience with cloud environments (AWS, GCP, or Azure), infrastructure-as-code practices, and ideally container orchestration with Kubernetes. Familiarity with SQL … Snowflake, Databricks) Strong DevOps mindset with experience in CI/CD pipelines, monitoring, and observability tools (Grafana or equivalent). Exposure to analytics, reporting, and BI tools such as Apache Superset, Lightdash or OpenSearch Willingness to work across the stack by contributing to API development and, at times, UI components (Vue.js, Zoho, or similar). Excellent communication and collaboration More ❯
Data Science. Proven experience managing large-scale data platforms or complex data pipelines. Deep understanding of analytics, ML workflows, and data product development. Strong grasp of tools such as Spark, Cassandra, and Redshift. Comfortable discussing technical architecture and data systems with engineers. Strong communicator with the ability to explain complex concepts to non-technical stakeholders. Experience working within SaaS More ❯
and the ability to implement them through libraries. Experience with programming, ideally Python, and the ability to handle large data volumes with modern data processing tools (e.g., Hadoop/Spark/SQL). Experience with or ability to learn open-source software including machine learning packages (e.g., Pandas, scikit-learn) and data visualization technologies. Experience in the retail sector More ❯