and statistical packages. Strong analytical, problem-solving, and critical thinking skills. 8.Experience with social media analytics and understanding of user behaviour. 9.Familiarity with big data technologies, such as ApacheHadoop, Apache Spark, or Apache Kafka. 10.Knowledge of AWS machine learning services, such as Amazon SageMaker and Amazon Comprehend. 11.Experience with data governance and security best practices in AWS. 12Excellent More ❯
London, England, United Kingdom Hybrid / WFH Options
Trudenty
time data pipelines for processing large-scale data. Experience with ETL processes for data ingestion and processing. Proficiency in Python and SQL. Experience with big data technologies like ApacheHadoop and Apache Spark. Familiarity with real-time data processing frameworks such as Apache Kafka or Flink. MLOps & Deployment: Experience deploying and maintaining large-scale ML inference pipelines into production More ❯
Redshift, EMR, Glue). Familiarity with programming languages such as Python or Java. Understanding of data warehousing concepts and data modeling techniques. Experience working with big data technologies (e.g., Hadoop, Spark) is an advantage. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Benefits Enhanced leave - 38 days inclusive of 8 UK Public Holidays Private Health Care More ❯
deploy your pipelines and proven experience in their technologies You have experience in database technologies including writing complex queries against their (relational and non-relational) data stores (e.g. Postgres, Hadoop, Elasticsearch, Graph databases), and designing the database schemas to support those queries You have a good understanding of coding best practices and design patterns and experience with code and More ❯
London, England, United Kingdom Hybrid / WFH Options
NTT DATA
Azure, GCP, and Snowflake. Understanding of cloud platform infrastructure and its impact on data architecture. A solid understanding of big data technologies such as Apache Spark, and knowledge of Hadoop ecosystems. Knowledge of programming languages such as Python, R, or Java is beneficial. Exposure to ETL/ELT processes, SQL, NoSQL databases is a nice-to-have, providing a More ❯
team and drive the architecture, design, and implementation of customers modern data platforms. The ideal candidate will have extensive experience migrating traditional data warehouse technologies, including Teradata, Oracle, BW, Hadoop to modern cloud data platforms like Databricks, Snowflake, Redshift, Bigquery, or Microsoft fabric. You will be responsible for leading data platform migrations and the design and development of scalable … platforms such as Teradata Oracle, SAP BW and migration of these data warehouses to modern cloud data platforms. Deep understanding and hands-on experience with big data technologies like Hadoop, HDFS, Hive, Spark and cloud data platform services. Proven track record of designing and implementing large-scale data architectures in complex environments. CICD/DevOps experience is a plus. … governance, security, and compliance best practices. Excellent problem-solving, analytical, and critical-thinking skills. Strong leadership, communication, and collaboration abilities. Preferred Qualifications: Experience in data warehouse (SAP BW, Teradata, Hadoop, Oracle etc) migration to cloud data platforms. Familiarity with data visualization and BI tools (e.g., Tableau, Power BI). Experience with cloud-based data architectures and hybrid data environments. More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Leonardo
Social network you want to login/join with: The Leonardo Cyber & Security Division, one of the three divisions in Leonardo UK, is a pivotal innovator, helping customers deliver and secure their digital transformation. The Cyber & Security Division is at More ❯
West Bromwich, England, United Kingdom Hybrid / WFH Options
Leonardo UK Ltd
Job Description: The Opportunity: The Leonardo Cyber & Security Division, one of the three divisions in Leonardo UK, is a pivotal innovator, helping customers deliver and secure their digital transformation. The Cyber & Security Division is at the forefront of supplying technology More ❯
City Of Bristol, England, United Kingdom Hybrid / WFH Options
ADLIB Recruitment | B Corp™
Clear communicator, able to translate complex data concepts to cross-functional teams Bonus points for experience with: DevOps tools like Docker, Kubernetes, CI/CD Big data tools (Spark, Hadoop), ETL workflows, or high-throughput data streams Genomic data formats and tools Cold and hot storage management, ZFS/RAID systems, or tape storage AI/LLM tools to More ❯
London, England, United Kingdom Hybrid / WFH Options
Rein-Ton
to See From You Qualifications Proven experience as a Data Engineer with a strong background in data pipelines. Proficiency in Python, Java, or Scala, and big data technologies (e.g., Hadoop, Spark, Kafka). Experience with Databricks, Azure AI Services, and cloud platforms (AWS, Google Cloud, Azure). Solid understanding of SQL and NoSQL databases. Strong problem-solving skills and More ❯
London, England, United Kingdom Hybrid / WFH Options
Lloyds Banking Group
exciting new technologies to design and build scalable real-time data applications. Spanning the full data lifecycle and experience using a mix of modern and traditional data platforms (e.g., Hadoop, Kafka, GCP, Azure, Teradata, SQL Server) you’ll get to work building capabilities with horizon-expanding exposure to a host of wider technologies and careers in data. Helping in … and non-relational databases to build data solutions, such as SQL Server/Oracle , experience with relational and dimensional data structures. Experience in using distributed frameworks ( Spark, Flink, Beam, Hadoop ). Proficiency in infrastructure as code (IaC) using Terraform . Experience with CI/CD pipelines and related tools/frameworks. Containerisation Good knowledge of containers ( Docker, Kubernetes , etc. More ❯
London, England, United Kingdom Hybrid / WFH Options
NTT DATA
Azure, GCP, and Snowflake. Understanding of cloud platform infrastructure and its impact on data architecture. A solid understanding of big data technologies such as Apache Spark, and knowledge of Hadoop ecosystems. Knowledge of programming languages such as Python, R, or Java is beneficial. Exposure to ETL/ELT processes, SQL, NoSQL databases is a nice-to-have, providing a More ❯
Docker Experience with NLP and/or computer vision Exposure to cloud technologies (eg. AWS and Azure) Exposure to Big data technologies Exposure to Apache products eg. Hive, Spark, Hadoop, NiFi Programming experience in other languages This is not an exhaustive list, and we are keen to hear from you even if you don't tick every box. The More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Leonardo SpA
Docker Experience with NLP and/or computer vision Exposure to cloud technologies (eg. AWS and Azure) Exposure to Big data technologies Exposure to Apache products eg. Hive, Spark, Hadoop, NiFi Programming experience in other languages This is not an exhaustive list, and we are keen to hear from you even if you don't tick every box. The More ❯
West Bromwich, England, United Kingdom Hybrid / WFH Options
Leonardo UK Ltd
Docker Experience with NLP and/or computer vision Exposure to cloud technologies (eg. AWS and Azure) Exposure to Big data technologies Exposure to Apache products eg. Hive, Spark, Hadoop, NiFi Programming experience in other languages This is not an exhaustive list, and we are keen to hear from you even if you don’t tick every box. The More ❯
London, England, United Kingdom Hybrid / WFH Options
Foundever
controls, and monitoring systems. Skills/Abilities/Knowledge Proficiency in data modeling and database management. Strong programming skills in Python and SQL. Knowledge of big data technologies like Hadoop, Spark, and NoSQL databases. Deep experience with ETL processes and data pipeline development. Strong understanding of data warehousing concepts and best practices. Experience with cloud platforms such as AWS … Computer Science or Engineering Languages Excellent command of English. French and Spanish language skills are a bonus. Tools and Applications Programming languages and tools: Python, SQL. Big data technologies: Hadoop, Spark, NoSQL databases. ETL and data pipeline tools: AWS Glue, Airflow. Cloud platforms: AWS, Azure. Data visualization tools and data modeling software. Version control systems and collaborative development platforms. More ❯
London, England, United Kingdom Hybrid / WFH Options
HipHopTune Media
learning. Experience in using advanced features of cloud platforms (AWS, Azure, Google Cloud) such as machine learning services and automated data pipeline tools. Familiarity with big data frameworks like Hadoop or Spark is beneficial. Skills in advanced data visualization tools and software beyond basic reporting—such as Tableau, Power BI, or even more sophisticated interactive web visualization frameworks like More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Lloyds Bank plc
exciting new technologies to design and build scalable real-time data applications. Spanning the full data lifecycle and experience using a mix of modern and traditional data platforms (e.g. Hadoop, Kafka, GCP, Azure, Teradata, SQL server) you’ll get to work building capabilities with horizon-expanding exposure to a host of wider technologies and careers in data. Helping in … and non-relational databases to build data solutions, such as SQL Server/Oracle , experience with relational and dimensional data structures. Experience in using distributed frameworks ( Spark, Flink, Beam, Hadoop ). Proficiency in infrastructure as code (IaC) using Terraform . Experience with CI/CD pipelines and related tools/frameworks. Containerisation: Good knowledge of containers ( Docker, Kubernetes etc More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Made Tech
1 month ago Be among the first 25 applicants Description As a Lead Data Engineer or architect at Made Tech, you'll play a pivotal role in helping public sector organisations become truly data-lead, by equipping them with robust More ❯
London, England, United Kingdom Hybrid / WFH Options
Aecom
languages such as Python, R, and SQL.+ In-depth experience with data manipulation and visualization libraries (e.g., Pandas, NumPy, Matplotlib, etc.).+ Solid understanding of big data technologies (e.g., Hadoop, Spark) and cloud platforms (AWS, Azure, Google Cloud).+ Strong expertise in the full data science lifecycle: data collection, preprocessing, model development, deployment, and monitoring.+ Experience in leading teams More ❯
Cambourne, England, United Kingdom Hybrid / WFH Options
Remotestar
of Scala. Familiarity with distributed computing frameworks such as Spark, KStreams, Kafka. Experience with Kafka and streaming frameworks. Understanding of monolithic vs. microservice architectures. Familiarity with Apache ecosystem including Hadoop modules (HDFS, YARN, HBase, Hive, Spark) and Apache NiFi. Experience with containerization and orchestration tools like Docker and Kubernetes. Knowledge of time-series or analytics databases such as Elasticsearch. More ❯
Cambridge, England, United Kingdom Hybrid / WFH Options
RemoteStar
/KStreams/Kafka. Connect and Streaming Frameworks such as Kafka. Knowledge on Monolithic versus Microservice Architecture concepts for building large-scale applications. Familiar with the Apache suite including Hadoop modules such as HDFS, Yarn, HBase, Hive, Spark as well as Apache NiFi. Familiar with containerization and orchestration technologies such as Docker, Kubernetes. Familiar with Time-series or Analytics More ❯
London, England, United Kingdom Hybrid / WFH Options
HM Revenue and Customs
and managing end to end projects with diverse cross-functional teams Proven skills in translating analytics output to actionable recommendations Hands-on experience with modern distributed systems, including both Hadoop, Hive/SQL and Apache Spark Proficiency in creating reproducible analytic pipelines One or more data analytics/programming tools such as Python and R Proficiency with version control More ❯
Guildford, Surrey, United Kingdom Hybrid / WFH Options
Actica Consulting Limited
build scalable data infrastructure, develop machine learning models, and create robust solutions that enhance public service delivery. Working in classified environments, you'll tackle complex challenges using tools like Hadoop, Spark, and modern visualisation frameworks while implementing automation that drives government efficiency. You'll collaborate with stakeholders to transform legacy systems, implement data governance frameworks, and ensure solutions meet … R; Collaborative, team-based development; Cloud analytics platforms e.g. relevant AWS and Azure platform services; Data tools hands on experience with Palantir ESSENTIAL; Data science approaches and tooling e.g. Hadoop, Spark; Data engineering approaches; Database management, e.g. MySQL, Postgress; Software development methods and techniques e.g. Agile methods such as SCRUM; Software change management, notably familiarity with git; Public sector More ❯