Redshift, EMR, Glue). Familiarity with programming languages such as Python or Java. Understanding of data warehousing concepts and data modeling techniques. Experience working with big data technologies (e.g., Hadoop, Spark) is an advantage. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Benefits Enhanced leave - 38 days inclusive of 8 UK Public Holidays Private Health Care More ❯
City Of Bristol, England, United Kingdom Hybrid / WFH Options
ADLIB Recruitment | B Corp™
Clear communicator, able to translate complex data concepts to cross-functional teams Bonus points for experience with: DevOps tools like Docker, Kubernetes, CI/CD Big data tools (Spark, Hadoop), ETL workflows, or high-throughput data streams Genomic data formats and tools Cold and hot storage management, ZFS/RAID systems, or tape storage AI/LLM tools to More ❯
Docker Experience with NLP and/or computer vision Exposure to cloud technologies (eg. AWS and Azure) Exposure to Big data technologies Exposure to Apache products eg. Hive, Spark, Hadoop, NiFi Programming experience in other languages This is not an exhaustive list, and we are keen to hear from you even if you don't tick every box. The More ❯
Guildford, Surrey, United Kingdom Hybrid / WFH Options
Actica Consulting Limited
build scalable data infrastructure, develop machine learning models, and create robust solutions that enhance public service delivery. Working in classified environments, you'll tackle complex challenges using tools like Hadoop, Spark, and modern visualisation frameworks while implementing automation that drives government efficiency. You'll collaborate with stakeholders to transform legacy systems, implement data governance frameworks, and ensure solutions meet … R; Collaborative, team-based development; Cloud analytics platforms e.g. relevant AWS and Azure platform services; Data tools hands on experience with Palantir ESSENTIAL; Data science approaches and tooling e.g. Hadoop, Spark; Data engineering approaches; Database management, e.g. MySQL, Postgress; Software development methods and techniques e.g. Agile methods such as SCRUM; Software change management, notably familiarity with git; Public sector More ❯
build scalable data infrastructure, develop machine learning models, and create robust solutions that enhance public service delivery. Working in classified environments, you'll tackle complex challenges using tools like Hadoop, Spark, and modern visualisation frameworks while implementing automation that drives government efficiency. You'll collaborate with stakeholders to transform legacy systems, implement data governance frameworks, and ensure solutions meet … R; Collaborative, team-based development; Cloud analytics platforms e.g. relevant AWS and Azure platform services; Data tools hands on experience with Palantir ESSENTIAL; Data science approaches and tooling e.g. Hadoop, Spark; Software development methods and techniques e.g. Agile methods such as SCRUM; Software change management, notably familiarity with git; Public sector best practice guidance, e.g. ITIL, OGC toolkit. Additional More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Actica Consulting
build scalable data infrastructure, develop machine learning models, and create robust solutions that enhance public service delivery. Working in classified environments, you'll tackle complex challenges using tools like Hadoop, Spark, and modern visualisation frameworks while implementing automation that drives government efficiency. You'll collaborate with stakeholders to transform legacy systems, implement data governance frameworks, and ensure solutions meet … R; Collaborative, team-based development; Cloud analytics platforms e.g. relevant AWS and Azure platform services; Data tools hands on experience with Palantir ESSENTIAL; Data science approaches and tooling e.g. Hadoop, Spark; Software development methods and techniques e.g. Agile methods such as SCRUM; Software change management, notably familiarity with git; Public sector best practice guidance, e.g. ITIL, OGC toolkit. Additional More ❯
East Horsley, Surrey, United Kingdom Hybrid / WFH Options
Actica Consulting
build scalable data infrastructure, develop machine learning models, and create robust solutions that enhance public service delivery. Working in classified environments, you'll tackle complex challenges using tools like Hadoop, Spark, and modern visualisation frameworks while implementing automation that drives government efficiency. You'll collaborate with stakeholders to transform legacy systems, implement data governance frameworks, and ensure solutions meet … R; Collaborative, team-based development; Cloud analytics platforms e.g. relevant AWS and Azure platform services; Data tools hands on experience with Palantir ESSENTIAL; Data science approaches and tooling e.g. Hadoop, Spark; Software development methods and techniques e.g. Agile methods such as SCRUM; Software change management, notably familiarity with git; Public sector best practice guidance, e.g. ITIL, OGC toolkit. Additional More ❯
Telford, Shropshire, West Midlands, United Kingdom Hybrid / WFH Options
JLA Resourcing Ltd
solving and communication skills, including the ability to convey complex concepts to non-technical stakeholders. Desirable (but not essential): Experience with SSIS, AWS or Azure Data Factory. Familiarity with Hadoop, Jenkins, or DevOps practices including CI/CD. Cloud certifications (Azure or AWS). Knowledge of additional programming languages or ETL tools. This is a fantastic opportunity to take More ❯
Reading, Oxfordshire, United Kingdom Hybrid / WFH Options
Henderson Drake
Services (e.g., Azure Data Factory, Synapse, Databricks, Fabric) Data warehousing and lakehouse design ETL/ELT pipelines SQL, Python for data manipulation and machine learning Big Data frameworks (e.g., Hadoop, Spark) Data visualisation (e.g., Power BI) Understanding of statistical analysis and predictive modelling Experience: 5+ years working with Microsoft data platforms 5+ years in a customer-facing consulting or More ❯
Reading, Berkshire, United Kingdom Hybrid / WFH Options
Henderson Drake
Services (e.g., Azure Data Factory, Synapse, Databricks, Fabric) Data warehousing and lakehouse design ETL/ELT pipelines SQL, Python for data manipulation and machine learning Big Data frameworks (e.g., Hadoop, Spark) Data visualisation (e.g., Power BI) Understanding of statistical analysis and predictive modelling Experience: 5+ years working with Microsoft data platforms 5+ years in a customer-facing consulting or More ❯
environments (e.g. Snowflake, AWS). 6+ years of hands-on technical leadership in building large-scale, distributed data pipelines and reporting tools using big data technologies (e.g. Spark, Kafka, Hadoop), ensuring quality, scalability, and governance. Strong expertise in balancing trade-offs within complex distributed systems, focusing on data quality, performance, reliability, availability, and security. Proficient in software engineering with More ❯
US or UK Preferred Experience: Data orchestration tools (e.g. , Airflow, Prefect)Experience deploying, monitoring, and maintaining ML models in production environments (MLOps)Familiarity with big data technologies ( e.g. , Spark, Hadoop)Background in time-series analysis and forecastingExperience with data governance and security best practicesReal-time data streaming is a plus (Kafka, Beam, Flink)Experience with Kubernetes is a plusEnergy More ❯
US or UK Preferred Experience: Data orchestration tools (e.g. , Airflow, Prefect)Experience deploying, monitoring, and maintaining ML models in production environments (MLOps)Familiarity with big data technologies ( e.g. , Spark, Hadoop)Background in time-series analysis and forecastingExperience with data governance and security best practicesReal-time data streaming is a plus (Kafka, Beam, Flink)Experience with Kubernetes is a plusEnergy More ❯
East London, London, United Kingdom Hybrid / WFH Options
McGregor Boyall Associates Limited
. Strong knowledge of LLM algorithms and training techniques . Experience deploying models in production environments. Nice to Have: Experience in GenAI/LLMs Familiarity with distributed computing tools (Hadoop, Hive, Spark). Background in banking, risk management, or capital markets . Why Join? This is a unique opportunity to work at the forefront of AI innovation in financial More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Hanson Lee
and disseminate significant amounts of information with attention to detail and accuracy. Adept at queries, report writing, and presenting findings. Experience working with large datasets and distributed computing tools (Hadoop, Spark, etc.) Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests, etc.). Experience with data profiling tools and processes. Knowledge of Microsoft Fabric is More ❯
and disseminate significant amounts of information with attention to detail and accuracy. Adept at queries, report writing, and presenting findings. Experience working with large datasets and distributed computing tools (Hadoop, Spark, etc.) Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests, etc.). Experience with data profiling tools and processes. Knowledge of Microsoft Fabric is More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Anson McCade
architectures. Experienced with Matillion and modern data visualisation tools (QuickSight, Tableau, Looker, etc.). Strong scripting and Linux/cloud environment familiarity. Desirable: Exposure to big data tools (Spark, Hadoop, MapReduce). Experience with microservice-based data APIs. AWS certifications (Solutions Architect or Big Data Specialty). Knowledge of machine learning or advanced analytics. Interested? This is a great More ❯
architectures. Experienced with Matillion and modern data visualisation tools (QuickSight, Tableau, Looker, etc.). Strong scripting and Linux/cloud environment familiarity. Desirable: Exposure to big data tools (Spark, Hadoop, MapReduce). Experience with microservice-based data APIs. AWS certifications (Solutions Architect or Big Data Specialty). Knowledge of machine learning or advanced analytics. Interested? This is a great More ❯
Stevenage, Hertfordshire, United Kingdom Hybrid / WFH Options
Capgemini
Experience with data analysis and visualization tools (e.g., Matplotlib, Seaborn, Tableau). • Ability to work independently and lead projects from inception to deployment. • Experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, GCP, Azure) is desirable. • MSc or PhD in Computer Science, Data Science, or related field is preferred. Don't meet every single requirements More ❯
Go) University degree (IT/math) or equivalent experience The following additional qualifications are a significant plus: Kubernetes knowledge and operating experience Experience with big data stack components like Hadoop, Spark, Kafka, Nifi, Experience with data science/data analysis Knowledge of SRE/DevOP stacks - monitoring/system management tools (Prometheus, Ansible, ELK, ) Version control using git A More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Anson McCade
Knowledge of CI/CD processes and infrastructure-as-code. • Eligible for SC clearance (active clearance highly advantageous). Desirable Skills • Exposure to large data processing frameworks (e.g., Spark, Hadoop). • Experience deploying data via APIs and microservices. • AWS certifications (Solution Architect Associate, Data Analytics Speciality, etc.). • Experience in public sector programmes or government frameworks. Package & Benefits More ❯
Knowledge of CI/CD processes and infrastructure-as-code. • Eligible for SC clearance (active clearance highly advantageous). Desirable Skills • Exposure to large data processing frameworks (e.g., Spark, Hadoop). • Experience deploying data via APIs and microservices. • AWS certifications (Solution Architect Associate, Data Analytics Speciality, etc.). • Experience in public sector programmes or government frameworks. Package & Benefits More ❯
such as Java, TypeScript, Python, and Go Web libraries and frameworks such as React and Angular Designing, building, and maintaining CI/CD pipelines Big data technologies like NiFi, Hadoop, Spark Cloud and containerization technologies such as AWS, OpenShift, Kubernetes, Docker DevOps methodologies, including infrastructure as code and GitOps Database technologies, e.g., relational databases, Elasticsearch, MongoDB Why join Gemba More ❯
Cheltenham, Gloucestershire, United Kingdom Hybrid / WFH Options
Gemba Advantage
as Java, TypeScript, Python, and Go Web libraries and frameworks such as React and Angular Designing, building, and maintaining CI/CD pipelines Big data technologies, such as NiFi, Hadoop, Spark Cloud and containerization technologies such as AWS, OpenShift, Kubernetes, Docker DevOps methodologies, such as infrastructure as code and GitOps Database technologies, e.g. relational databases, Elasticsearch, Mongo Why join More ❯
to implement them through libraries. Experience with programming, ideally Python, and the ability to quickly pick up handling large data volumes with modern data processing tools, e.g. by using Hadoop/Spark/SQL Experience with or ability to quickly learn open-source software including machine learning packages, such as Pandas, scikit-learn, along with data visualisation technologies. Experience More ❯