London, England, United Kingdom Hybrid / WFH Options
Luupli
and statistical packages. Strong analytical, problem-solving, and critical thinking skills. 8.Experience with social media analytics and understanding of user behaviour. 9.Familiarity with big data technologies, such as ApacheHadoop, Apache Spark, or Apache Kafka. 10.Knowledge of AWS machine learning services, such as Amazon SageMaker and Amazon Comprehend. 11.Experience with data governance and security best practices in AWS. 12Excellent More ❯
time data pipelines for processing large-scale data. Experience with ETL processes for data ingestion and processing. Proficiency in Python and SQL. Experience with big data technologies like ApacheHadoop and Apache Spark. Familiarity with real-time data processing frameworks such as Apache Kafka or Flink. MLOps & Deployment: Experience deploying and maintaining large-scale ML inference pipelines into production More ❯
Redshift, EMR, Glue). Familiarity with programming languages such as Python or Java. Understanding of data warehousing concepts and data modeling techniques. Experience working with big data technologies (e.g., Hadoop, Spark) is an advantage. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Benefits Enhanced leave - 38 days inclusive of 8 UK Public Holidays Private Health Care More ❯
deploy your pipelines and proven experience in their technologies You have experience in database technologies including writing complex queries against their (relational and non-relational) data stores (e.g. Postgres, Hadoop, Elasticsearch, Graph databases), and designing the database schemas to support those queries You have a good understanding of coding best practices and design patterns and experience with code and More ❯
build scalable data infrastructure, develop machine learning models, and create robust solutions that enhance public service delivery. Working in classified environments, you'll tackle complex challenges using tools like Hadoop, Spark, and modern visualisation frameworks while implementing automation that drives government efficiency. You'll collaborate with stakeholders to transform legacy systems, implement data governance frameworks, and ensure solutions meet … R; Collaborative, team-based development; Cloud analytics platforms e.g. relevant AWS and Azure platform services; Data tools hands on experience with Palantir ESSENTIAL; Data science approaches and tooling e.g. Hadoop, Spark; Data engineering approaches; Database management, e.g. MySQL, Postgress; Software development methods and techniques e.g. Agile methods such as SCRUM; Software change management, notably familiarity with git; Public sector More ❯
Press Tab to Move to Skip to Content Link Select how often (in days) to receive an alert: The job you're considering The Cloud Data Platforms team is part of the Insights and Data Global Practice and has seen More ❯
design of data architectures that will be deployed You have experience in database technologies including writing complex queries against their (relational and non-relational) data stores (e.g. Postgres, ApacheHadoop, Elasticsearch, Graph databases), and designing the database schemas to support those queries You have a good understanding of coding best practices & design patterns and experience with code & data versioning More ❯
really make yourapplication stand out: Implementationexperience with Machine Learning models and applications Knowledgeof cloud-based Machine Learning engines (AWS, Azure, Google, etc.) Experiencewith large scale data processing tools (Spark, Hadoop, etc.) Abilityto query and program databases (SQL, No SQL) Experiencewith distributed ML frameworks (TensorFlow, PyTorch, etc.) Familiaritywith collaborative software tools (Git, Jira, etc.) Experiencewith user interface libraries/applications More ❯
task effort and identify dependencies. Excellent communication skills. Familiarity with Python and its data, numerical, and machine learning libraries. It would be great if you also had: Experience with Hadoop and Jenkins. Azure and AWS certifications. Familiarity with Java. What we do for you: At Leidos, we are passionate about customer success, united as a team, and inspired to More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Widen the Net Limited
team! You will develop scalable data pipelines, ensure data quality, and support business decision-making with high-quality datasets. -Work across technology stack: SQL, Python, ETL, Big Query, Spark, Hadoop, Git, Apache Airflow, Data Architecture, Data Warehousing -Design and develop scalable ETL pipelines to automate data processes and optimize delivery -Implement and manage data warehousing solutions, ensuring data integrity More ❯
team! You will develop scalable data pipelines, ensure data quality, and support business decision-making with high-quality datasets. -Work across technology stack: SQL, Python, ETL, Big Query, Spark, Hadoop, Git, Apache Airflow, Data Architecture, Data Warehousing -Design and develop scalable ETL pipelines to automate data processes and optimize delivery -Implement and manage data warehousing solutions, ensuring data integrity More ❯
East London, London, United Kingdom Hybrid / WFH Options
McGregor Boyall Associates Limited
. Strong knowledge of LLM algorithms and training techniques . Experience deploying models in production environments. Nice to Have: Experience in GenAI/LLMs Familiarity with distributed computing tools (Hadoop, Hive, Spark). Background in banking, risk management, or capital markets . Why Join? This is a unique opportunity to work at the forefront of AI innovation in financial More ❯
London, England, United Kingdom Hybrid / WFH Options
Anson McCade
architectures. Experienced with Matillion and modern data visualisation tools (QuickSight, Tableau, Looker, etc.). Strong scripting and Linux/cloud environment familiarity. Desirable: Exposure to big data tools (Spark, Hadoop, MapReduce). Experience with microservice-based data APIs. AWS certifications (Solutions Architect or Big Data Specialty). Knowledge of machine learning or advanced analytics. Interested? This is a great More ❯
like TensorFlow, PyTorch, or scikit-learn. Familiarity with cloud platforms like AWS, GCP, or Azure. Strong written and spoken English skills. Bonus Experience: Experience with big data tools (e.g., Hadoop, Spark) and distributed computing. Knowledge of NLP techniques and libraries. Familiarity with Docker, Kubernetes, and deploying machine learning models in production. Experience with visualization tools like Tableau, Power BI More ❯
environment. Relevant domains: FinTech, analytics, data science & engineering, software development, ML, LLMs, SaaS, etc. Very strong understanding of computer science Knowledge of diverse data-related solutions and topics: Spark, Hadoop, Elasticsearch, ETL, ML, LLM, etc. Programming skills: Scala or Java (preferred), or Python (or any other object-oriented language) Strong understanding of learning theories and instructional design principles Experience More ❯
a degree or interest in the legal domain. Ability to communicate with multiple stakeholders, including non-technical legal subject matter experts. Experience with big data technologies such as Spark, Hadoop, or similar. Experience conducting world-leading research, e.g. by contributions to publications at leading ML venues. Previous experience working on large-scale data processing systems. Strong software and/ More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Mars
help shape our digital platforms 🧠 What we’re looking for Proven experience as a Data Engineer in cloud environments (Azure ideal) Proficiency in Python, SQL, Spark, Databricks Familiarity with Hadoop, NoSQL, Delta Lake Bonus: Azure Functions, Logic Apps, Django, CI/CD tools 💼 What you’ll get from Mars A competitive salary & bonus Hybrid working with flexibility built in More ❯
help shape our digital platforms 🧠 What we’re looking for Proven experience as a Data Engineer in cloud environments (Azure ideal) Proficiency in Python, SQL, Spark, Databricks Familiarity with Hadoop, NoSQL, Delta Lake Bonus: Azure Functions, Logic Apps, Django, CI/CD tools 💼 What you’ll get from Mars A competitive salary & bonus Hybrid working with flexibility built in More ❯
if you Have experience with Cloud-based or SaaS products and a good understanding of Digital Marketing and Marketing Technologies. Have experience working with Big Data technologies (such as Hadoop, MapReduce, Hive/Pig, Cassandra, MongoDB, etc) An understanding of web technologies such as Javascript, node.js and html. Some level of understanding or experience in AI/ML. Physical More ❯
bring to the role? As you take on the Data Solution Architect role, you will have to come with previous experience of working with big data platforms such as Hadoop and Spark. Other key skills required for the role include: Extensive experience in data architecture and analytics platform solutions. Expertise in all things data including lifecycle, technologies, data-patterns More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Anson McCade
Knowledge of CI/CD processes and infrastructure-as-code. • Eligible for SC clearance (active clearance highly advantageous). Desirable Skills • Exposure to large data processing frameworks (e.g., Spark, Hadoop). • Experience deploying data via APIs and microservices. • AWS certifications (Solution Architect Associate, Data Analytics Speciality, etc.). • Experience in public sector programmes or government frameworks. Package & Benefits More ❯
Knowledge of CI/CD processes and infrastructure-as-code. • Eligible for SC clearance (active clearance highly advantageous). Desirable Skills • Exposure to large data processing frameworks (e.g., Spark, Hadoop). • Experience deploying data via APIs and microservices. • AWS certifications (Solution Architect Associate, Data Analytics Speciality, etc.). • Experience in public sector programmes or government frameworks. Package & Benefits More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Anson McCade
Knowledge of CI/CD processes and infrastructure-as-code. • Eligible for SC clearance (active clearance highly advantageous). Desirable Skills • Exposure to large data processing frameworks (e.g., Spark, Hadoop). • Experience deploying data via APIs and microservices. • AWS certifications (Solution Architect Associate, Data Analytics Speciality, etc.). • Experience in public sector programmes or government frameworks. Package & Benefits More ❯
such as Java, TypeScript, Python, and Go Web libraries and frameworks such as React and Angular Designing, building, and maintaining CI/CD pipelines Big data technologies like NiFi, Hadoop, Spark Cloud and containerization technologies such as AWS, OpenShift, Kubernetes, Docker DevOps methodologies, including infrastructure as code and GitOps Database technologies, e.g., relational databases, Elasticsearch, MongoDB Why join Gemba More ❯
to implement them through libraries. Experience with programming, ideally Python, and the ability to quickly pick up handling large data volumes with modern data processing tools, e.g. by using Hadoop/Spark/SQL Experience with or ability to quickly learn open-source software including machine learning packages, such as Pandas, scikit-learn, along with data visualisation technologies. Experience More ❯