Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Actica Consulting
build scalable data infrastructure, develop machine learning models, and create robust solutions that enhance public service delivery. Working in classified environments, you'll tackle complex challenges using tools like Hadoop, Spark, and modern visualisation frameworks while implementing automation that drives government efficiency. You'll collaborate with stakeholders to transform legacy systems, implement data governance frameworks, and ensure solutions meet … R; Collaborative, team-based development; Cloud analytics platforms e.g. relevant AWS and Azure platform services; Data tools hands on experience with Palantir ESSENTIAL; Data science approaches and tooling e.g. Hadoop, Spark; Software development methods and techniques e.g. Agile methods such as SCRUM; Software change management, notably familiarity with git; Public sector best practice guidance, e.g. ITIL, OGC toolkit. Additional More ❯
experience across AWS Glue, Lambda, Step Functions, RDS, Redshift, and Boto3. Proficient in one of Python, Scala or Java, with strong experience in Big Data technologies such as: Spark, Hadoop etc. Practical knowledge of building Real Time event streaming pipelines (eg, Kafka, Spark Streaming, Kinesis). Proven experience developing modern data architectures including Data Lakehouse and Data Warehousing. A … tooling, and data governance including GDPR. Bonus Points For Expertise in Data Modelling, schema design, and handling both structured and semi-structured data. Familiarity with distributed systems such as Hadoop, Spark, HDFS, Hive, Databricks. Exposure to AWS Lake Formation and automation of ingestion and transformation layers. Background in delivering solutions for highly regulated industries. Passion for mentoring and enabling More ❯
in applied research PREFERRED QUALIFICATIONS Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. Experience with large scale distributed systems such as Hadoop, Spark etc. PhD Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting decisions based on your More ❯
experience Experience working effectively across cross-functional teams and partnering well with people at all levels within an organization Preferred Qualifications Experience with large scale distributed systems such as Hadoop, Spark etc. Experience with one of the following areas: machine learning technologies, Reinforcement Learning, Deep Learning, Computer Vision, Natural Language Processing (NLP) or related applications Amazon is an equal More ❯
Milton Keynes, Buckinghamshire, South East, United Kingdom
InfinityQuest Ltd,
and business stakeholders to understand data requirements. Optimize data workflows for performance and reliability. Ensure data quality, integrity, and security across systems. Work with large datasets using tools like Hadoop, Spark, and SAS. Integrate data from various sources including IBM Mainframe systems. Troubleshoot and resolve data-related issues efficiently. Required Skills & Experience:- Proven experience as a Data Engineer with … a strong foundation in data analysis. Expert-level proficiency in SAS for data manipulation and reporting. Working knowledge of IBM Mainframe systems and data structures. Advanced programming skills in Hadoop, SQL, Spark, and Python. Strong problem-solving and analytical skills. Experience with data modeling, warehousing, and performance tuning. Familiarity with Santander UK systems and processes is a strong advantage. … Preferred Qualifications:- Bachelors or Masters degree in Computer Science, Data Engineering, or related field. Certifications in SAS, Hadoop, or related technologies. Experience working in financial services or banking domain. Soft Skills:- Excellent communication and collaboration abilities. Ability to work independently and in a team-oriented environment. Strong attention to detail and commitment to quality. More ❯
Milton Keynes, Buckinghamshire, England, United Kingdom Hybrid / WFH Options
Lorien
banking-focused data model. Liaise with IT teams to transition data models into production environments. Conduct data mining and exploratory data analysis to support model development. Apply strong SQL, Hadoop, and cloud-based data processing skills to manage and analyse large datasets. Support the design and structure of data models, with a working understanding of data modelling principles. Present … scalable data solutions within a cloud architecture. Key Skills & Experience: Proven experience as a technical data analyst or data engineer in a project-focused environment. Strong proficiency in SQL, Hadoop, and cloud platforms (preferably AWS). Experience with data mining, data modelling, and large-scale data processing. Familiarity with tools such as Python, R, and Power BI. Understanding of More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Mexa Solutions LTD
data sources, including SQL and NoSQL databases. Implementing and optimizing data warehouse solutions and ETL/ELT pipelines for analytics and reporting. Working with big data ecosystems such as Hadoop, Spark, and Kafka to build scalable solutions. What you’ll bring... Strong expertise in SQL and NoSQL technologies, such as Oracle, PostgreSQL, MongoDB, or similar. Proven experience with data … warehousing concepts and ETL/ELT tools. Knowledge of big data platforms and streaming tools like Hadoop, Spark, and Kafka. A deep understanding of scalable data architectures, including high availability and fault tolerance. Experience working across hybrid or cloud environments. Excellent communication skills to engage both technical teams and senior stakeholders. What’s in it for you... This is More ❯
About Agoda Agoda is an online travel booking platform for accommodations, flights, and more. We build and deploy cutting-edge technology that connects travelers with a global network of 4.7M hotels and holiday properties worldwide, plus flights, activities, and more. More ❯
using Bash) Collaborate with DevOps and cloud teams to enhance infrastructure reliability Participate in agile development practices (Scrum or Kanban) Learn and apply concepts from big data ecosystems (eg Hadoop, Hive) What You'll Bring: Required Skills: Entry-level experience in a Platform Engineer or similar technical operations role Familiarity with container technologies (Docker) and Scripting (Bash) Basic understanding … solving mindset and eagerness to learn Team-oriented with excellent communication skills Ability to clearly explain technical issues to non-technical colleagues Interest or exposure to big data technologies (Hadoop, Hive) Why Join Us? You'll work with some of the brightest minds in tech, solving complex challenges that impact millions of users. We offer mentorship, training, and a More ❯
nature of the work, you must hold enhanced DV Clearance. WE NEED THE DATA ENGINEER TO HAVE.... Current enhanced DV Security Clearance Experience with big data tools such as Hadoop, Cloudera or Elasticsearch Experience With Palantir Foundry Experience working in an Agile Scrum environment with tools such as Confluence/Jira Experience in design, development, test and integration of …/DEVELOPPED VETTING/DEVELOPED VETTED/DEEP VETTING/DEEP VETTED/SC CLEARED/SC CLEARANCE/SECURITY CLEARED/SECURITY CLEARANCE/NIFI/CLOUDERA/HADOOP/KAFKA/ELASTIC SEARCH/LEAD BIG DATA ENGINEER/LEAD BIG DATA DEVELOPER More ❯
Data Solutions in Mission-Critical areas. WE NEED THE BIG DATA ENGINEER TO HAVE.... Current DV clearance - Standard or Enhanced Must have experience with big data tools such as Hadoop, Cloudera or Elasticsearch Experience with Palantir Foundry is preferred but not essential Experience working in an Agile Scrum environment Experience in design, development, test and integration of software IT …/DEVELOPPED VETTING/DEVELOPED VETTED/DEEP VETTING/DEEP VETTED/SC CLEARED/SC CLEARANCE/SECURITY CLEARED/SECURITY CLEARANCE/NIFI/CLOUDERA/HADOOP/KAFKA/ELASTIC SEARCH More ❯