of the following: Python, SQL, Java Commercial experience in client-facing projects is a plus, especially within multi-disciplinary teams Deep knowledge of database technologies: Distributed systems (e.g., Spark, Hadoop, EMR) RDBMS (e.g., SQL Server, Oracle, PostgreSQL, MySQL) NoSQL (e.g., MongoDB, Cassandra, DynamoDB, Neo4j) Solid understanding of software engineering best practices - code reviews, testing frameworks, CI/CD, and More ❯
work experience). Proven experience with Trino/Starburst Enterprise/Galaxy administration/CLI. Implementation experience with container orchestration solutions (Kubernetes/OpenShift). Knowledge of Big Data (Hadoop/Hive/Spark) and Cloud technologies (AWS, Azure, GCP). Understanding of distributed system architecture, high availability, scalability, and fault tolerance. Familiarity with security authentication systems such as More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Advanced Resource Managers Limited
work experience). Proven experience with Trino/Starburst Enterprise/Galaxy administration/CLI. Implementation experience with container orchestration solutions (Kubernetes/OpenShift). Knowledge of Big Data (Hadoop/Hive/Spark) and Cloud technologies (AWS, Azure, GCP). Understanding of distributed system architecture, high availability, scalability, and fault tolerance. Familiarity with security authentication systems such as More ❯
as well as programming languages such as Python, R, or similar. Strong experience with machine learning frameworks (e.g., TensorFlow, Scikit-learn) as well as familiarity with data technologies (e.g., Hadoop, Spark). About Vixio: Our mission is to empower businesses to efficiently manage and meet their regulatory obligations with our unique combination of human expertise and Regulatory Technology (RegTech More ❯
with some of the brightest technical minds in the industry today. BASIC QUALIFICATIONS - 10+ years of technical specialist, design and architecture experience - 10+ years of database (eg. SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) experience - 10+ years of consulting, design and implementation of serverless distributed solutions experience - Australian citizen with ability to obtain security clearance. PREFERRED QUALIFICATIONS - AWS Professional level More ❯
and social benefits (e.g. UK pension schema) What do you offer? Strong hands-on experience working with modern Big Data technologies such as Apache Spark, Trino, Apache Kafka, ApacheHadoop, Apache HBase, Apache Nifi, Apache Airflow, Opensearch Proficiency in cloud-native technologies such as containerization and Kubernetes Strong knowledge of DevOps tools (Terraform, Ansible, ArgoCD, GitOps, etc.) Proficiency in More ❯
Basic qualifications 3+ years of experience in cloud architecture and implementation Bachelor's degree in Computer Science, Engineering, related field, or equivalent experience Experience in database (e.g., SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) Experience in consulting, design and implementation of serverless distributed solutions Experience in software development with an object-oriented language Preferred qualifications AWS experience preferred, with proficiency More ❯
home, there's nothing we can't achieve in the cloud. BASIC QUALIFICATIONS 5+ years of experience in cloud architecture and implementation 5+ years of database (e.g., SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) experience 5+ years of experience delivering cloud projects or cloud-based solutions Ability to communicate effectively in English, in technical and business settings Bachelor's degree More ❯
workplace and at home, there's nothing we can't achieve. BASIC QUALIFICATIONS - 10+ years of technical specialist, design and architecture experience - 10+ years of database (eg. SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) experience - 10+ years of consulting, design and implementation of serverless distributed solutions experience - Australian citizen with ability to obtain security clearance. PREFERRED QUALIFICATIONS - AWS Professional level More ❯
Scikit-learn, PyTorch). Familiarity with data visualization tools (e.g., Tableau, Power BI, Matplotlib). Knowledge of statistical analysis and predictive modelling techniques. Understanding of big data technologies (e.g., Hadoop, Spark) is a plus. Experience 2+ years of experience in a Data Science or Analytical role. Proven track record in developing and deploying predictive models and algorithms. Strong ability More ❯
Familiarity with and experience of using UNIX Knowledge of CI toolsets Good client facing skills and problem solving aptitude DevOps knowledge of SQL Oracle DB Postgres ActiveMQ Zabbix Ambari Hadoop Jira Confluence BitBucket ActiviBPM Oracle SOA Azure SQLServer IIS AWS Grafana Oracle BPM Jenkins Puppet CI and other cloud technologies. All profiles will be reviewed against the required skills More ❯
Familiarity with and experience of using UNIX Knowledge of CI toolsets Good client facing skills and problem solving aptitude DevOps knowledge of SQL Oracle DB Postgres ActiveMQ Zabbix Ambari Hadoop Jira Confluence BitBucket ActiviBPM Oracle SOA Azure SQLServer IIS AWS Grafana Oracle BPM Jenkins Puppet CI and other cloud technologies. All profiles will be reviewed against the required skills More ❯
Agile working practices CI/CD tooling Scripting experience (Python, Perl, Bash, etc.) ELK (Elastic stack) JavaScript Cypress Linux experience Search engine technology (e.g., Elasticsearch) Big Data Technology experience (Hadoop, Spar k , Kafka, etc.) Microservice and cloud native architecture Desirable: Able to demonstrate experience of troubleshooting and diagnosis of technical issues. Able to demonstrate excellent team-working skills. Strong More ❯
month contract - Inside IR35 - Financial Services This role will require travel to Milton Keyenes and the London offices. Experience: Advanced knowledge and 3yrs+ experience of programming languages Hadoop SQL, Spark, SAS etc. Exceptional problem-solving skills. Track record of developing high quality analysis that has resulted in customer centric decisions that achieve business objectives. Understanding of the customer and More ❯
display, video, mobile, programmatic, social, native), considering viewability, interaction, and engagement metrics. Create dashboards and deliver usable insights to help steer product roadmaps. Utilize tools such as SQL, R, Hadoop, Excel to hypothesize and perform statistical analysis, AB tests, and experiments to measure the impact of product initiatives on revenue, technical performance, advertiser & reader engagement. Candidates should have analysis More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Mexa Solutions LTD
data sources, including SQL and NoSQL databases. Implementing and optimizing data warehouse solutions and ETL/ELT pipelines for analytics and reporting. Working with big data ecosystems such as Hadoop, Spark, and Kafka to build scalable solutions. What you’ll bring... Strong expertise in SQL and NoSQL technologies, such as Oracle, PostgreSQL, MongoDB, or similar. Proven experience with data … warehousing concepts and ETL/ELT tools. Knowledge of big data platforms and streaming tools like Hadoop, Spark, and Kafka. A deep understanding of scalable data architectures, including high availability and fault tolerance. Experience working across hybrid or cloud environments. Excellent communication skills to engage both technical teams and senior stakeholders. What’s in it for you... This is More ❯
nature of the work, you must hold enhanced DV Clearance. WE NEED THE DATA ENGINEER TO HAVE. Current enhanced DV Security Clearance Experience with big data tools such as Hadoop, Cloudera or Elasticsearch Experience With Palantir Foundry Experience working in an Agile Scrum environment with tools such as Confluence/Jira Experience in design, development, test and integration of …/DEVELOPPED VETTING/DEVELOPED VETTED/DEEP VETTING/DEEP VETTED/SC CLEARED/SC CLEARANCE/SECURITY CLEARED/SECURITY CLEARANCE/NIFI/CLOUDERA/HADOOP/KAFKA/ELASTIC SEARCH/LEAD BIG DATA ENGINEER/LEAD BIG DATA DEVELOPER More ❯
/MOD or Enhanced DV Clearance. WE NEED THE PYTHON/DATA ENGINEER TO HAVE. Current DV Security Clearance (Standard or Enhanced) Experience with big data tools such as Hadoop, Cloudera or Elasticsearch Python/PySpark experience Experience With Palantir Foundry is nice to have Experience working in an Agile Scrum environment with tools such as Confluence/Jira …/DEVELOPPED VETTING/DEVELOPED VETTED/DEEP VETTING/DEEP VETTED/SC CLEARED/SC CLEARANCE/SECURITY CLEARED/SECURITY CLEARANCE/NIFI/CLOUDERA/HADOOP/KAFKA/ELASTIC SEARCH/LEAD BIG DATA ENGINEER/LEAD BIG DATA DEVELOPER More ❯