SDLC automation tools like JIRA, Bamboo, and Ansible is a plus Qualifications/Experience: Strong programming skills in Java; experience with Spring, Hibernate, and Apache Ignite is a plus Ability to write complex SQL queries Experience with Fidessa Equities platform ETP/CTAC is desirable Unix/Linux command More ❯
City of Westminster, England, United Kingdom Hybrid / WFH Options
VIOOH
Kibana/Grafana/Prometheus). Write software using either Java/Scala/Python. The following are nice to have, but not required - Apache Spark jobs and pipelines. Experience with any functional programming language. Writing and analysing SQL queries. Application overVIOOH Our recruitment team will work hard to More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
experience Snowflake experience Proficiency across an AWS tech stack DevOps experience building and deploying using Terraform Nice to Have: DBT Data Modelling Data Vault Apache Airflow Benefits: Up to 10% Bonus Up to 14% Pensions Contribution 29 Days Annual Leave + Bank Holidays Free Company Shares Interviews ongoing don More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions
financial data, risk modelling, or algorithmic trading is a plus. Familiarity with cloud platforms (AWS, GCP, or Azure) and modern data stack tools (e.g., Apache Airflow, dbt, Snowflake). Excellent communication and stakeholder management skills. Must be available to work onsite in London 3 days per week. What's More ❯
develop proficiency in backend technologies (e.g., Python with Django) to support data pipeline integrations. Cloud Platforms: Familiarity with AWS or Azure, including services like Apache Airflow, Terraform, or SageMaker. Data Quality Management: Experience with data versioning and quality assurance practices. Automation and CI/CD: Knowledge of build and More ❯
City of London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
financial data, risk modelling, or algorithmic trading is a plus. Familiarity with cloud platforms (AWS, GCP, or Azure) and modern data stack tools (e.g., Apache Airflow, dbt, Snowflake). Excellent communication and stakeholder management skills. Must be available to work onsite in London 3 days per week. What's More ❯
City Of London, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
focus on public cloud onboarding. The platform is a Greenfield build using modern technologies such as Java, Spring Boot, Kubernetes, Kafka, MongoDB, RabbitMQ, Solace, Apache Ignite. The platform runs in a hybrid mode both on-premises and in AWS utilising technologies such as EKS, S3, FSX. Objectives Steering platform More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DBT expertise Terraform experience Expert SQL and Python Data Modelling Data Vault Apache Airflow My client have very limited interview slots and they are looking to fill this vacancy ASAP. I have limited slots for 1st stage More ❯
City of London, England, United Kingdom Hybrid / WFH Options
uk.tiptopjob.com - Jobboard
Scikit-learn, Pandas, PyTorch, Jupyter, pipelines), and practical knowledge of data tools like Databricks, Ray, Vector Databases, Kubernetes, and workflow scheduling tools such as Apache Airflow, Dagster, and Astronomer. -GPU Computing: Familiarity with GPU computing, both on-premises and on cloud platforms, and experience in building end-to-end More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Kolayo
to Have: Familiarity with cloud platforms (AWS, GCP, Azure) and cloud-based database services (Snowflake). Knowledge of data warehousing, orchestration and pipeline technologies (Apache Airflow/Kafka, Azure DataFactory etc.). Experience with DBT for modelling Server administration and networking fundamentals More ❯
British-born sole UK National with active SC or DV Clearance • Strong Java skills, familiarity with Python • Experience in Linux, Git, CI/CD, Apache NiFi • Knowledge of Oracle, MongoDB, React, Elasticsearch • Familiarity with AWS (EC2, EKS, Fargate, S3, Lambda) Active DV Clearance If you do not meet all More ❯
City of London, England, United Kingdom Hybrid / WFH Options
Jefferson Frank
3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DBT Expertise Terraform Experience Nice to Have: Data Modelling Data Vault Apache Airflow Benefits: Up to 10% Bonus Up to 14% Pensions Contribution 29 Days Annual Leave + Bank Holidays Free Company Shares Interviews ongoing don More ❯
systems Strong knowledge of Kubernetes and Kafka Experience with Git, and Deployment Pipelines Having worked with at least one of the following stacks: Hadoop, Apache Spark, Presto AWS Redshift, Azure Synapse or Google BigQuery Experience profiling performance issues in database systems Ability to learn and/or adapt quickly More ❯
Skills: x5 + experience with Python programming for data engineering tasks Strong proficiency in SQL and database management Hands-on experience with Databricks and Apache Spark Familiarity with Azure cloud platform and related services Knowledge of data security best practices and compliance standards Excellent problem-solving and communication skills More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Undisclosed
concepts to non-technical stakeholders. Preferred Skills: Experience with insurance platforms such as Guidewire, Duck Creek, or legacy PAS systems. Knowledge of Delta Lake, Apache Spark, and data pipeline orchestration tools. Exposure to Agile delivery methodologies and tools like JIRA, Confluence, or Azure DevOps. Understanding of regulatory data requirements More ❯
City of London, England, United Kingdom Hybrid / WFH Options
ACLED
to detail, ability to work remotely. Desirable: Cloud architecture certification (e.g., AWS Certified Solutions Architect). Experience with Drupal CMS, geospatial/mapping tools, Apache Airflow, serverless architectures, API gateways. Interest in conflict data, humanitarian tech, open data platforms; desire to grow into a solution architect or technical lead More ❯
TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like Apache Airflow, Spark, or Kafka. Strong understanding of SQL, NoSQL, and data modeling. Familiarity with cloud platforms (AWS, Azure, GCP) for deploying ML and data More ❯
their team in London on a full-time basis What You’ll Do Architect and implement high-performance data processing systems in Rust Leverage Apache Arrow and Parquet for in-memory and on-disk data efficiency Integrate and extend systems like DataFusion, ClickHouse, and DuckDB Design low-latency pipelines More ❯
Proven experience in recommender systems, behavioural AI, and/or reinforcement learning. Building data pipelines (realtime or batch) & data quality using modern toolchain (e.g., Apache Spark, Kafka, Airflow, dbt). PhD in Computer Science, Machine Learning, or a closely related field What We Offer: Opportunity to build technology that More ❯
systems Strong knowledge of Kubernetes and Kafka Experience with Git, and Deployment Pipelines Having worked with at least one of the following stacks: Hadoop, Apache Spark, Presto Experience profiling performance issues in database systems Ability to learn and/or adapt quickly to complex issues Happy to collaborate with More ❯
setting. MSc in Computer Science, Machine Learning, or a related field. Experience building data pipelines (realtime or batch) & data quality using modern toolchain (e.g., Apache Spark, Kafka, Airflow, dbt). Strong foundational knowledge of machine learning and deep learning algorithms, including deep neural networks, supervised/unsupervised learning, predictive More ❯
frameworks, and performance monitoring Desirable Skills: Background in serverless technologies (e.g. Lambda, Step Functions, API Gateway) Experience with data tools like EMR, Glue, or Apache Spark Understanding of event-driven architecture (EventBridge, SNS, SQS) Knowledge of AWS database offerings including DynamoDB and RDS Familiarity with multi-region deployments and More ❯
d like to see from you: 3–5 years of experience in data integration, orchestration, or automation roles Solid experience with orchestration tools (e.g., Apache Airflow, MuleSoft, Dell Boomi, Informatica Cloud). Familiarity with cloud data platforms (e.g., AWS, Microsoft Azure, Google Cloud Platform) and related data movement technologies More ❯
Spark, Kafka). • Familiarity with cloud-based data platforms and services (e.g., AWS, Azure, Google Cloud). • Expertise in ETL tools and processes (e.g., Apache NiFi, Talend, Informatica). • Proficiency in data integration tools and technologies. • Familiarity with data visualization and reporting tools (e.g., Tableau, Power BI) is a More ❯
Rust , C , or C++ Solid understanding of operating systems, file systems, and storage internals Experience working with modern data formats or infrastructure tools (e.g., Apache Arrow, Parquet, DuckDB, ClickHouse) A passion for infrastructure and performance problems Willingness and ability to work on-site 5 days/week in London More ❯