slough, south east england, united kingdom Hybrid / WFH Options
Tata Consultancy Services
with AWS Cloud-native data platforms, including: AWS Glue, Lambda, Step Functions, Athena, Redshift, S3, CloudWatch AWS SDKs, Boto3, and serverless architecture patterns Strong programming skills in Python and Apache Spark Proven experience in Snowflake data engineering, including: Snowflake SQL, Snowpipe, Streams & Tasks, and performance optimization Integration with AWS services and orchestration tools Expertise in data integration patterns, ETL More ❯
Science, Computer Science, or a related field. 5+ years of experience in data engineering and data quality. Strong proficiency in Python/Java, SQL, and data processing frameworks including Apache Spark. Knowledge of machine learning and its data requirements. Attention to detail and a strong commitment to data integrity. Excellent problem-solving skills and ability to work in a More ❯
scalable pipelines, data platforms, and integrations, while ensuring solutions meet regulatory standards and align with architectural best practices. Key Responsibilities: Build and optimise scalable data pipelines using Databricks and Apache Spark (PySpark). Ensure performance, scalability, and compliance (GxP and other standards). Collaborate on requirements, design, and backlog refinement. Promote engineering best practices including CI/CD, code More ❯
and architecture. Skills & Experience Required: 2-5 years of software development experience. Strong hands-on expertise in Scala (mandatory) , plus Python and Java . Experience with Big Data frameworks ; Apache Spark experience is an advantage. Solid understanding of software engineering principles, data structures, and algorithms. Strong problem-solving skills and ability to work in an Agile environment. Educational Criteria More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Advanced Resource Managers
warehouse knowledge Redshift and Snowflake preferred Working with IaC – Terraform and Cloud Formation Working understanding of scripting languages including Python and Shell Experience working with streaming technologies inc. Kafka, Apache Flink Experience working with a ETL environments Experience working with a confluent cloud platform More ❯
slough, south east england, united kingdom Hybrid / WFH Options
LevelUP HCS
Dev, QA, Production, and DR environments Contribute to SDLC automation using tools such as JIRA, Bamboo, and Ansible Qualifications & Experience Strong proficiency in Java; experience with Spring, Hibernate, and Apache Ignite is a plus Skilled in writing complex SQL queries Familiarity with Fidessa Equities platforms (ETP/CTAC) is advantageous Experience with Unix/Linux command-line and basic More ❯
and a solid understanding of CI/CD pipelines, DevSecOps workflows, and automated policy enforcement tools (e.g., Snyk, GitHub Actions, Jenkins, Sonatype, etc.). Knowledge of software licensing (MIT, Apache, GPL, etc.) and IP risk management. Open Source license-risk engineering and experience building & enforcing technology standards, risk frameworks, & software asset policies. Control the adoption, contribution, and distribution of More ❯
slough, south east england, united kingdom Hybrid / WFH Options
FENESTRA
production-quality code Very deep proficiency in Python and modern software engineering practices Advanced SQL knowledge with experience in data modelling and optimization Extensive experience with ETL frameworks, particularly Apache Airflow (3+ years) Strong understanding of CI/CD, Infrastructure as Code (Terraform), and containerization Experience designing and optimizing big data processing systems particularly using Bigquery/GCP Leadership More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Peaple Talent
delivered solutions in Google Cloud Platform (GCP) Strong experience designing and delivering data solutions using BigQuery Proficient in SQL and Python Experience working with Big Data technologies such as Apache Spark or PySpark Excellent communication skills, with the ability to engage effectively with senior stakeholders Nice to haves: GCP Data Engineering certifications BigQuery or other GCP tool certifications What More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Hexegic
to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in Apache Spark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base salary More ❯
consulting 8+ years leading technical teams in data engineering or analytics Expertise in modern data platforms such as Databricks, Snowflake, GCP, AWS, or Azure Strong understanding of tools like Apache Spark, Kafka, and Kubernetes Deep knowledge of data governance, strategy, and privacy regulations (GDPR, etc.) Strategic mindset with the ability to balance technical depth and business insight Passion for More ❯
and Responsibilities: Develop and maintain high-performance, low-latency Java-based systems for front office trading or pricing platforms. Build reactive systems using Kafka Streams , Akka , Eclipse Vert.x , or Apache Flink . Utilize multithreading , concurrency models , and Executor Services to optimize system performance and throughput. Write clean, efficient, and maintainable code using functional programming paradigms in Java. Follow and More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Synechron
internet applications Experience working in financial services Must have strong hands-on experience of implementing Web application using React.js, AngularJS, Node.js, TypeScript, HTML5, CSS etc. Experience with NGINX or Apache web server, good practice with URL mapping & URL rewrite Experience of deploying applications on Kubernetes/Openshift Experience in SCMs like GIT and tools like JIRA Agile/Scrum More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Areti Group | B Corp™
and government organisations , delivering real-world innovation powered by data and technology . 🔧 Tech Stack & Skills We're Looking For: Palantir Azure Databricks Microsoft Azure Python Docker & Kubernetes Linux Apache Tools Data Pipelines IoT (Internet of Things) Scrum/Agile Methodologies ✅ Ideal Candidate: Already DV Cleared or at least SC Strong communication skills – comfortable working directly with clients and More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Fruition Group
experience in a leadership or technical lead role, with official line management responsibility. Strong experience with modern data stack technologies, including Python, Snowflake, AWS (S3, EC2, Terraform), Airflow, dbt, Apache Spark, Apache Iceberg, and Postgres. Skilled in balancing technical excellence with business priorities in a fast-paced environment. Strong communication and stakeholder management skills, able to translate technical More ❯
assessments and predictive models. Optimize models for performance, scalability, and accuracy. Qualifications: Deep knowledge of neural networks (CNNs, RNNs, LSTMs, Transformers). Strong experience with data tools (Pandas, NumPy, Apache Spark). Solid understanding of NLP algorithms. Experience integrating ML models via RESTful APIs. Familiarity with CI/CD pipelines and deployment automation. Strategic thinking around architecture and trade More ❯
someone who wants to get more hands on, & build up their software experience. Key Skills: o Windows support experience o Databases/basic SQL o Web servers - such as Apache experience is helpful o Remote Access Tool Starting salary in the region £30-45K + benefits. Apply now for immediate consideration and interview this week! Technical Support Analyst More ❯
quickly, and delays of even milliseconds can have big consequences. Essential skills: 3+ years of experience in Python development. 3+ with open-source real-time data feeds (Amazon Kinesis, Apache Kafka, Apache Pulsar or Redpanda) Exposure building and managing data pipelines in production. Experience integrating serverless functions (AWS, Azure or GCP). Passion for fintech and building products More ❯
Role – Technology Lead/Confluent Consulting Engineer Technology – Apache Kafka, Confluent Platform, Stream Processing Location – UK, Germany, Netherlands, France & Spain Job Description Today, the corporate landscape is dynamic and the world ahead is full of possibilities! None of the amazing things we do at Infosys would be possible without an equally amazing culture, the environment where ideas can flourish … data pipelines and integrations using Kafka and Confluent components. You will collaborate with data engineers, architects, and DevOps teams to deliver robust streaming solutions. Required: • Hands-on experience with Apache Kafka (any distribution: open-source, Confluent, Cloudera, AWS MSK, etc.) • Strong proficiency in Java, Python, or Scala • Solid understanding of event-driven architecture and data streaming patterns • Experience deploying … ecosystem will be given preference: • Experience with Kafka Connect, Kafka Streams, KSQL, Schema Registry, REST Proxy, Confluent Control Center • Hands-on with Confluent Cloud services, including ksqlDB Cloud and Apache Flink • Familiarity with Stream Governance, Data Lineage, Stream Catalog, Audit Logs, RBAC • Confluent certifications (Developer, Administrator, or Flink Developer) • Experience with Confluent Platform, Confluent Cloud managed services, multi-cloud More ❯
a key role in ensuring that all data systems comply with industry regulations and security standards while enabling efficient access for analytics and operational teams. A strong command of Apache NiFi is essential for this role. You will be expected to design, implement, and maintain data flows using NiFi, ensuring accurate, efficient, and secure data ingestion, transformation, and delivery. … business needs and compliance requirements. Maintain documentation of data flows and processes, ensuring knowledge sharing and operational transparency. Skills & Experience: You will have the following skills or proven experience: Apache NiFi Expertise: Deep understanding of core NiFi concepts: FlowFiles, Processors, Controller Services, Schedulers, Web UI. Experience designing and optimizing data flows for batch, real-time streaming, and event-driven More ❯
a key role in ensuring that all data systems comply with industry regulations and security standards while enabling efficient access for analytics and operational teams. A strong command of Apache NiFi is essential for this role. You will be expected to design, implement, and maintain data flows using NiFi, ensuring accurate, efficient, and secure data ingestion, transformation, and delivery. … business needs and compliance requirements. Maintain documentation of data flows and processes, ensuring knowledge sharing and operational transparency. Skills & Experience: You will have the following skills or proven experience: Apache NiFi Expertise: Deep understanding of core NiFi concepts: FlowFiles, Processors, Controller Services, Schedulers, Web UI. Experience designing and optimizing data flows for batch, real-time streaming, and event-driven More ❯
Role – Technology Architect/Confluent Solution Architect Technology – Apache Kafka, Confluent Platform, Stream Processing Location – UK, Germany, Netherlands, France & Spain Job Description Today, the corporate landscape is dynamic and the world ahead is full of possibilities! None of the amazing things we do at Infosys would be possible without an equally amazing culture, the environment where ideas can flourish … these values are upheld only because of our people. Your role As a Confluent Solution Architect, you will lead the design and architecture of enterprise-grade streaming solutions using Apache Kafka and the Confluent Platform. You will work closely with clients to understand business requirements, define integration strategies, and guide implementation teams in building scalable, secure, and resilient data … streaming ecosystems. Strongly Preferred: • Experience in designing and architecting solutions using Apache Kafka, with hands-on experience in Confluent Kafka • Ability to lead client engagements, translate business requirements into technical solutions, and guide implementation teams • Deep understanding of Kafka internals, KRaft architecture, and Confluent components • Experience with Confluent Cloud, Stream Governance, Data Lineage, and RBAC • Expertise in stream processing More ❯
data modeling (star schema, snowflake schema). Version Control Practical experience with Git (branching, merging, pull requests). Preferred Qualifications (A Plus) Experience with a distributed computing framework like Apache Spark (using PySpark). Familiarity with cloud data services ( AWS S3/Redshift, Azure Data Lake/Synapse, or Google BigQuery/Cloud Storage ). Exposure to workflow orchestration … tools ( Apache Airflow, Prefect, or Dagster ). Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Acquired Talent Ltd
Data Engineer/PostgreSQL/SQL/Data Pipelines/Apache Superset/PowerBI/Tableau/Terraform Data Engineer (Outside IR35 Contract role) Determination: Outside IR35 Day Rate: Up to £575 per day Location: Hybrid Zone 1 Duration: 3 months (initial) Job Title: Data Engineer About the role: We're on the lookout for an experience Data Engineer … for good space. You'll be involved in the full end-to-end process, building data pipelines and dashboards. Data Engineer/PostgreSQL/SQL/Data Pipelines/Apache Superset/PowerBI/Tableau/Terraform Requirements: 5+years' experience with PostgresSQL, SQL & Terrform Demonstrable experience with building data pipelines from scratch 3+ years' Dashboarding/Building Dashboards, (Apache … an experienced data engineer with experience building data pipelines please apply, or send your CV directly to callum@acquiredtalent.co.uk Data Engineer/PostgreSQL/SQL/Data Pipelines/Apache Superset/PowerBI/Tableau/Terraform More ❯