City of London, London, United Kingdom Hybrid / WFH Options
Rise Technical Recruitment Limited
a trusted partner across a wide range of businesses. In this role you'll take ownership of the reliability and performance of large-scale date pipelines built on AWS, Apache Flink, Kafka, and Python. You'll play a key role in diagnosing incidents, optimising system behaviour, and ensuring reporting data is delivered on time and without failure. The ideal … candidate will have a strong experience working with streaming and batch data systems, a solid understanding of monitoring a observability, and hands-on experience working with AWS, Apache Flink, Kafka, and Python. This is a fantastic opportunity to step into a SRE role focused on data reliability in a modern cloud native environment, with full ownership of incident management … with various other departments and teams to architect scalable, fault-tolerant data solutions The Person: *Experience in a data-focused SRE, Data Platform, or DevOps role *Strong knowledge of Apache Flink, Kafka, and Python in production environments *Hands-on AWS experience with AWS (Lambda, EMR, Step Functions, Redshift, etc.) *Comfortable with monitoring tools, distributed systems debugging, and incident response More ❯
Huntingdon, Cambridgeshire, East Anglia, United Kingdom Hybrid / WFH Options
Leidos Innovations UK Limited
built features Integrate and extend Codice Alliance tools with the existing catalogue Build secure and modular services for ingesting, indexing, and querying geospatial and imagery data Work with OSGi, Apache Karaf, and other modular Java platforms Ensure compliance with data security, access control, and audit requirements Create Design and Build documentation derived from customer requirements. Required Experience: Strong Java … development experience, especially in modular or distributed systems Familiarity with OSGi, Apache Karaf, and the DDF architecture Experience with REST APIs, secure data handling, and geospatial data formats Experience with build tools (Maven), version control (Git), and CI/CD pipelines Experience with geospatial standards (OGC, GeoJSON, WKT, etc.) Knowledge of Elasticsearch, Solr, or other search indexing tools Familiarity More ❯
and Data Pipeline Integration Collaborate with data scientists and engineers to support the operationalization of machine learning models. Possess a strong understanding of data pipeline orchestration tools such as Apache Airflow and their integration with cloud infrastructure. Assist in the deployment and monitoring of self-hosted LLMs. Cross-Functional Collaboration: Work closely with stakeholders, including data scientists, software engineers … container orchestration in a production environment. Experience with building and managing CI/CD pipelines (e.g., Jenkins, GitHub Actions, Azure DevOps). Familiarity with MLOps practices and tools including Apache Airflow is a significant plus. Technical Skills: Proficiency in programming languages such as Bash and Python Strong understanding of version control systems (e.g., Git). Solid knowledge of networking More ❯
to incrementally deliver working software. You will work well within small teams, taking ownership of and delivering high quality software - you should take pride in the software you produce. Apache Maven Desirable Skills and Experience Industry experience of any of the following technologies/skills would bebeneficial: Elasticsearch Docker Apache Hadoop, Kafka or Camel. Javascript Knowledge of both … Windows and Linux operating systems. Kubernetes Nifi Nsq Apache Ignite Arango Why BAE Systems? This is a place where you'll be able to make a real difference. You'll be part of an inclusive culture that values diversity of thought, rewards integrity, and merit, and where you'll be empowered to fulfil your potential. We welcome people from More ❯
Guildford, Surrey, United Kingdom Hybrid / WFH Options
BAE Systems (New)
to incrementally deliver working software. You will work well within small teams, taking ownership of and delivering high quality software - you should take pride in the software you produce. Apache Maven Desirable Skills and Experience Industry experience of any of the following technologies/skills would bebeneficial: Elasticsearch Docker Apache Hadoop, Kafka or Camel. Javascript Knowledge of both … Windows and Linux operating systems. Kubernetes Nifi Nsq Apache Ignite Arango Why BAE Systems? This is a place where you'll be able to make a real difference. You'll be part of an inclusive culture that values diversity of thought, rewards integrity, and merit, and where you'll be empowered to fulfil your potential. We welcome people from More ❯
Frimley, Surrey, United Kingdom Hybrid / WFH Options
BAE Systems (New)
to incrementally deliver working software. You will work well within small teams, taking ownership of and delivering high quality software - you should take pride in the software you produce. Apache Maven Desirable Skills and Experience Industry experience of any of the following technologies/skills would bebeneficial: Elasticsearch Docker Apache Hadoop, Kafka or Camel. Javascript Knowledge of both … Windows and Linux operating systems. Kubernetes Nifi Nsq Apache Ignite Arango Why BAE Systems? This is a place where you'll be able to make a real difference. You'll be part of an inclusive culture that values diversity of thought, rewards integrity, and merit, and where you'll be empowered to fulfil your potential. We welcome people from More ❯
Astronomer empowers data teams to bring mission-critical software, analytics, and AI to life and is the company behind Astro, the industry-leading unified DataOps platform powered by Apache Airflow. Astro accelerates building reliable data products that unlock insights, unleash AI value, and powers data-driven applications. Trusted by more than 700 of the world's leading enterprises, Astronomer … our product's evolution through client feedback. This role is ideal for someone who wants to make a visible impact while growing into an expert in workflow orchestration and Apache Airflow. This is a hybrid role requiring a minimum of 3 days per week onsite, and includes up to 40% travel for business and customer needs. What you get … production. Be a Trusted Advisor: Conduct demos and provide technical guidance to engineering teams, showing them how our platform can transform their workflows. Drive Community Impact: Contribute to the Apache Airflow community by creating technical content and best practices, positioning Astronomer as a thought leader in workflow orchestration. Influence Product Direction: Act as a liaison by gathering field insights More ❯
data pipelines within enterprise-grade on-prem systems. Key Responsibilities: Design, develop, and maintain data pipelines using Hadoop technologies in an on-premises infrastructure. Build and optimise workflows using Apache Airflow and Spark Streaming for real-time data processing. Develop robust data engineering solutions using Python for automation and transformation. Collaborate with infrastructure and analytics teams to support operational … platform. Ensure compliance with enterprise security and data governance standards. Required Skills & Experience: Minimum 5 years of experience in Hadoop and data engineering. Strong hands-on experience with Python, Apache Airflow, and Spark Streaming. Deep understanding of Hadoop components (HDFS, Hive, HBase, YARN) in on-prem environments. Exposure to data analytics, preferably involving infrastructure or operational data. Experience working More ❯
West Midlands, United Kingdom Hybrid / WFH Options
Experis
data pipelines within enterprise-grade on-prem systems. Key Responsibilities: Design, develop, and maintain data pipelines using Hadoop technologies in an on-premises infrastructure. Build and optimise workflows using Apache Airflow and Spark Streaming for real-time data processing. Develop robust data engineering solutions using Python for automation and transformation. Collaborate with infrastructure and analytics teams to support operational … platform. Ensure compliance with enterprise security and data governance standards. Required Skills & Experience: Minimum 5 years of experience in Hadoop and data engineering. Strong hands-on experience with Python, Apache Airflow, and Spark Streaming. Deep understanding of Hadoop components (HDFS, Hive, HBase, YARN) in on-prem environments. Exposure to data analytics, preferably involving infrastructure or operational data. Experience working More ❯
unsupervised, and reinforcement learning methods. Experience with GCP services such as Vertex AI, BigQuery ML, Dataflow, AI Platform Pipelines, and Dataproc. Solid knowledge of distributed systems, data streaming (e.g., Apache Beam, Kafka), and large-scale data processing. ML Ops: Hands-on experience with continuous integration/deployment (CI/CD) for ML, model versioning, and monitoring. Business Acumen: Ability … to understand marketing and advertising concepts like customer lifetime value (CLV), attribution modeling, real-time bidding (RTB), and audience targeting. Strong understanding of data pipeline orchestration tools (e.g., Apache Airflow, Kubernetes). You thrive when working as part of a team Comfortable in a fast-paced environment Have excellent written and verbal English skills Last but not least, you More ❯
unsupervised, and reinforcement learning methods. Experience with GCP services such as Vertex AI, BigQuery ML, Dataflow, AI Platform Pipelines, and Dataproc. Solid knowledge of distributed systems, data streaming (e.g., Apache Beam, Kafka), and large-scale data processing. ML Ops: Hands-on experience with continuous integration/deployment (CI/CD) for ML, model versioning, and monitoring. Business Acumen: Ability … to understand marketing and advertising concepts like customer lifetime value (CLV), attribution modeling, real-time bidding (RTB), and audience targeting. Strong understanding of data pipeline orchestration tools (e.g., Apache Airflow, Kubernetes). You thrive when working as part of a team Comfortable in a fast-paced environment Have excellent written and verbal English skills Last but not least, you More ❯
City of London, London, United Kingdom Hybrid / WFH Options
QiH Group
unsupervised, and reinforcement learning methods. Experience with GCP services such as Vertex AI, BigQuery ML, Dataflow, AI Platform Pipelines, and Dataproc. Solid knowledge of distributed systems, data streaming (e.g., Apache Beam, Kafka), and large-scale data processing. ML Ops: Hands-on experience with continuous integration/deployment (CI/CD) for ML, model versioning, and monitoring. Business Acumen: Ability … to understand marketing and advertising concepts like customer lifetime value (CLV), attribution modeling, real-time bidding (RTB), and audience targeting. Strong understanding of data pipeline orchestration tools (e.g., Apache Airflow, Kubernetes). You thrive when working as part of a team Comfortable in a fast-paced environment Have excellent written and verbal English skills Last but not least, you More ❯
/or teaching technical concepts to non-technical and technical audiences alike Passion for collaboration, life-long learning, and driving business value through ML [Preferred] Experience working with Databricks & Apache Spark to process large-scale distributed datasets About Databricks Databricks is the data and AI company. More than 10,000 organizations worldwide including Comcast, Cond Nast, Grammarly, and over … Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook . Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet More ❯
/or teaching technical concepts to non-technical and technical audiences alike Passion for collaboration, life-long learning, and driving business value through ML Preferred Experience working with Databricks & Apache Spark to process large-scale distributed datasets As a client-facing role, travel may be necessary to support meetings and engagements. About Databricks Databricks is the data and AI … Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook . Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet More ❯
in either Python or Scala Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one Deep experience with distributed computing with Apache Spark and knowledge of Spark runtime internals Familiarity with CI/CD for production deployments Working knowledge of MLOps Design and deployment of performant end-to-end data architectures … Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook . Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet More ❯
company covering the entire data transformation from architecture to implementation. Beyond delivering solutions, we also provide data & AI training and enablement. We are backed by Databricks - the creators of Apache Spark, and act as a delivery partner and training provider for them in Europe. Additionally, we are Microsoft Gold Partners in delivering cloud migration and data architecture on Azure. … company covering the entire data transformation from architecture to implementation. Beyond delivering solutions, we also provide data & AI training and enablement. We are backed by Databricks - the creators of Apache Spark, and act as a delivery partner and training provider for them in Europe. Additionally, we are Microsoft Gold Partners in delivering cloud migration and data architecture on Azure. More ❯
mission-driven company dedicated to freeing data from data platform lock-in. We deliver the industry's most interoperable data lakehouse through a cloud-native managed service built on Apache Hudi. Onehouse enables organizations to ingest data at scale with minute-level freshness, centrally store it, and make available to any downstream query engine and use case (from traditional … technology and product landscape to building the sales strategy around a large, successful open source project. You will be challenged to deeply understand data architecture and the key role Apache Hudi plays in some of the biggest enterprises in the world, then articulate the value proposition of the Onehouse managed service to potential customers. You will set up best More ❯
guidance to cross-functional teams, ensuring best practices in data architecture, security and cloud computing Proficiency in data modelling, ETL processes, data warehousing, distributed systems and metadata systems Utilise Apache Flink and other streaming technologies to build real-time data processing systems that handle large-scale, high-throughput data Ensure all data solutions comply with industry standards and government … but not limited to EC2, S3, RDS, Lambda and Redshift. Experience with other cloud providers (e.g., Azure, GCP) is a plus In-depth knowledge and hands-on experience with Apache Flink for real-time data processing Proven experience in mentoring and managing teams, with a focus on developing talent and fostering a collaborative work environment Strong ability to engage More ❯