impact at Tenzo. This role is pivotal in shaping how our product integrates and interacts with external systems, partners, and platforms. Our Tech Stack: ApacheAirflow Python Django AWS (S3, RDS withPostgresql, ElastiCache, MSK, EC2, ECS, Fargate, Lamda etc.) Snowflake Terraform CircleCI Your mission Design and develop data … relational databases such as PostgreSQL , including database administration, tuning, and optimisation ( Highly desirable ). Experience with data pipeline and workflow management tools such as ApacheAirflow ( Nice to have ). Proficiency in Git ( Important ). Ability and eagerness to write high-quality code, technical documentation, architecture diagrams, and More ❯
with data modelling. General understanding of data architectures and event-driven architectures. Proficient in SQL. Familiarity with one scripting language, preferably Python. Experience with ApacheAirflow & Apache Spark. Solid understanding of cloud data services: AWS services such as S3, Athena, EC2, RedShift, EMR (Elastic MapReduce), EKS, RDS More ❯
automate data ingestion, transformation, and orchestration across systems. Own data operations infrastructure: Manage and optimise key data infrastructure components within AWS, including Amazon Redshift, ApacheAirflow for workflow orchestration and other analytical tools. You will be responsible for ensuring the performance, reliability, and scalability of these systems to … AWS data services . Strong proficiency in DataOps methodologies and tools, including experience with CI/CD pipelines, containerized applications , and workflow orchestration using ApacheAirflow . Familiar with ETL frameworks, and bonus experience with Big Data processing (Spark, Hive, Trino), and data streaming. Proven track record - You More ❯
London, England, United Kingdom Hybrid / WFH Options
Biprocsi Ltd
automation to ensure successful project delivery, adhering to client timelines and quality standards. Implement and manage real-time and batch data processing frameworks (e.g., Apache Kafka, Apache Spark, Google Cloud Dataproc) in line with project needs. Build and maintain robust monitoring, logging, and alerting systems for client projects … in languages like Python, Bash, or Go to automate tasks and build necessary tools. Expertise in designing and optimising data pipelines using frameworks like ApacheAirflow or equivalent. Demonstrated experience with real-time and batch data processing frameworks, including Apache Kafka, Apache Spark, or Google Cloud More ❯
with data modelling. General understanding of data architectures and event-driven architectures. Proficient in SQL. Familiarity with one scripting language, preferably Python. Experience with ApacheAirflow & Apache Spark. Solid understanding of cloud data services: AWS services such as S3, Athena, EC2, RedShift, EMR (Elastic MapReduce), EKS, RDS More ❯
Worcester, England, United Kingdom Hybrid / WFH Options
Methods
Kubernetes for orchestration, ensuring scalable and efficient deployment of applications across both cloud-based and on-premises environments. Workflow Automation: Employ tools such as ApacheAirflow to automate data flows and manage complex workflows within hybrid environments. Event Streaming Experience: Utilise event-driven technologies such as Kafka to … Expertise: Solid experience with Docker and Kubernetes in managing applications across both on-premises and cloud platforms. Proficiency in Workflow Automation Tools: Practical experience ApacheAirflow in hybrid data environments. Experience in Event Streaming: Proven ability in managing and deploying event streaming platforms like Kafka. Data Security Knowledge More ❯
Brynmawr, Wales, United Kingdom Hybrid / WFH Options
Methods
Kubernetes for orchestration, ensuring scalable and efficient deployment of applications across both cloud-based and on-premises environments. Workflow Automation: Employ tools such as ApacheAirflow to automate data flows and manage complex workflows within hybrid environments. Event Streaming Experience: Utilise event-driven technologies such as Kafka to … Expertise: Solid experience with Docker and Kubernetes in managing applications across both on-premises and cloud platforms. Proficiency in Workflow Automation Tools: Practical experience ApacheAirflow in hybrid data environments. Experience in Event Streaming: Proven ability in managing and deploying event streaming platforms like Kafka. Data Security Knowledge More ❯
Great Malvern, England, United Kingdom Hybrid / WFH Options
Methods
Kubernetes for orchestration, ensuring scalable and efficient deployment of applications across both cloud-based and on-premises environments. Workflow Automation: Employ tools such as ApacheAirflow to automate data flows and manage complex workflows within hybrid environments. Event Streaming Experience: Utilise event-driven technologies such as Kafka to … Expertise: Solid experience with Docker and Kubernetes in managing applications across both on-premises and cloud platforms. Proficiency in Workflow Automation Tools: Practical experience ApacheAirflow in hybrid data environments. Experience in Event Streaming: Proven ability in managing and deploying event streaming platforms like Kafka. Data Security Knowledge More ❯
Gloucester, England, United Kingdom Hybrid / WFH Options
Methods
Kubernetes for orchestration, ensuring scalable and efficient deployment of applications across both cloud-based and on-premises environments. Workflow Automation: Employ tools such as ApacheAirflow to automate data flows and manage complex workflows within hybrid environments. Event Streaming Experience: Utilise event-driven technologies such as Kafka to … Expertise: Solid experience with Docker and Kubernetes in managing applications across both on-premises and cloud platforms. Proficiency in Workflow Automation Tools: Practical experience ApacheAirflow in hybrid data environments. Experience in Event Streaming: Proven ability in managing and deploying event streaming platforms like Kafka. Data Security Knowledge More ❯
Ebbw Vale, Wales, United Kingdom Hybrid / WFH Options
Methods
Kubernetes for orchestration, ensuring scalable and efficient deployment of applications across both cloud-based and on-premises environments. Workflow Automation: Employ tools such as ApacheAirflow to automate data flows and manage complex workflows within hybrid environments. Event Streaming Experience: Utilise event-driven technologies such as Kafka to … Expertise: Solid experience with Docker and Kubernetes in managing applications across both on-premises and cloud platforms. Proficiency in Workflow Automation Tools: Practical experience ApacheAirflow in hybrid data environments. Experience in Event Streaming: Proven ability in managing and deploying event streaming platforms like Kafka. Data Security Knowledge More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
JR United Kingdom
early and providing strategic guidance. Support the ongoing development of integration strategies involving Managed File Transfer solutions (e.g., GoAnywhere) and data orchestration platforms (e.g., ApacheAirflow). Provide hands-on support and detailed guidance on particularly complex integration designs where necessary. Maintain current knowledge of industry trends, technology … and migrations. Familiarity with IBM Maximo asset management platform. Knowledge and experience with Managed File Transfer solutions (e.g., GoAnywhere). Understanding and experience with ApacheAirflow orchestration platform. Strong grasp of integration best practices, security considerations, and data flow management. Ability to work collaboratively across distributed teams and More ❯
London, England, United Kingdom Hybrid / WFH Options
Apollo Solutions
manipulation and analysis, with the ability to build, maintain, and deploy sequences of automated processes Bonus Experience (Nice to Have) Familiarity with dbt, Fivetran, ApacheAirflow, Data Mesh, Data Vault 2.0, Fabric, and Apache Spark Experience working with streaming technologies such as Apache Kafka, ApacheMore ❯
be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, Apache Spark, and ApacheAirflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a … impact, we encourage you to apply! Job Responsibilities Data Engineering & Data Pipeline Development Design, develop, and optimize scalable DATA workflows using Python, PySpark, and Airflow Implement real-time and batch data processing using Spark Enforce best practices for data quality, governance, and security throughout the data lifecycle Ensure data …/CD pipelines for data workflows to ensure smooth and reliable deployments. Big Data & Analytics: Build and optimize large-scale data processing pipelines using Apache Spark and PySpark Implement data partitioning, caching, and performance tuning for Spark-based workloads. Work with diverse data formats (structured and unstructured) to support More ❯
London, England, United Kingdom Hybrid / WFH Options
Intercom
by designing and building the next generation of the stack. Develop, run and support our batch and real-time data pipelines using tools like Airflow, PlanetScale, Kinesis, Snowflake, Tableau, all in AWS. Collaborate with product managers, data engineers, analysts and data scientists to develop tooling and infrastructure to support … team has delivered: Refactoring of our MySQL Ingestion pipeline for reduced latency and 10x scalability. Redshift -> Snowflake migration Unified Local Analytics Development Environment for Airflow and DBT Building our next generation company metrics framework, adding anomaly detection and alerting, and enabling easier discovery and consumption. About you You have … your direct technical contributions on a project. You care about your craft In addition it would be a bonus if you have Worked with ApacheAirflow - we use Airflow extensively to orchestrate and schedule all of our data workflows. A good understanding of the quirks of operating More ❯
London, England, United Kingdom Hybrid / WFH Options
The Adaptavist Group
to communicate technical solutions/issues to technical and non-technical staff and stakeholders Experience with streaming technologies Kubernetes experience Kafka experience Experience with ApacheAirflow and Apache Flink Not ticking every box? That’s totally okay! Studies show that women and people of colour might hesitate More ❯
London, England, United Kingdom Hybrid / WFH Options
SBS
modelling, design, and integration expertise. Data Mesh Architectures: In-depth understanding of data mesh architectures. Technical Proficiency: Proficient in dbt, SQL, Python/Java, Apache Spark, Trino, ApacheAirflow, and Astro. Cloud Technologies: Awareness and experience with cloud technologies, particularly AWS. Analytical Skills: Excellent problem-solving and More ❯
modelling, design, and integration expertise. Data Mesh Architectures: In-depth understanding of data mesh architectures. Technical Proficiency: Proficient in dbt, SQL, Python/Java, Apache Spark, Trino, ApacheAirflow, and Astro. Cloud Technologies: Awareness and experience with cloud technologies, particularly AWS. Analytical Skills: Excellent problem-solving and More ❯
or Angular good but not necessary) Agile The following is DESIRABLE, not essential: AWS or GCP Buy-side Data tools such as Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio Fixed Income performance, risk or attribution TypeScript and Node Role: Python Developer (Software Engineer Programmer Developer Python Fixed Income … the office 1-2 times a week. The tech environment is very new and will soon likely include exposure to the following: Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio This is an environment that has been described as the only corporate environment with a start-up/fintech More ❯
architectures, such as lakehouse. Experience with CI/CD pipelines and version control systems like Git. Knowledge of ETL tools and technologies such as ApacheAirflow, Informatica, or Talend. Knowledge of data governance and best practices in data management. Familiarity with cloud platforms and services such as AWS … solutions. Strong problem-solving and analytical skills with the ability to diagnose and resolve complex data-related issues. SQL (for database management and querying) Apache Spark (for distributed data processing) Apache Spark Streaming, Kafka or similar (for real-time data streaming) Experience using data tools in at least More ❯
Experience with CI/CD pipelines, version control systems like Git, and containerization (e.g., Docker). Experience with ETL tools and technologies such as ApacheAirflow, Informatica, or Talend. Strong understanding of data governance and best practices in data management. Experience with cloud platforms and services such as … solutions. Strong problem-solving and analytical skills with the ability to diagnose and resolve complex data-related issues. SQL (for database management and querying) Apache Spark (for distributed data processing) Apache Spark Streaming, Kafka or similar (for real-time data streaming) Experience using data tools in at least More ❯
London, England, United Kingdom Hybrid / WFH Options
Cboe
Familiarity with PostgreSQL and Snowflake Preferred: Familiarity with Web Frameworks such as Django, Flask or FastAPI Preferred: Familiarity with event streaming platforms such as Apache Kafka Preferred: Familiarity with data pipeline platforms such as ApacheAirflow Preferred: Familiarity with Java Preferred: Experience in one or more relevant More ❯
practices include OWASP guidelines/top 10, SOC 2, and NCSC cloud security principles. Experience in data and orchestration tools including some of dbt, ApacheAirflow, Azure Data Factory. Experience in programming languages including some of Python, Typescript, Javascript, R, Java, C#, producing services, APIs, Function Apps or More ❯
or similar roles. Hands-on expertise with Python (Numpy/Pandas) and SQL. Proven experience designing and building robust ETL/ELT pipelines (dbt, Airflow). Strong knowledge of data pipelining, schema design, and cloud platforms (e.g., Snowflake, AWS). Excellent communication skills and the ability to translate technical More ❯
London, England, United Kingdom Hybrid / WFH Options
Cloudera
Data Engineering product area. This next-generation cloud-native service empowers customers to run large-scale data engineering workflows—using industry-standard tools like Apache Spark and ApacheAirflow—with just a few clicks, across both on-premises and public cloud environments. You'll play a critical … lead their own teams across multiple time zones Oversee a global team, many of whom are active contributors to open source communities like the Apache Software Foundation Own both technical direction and people management within the team Ensure consistent, high-quality software delivery through iterative releases Hire, manage, coach More ❯
/PyTorch, or similar). Experience validating models with historical data and communicating results to non-specialists. Exposure to real-time data engineering (Kafka, Airflow, dbt) Track record turning research code into production services (CI/CD, containers etc) Strong SQL and data-management skills; experience querying large analytical More ❯