performance and responsiveness. Stay Up to Date with Technology: Keep yourself and the team updated on the latest Python technologies, frameworks, and tools like Apache Spark , Databricks , Apache Pulsar , ApacheAirflow , Temporal , and Apache Flink , sharing knowledge and suggesting improvements. Documentation: Contribute to clear and … or Azure . DevOps Tools: Familiarity with containerization ( Docker ) and infrastructure automation tools like Terraform or Ansible . Real-time Data Streaming: Experience with Apache Pulsar or similar systems for real-time messaging and stream processing is a plus. Data Engineering: Experience with Apache Spark , Databricks , or similar … big data platforms for processing large datasets, building data pipelines, and machine learning workflows. Workflow Orchestration: Familiarity with tools like ApacheAirflow or Temporal for managing workflows and scheduling jobs in distributed systems. Stream Processing: Experience with Apache Flink or other stream processing frameworks is a plus. More ❯
quickly and apply new skills Desirable Solid understanding of microservices development SQL and NoSQL databases working set Familiar with or able to quickly learn Apache NiFi, ApacheAirflow, Apache Kafka, KeyCloak, Serverless Computing, GraphQL, APIs, APIM Good skills working with JSON, XML, YAML files Knowledge in More ❯
chelmsford, east anglia, united kingdom Hybrid / WFH Options
EMBL-EBI
services, including microservice deployment Developing and maintaining data pipelines to process and manage large-scale structural biology data Working with workflow orchestration tools like ApacheAirflow and Nextflow To liaise with the EMBL-EBI Technical Services Cluster To support the development and deployment of other scientific software The … Proficiency in containerisation Proficiency in relational databases (Oracle, PostgreSQL) Experience in developing and maintaining data pipelines Hands-on experience with workflow orchestration tools (e.g., ApacheAirflow, Nextflow) Strong interpersonal and communication skills Proficiency in oral and written English You may also have PhD in computer science, IT or … a related field, or in bioinformatics with a demonstrated IT expertise Experience in using Kubernetes Experience with web servers (Apache/Nginx) Hands-on experience with CI/CD (GitLab CI/GitHub Actions) Familiarity with networking Familiarity with Java Knowledge of, or affinity with, structural biology and bioinformatics More ❯
saffron walden, east anglia, united kingdom Hybrid / WFH Options
EMBL-EBI
services, including microservice deployment Developing and maintaining data pipelines to process and manage large-scale structural biology data Working with workflow orchestration tools like ApacheAirflow and Nextflow To liaise with the EMBL-EBI Technical Services Cluster To support the development and deployment of other scientific software The … Proficiency in containerisation Proficiency in relational databases (Oracle, PostgreSQL) Experience in developing and maintaining data pipelines Hands-on experience with workflow orchestration tools (e.g., ApacheAirflow, Nextflow) Strong interpersonal and communication skills Proficiency in oral and written English You may also have PhD in computer science, IT or … a related field, or in bioinformatics with a demonstrated IT expertise Experience in using Kubernetes Experience with web servers (Apache/Nginx) Hands-on experience with CI/CD (GitLab CI/GitHub Actions) Familiarity with networking Familiarity with Java Knowledge of, or affinity with, structural biology and bioinformatics More ❯
Saffron Walden, Essex, South East, United Kingdom Hybrid / WFH Options
EMBL-EBI
services, including microservice deployment Developing and maintaining data pipelines to process and manage large-scale structural biology data Working with workflow orchestration tools like ApacheAirflow and Nextflow To liaise with the EMBL-EBI Technical Services Cluster To support the development and deployment of other scientific software The … Proficiency in containerisation Proficiency in relational databases (Oracle, PostgreSQL) Experience in developing and maintaining data pipelines Hands-on experience with workflow orchestration tools (e.g., ApacheAirflow, Nextflow) Strong interpersonal and communication skills Proficiency in oral and written English You may also have PhD in computer science, IT or … a related field, or in bioinformatics with a demonstrated IT expertise Experience in using Kubernetes Experience with web servers (Apache/Nginx) Hands-on experience with CI/CD (GitLab CI/GitHub Actions) Familiarity with networking Familiarity with Java Knowledge of, or affinity with, structural biology and bioinformatics More ❯
Science, Math, or Financial Engineering degree Strong knowledge in other programming language(s) - e.g., JavaScript, Typescript, Kotlin Strong knowledge of data orchestration technologies - e.g., ApacheAirflow, Dagster, AWS Step Functions Understanding of ETL/ELT workflows, data modeling, and performance optimization for both batch and real-time processing. More ❯
or Angular good but not necessary) Agile The following is DESIRABLE, not essential: AWS or GCP Buy-side Data tools such as Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio Fixed Income performance, risk or attribution TypeScript and Node Role: Python Developer (Software Engineer Programmer Developer Python Fixed Income … the office 1-2 times a week. The tech environment is very new and will soon likely include exposure to the following: Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio This is an environment that has been described as the only corporate environment with a start-up/fintech More ❯
the big 3 cloud ML stacks (AWS, Azure, GCP). Hands-on experience with open-source ETL, and data pipeline orchestration tools such as ApacheAirflow and Nifi. Experience with large scale/Big Data technologies, such as Hadoop, Spark, Hive, Impala, PrestoDb, Kafka. Experience with workflow orchestration … tools like Apache Airflow. Experience with containerisation using Docker and deployment on Kubernetes. Experience with NoSQL and graph databases. Unix server administration and shell scripting experience. Experience in building scalable data pipelines for highly unstructured data. Experience in building DWH and data lakes architectures. Experience in working in cross More ❯
AWS preferred) Solid understanding of libraries like Pandas and NumPy Experience in data warehousing tools like Snowflake, Databricks, BigQuery Familiar with AWS Step Functions, Airflow, Dagster, or other workflow orchestration tools Commercial experience with performant database programming in SQL Capability to solve complex technical issues, comprehending risks prior to More ❯
AWS preferred) Solid understanding of libraries like Pandas and NumPy Experience in data warehousing tools like Snowflake, Databricks, BigQuery Familiar with AWS Step Functions, Airflow, Dagster, or other workflow orchestration tools Commercial experience with performant database programming in SQL Capability to solve complex technical issues, comprehending risks prior to More ❯
of the open-source libraries we use extensively. We implement the systems that require the highest data throughput in Java and C++. We use Airflow for workflow management, Kafka for data pipelines, Bitbucket for source control, Jenkins for continuous integration, Grafana + Prometheus for metrics collection, ELK for log More ❯
diagram of proposed tables to enable discussion. Good communicator and comfortable with presenting ideas and outputs to technical and non-technical users. Worked on ApacheAirflow before to create DAGs. Ability to work within Agile, considering minimum viable products, story pointing, and sprints. More information: Enjoy fantastic perks More ❯
Proficiency in version control tools like Git ensures effective collaboration and management of code and data models. Experience with workflow automation tools, such as ApacheAirflow, is crucial for streamlining and orchestrating complex data processes. Skilled at integrating data from diverse sources, including APIs, databases, and third-party More ❯
Science or equivalent Experience in developing Finance or HR related applications Working experience with Tableau Working experience with Terraform Experience in creating workflows for ApacheAirflow and Jenkins Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our More ❯
Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience in working with process scheduling platforms like Apache Airflow. Open to working in GS proprietary technology like Slang/SECDB An understanding of compute resources and the ability to interpret performance metrics More ❯
Intelligence, Statistical & Data Analysis, Computational Algorithms, Data Engineering, etc. Experience working with a variety of complex, large datasets. Experience building automated pipelines (e.g., Jenkins, Airflow, etc.). Experience building or understanding end-to-end, distributed, and high-performance software infrastructures. Proven ability to work collaboratively as part of a More ❯
across the company. Role requirements 4+ years of experience You have an understanding of developing ETL pipelines using Python frameworks such as luigi or airflow; You have experience with the development of Python-based REST APIs/services and their integration with databases (e.g. Postgres); You are familiar with More ❯
Nottinghamshire, United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (Python, Data Modelling, ETL/ELT, ApacheAirflow, DBT, AWS) Enterprise-scale tech firm Up to £70,000 plus benefits - FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalabale data structures and data pipelines within More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Duel
Snowflake. You understand event-driven architectures and real-time data processing. You have experience implementing and maintaining scalable data pipelines using tools like dbt, ApacheAirflow, or similar. You have no issue working with either structured or semi-structured data. You are comfortable working with data engineering and More ❯
MPP databases such as Redshift, BigQuery, or Snowflake, and modern transformation/query engines like Spark, Flink, Trino. Familiarity with workflow management tools (e.g., Airflow) and/or dbt for transformations. Comprehensive understanding of modern data platforms, including data governance and observability. Experience with cloud platforms (AWS, GCP, Azure More ❯
as Redshift, Snowflake, or BigQuery. Strong command of SQL and programming languages like Python, Scala, or Java. Familiarity with ETL/ELT tools (e.g., Airflow, Fivetran, dbt) and cloud data stacks (AWS/GCP/Azure). A deep understanding of data modelling, access controls, and infrastructure performance tuning. More ❯
as Redshift, Snowflake, or BigQuery. Strong command of SQL and programming languages like Python, Scala, or Java. Familiarity with ETL/ELT tools (e.g., Airflow, Fivetran, dbt) and cloud data stacks (AWS/GCP/Azure). A deep understanding of data modelling, access controls, and infrastructure performance tuning. More ❯
navigate client relationships and translate technical insights into business value. Experience with cloud platforms (e.g., Snowflake, AWS) and ETL/ELT pipeline tools like Airflow/dbt. Benefits £6,000 per annum training & conference budget to help you up-skill and elevate your career Pension contribution scheme (up to More ❯
architectures and CAP theorem. A good understanding of functional paradigms and type theory. Confident JVM knowledge. Modern Java, Ruby, or Clojure knowledge. Experience with Airflow or other Python-based workflow orchestration tools. Exposure to Kubernetes, Docker, Linux, Kafka, RabbitMQ, or git. Knowledge of financial concepts, exchange trading, or physical More ❯