and optimising data warehouses and ELT pipelines Solid experience across cloud platforms – ideally AWS, Snowflake, or Databricks Comfortable working with automation/integration tools (Airflow, Fivetran, Astronomer) Hands-on with Terraform, Docker, Kubernetes and modern CI/CD tools (GitHub Actions, Jenkins, CircleCI, etc.) Experience with real-time pipelines More ❯
with tools and packages like Pandas, NumPy, scikit-learn, Plotly/Matplotlib, and Jupyter Notebooks. Knowledge of ML-adjacent technologies, including AWS SageMaker and Apache Airflow. Proficiency in data pre-processing, data wrangling, and augmentation techniques. Experience with cloud platforms (e.g. AWS, Google Cloud, or Azure) for deploying scalable More ❯
Python skills (NumPy, Pandas, SQL Alchemy) and expert-level SQL across multiple database platforms Hands-on experience with modern data stack tools including dbt, Airflow, and cloud data warehouses (Snowflake, BigQuery, Redshift) Strong understanding of data modelling, schema design, and building maintainable ELT/ETL pipelines Experience with cloud More ❯
solutions using API and microservice-based architecture. Deep understanding of ETL/ELT architecture, streaming, and event-driven processing; familiarity with tools like dbt, Airflow, Kafka, or equivalents. Familiarity with mid-sized firm tech stacks, especially in financial services, including systems such as NetSuite, Salesforce, Addepar, Experience with Atlassian More ❯
the systems that require the highest data throughput in Java. We implement most of our long running services and analytics in C#. We use Airflow for workflow management, Kafka for data pipelines, Bitbucket for source control, Jenkins for continuous integration, ELK for logs, Grafana, Prometheus & InfluxDb for metrics, Docker More ❯
fixes and code refactoring. Leverage the latest data technologies and programming languages, including Python, Scala, and Java, along with systems like Spark, Kafka, and Airflow, within cloud services such as AWS. Ensure the ongoing maintenance, troubleshooting, optimization, and reliability of data systems, including timely resolution of unexpected issues. Stay … NoSQL databases (e.g., PostgreSQL, MongoDB) and data modeling principles. Proven ability to design, build, and maintain scalable data pipelines and workflows using tools like ApacheAirflow or similar. Strong problem-solving and analytical skills. Excellent communication and collaboration skills. Nice to have: Hands-on experience with data warehouse … and lakehouse architectures (e.g., Databricks, Snowflake, or similar). Experience with big data frameworks (e.g., Apache Spark, Hadoop) and cloud platforms (e.g., AWS, Azure, or GCP). More ❯
fixes and code refactoring. Leverage the latest data technologies and programming languages, including Python, Scala, and Java, along with systems like Spark, Kafka, and Airflow, within cloud services such as AWS. Ensure the ongoing maintenance, troubleshooting, optimization, and reliability of data systems, including timely resolution of unexpected issues. Stay … NoSQL databases (e.g., PostgreSQL, MongoDB) and data modeling principles. Proven ability to design, build, and maintain scalable data pipelines and workflows using tools like ApacheAirflow or similar. Strong problem-solving and analytical skills. Excellent communication and collaboration skills. Nice to have: Hands-on experience with data warehouse … and lakehouse architectures (e.g., Databricks, Snowflake, or similar). Experience with big data frameworks (e.g., Apache Spark, Hadoop) and cloud platforms (e.g., AWS, Azure, or GCP). More ❯
processes. Technical Skills Programming: Proficiency in Python, Java, Scala, or similar languages. Big Data Technologies: Hands-on experience with big data tools e.g. (Databricks, Apache Spark, Hadoop). Cloud Platforms: Familiarity with AWS, Azure, GCP, or other cloud ecosystems for data engineering tasks. Expertise in relational databases (e.g. postgres … sql server) Data Integration Tools: Knowledge of platforms like Airflow, Apache NiFi, or Talend. Data Storage and Modelling: Experience with data warehousing tools (e.g. Snowflake, Redshift, BigQuery) and schema design. Version Control and CI/CD: Familiarity with Git, Docker, and CI/CD pipelines for deployment. Experience More ❯
challenge Desirable Experience: Proficiency in at least one of Python, Go, Java, Ruby Working knowledge of Kubernetes Exposure to ETL systems at scale (e.g. ApacheAirflow, Argo Workflows) Exposure to streaming data platforms (e.g. Apache Kafka, RabbitMQ) Working knowledge of networking fundamentals Comfort deploying software to the More ❯
properties, simulation outputs, or imaging datasets. Proficiency in Python (including Pandas or PySpark) and SQL, with exposure to ETL/orchestration tools such as Airflow or dbt. Strong knowledge of cloud-native services on AWS (e.g., S3, Glue, Lambda, Athena) and Azure (Data Factory, Data Lake). Track record … ontologies, and best practices for metadata capture. Understanding of data science workflows in computational chemistry, bioinformatics, or AI/ML-driven research. Orchestration & ETL: ApacheAirflow, Prefect Scientific Libraries (Preferred): RDKit, Open Babel, CDK Seniority level Seniority level Mid-Senior level Employment type Employment type Full-time Job More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
experience Snowflake experience Proficiency across an AWS tech stack DevOps experience building and deploying using Terraform Nice to Have: DBT Data Modelling Data Vault ApacheAirflow Benefits: Up to 10% Bonus Up to 14% Pensions Contribution 29 Days Annual Leave + Bank Holidays Free Company Shares Interviews ongoing More ❯
London, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
experience Snowflake experience Proficiency across an AWS tech stack DevOps experience building and deploying using Terraform Nice to Have: DBT Data Modelling Data Vault ApacheAirflow Benefits: Up to 10% Bonus Up to 14% Pensions Contribution 29 Days Annual Leave + Bank Holidays Free Company Shares Interviews ongoing More ❯
City of London, England, United Kingdom Hybrid / WFH Options
Jefferson Frank
3+ Years data engineering experience * Snowflake experience * Proficiency across an AWS tech stack * DBT Expertise * Terraform Experience Nice to Have: * Data Modelling * Data Vault * ApacheAirflow Benefits: * Up to 10% Bonus * Up to 14% Pensions Contribution * 29 Days Annual Leave + Bank Holidays * Free Company Shares Interviews ongoing More ❯
Brighton, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
adopt emerging technologies, and enhance analytics capabilities. Requirements: Technical Proficiency : Hands-on experience building ETL/ELT pipelines with Python, SQL, or tools like ApacheAirflow, and expertise in visualisation tools (Power BI, Tableau, or Looker). Cloud Expertise : Familiarity with cloud platforms like Snowflake, Databricks, or AWS More ❯
to Have: Familiarity with cloud platforms (AWS, GCP, Azure) and cloud-based database services (Snowflake). Knowledge of data warehousing, orchestration and pipeline technologies (ApacheAirflow/Kafka, Azure DataFactory etc.). Experience with DBT for modelling Server administration and networking fundamentals More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Kolayo
to Have: Familiarity with cloud platforms (AWS, GCP, Azure) and cloud-based database services (Snowflake). Knowledge of data warehousing, orchestration and pipeline technologies (ApacheAirflow/Kafka, Azure DataFactory etc.). Experience with DBT for modelling Server administration and networking fundamentals More ❯
learn, PyTorch, Pandas, NumPy, SciPy. Experience with AWS (EC2, S3, SageMaker) or Azure/GCP equivalents. Experience designing, developing, and deploying scalable infrastructure (e.g., ApacheAirflow, Luigi). Object-oriented programming concepts and design. Ability to create well-documented, modular, and unit-tested code. Understanding of Agile development More ❯
or a similar language. Proficiency in database technologies (SQL, NoSQL, time-series databases) and data modelling. Strong understanding of data pipeline orchestration tools (e.g., ApacheAirflow, Kubernetes). You thrive when working as part of a team. Comfortable in a fast-paced environment. Have excellent written and verbal More ❯
related field, or equivalent industry experience. Preferred Qualifications Experience or interest in mentoring junior engineers. Familiarity with data-centric workflows and pipeline orchestration (e.g., ApacheAirflow). Proficiency in data validation, anomaly detection, or debugging using tools like Pandas, Polars, or data.table/R. Experience working with AWS More ❯
modeling. Knowledge of CI/CD tools like GitHub Actions or similar. AWS certifications such as AWS Certified Data Engineer. Knowledge of Snowflake, SQL, ApacheAirflow, and DBT. Familiarity with Atlan for data cataloging and metadata management. Understanding of iceberg tables. Who we are: We're a global More ❯
in SQL, experience with Python, exposure to DBT considered a plus Experience with AWS cloud computing services (Redshift, S3), GCP or similar Experience with ApacheAirflow or similar nice to have Ability to merge the multiple requirements of data projects into robust future-proof solutions. Excellent written and More ❯
NumPy, SciPy Experience with AWS (principally EC2, S3, SageMaker) or Azure/GCP equivalents Some experience of designing, developing and deploying scalable infrastructure (eg ApacheAirflow, Luigi or other cluster management software) Object Orientated concepts and design The ability to design and build unit-tested and well documented More ❯
related field, or equivalent industry experience Preferred Qualifications Experience or interest in mentoring junior engineers Familiarity with data-centric workflows and pipeline orchestration (e.g., ApacheAirflow) Proficiency in data validation, anomaly detection, or debugging using tools like Pandas, Polars, data.table/R Experience working with AWS or other More ❯
with big data technologies such as Hadoop, Spark, or Hive Familiarity with data warehousing and ETL tools such as Amazon Redshift, Google BigQuery, or ApacheAirflow Proficiency in Python and at least one other programming language such as Java, or Scala Willingness to mentor more junior members of More ❯
London, England, United Kingdom Hybrid / WFH Options
Acord (association For Cooperative Operations Research And Development)
or Java (Required) Bachelor's degree or equivalent in Computer Science, Mathematics or Finance-related field (Required) Knowledge of workflow management frameworks such as ApacheAirflow (Preferred) Knowledge of cloud computing infrastructure, such as AWS (Preferred) Knowledge of BI visualisation tools such as Looker or Power BI (Preferred More ❯