in Python with libraries like TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like ApacheAirflow, Spark, or Kafka. Strong understanding of SQL, NoSQL, and data modeling. Familiarity with cloud platforms (AWS, Azure, GCP) for deploying ML and data solutions. Knowledge of MLOps More ❯
MySQL, PostgreSQL, or Oracle Experience with big data technologies such as Hadoop, Spark, or Hive Familiarity with data warehousing and ETL tools such as Amazon Redshift, Google BigQuery, or ApacheAirflow Proficiency in Python and at least one other programming language such as Java, or Scala Willingness to mentor more junior members of the team Strong analytical and More ❯
join our Data Platform team, responsible for the storing and processing of most of our data. You will work with other Data Engineers using tools such as Python, SQL, Airflow and Prefect to ensure quality and collaborate with other teams across the business. What you’ll be working on: Implementing and supporting ETLs and Data Quality monitors Conducting research … Analysts and Technical Business Analysts You should apply if you have: 3+ years of data engineering, analytics or machine learning experience Advanced skills in Python and SQL Experience with Airflow, Prefect or other task orchestration tools Familiarity with modern data stack – you know the current trends and what tools to use for the job A proactive approach, with the … house data quality and cataloguing solutions Experience with documentation of system architecture Pandas, Jupyter, Plotly DBT, Kafka BI tools such as Tableau, Metabase and Superset The current tech stack: Airflow Clickhouse DBT Python MongoDB PostgreSQL MariaDB Kafka K8s AWS FXC Intelligence is a leading provider of cross-border payments data and intelligence, providing some of the world's biggest More ❯
ensure best practices and standards, maintaining our own infrastructure. The ideal candidate will have strong software development experience with Python and SQL, along with an interest in Docker, Kubernetes, Airflow, and AWS data technologies such as Athena, Redshift, and EMR. You will join a team of over 25 engineers across mobile, web, data, and platform disciplines, emphasizing attention to … from diverse sources Good cross-team communication skills Knowledge of CI/CD practices, unit testing tools Understanding of Agile Scrum lifecycle Responsibilities: Implement and maintain ETL pipelines using Airflow and AWS technologies Contribute to data-driven tools, including content personalization Manage ingestion frameworks and processes Monitor and maintain data infrastructure in AWS Support Business Analytics and Marketing teams More ❯
London, England, United Kingdom Hybrid / WFH Options
Talent Hero
and secure. Monitor system performance and troubleshoot issues to optimize data flows. Document processes, designs, and systems to facilitate knowledge sharing within the team. Using tools like SQL, Python, Airflow, Spark, AWS, Azure, GCP, DBT, Snowflake and more Requirements Minimum Bachelor's degree required in related field Proven experience as a Data Engineer or in a similar role Strong … Python for data manipulation Solid understanding of relational and non-relational databases Experience with data modelling , data warehousing , and ETL/ELT pipelines Familiarity with data orchestration tools (e.g., Airflow) Knowledge of cloud platforms such as AWS, Azure, or Google Cloud Bonus: Experience with data lakes , streaming pipelines , or big data tools Excellent problem-solving skills and keen attention More ❯
a focus on data quality and reliability. Infrastructure & Architecture Design and manage data storage solutions, including databases, warehouses, and lakes. Leverage cloud-native services and distributed processing tools (e.g., Apache Flink, AWS Batch) to support large-scale data workloads. Operations & Tooling Monitor, troubleshoot, and optimize data pipelines to ensure performance and cost efficiency. Implement data governance, access controls, and … ELT pipelines and data architectures. Hands-on expertise with cloud platforms (e.g., AWS) and cloud-native data services. Comfortable with big data tools and distributed processing frameworks such as Apache Flink or AWS Batch. Strong understanding of data governance, security, and best practices for data quality. Effective communicator with the ability to work across technical and non-technical teams. … Additional Strengths Experience with orchestration tools like Apache Airflow. Knowledge of real-time data processing and event-driven architectures. Familiarity with observability tools and anomaly detection for production systems. Exposure to data visualization platforms such as Tableau or Looker. Relevant cloud or data engineering certifications. What we offer: A collaborative and transparent company culture founded on Integrity, Innovation and More ❯
team are working on the core research data platform, as well as data infrastructure, ingestion pipelines and back end services for the aforementioned trading desks. Tech Stack: Python, ETL, Airflow, SQL, AWS Please apply if this is of interest. More ❯
team are working on the core research data platform, as well as data infrastructure, ingestion pipelines and back end services for the aforementioned trading desks. Tech Stack: Python, ETL, Airflow, SQL, AWS Please apply if this is of interest. More ❯
team are working on the core research data platform, as well as data infrastructure, ingestion pipelines and back end services for the aforementioned trading desks. Tech Stack: Python, ETL, Airflow, SQL, AWS Please apply if this is of interest. More ❯
London, England, United Kingdom Hybrid / WFH Options
Whitehall Resources Ltd
existing ETL processes from legacy systems into scalable cloud-native solutions • Contribute to the development and optimization of a cloud-based data platform, leveraging tools like snowflake, AWS and airflow • Work closely with data architects, analysts and other engineers to deliver high-quality, production-ready code • Participate in code reviews, ensuring adherence to best practices and high engineering standards … working knowledge and hands on experience working with Teradata and Infomatica. • Proficiency in working with legacy systems and traditional ETL workflows • Solid experience building data pipelines using modern tools (Airflow, DBT, Glue etc.) and working with large volumes of structures and semi-structured data • Demonstrated experience with SQL and Python for data manipulation, pipeline development and workflow orchestration • Strong More ❯
London, England, United Kingdom Hybrid / WFH Options
Ziff Davis
and standards, maintaining our infrastructure independently. The ideal candidate will have strong software development experience with Python and SQL, along with a keen interest in working with Docker, Kubernetes, Airflow, and AWS data technologies such as Athena, Redshift, and EMR. You will join a team of over 25 engineers across mobile, web, data, and platform domains, and should demonstrate … cross-team communication skills Familiarity with CI/CD practices and tools Understanding of Agile Scrum development lifecycle What you’ll be doing: Implementing and maintaining ETL pipelines using Airflow and AWS technologies Contributing to data-driven tools, including content personalization Managing ingestion frameworks and processes Monitoring and maintaining our data infrastructure in AWS Supporting Business Analytics and Marketing More ❯
ideal candidate will have good software development experience with Python coupled with strong SQL skills. In addition, you will also have a strong desire to work with Docker, Kubernetes, Airflow and the AWS data technologies such as Athena, Redshift, EMR and various other tools in the AWS ecosystem. You would be joining a team of 25+ engineers across mobile … skills Familiarity with continuous integration, unit testing tools and related practices Understanding of Agile Scrum software development lifecycle What you'll be doing: Implementing and maintaining ETL pipelines using Airflow & AWS technologies Contributing to data-driven tools owned by the data engineering team, including content personalisation Responsibility of ingestion framework and processes Helping monitor and look after our data More ❯
years+ experience in a revelant role. This will mean exposure to market data vendors, ability to communicate with traders and mentoring junior engineers. Tech stack: Deep Python, Pandas, AWS, Airflow, Kubernetes, ETL, SQL Please apply if this of interest More ❯
years+ experience in a revelant role. This will mean exposure to market data vendors, ability to communicate with traders and mentoring junior engineers. Tech stack: Deep Python, Pandas, AWS, Airflow, Kubernetes, ETL, SQL Please apply if this of interest More ❯
years+ experience in a revelant role. This will mean exposure to market data vendors, ability to communicate with traders and mentoring junior engineers. Tech stack: Deep Python, Pandas, AWS, Airflow, Kubernetes, ETL, SQL Please apply if this of interest More ❯
3+ years of relevant industry experience in a data engineering capacity Experience translating business needs to scalable data solutions. Experience building highly scalable data pipelines (batch and streaming) using AirFlow, Spark, EMR, Kafka, AWS Kinesis. Experience designing and developing solutions on AWS, including infrastructure as Code (e.g. Cloud Formation, Terraform, AWS CDK) Experience with AWS compute resources such as … principles - able to write elegant, scalable and maintainable code. Strong communication skills - you will be able to tailor your communication to technical and less technical audiences alike Experience with Airflow highly advantageous Experience with Snowflake highly advantageous Python expertise with object oriented programming knowledge. Experience using dashboarding tools such as Tableau, Superset, Domo or similar. Experience preparing large, complex More ❯
of relevant experience. The role offers exposure to market data vendors, requires excellent communication skills with traders, and involves mentoring junior engineers. Tech stack includes: Deep Python, Pandas, AWS, Airflow, Kubernetes, ETL, SQL. Please apply if this opportunity interests you! #J-18808-Ljbffr More ❯
of relevant experience. The role offers exposure to market data vendors, requires strong communication skills with traders, and involves mentoring junior engineers. Tech stack includes: Deep Python, Pandas, AWS, Airflow, Kubernetes, ETL, SQL. Please apply if this opportunity interests you! #J-18808-Ljbffr More ❯
London, England, United Kingdom Hybrid / WFH Options
Merantix
Linux systems and bash terminals Preferred Qualifications Hands-on experience with: Distributed computing frameworks, such as Ray Data and Spark. Databases and/or data warehousing technologies, such as Apache Hive. Data transformation via SQL and DBT. Orchestration platforms, such as Apache Airflow. Data catalogs and metadata management tools. o Vector data stores. Familiarity with: Data lake architectures More ❯
not only push the boundaries of data engineering and analytics but also contribute back to the open-source community through continuous innovation and solution development. Golang, Python, Java, dbt, Airflow, Kafka, Flink, Kubernetes, Terraform, Prometheus, Grafana, and more. What you’ll do: This role will allow you to master the three pillars of every organisation: Software Engineering, Infrastructure, and … deep technical expertise but also a proactive, engaging approach to working with others and building lasting partnerships. Big Data Technologies: Familiarity with tools such as Kafka, Flink, dbt, and Airflow, with a deep understanding of distributed computing and large-scale data processing systems. Nice to Have: Kubernetes Expertise: Experience with Kubernetes, Helm, ArgoCD, and related technologies. Cloud Platform Proficiency More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Radley James
Familiarity with Git, Docker, CI/CD pipelines, testing and monitoring Clear communicator, comfortable with cross-functional teams Desirable Experience APIs from major financial data providers dbt, Snowflake Kafka, Airflow Java feedhandler support Migration of legacy systems (e.g. MATLAB) This position offers a competitive compensation package and hybrid working model. More ❯
Familiarity with Git, Docker, CI/CD pipelines, testing and monitoring Clear communicator, comfortable with cross-functional teams Desirable Experience APIs from major financial data providers dbt, Snowflake Kafka, Airflow Java feedhandler support Migration of legacy systems (e.g. MATLAB) This position offers a competitive compensation package and hybrid working model. More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Radley James
Familiarity with Git, Docker, CI/CD pipelines, testing and monitoring Clear communicator, comfortable with cross-functional teams Desirable Experience APIs from major financial data providers dbt, Snowflake Kafka, Airflow Java feedhandler support Migration of legacy systems (e.g. MATLAB) This position offers a competitive compensation package and hybrid working model. More ❯
degree in Computer Science. -Python and SQL experience. -1-5 years experience in a data or software engineering role. -Familiarity with cloud/data warehousing. -Experience with Snowflake, Kafka, Airflow would be helpful. -Experience with financial data sets/vendors would be helpful. Seniority level Seniority level Associate Employment type Employment type Full-time Job function Job function Information More ❯
in a Hedge Fund, Trading Firm, and/or working with Quant Trading Technology Expertise in Python and SQL and familiarity with relational and time-series databases. Exposure to Airflow and dbt, as well as Snowflake, Databricks or other Cloud Data Warehouses preferred. Experience implementing data pipelines from major financial market data vendors (Bloomberg, Refinitiv, Factset .) SDLC and More ❯