designing and building robust, scalable, distributed data systems and pipelines, using open source and public cloud technologies. Strong experience with data orchestration tools: e.g. ApacheAirflow, Dagster. Experience with big data storage and processing technologies: e.g. DBT, Spark, SQL, Athena/Trino, Redshift, Snowflake, RDBMSs (PostgreSQL/MySQL … . Knowledge of event-driven architectures and streaming technologies: e.g. Apache Kafka, Kafka Streams, Apache Flink. Experience with public cloud environments: e.g. AWS, GCP, Azure, Terraform. Strong knowledge of software engineering practices: e.g. testing, CI/CD (Jenkins, Github Actions), agile development, git/version control, containers etc. more »
step functions and ECS services. Strong understanding of AWS ecosystems like Lambdas, step functions and ECS services. Experience with data stack technologies, such as Apache Iceberg & Spark. Exposure to ApacheAirflow, Prefect, Dagster, DBT. Expertise in data analysis with exposure to data services (such as Glue, Lake more »
Greater London, England, United Kingdom Hybrid / WFH Options
CommuniTech Recruitment Group
AWS ecosystems like Lambdas, step functions and ECS services. Experience of Dremio is a nice to have Experience with data stack technologies, such as Apache Iceberg & Spark. Exposure to ApacheAirflow, Prefect, Dagster, DBT. Expertise in data analysis with exposure to data services (such as Glue, Lake more »
Full Stack Python React Fixed Income Equities Multi-Asset Front Office Brokerage Trading Banking Buy Side Buy-Side AWS GCP React TypeScript Finance PostgreSQL Apache Ignite MaterialUI Ag-Grid Redux Front Office Trading Investment Management Asset Manager) required by our asset management client in London. You MUST have the … AWS) or Google Cloud Platform (GCP) Experience in a trading environment with a bank, broker, asset manager or hedge fund PostgreSQL Ag-Grid, Redux, Airflow, Dasher Role: Python Developer (Software Engineer Programmer Full Stack Python React Fixed Income Equities Multi-Asset Front Office Brokerage Trading Banking Buy Side Buy … Side AWS GCP React TypeScript Finance PostgreSQL Apache Ignite MaterialUI Ag-Grid Redux Front Office Trading Investment Management Asset Manager) required by our asset management client in London. The application allows the portfolio managers to generate 'what if' scenarios across their portfolios so that they can simulate market conditions. more »
Full Stack Python React Fixed Income Equities Multi-Asset Front Office Brokerage Trading Banking Buy Side Buy-Side AWS GCP React TypeScript Finance PostgreSQL Apache Ignite MaterialUI Ag-Grid Redux Front Office Trading Investment Management Asset Manager) required by our asset management client in London. You MUST have the … AWS) or Google Cloud Platform (GCP) Experience in a trading environment with a bank, broker, asset manager or hedge fund PostgreSQL Ag-Grid, Redux, Airflow, Dasher Role: Full-Stack Developer (Software Engineer Programmer Full Stack Python React Fixed Income Equities Multi-Asset Front Office Brokerage Trading Banking Buy Side … Buy-Side AWS GCP React TypeScript Finance PostgreSQL Apache Ignite MaterialUI Ag-Grid Redux Front Office Trading Investment Management Asset Manager) required by our asset management client in London. The application allows the portfolio managers to generate 'what if' scenarios across their portfolios so that they can simulate market more »
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Hypercube Consulting
Azure, GCP) Docker, Kubernetes and container services CI/CD, DevOps Additional experience with the following would be beneficial but not essential: Orchestration tools - ApacheAirflow/Prefect/Azure Data Factory Containers and related services (AKS, Container Registry) Other desirable skills and experience Ability to get stuck more »
City of London, London, United Kingdom Hybrid / WFH Options
GCS Ltd
harnessing diverse AWS services. Key Requirements: High level of experience in both SQL and Python programming (10+ years) Experience managing data engineering pipelines using ApacheAirflow Proficiency in CI/CD pipelines and automation Git proficiency for version control (branching strategies and repo management) Competent in monitoring tools more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
in SQL, NoSQL, Blob,Delta Lake, and other enterprise scale data stores. Data Orchestration - Enterprise scale usage of technology such as Azure Data Factory, ApacheAirflow, Logic Apps, DBT, SnapLogic, Spark or similar tools. Software Tooling - GIT/GitHub, CI/CD, deployment tools like Octopus, Terraform infrastructure more »
Degree in Computer Science, Engineering, Management Information Systems, Mathematics, a related field, or equivalent work experience (3+ years) Experience in: Database orchestration technologies, specifically Airflow and/or DBT Experience with streaming data architectures, specifically Kafka Knowledge of semi structured data: Parquet, Avro, JSONA deep understanding of AWS Cloud more »
ends (React, Redux, NodeJS, Webpack) • Strong understanding of AWS ecosystems like Lambdas, step functions and ECS services. • Experience with data stack technologies, such as Apache Iceberg & DBT. Preferred Skills • Experience on RDBMS like PostgreSQL would be a plus. Exposure to ApacheAirflow, Prefect, Dagster would be beneficial. more »
pandas, numpy, pyspark Good understanding of OOP, software design patterns, and SOLID principles Good experience in Docker Good experience in Linux Good experience in Airflow Good knowledge of cloud architecture Good experience in Terraform Expert experience with database systems (snowflake, sql, postgres etc.) Experience of micro-service development and more »
problem-solving and communication skills Proficiency in scripting for automating deployment and maintenance tasks. Understanding of DAG (Directed Acyclic Graph) models and experience with ApacheAirflow for managing complex data processing work-flows. Solid understanding of software development best practices, including version control (Git), testing, and code review more »
field (STEM) Technical proficiency in cloud-based data solutions (AWS, Azure or GCP), engineering languages including Python, SQL, Java, and pipeline management tools e.g., Apache Airflow. Familiarity with big data technologies, Hadoop, or Spark. If this opportunity is of interest, or you know anyone who would be interested in more »
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Creditsafe
creating ETL pipelines in python * Exposure to analytical data warehouses such as Snowflake, Redshift or BigQuery (Redshift Preferred) * Experience with data orchestrators such as Airflow, AWS Step Functions, AWS Batch * Knowledge of Agile development methodologies * Knowledge of automated delivery processes * Experience designing and building autonomous data pipelines BENEFITS Competitive more »
such as FiveTran or Stitch Experience using Business Intelligence tools such as PowerBI, Looker or Tableau Experience with data pipeline tools such as DBT, Airflow or Luigi are a plus! Experience using cloud environments e.g. Azure or AWS Understanding of the Agile delivery method Working Conditions: · Permanent, London Chiswick more »
Manchester, Greater Manchester, United Kingdom Hybrid / WFH Options
AutoTrader UK
robust and scalable web hosting and data platforms. Our platform is a layer on top of core Open Source technologies such as Kubernetes, Istio, Airflow, dbt, running in Public Cloud. It is the glue that allows our teams to deploy into production environments 100s of times per day with more »
Tech: - AWS - S3, Glue, EMR, Athena, Lambda - Snowflake, Redshift - DBT (Data Build Tool) - Programming: Python, Scala, Spark, PySpark or Ab Initio - Data pipeline orchestration (ApacheAirflow) - Knowledge of SQL This is a 6 month initial contract with a trusted client of ours. CVs are being presented on Friday more »
proficiency in SQL for data querying and transformation. ● Programming skills in Python, including experience with basic libraries like os, csv, and pandas. ● Experience with ApacheAirflow for workflow management. ● Experience with enterprise DBMS (e.g., DB2, MS SQL Server) and cloud data warehouses, particularly Google BigQuery. ● Proficiency in Google more »
really value their employees. As a testament to this, you’ll also receive an unrivalled benefits package. 🛠Tech: Snowflake, AWS (or Azure/GCP), Airflow, dbt 🌳Environment: Agile ✍️Process: 3 stages No CV? No problem. Email me at athomas@trg-uk.com, and let’s arrange a call more »
existing systems and ingestion pipelines. Requirements: Proven experience working with Python or Java or C# Experience working with ELT/ELT technologies such as Airflow, Argo, Dagster, Spark, Hive Strong technical expertise, especially in data processing and exploration, with a willingness to learn new technologies. A passion for automation more »
needed: Proficiency in AWS services relevant to data engineering such as S3, Glue, EMR, Athena, and Lambda. Experience with data pipeline orchestration tools like ApacheAirflow . Understanding of Datamodelling Principles and best practices. Hands-on experience with Snowflake, Redshift cloud data warehousing solutions. Familiarity with DBT (Data more »
data and AI models. Data Engineer Required Experience Data engineering experience (2+ years) Cloud platform proficiency (e.g., AWS, Azure, GCP) Data pipeline development (e.g., Airflow, Apache Spark) SQL proficiency, database design Visualization tools knowledge (e.g., Tableau, PowerBI, Looker) Data Engineer Application Process This is a 1 year contract more »
really value their employees. As a testament to this, you’ll also receive an unrivalled benefits package. Tech: Snowflake, AWS (or Azure/GCP), Airflow, dbt TC: £85,000 + bonus + up to 22% pension Process: 2 stages No CV? No problem. Email me at athomas@trg-uk.com more »
out on the front end! YOUR EXPERIENCE Python Cloud experience - AWS/GCP/Azure CI/CD Data modeling experience will be useful Airflow & DBT experience will be useful THE BENEFITS An education budget is available to learn and develop with the company Matched pension Travel budget in more »