processing and analytics Desired experience: Worked with Python 3.9+ Familiar with Python test automation Experience with SQL and Timeseries databases Familiar with Parquet, Arrow, Airflow, Databricks Experience with cloud AWS services, such as S3, EC2, RDS etc Quality engineering best practice and tooling including TDD, BDD This is an more »
proficiency GIT proficiency Linux use and admin Experience deploying cloud services (AWS is a bonus) Experience with Docker and Kubernetes Using frameworks such as AirFlow ML background - PyTorch for computer vision This is a fully remote role which comes with: Budget for WFH set up. Stock options. 25 days more »
financial services or energy trading industry Expertise in Python and its ecosystem of libraries and frameworks for data processing, data analysis and data visualisation Airflow, detailed understanding of architecture including schedulers, executers, operators Cloud Environments, understanding of principles, technologies and services for AWS/Azure Kubernetes EKS/AKS … including high availability Desired experience: Worked with Python 3.9+ Familiar with Python test automation Experience with SQL and Timeseries databases Familiar with Parquet, Arrow, Airflow, Databricks Experience with cloud AWS services, such as S3, EC2, RDS etc Quality engineering best practice and tooling including TDD, BDD This is an more »
Preston, Lancashire, United Kingdom Hybrid / WFH Options
Uniting Ambition
and deep knowledge in core processing and orchestration products such as Big Query, Data Flow, Data Fusion, Data Stream, Cloud Functions, Data Proc and Airflow/Composer. You will have held a leading role in a Data Engineering function with responsibility for the directing the efforts of other data more »
AWS ecosystems like Lambdas, step functions and ECS services. Experience of Dremio is a nice to have Experience with data stack technologies, such as Apache Iceberg & Spark. Exposure to ApacheAirflow, Prefect, Dagster, DBT. Expertise in data analysis with exposure to data services (such as Glue, Lake more »
in a Data Engineering role Strong SQL and Python development skills Hands-on experience with cloud-based data warehousing technologies (e.g., Snowflake, DBT, FiveTran, AirFlow) Effective communication skills for both technical and non-technical audiences Analytical mindset with attention to detail High energy, enthusiasm, and passion for learning in more »
Azure SQL Data Warehouse, or Amazon Redshift Support and learn the associated scheduling logic for data pipelines using scheduling tool such as Autosys or Airflow Read and translate logical data models and use ETL skills to load the physical layer, based on an understanding of timing of data loads more »
Mathematics, Finance, Accounting, Economics or a related field or equivalent work experience (3+ years) Experience in: Some knowledge of database orchestration technologies + ETL (Airflow, DBT, Databricks) Working understanding of financial concepts and systems Ability to recognize and diagnose potential errors or data inconsistencies between multiple reports Working knowledge more »
Proven experience in MLOps and deploying machine learning models on Kubernetes. Proficiency in cloud technologies, AWS, GCP, Azure Experience with data orchestration tools (e.g., ApacheAirflow). Familiarity with Terraform for CI/CD and infrastructure as code. Strong programming skills in software development. A cloud-agnostic mindset more »
testing, and maintenance of data pipelines and data storage systems on Google Cloud Platform (GCP). You will be working with technologies such as ApacheAirflow, BigQuery, Python, and SQL to transform and load large data sets, ensuring high data quality and accessibility for business intelligence and analytics more »
load data to Snowflake. Converting SAS based module to python based Candidates who have data management experience in Snowflake with expertise in Python, DBT, Airflow or similar technologies Must have hands on experience in DBT & SQL Snowflake (added advantage more »
Manchester Area, United Kingdom Hybrid / WFH Options
Maxwell Bond
successful Lead Data Engineer will have: Experience leading a Data Engineering team. Extensive working experience with GCP, SQL and DBT. Proficient in: Kafka, Dataform, Airflow, Tableau, PowerBI, Redshift, Snowflake, Terraform and BigQuery. What's in it for the successful Lead Data Engineer: Hybrid working for a better work/ more »
Greater London, England, United Kingdom Hybrid / WFH Options
Cera
an autonomous environment. 5+ years SQL experience developing data pipelines in a cloud based database environment (e.g.BigQuery), including scheduling your own queries (e.g. using Airflow or Stitch) 3+ years experience delivering data products in a modern BI technology (ideally Looker) or open source data frameworks Experience managing end-to more »
working on very complex systems Strong experience with Computer vision Longevity in their previous roles Experience with Remote Sensing highly desirable Stack: Python, PyTorch, AirFlow, PySpark (equivalent tools are fine more »
City Of London, England, United Kingdom Hybrid / WFH Options
Cititec Talent
a leading commodities trading firm. Outside of IR35 Hybrid working - 2/3 days in London office Experience Required: - Commodities industry experience Weather forecasting Airflow (or equivalent data orchestration platforms) Data Modeling (Star Schema) Data Warehousing Data Pipeline Orchestration (Kafka) On-Prem SQL more »
Experienced creating data pipelines on a cloud (preferably AWS) environment CI/CD experience Containerization experience (Docker, Kubernetes, etc.) Experience with SQS/SNS, Apache Kafka, RabbitMQ Other interesting/bonus skills – Airflow, Trino, Apache Iceburg, Postgres, MongoDB You *must* be eligible to work in your chosen more »
the above project of redesigning the Creditsafe platform into the cloud space. You will be expected to work with technologies such as Python, Linux, Airflow, AWS DynamoDB, S3, Glue, Athena, Redshift, lambda, API Gateway, Terraform, CI/CD. KEY DUTIES AND RESPONSIBILITIES You will actively contribute to the codebase … and participate in peer reviews. Design and build metadata driven, event based distributed data processing platform using technologies such as Python, Airflow, Redshift, DynamoDB, AWS Glue, S3. As an experienced Engineer, you will play a critical role in the design, development, and deployment of our business-critical system. You more »
processes, effective monitoring, and infrastructure-as-code using Terraform. Collaborate closely with our engineering teams to support the orchestration of our ETL pipelines using Airflow and manage our tech stack including Python, Next.js, Airflow, PostgreSQL MongoDB, Kafka and Apache Iceberg. Optimize infrastructure costs and develop strategies for more »
Job Title: Software Engineer (Data/Airflow) Client: Elite FinTech Firm Salary: Up to £130k + Bonus Location: London (Hybrid) Sells: Cutting-edge tech, ownership of multiple greenfield projects, no red tape, a friendly/collaborative environment, beautiful offices, personal projects on Fridays! An Elite FinTech Firm is looking … They are fully open to experience level and will find good fits for the best people Strong experience with Python or Rust Experience with Airflow Exposure to building ETL pipelines is a huge plus A desire to learn Rust Solid SQL knowledge Fantastic education Experience working in mission critical more »
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Set2Recruit
proficiency GIT proficiency Linux use and admin Experience deploying cloud services (AWS is a bonus) Experience with Docker and Kubernetes Using frameworks such as AirFlow ML background - PyTorch for computer vision This is a fully remote role which comes with: Budget for WFH set up. Stock options. 25 days more »