processing and analytics Desired experience: Worked with Python 3.9+ Familiar with Python test automation Experience with SQL and Timeseries databases Familiar with Parquet, Arrow, Airflow, Databricks Experience with cloud AWS services, such as S3, EC2, RDS etc Quality engineering best practice and tooling including TDD, BDD This is an more »
extensive experience having designed and scaled a Data Platform- Has strong Python skills,- Has great SQL, preferably Snowflake- Has previous experience working with dbt & Airflow- Is passionate about solving complex data problems & is interested in working with rich & diverse climate datasets- Cares deeply about the climate and ecosystems of more »
financial services or energy trading industry Expertise in Python and its ecosystem of libraries and frameworks for data processing, data analysis and data visualisation Airflow, detailed understanding of architecture including schedulers, executers, operators Cloud Environments, understanding of principles, technologies and services for AWS/Azure Kubernetes EKS/AKS … including high availability Desired experience: Worked with Python 3.9+ Familiar with Python test automation Experience with SQL and Timeseries databases Familiar with Parquet, Arrow, Airflow, Databricks Experience with cloud AWS services, such as S3, EC2, RDS etc Quality engineering best practice and tooling including TDD, BDD This is an more »
Azure SQL Data Warehouse, or Amazon Redshift Support and learn the associated scheduling logic for data pipelines using scheduling tool such as Autosys or Airflow Read and translate logical data models and use ETL skills to load the physical layer, based on an understanding of timing of data loads more »
testing, and maintenance of data pipelines and data storage systems on Google Cloud Platform (GCP). You will be working with technologies such as ApacheAirflow, BigQuery, Python, and SQL to transform and load large data sets, ensuring high data quality and accessibility for business intelligence and analytics more »
load data to Snowflake. Converting SAS based module to python based Candidates who have data management experience in Snowflake with expertise in Python, DBT, Airflow or similar technologies Must have hands on experience in DBT & SQL Snowflake (added advantage more »
Greater London, England, United Kingdom Hybrid / WFH Options
Cera
an autonomous environment. 5+ years SQL experience developing data pipelines in a cloud based database environment (e.g.BigQuery), including scheduling your own queries (e.g. using Airflow or Stitch) 3+ years experience delivering data products in a modern BI technology (ideally Looker) or open source data frameworks Experience managing end-to more »
City Of London, England, United Kingdom Hybrid / WFH Options
Cititec Talent
a leading commodities trading firm. Outside of IR35 Hybrid working - 2/3 days in London office Experience Required: - Commodities industry experience Weather forecasting Airflow (or equivalent data orchestration platforms) Data Modeling (Star Schema) Data Warehousing Data Pipeline Orchestration (Kafka) On-Prem SQL more »
processes, effective monitoring, and infrastructure-as-code using Terraform. Collaborate closely with our engineering teams to support the orchestration of our ETL pipelines using Airflow and manage our tech stack including Python, Next.js, Airflow, PostgreSQL MongoDB, Kafka and Apache Iceberg. Optimize infrastructure costs and develop strategies for more »
Machine Learning Engineer up to £75 000 London NEW Machine Learning Engineer Opportunity Available with a Leading Organisation! The Company: Are you an expert in Machine Learning? We are on the lookout for a Machine Learning Engineer to join an more »
Lead Machine Learning Engineer up to £115 000 London NEW Machine Learning Engineer Opportunity Available with a Leading Organisation! The Company: Are you an expert in Machine Learning? We are on the lookout for a Machine Learning Engineer to join more »
or warehouse. Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed. Modern tech stack - Python, AWS, Airflow and DBT Must haves: A team player, happy to work with several teams, this is key as you will be reporting directly to the more »
in dbt.Skills in Python/R.Experience with Fivetran, Prefect, Snowflake, and Periscope.Familiarity with writing ETL pipelines using SQL and Python, and orchestration tools like Airflow or Prefect.Background in experimentation.Experience in fast-paced, venture-backed startup environments.At Fresha, we value passion and potential as much as specific skills. If you more »
Greater London, England, United Kingdom Hybrid / WFH Options
Agora Talent
early-stage B2B SaaS experience involving client-facing projects • Experience in front-end development and competency in JavaScript • Knowledge of API development • Familiarity with Airflow, DBT, Databricks • Experience working with Enterprise Resource Planning (e.g. Oracle, SAP) and CRM systems. If this role sounds of interest, please apply using the more »
company growth and profitability. OUR TECH STACK · Python · Scala · Kotlin · Spark · Google PubSub · Elasticsearch, Bigquery, PostgresQL FullCircl 3 Lead_Data_Engineer 04.24 · Kubernetes, Docker, Airflow KEY RESPONSIBILITIES · Designing and implementing scalable data pipelines using tools such as Apache Spark, Google PubSub etc. · Optimizing data storage and retrieval systems … Data Infrastructure projects, as well as designing and building data intensive applications and services. · Experience with data processing and distributed computing frameworks such as Apache Spark · Expert knowledge in one or more of the following languages - Python, Scala, Java, Kotlin · Deep knowledge of data modelling, data access, and data more »
Job BandPackage description Band: D Contract Permanent Salary : 65,000 to 85,000 depending on level of experience Location: London Our comprehensive benefits package includes: • An employer pension contribution of up to 10% • 26 days’ annual leave (based on full more »
Job Title: Data Engineer Job Type: Full Time, Permanent Working location: London, Hybrid Role Purpose At Travelex we are developing modern data services, which will be at the heart of our relationship with our customers. Our data architecture is becoming more »
knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc Strong proven knowledge of Kimball/Dimensional data modelling and/or Data vault If you are interested in applying for more »
Greater London, England, United Kingdom Hybrid / WFH Options
Validis
Proven ability to leverage CI/CD tools to streamline data pipeline development and deployment. Experience designing and implementing ETL pipelines using tools like ApacheAirflow, Luigi, Spark, or similar frameworks (familiarity is a plus). Understanding of data warehousing concepts and data modelling techniques. Experience with SQL more »