data processing, analysis, and visualization libraries Experienced with SQL and Timeseries databases Skilled in AWS services: S3, EC2, RDS Knowledgeable in ETL tools like Airflow Proficient in Git, CI/CD, testing tools, and documentation best practices Adheres to quality engineering practices including TDD and BDD Nice to have more »
testing, and maintenance of data pipelines and data storage systems on Google Cloud Platform (GCP). You will be working with technologies such as ApacheAirflow, BigQuery, Python, and SQL to transform and load large data sets, ensuring high data quality and accessibility for business intelligence and analytics more »
and IAM. Experience with containerization and orchestration tools, particularly Kubernetes. Proficiency in infrastructure as code tools such as Terraform, Ansible, or CloudFormation. Experience in ApacheAirflow, AWS Backup & S3 versioning Solid understanding of CI/CD concepts and experience implementing CI/CD pipelines using tools like Jenkins more »
load data to Snowflake. Converting SAS based module to python based Candidates who have data management experience in Snowflake with expertise in Python, DBT, Airflow or similar technologies Must have hands on experience in DBT & SQL Snowflake (added advantage more »
hands on and will require exposure to some essential tools and a high level of financial knowledge. Good exposure to AWS technologies is essential, Airflow would be desirable. Experience with batch processing is required. Experience working with a Linux environment. Scripting exposure within either Python, Bash or Shell scripting. more »
as thinking strategically and proactively identifying opportunities. You have experience collaborating with senior business stakeholders and finance teams. You have working knowledge of Python, Airflow, dbt, Bigquery, and Looker. Additional desirables: Experience in one or more finance domains, such as Financial Reporting, Treasury, Regulatory Reporting, Financial Planning & Analysis, Financial more »
Greater London, England, United Kingdom Hybrid / WFH Options
Cera
an autonomous environment. 5+ years SQL experience developing data pipelines in a cloud based database environment (e.g.BigQuery), including scheduling your own queries (e.g. using Airflow or Stitch) 3+ years experience delivering data products in a modern BI technology (ideally Looker) or open source data frameworks Experience managing end-to more »
City Of London, England, United Kingdom Hybrid / WFH Options
Cititec Talent
a leading commodities trading firm. Outside of IR35 Hybrid working - 2/3 days in London office Experience Required: - Commodities industry experience Weather forecasting Airflow (or equivalent data orchestration platforms) Data Modeling (Star Schema) Data Warehousing Data Pipeline Orchestration (Kafka) On-Prem SQL more »
and maintain scalable data architectures, including pipelines and cloud-based data warehouses. Tech: Python (NumPy, Pandas), SQL, ETL, Cloud (AWS, Azure or GCP), Snowflake, Airflow, BigQuery, PowerBI/Tableau Industry: Fintech, Maritime trading Immersum are supporting the growth of a specialist consultancy who solely specialise in the Maritime trading … developing and optimising ETL pipelines. Version Control: Experience with Git for code collaboration and change tracking. Data Pipeline Tools: Proficiency with tools such as Apache Airflow. Cloud Platforms: Familiarity with AWS, Azure, Snowflake, and GCP. Visualisation: Tableau or PowerBI Delivery Tools: Familiarity with agile backlogs, code repositories, automated builds more »
a production setting. Knowledge of developing real-time data stream systems (ideally Kafka). Proven track record in developing data systems using PySpark and Apache Spark for batch processing. Capable of managing data intake from various sources, including data streams, unstructured data, relational databases, and NoSQL databases. Extensive knowledge … data pipelines. Proficiency in Python programming (knowledge of Scala or Rust is a plus). Familiarity with ETL principles in contemporary data applications (Dagster, Airflow, Perfect). Familiarity with AWS services such as Glue, Redshift, Athena, and S3. Proficiency with Terraform, Kubernetes, and ArgoCD (expertise not required; cloud team more »
. Experience with SQL and query design on large, complex datasets. Experience with cloud and big-data tools and frameworks like Databricks/Spark, Airflow, Snowflake, etc. Expertise designing and developing with distributed data processing platforms like Databricks/Spark. Experience using ELT/ETL tools such as DBT … FiveTran, etc. Understanding of Agile Delivery best practice Good knowledge of the relevant technologies e.g. SQL, Oracle, PostgreSQL, Python, ETL pipelines, Airflow, Hadoop, Parquet. Strong problem-solving and analytical abilities. Ability to present solutions and limitations to non-IT business experts ABOUT YOU Integrity, respect, intellectually curious and an more »
Databricks • Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). • In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/… MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc. • Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. • Ability to be a SPOC for all technical discussions across industry groups. • Excellent design experience, with entrepreneurship skills to own and lead solutions more »
processes, effective monitoring, and infrastructure-as-code using Terraform. Collaborate closely with our engineering teams to support the orchestration of our ETL pipelines using Airflow and manage our tech stack including Python, Next.js, Airflow, PostgreSQL MongoDB, Kafka and Apache Iceberg. Optimize infrastructure costs and develop strategies for more »
Job Title: Software Engineer (Data/Airflow) Client: Elite FinTech Firm Salary: Up to £130k + Bonus Location: London (Hybrid) Sells: Cutting-edge tech, ownership of multiple greenfield projects, no red tape, a friendly/collaborative environment, beautiful offices, personal projects on Fridays! An Elite FinTech Firm is looking … They are fully open to experience level and will find good fits for the best people Strong experience with Python or Rust Experience with Airflow Exposure to building ETL pipelines is a huge plus A desire to learn Rust Solid SQL knowledge Fantastic education Experience working in mission critical more »
MACHINE LEARNING ENGINEER £75,000 HYBRID - London COMPANY: We are looking for a Machine Learning Engineer to join a leading Marketing Consultancy. They are looking to grow to be an even bigger player within the industry through Data scientists/ more »
Machine Learning Engineer up to £75 000 London NEW Machine Learning Engineer Opportunity Available with a Leading Organisation! The Company: Are you an expert in Machine Learning? We are on the lookout for a Machine Learning Engineer to join an more »
Lead Machine Learning Engineer up to £115 000 London NEW Machine Learning Engineer Opportunity Available with a Leading Organisation! The Company: Are you an expert in Machine Learning? We are on the lookout for a Machine Learning Engineer to join more »
implement the systems that require the highest data throughput in Java. We implement most of our long running services and analytics in C#.We use Airflow for workflow management, Kafka for data pipelines, Bitbucket for source control, Jenkins for continuous integration, ELK for logs, Grafana, Prometheus & InfluxDb for metrics, Docker more »
or warehouse. Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed. Modern tech stack - Python, AWS, Airflow and DBT Must haves: A team player, happy to work with several teams, this is key as you will be reporting directly to the more »
Data Engineer Experience working with AI frameworks and libraries (PyTorch) Confidence collaborating in a complex and cross-functional teams Strong skills with: AWS, Python, Airflow, Snowflake Interested ? Apply now or reach out to daisy@wearenumi.com for further details and a chat more »
Experience with Data Vault 2.0. Analytical mindset with attention to detail. Enthusiastic about learning and thriving in a dynamic setting. Familiarity with dbt, Snowflake, Airflow, FiveTran, and Terraform. Join and be part of a dynamic team driving innovation and excellence in data engineering more »
Greater London, England, United Kingdom Hybrid / WFH Options
Agora Talent
early-stage B2B SaaS experience involving client-facing projects • Experience in front-end development and competency in JavaScript • Knowledge of API development • Familiarity with Airflow, DBT, Databricks • Experience working with Enterprise Resource Planning (e.g. Oracle, SAP) and CRM systems. If this role sounds of interest, please apply using the more »
a leading commodities trading firm. Outside of IR35 Hybrid working - 2/3 days in London office Experience Required: - Commodities industry experience Weather forecasting Airflow (or equivalent data orchestration platforms) Data Modeling (Star Schema) Data Warehousing Data Pipeline Orchestration (Kafka) On-Prem SQL more »
Senior AWS Data Engineer - London (Hybrid) - Permanent - £65,000-£70,000 Media Agency, Snowflake, SQL, Python, dbt, Airflow, Marketing *You must be based in London, and have full permanent right to work in the UK to apply for this role* I'm currently working with a leading media agency … ASAP and I will be in touch. Senior AWS Data Engineer - London (Hybrid) - Permanent - £65,000-£70,000 Media Agency, Snowflake, SQL, Python, dbt, Airflow, Marketing more »