pipelines solutions for the ingestion, transformation, and serving of data, as well as solutions for the orchestration of pipeline components (e.g. AWS Step Functions, ApacheAirflow). Good understanding of data modelling, algorithm, and data transformation techniques to work with data platforms. Working knowledge of cloud development practices more »
existing systems and ingestion pipelines. Requirements: Proven experience working with Python or Java or C# Experience working with ELT/ELT technologies such as Airflow, Argo, Dagster, Spark, Hive Strong technical expertise, especially in data processing and exploration, with a willingness to learn new technologies. A passion for automation more »
Azure SQL Data Warehouse, or Amazon Redshift Support and learn the associated scheduling logic for data pipelines using scheduling tool such as Autosys or Airflow Read and translate logical data models and use ETL skills to load the physical layer, based on an understanding of timing of data loads more »
data processing, analysis, and visualization libraries Experienced with SQL and Timeseries databases Skilled in AWS services: S3, EC2, RDS Knowledgeable in ETL tools like Airflow Proficient in Git, CI/CD, testing tools, and documentation best practices Adheres to quality engineering practices including TDD and BDD Nice to have more »
testing, and maintenance of data pipelines and data storage systems on Google Cloud Platform (GCP). You will be working with technologies such as ApacheAirflow, BigQuery, Python, and SQL to transform and load large data sets, ensuring high data quality and accessibility for business intelligence and analytics more »
and IAM. Experience with containerization and orchestration tools, particularly Kubernetes. Proficiency in infrastructure as code tools such as Terraform, Ansible, or CloudFormation. Experience in ApacheAirflow, AWS Backup & S3 versioning Solid understanding of CI/CD concepts and experience implementing CI/CD pipelines using tools like Jenkins more »
load data to Snowflake. Converting SAS based module to python based Candidates who have data management experience in Snowflake with expertise in Python, DBT, Airflow or similar technologies Must have hands on experience in DBT & SQL Snowflake (added advantage more »
hands on and will require exposure to some essential tools and a high level of financial knowledge. Good exposure to AWS technologies is essential, Airflow would be desirable. Experience with batch processing is required. Experience working with a Linux environment. Scripting exposure within either Python, Bash or Shell scripting. more »
as thinking strategically and proactively identifying opportunities. You have experience collaborating with senior business stakeholders and finance teams. You have working knowledge of Python, Airflow, dbt, Bigquery, and Looker. Additional desirables: Experience in one or more finance domains, such as Financial Reporting, Treasury, Regulatory Reporting, Financial Planning & Analysis, Financial more »
City Of London, England, United Kingdom Hybrid / WFH Options
Cititec Talent
a leading commodities trading firm. Outside of IR35 Hybrid working - 2/3 days in London office Experience Required: - Commodities industry experience Weather forecasting Airflow (or equivalent data orchestration platforms) Data Modeling (Star Schema) Data Warehousing Data Pipeline Orchestration (Kafka) On-Prem SQL more »
a production setting. Knowledge of developing real-time data stream systems (ideally Kafka). Proven track record in developing data systems using PySpark and Apache Spark for batch processing. Capable of managing data intake from various sources, including data streams, unstructured data, relational databases, and NoSQL databases. Extensive knowledge … data pipelines. Proficiency in Python programming (knowledge of Scala or Rust is a plus). Familiarity with ETL principles in contemporary data applications (Dagster, Airflow, Perfect). Familiarity with AWS services such as Glue, Redshift, Athena, and S3. Proficiency with Terraform, Kubernetes, and ArgoCD (expertise not required; cloud team more »
. Experience with SQL and query design on large, complex datasets. Experience with cloud and big-data tools and frameworks like Databricks/Spark, Airflow, Snowflake, etc. Expertise designing and developing with distributed data processing platforms like Databricks/Spark. Experience using ELT/ETL tools such as DBT … FiveTran, etc. Understanding of Agile Delivery best practice Good knowledge of the relevant technologies e.g. SQL, Oracle, PostgreSQL, Python, ETL pipelines, Airflow, Hadoop, Parquet. Strong problem-solving and analytical abilities. Ability to present solutions and limitations to non-IT business experts ABOUT YOU Integrity, respect, intellectually curious and an more »
processes, effective monitoring, and infrastructure-as-code using Terraform. Collaborate closely with our engineering teams to support the orchestration of our ETL pipelines using Airflow and manage our tech stack including Python, Next.js, Airflow, PostgreSQL MongoDB, Kafka and Apache Iceberg. Optimize infrastructure costs and develop strategies for more »
London, England, United Kingdom Hybrid / WFH Options
Harnham
Machine Learning Engineer up to £75 000 London NEW Machine Learning Engineer Opportunity Available with a Leading Organisation! The Company: Are you an expert in Machine Learning? We are on the lookout for a Machine Learning Engineer to join an more »
MACHINE LEARNING ENGINEER £75,000 HYBRID - London COMPANY: We are looking for a Machine Learning Engineer to join a leading Marketing Consultancy. They are looking to grow to be an even bigger player within the industry through Data scientists/ more »
or warehouse. Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed. Modern tech stack - Python, AWS, Airflow and DBT Must haves: A team player, happy to work with several teams, this is key as you will be reporting directly to the more »
expertise in tools spanning: data warehousing, ETL, internal visualisation and analytics. Good examples are Snowflake, GCP, Azure Analytics, Sagemaker, Databricks, Tableau, PowerBI, Looker, Quicksight, Airflow, astronomer.io, Alteryx, Collibra.• Hands on experience and\or a detailed and deep understanding of the workflow and approaches of:o BI analyst for visualisation more »
Greater London, England, United Kingdom Hybrid / WFH Options
Agora Talent
early-stage B2B SaaS experience involving client-facing projects • Experience in front-end development and competency in JavaScript • Knowledge of API development • Familiarity with Airflow, DBT, Databricks • Experience working with Enterprise Resource Planning (e.g. Oracle, SAP) and CRM systems. If this role sounds of interest, please apply using the more »
non-routine issues and identify improvements in the testing and validation of data accuracy.Extensive experience with Snowflake is essential and working knowledge of dbt, Airflow and AWS is highly desirable.Strong background developing, constructing, testing, and maintaining practical data architectures and drive improvements in data reliability, efficiency, and quality.A proven more »
Data Engineer Experience working with AI frameworks and libraries (PyTorch) Confidence collaborating in a complex and cross-functional teams Strong skills with: AWS, Python, Airflow, Snowflake Interested ? Apply now or reach out to daisy@wearenumi.com for further details and a chat more »
Job BandPackage description Band: D Contract Permanent Salary : 65,000 to 85,000 depending on level of experience Location: London Our comprehensive benefits package includes: • An employer pension contribution of up to 10% • 26 days’ annual leave (based on full more »