required: Python SQL Kubernetes CI/CD experience with relevant tooling with Jenkins, Docker or Terraform Cloud services experience with AWS/Azure Ideally: Airflow Java Experience working with front office trading systems and financial market data For more information on this role or any other contract/permanent more »
processing and analytics Desired experience: Worked with Python 3.9+ Familiar with Python test automation Experience with SQL and Timeseries databases Familiar with Parquet, Arrow, Airflow, Databricks Experience with cloud AWS services, such as S3, EC2, RDS etc Quality engineering best practice and tooling including TDD, BDD This is an more »
or warehouse. Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed. Modern tech stack - Python, AWS, Airflow and DBT Must haves: A team player, happy to work with several teams, this is key as you will be reporting directly to the more »
building APIs & SQLAlchmy for database interactions. Strong experience in cloud-based development (AWS). Proficiency with both Docker & Kubernetes for containerisation & orchestration. Understanding or Airflow & DAGs. Experience in building applications using Kafka. Solid OOP principles & design patterns. Permanent/Full-Time Employment. Hybrid working environment (2/3 days more »
existing systems and ingestion pipelines. Requirements: Proven experience working with Python or Java or C# Experience working with ELT/ELT technologies such as Airflow, Argo, Dagster, Spark, Hive Strong technical expertise, especially in data processing and exploration, with a willingness to learn new technologies. A passion for automation more »
quality and identify areas for improvement to implement practical solutions. Key Requirements Background in Python Development from an engineering or development environment Experience with Airflow, Cloud (AWS) and Pandas more »
are travelling and how to get there, they just need your help to do so. This will suit someone with experience of Python, DBT, Airflow, AWS, Snowflake, Terraform and Kubernetes. Both Engineering and DevOps knowledge and or experience would be helpful. Expect in the next 12 - 24 months this more »
data processing, analysis, and visualization libraries Experienced with SQL and Timeseries databases Skilled in AWS services: S3, EC2, RDS Knowledgeable in ETL tools like Airflow Proficient in Git, CI/CD, testing tools, and documentation best practices Adheres to quality engineering practices including TDD and BDD Nice to have more »
converting SAS-based modules to Python-based solutions. Strong understanding of data management principles and experience working with Snowflake. Proficiency in Python, DBT, and Airflow or similar technologies. Excellent problem-solving skills and ability to troubleshoot complex issues. Experience working in an Agile environment and collaborating with cross-functional more »
an exciting opportunity with one of our multinational retail clients. Skills & Experience: - Experience working within a global organisation - Google Cloud Platform (GCP) - Background using Airflow/Cloud Composer with Python - Cloud-Based data platforms, Snowflake or BigQuery - Advanced SQL - Data Transformation tools, DBT - CI/CD - TDD If you more »
range of databases. Snowflake is widely used, as are Docker and Kubernetes for containerisation. ETL and ELT tech are also used every day, primarily Airflow, Spark, Hive and a lot more. You’ll need to come from a strong academic background with some commercial experience in a data heavy more »
of SQL with vast amount of experience developing data pipelines in a cloud based database environment (e.g.BigQuery), including scheduling your own queries (e.g. using Airflow or Stitch) Demonstrated experience delivering data products in a modern BI technology (ideally Looker) or open source data frameworks Experience using programming languages (e.g. more »
testing, and maintenance of data pipelines and data storage systems on Google Cloud Platform (GCP). You will be working with technologies such as ApacheAirflow, BigQuery, Python, and SQL to transform and load large data sets, ensuring high data quality and accessibility for business intelligence and analytics more »
Agile software development and system architecture within the Telco OSS domain, with preferred experience in Network GIS (Hexagon, IQ Geo) and workflow tooling (Appian, ApacheAirflow). Strong understanding of platform and product dynamics, including Platform Engineering and its relevance to OSS. Extensive background in DevOps practices, encompassing more »
and IAM. Experience with containerization and orchestration tools, particularly Kubernetes. Proficiency in infrastructure as code tools such as Terraform, Ansible, or CloudFormation. Experience in ApacheAirflow, AWS Backup & S3 versioning Solid understanding of CI/CD concepts and experience implementing CI/CD pipelines using tools like Jenkins more »
and IAM. Experience with containerization and orchestration tools, particularly Kubernetes. Proficiency in infrastructure as code tools such as Terraform, Ansible, or CloudFormation. Experience in ApacheAirflow, AWS Backup & S3 versioning Solid understanding of CI/CD concepts and experience implementing CI/CD pipelines using tools like Jenkins more »
has multiple years of experience using Snowflake as a data tool and can hit the ground running Experience Snowflake Financial services Cloud (Ideally azure) Airflow would be an advantage more »
team ASAP. Capital Markets experience is a must for this. This is a hybrid role, London based and 2 days per week onsite. Python Airflow Pyspark SQL Jenkins/Docker Kubernetes AWS Please apply via the link below or email me on more »
have Terraform experience SQL & NoSQL experience Have built out Data Warehouses & built Data Pipelines Strong Databricks & Snowflake experience Docker, ECS, Kubernetes & Orchestration tools like Airflow or Step Functions are nice to have Contracts are running for 6 months initially, paying up to £450p/day (Outside IR35) and will more »
load data to Snowflake. Converting SAS based module to python based Candidates who have data management experience in Snowflake with expertise in Python, DBT, Airflow or similar technologies Must have hands on experience in DBT & SQL Snowflake (added advantage more »
Data Engineer Experience working with AI frameworks and libraries (PyTorch) Confidence collaborating in a complex and cross-functional teams Strong skills with: AWS, Python, Airflow, Snowflake Interested ? Apply now or reach out to daisy@wearenumi.com for further details and a chat more »
hands on and will require exposure to some essential tools and a high level of financial knowledge. Good exposure to AWS technologies is essential, Airflow would be desirable. Experience with batch processing is required. Experience working with a Linux environment. Scripting exposure within either Python, Bash or Shell scripting. more »
Experience with Data Vault 2.0. Analytical mindset with attention to detail. Enthusiastic about learning and thriving in a dynamic setting. Familiarity with dbt, Snowflake, Airflow, FiveTran, and Terraform. Join and be part of a dynamic team driving innovation and excellence in data engineering more »
Experience with Data Vault 2.0. Analytical mindset with attention to detail. Enthusiastic about learning and thriving in a dynamic setting. Familiarity with dbt, Snowflake, Airflow, FiveTran, and Terraform. Join and be part of a dynamic team driving innovation and excellence in data engineering more »