South East London, England, United Kingdom Hybrid / WFH Options
NatPower Marine
tools: Hadoop, Spark, Kafka, etc.· Experience with relational SQL and NoSQL databases, particularly Postgres.· Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.· Experience with AWS cloud services: EC2, EMR, RDS, Redshift.· Experience with multiple data architecture paradigms (relational, non-structured, streaming)· Knowledge of various data more »
hosted data platform that would be used by the entire firm. Tech Stack: Python/Java, Dremio, Snowflake, Iceberg, PySpark, Glue, EMR, dbt and Airflow/Dagster. Cloud: AWS, Lambdas, ECS services This role would focus on various areas of Data Engineering including: End to End ETL pipeline development … Implementing Data Curation, metadata management and data quality tooling. Requirements: Strong Python/Java Software Engineering skills Excellent AWS knowledge, ideally with exposure to Airflow, Glue, Iceberg and Snowflake Previous experience with Dremio, dbt, EMR or Dagster Good Computer Science fundamentals knowledge with strong knowledge of software and data more »
Greater London, England, United Kingdom Hybrid / WFH Options
CommuniTech Recruitment Group
AWS ecosystems like Lambdas, step functions and ECS services. Experience of Dremio is a nice to have Experience with data stack technologies, such as Apache Iceberg & Spark. Exposure to ApacheAirflow, Prefect, Dagster, DBT. Expertise in data analysis with exposure to data services (such as Glue, Lake more »
tooling that will empower the wider function. The core skillset: Object Oriented Python for building software & APIs (they also use Trino & Kong) Experience with Airflow or other orchestration tools (Hadoop, CloudComposer, Dagster etc) Demonstrable cloud experience developing in AWS (preferable), GCP, or Azure Experience of developing real-time streaming more »
Key technical experience: Ability to operate in a fast changing environment. Fluent in English Previous cloud based infrastructure experience, particularly with AWS. Experience using Airflow and dbt Expert SQL knowledge Solid understanding of Dimensional Data Modelling. Experience with at least one or more of these programming languages: Python, Scala …/Java Experience with distributed data and computing tools, mainly Apache Spark & Kafka Understanding of critical path approaches, how to iterate to build value, engaging with stakeholders actively at all stages. Able to deal with ambiguity and change. A self-starter who's able to work independently where necessary more »
schemas to support business requirements Develop and maintain data ingestion and processing systems using various tools and technologies, such as SQL, NoSQL, ETL, Luigi, Airflow, Argo, etc. Implement data storage solutions using different types of databases, such as relational, non-relational, or cloud-based. Working collaboratively with the client …/Azure SQL, PostgreSQL) You have framework experience within either Flask, Tornado or Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as Apache Spark, Hadoop, Kafka, etc. Data acquisition and development of data sets and improving more »
converting SAS-based modules to Python-based solutions. Strong understanding of data management principles and experience working with Snowflake. Proficiency in Python, DBT, and Airflow or similar technologies. Excellent problem-solving skills and ability to troubleshoot complex issues. Experience working in an Agile environment and collaborating with cross-functional more »
Machine Learning Engineer up to £95 000 London NEW Machine Learning Engineer Opportunity Available with a Leading Organisation! The Company: Are you an expert in Machine Learning? We are on the lookout for a Machine Learning Engineer to join an more »
have Terraform experience SQL & NoSQL experience Have built out Data Warehouses & built Data Pipelines Strong Databricks & Snowflake experience Docker, ECS, Kubernetes & Orchestration tools like Airflow or Step Functions are nice to have Contracts are running for 6 months initially, paying up to £450p/day (Outside IR35) and will more »
delivering key projects for clients across the UK, Northern Europe, and North America. Expertise in Google Cloud and modern data stack technologies (Fivetran, DBT, Airflow, BigQuery, etc). Lead and sometimes work within project teams. Manage multiple concurrent projects and meet client deadlines. What you need: Have hands-on … experience with Google Cloud. Ideally Google Cloud Certified Have experience working with a mixture of Fivetran, BT, Airflow, Cube, Rudderstack, Snowflake, and Looker. Possess outstanding analytical and problem-solving skills. Demonstrate proficiency in SQL/Python. Excel in consulting roles with the ability to manage multiple engagements – client-facing more »
Lead Machine Learning Engineer up to £105 000 London NEW Machine Learning Engineer Opportunity Available with a Leading Organisation! The Company: Are you an expert in Machine Learning? We are on the lookout for a Machine Learning Engineer to join more »
pandas, numpy, pyspark Good understanding of OOP, software design patterns, and SOLID principles Good experience in Docker Good experience in Linux Good experience in Airflow Good knowledge of cloud architecture Good experience in Terraform Expert experience with database systems (snowflake, sql, postgres etc.) Experience of micro-service development and more »
building APIs & SQLAlchmy for database interactions. Strong experience in cloud-based development (AWS). Proficiency with both Docker & Kubernetes for containerisation & orchestration. Understanding or Airflow & DAGs. Experience in building applications using Kafka. Solid OOP principles & design patterns. Permanent/Full-Time Employment. Hybrid working environment (2/3 days more »
Manchester Area, United Kingdom Hybrid / WFH Options
Maxwell Bond®
ensuring best practises, quality in data transformation and modelling. Essential experience with tech including; GCP, SQL and DBT. Preferably working experience with: Kafka, Dataform, Airflow, Tableau, PowerBI, Redshift, Snowflake, Terraform and BigQuery. This position does not offer Visa Sponsorship , please refrain from applying if you require sponsorship at any more »
to develop unit test cases. Help in backlog grooming. Key skills: Extensive experience in developing Bigdata pipelines in cloud using Bigdata technologies such as Apache Spark Expertise in performing complex data transformation using Spark SQL queries Experience in orchestrating data pipelines using ApacheAirflow Proficiency in Git more »
Westminster, Colorado, United States Hybrid / WFH Options
Maxar Technologies
Prior experience with CI/CD technologies such as Jenkins Prior experience with any of the following: Trino/Starburst, dbt (core or cloud), Apache Superset, OpenMetadata, ApacheAirflow, Tableau. Prior experience with RDS databases, or Postgres. Agile software development lifecycle experience These skills would be amazing more »
to develop unit test cases. Help in backlog grooming. Key skills: Extensive experience in developing Bigdata pipelines in cloud using Bigdata technologies such as Apache Spark Expertise in performing complex data transformation using Spark SQL queries Experience in orchestrating data pipelines using ApacheAirflow Proficiency in Git more »
Durham, County Durham, North East, United Kingdom Hybrid / WFH Options
Reed Technology
a data science team and managing complex projects. Expertise in machine learning, statistics, data management, and relevant technologies (e.g., Python, R, SQL, AWS SageMaker, ApacheAirflow, Dbt, AWS Kinesis). Strong communication skills with the ability to explain complex data concepts to a non-technical audience. Knowledge of more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
excellence in SQL, NoSQL, Blob,Delta Lake, and other enterprise scale data stores.Data Orchestration - Enterprise scale usage of technology such as Azure Data Factory, ApacheAirflow, Logic Apps, DBT, SnapLogic, Spark or similar tools.Software Tooling - GIT/GitHub, CI/CD, deployment tools like Octopus, Terraform infrastructure as more »
processing and analytics Desired experience: Worked with Python 3.9+ Familiar with Python test automation Experience with SQL and Timeseries databases Familiar with Parquet, Arrow, Airflow, Databricks Experience with cloud AWS services, such as S3, EC2, RDS etc Quality engineering best practice and tooling including TDD, BDD This is an more »
South East London, England, United Kingdom Hybrid / WFH Options
Burns Sheehan
data lake or warehouse.Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed.Modern tech stack - Python, AWS, Airflow and DBT Must haves:A team player, happy to work with several teams, this is key as you will be reporting directly to the more »
South East London, England, United Kingdom Hybrid / WFH Options
Hunter Bond
degree in Computer Science, or similar, and have 4-5 years minimum exposure to back end development in Python, and Kubernetes automation. Skills in Airflow are of extra interest.In this role, you will:Executes elite software solutions, help design, development, and technical troubleshootingCreate secure and high-quality software code more »
South East London, England, United Kingdom Hybrid / WFH Options
Durlston Partners
cloud-hosted data platform that would be used by the entire firm.Tech Stack: Python/Java, Dremio, Snowflake, Iceberg, PySpark, Glue, EMR, dbt and Airflow/Dagster.Cloud: AWS, Lambdas, ECS servicesThis role would focus on various areas of Data Engineering including:End to End ETL pipeline development and deployment … processes and toolsImplementing Data Curation, metadata management and data quality tooling.Requirements:Strong Python/Java Software Engineering skillsExcellent AWS knowledge, ideally with exposure to Airflow, Glue, Iceberg and SnowflakePrevious experience with Dremio, dbt, EMR or DagsterGood Computer Science fundamentals knowledge with strong knowledge of software and data architecture.If you more »
financial services or energy trading industry Expertise in Python and its ecosystem of libraries and frameworks for data processing, data analysis and data visualisation Airflow, detailed understanding of architecture including schedulers, executers, operators Cloud Environments, understanding of principles, technologies and services for AWS/Azure Kubernetes EKS/AKS … including high availability Desired experience: Worked with Python 3.9+ Familiar with Python test automation Experience with SQL and Timeseries databases Familiar with Parquet, Arrow, Airflow, Databricks Experience with cloud AWS services, such as S3, EC2, RDS etc Quality engineering best practice and tooling including TDD, BDD This is an more »
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Creditsafe
the above project of redesigning the Creditsafe platform into the cloud space. You will be expected to work with technologies such as Python, Linux, Airflow, AWS DynamoDB, S3, Glue, Athena, Redshift, lambda, API Gateway, Terraform, CI/CD.KEY DUTIES AND RESPONSIBILITIESYou will actively contribute to the codebase and participate … in peer reviews.Design and build metadata driven, event based distributed data processing platform using technologies such as Python, Airflow, Redshift, DynamoDB, AWS Glue, S3.As an experienced Engineer, you will play a critical role in the design, development, and deployment of our business-critical system.You will be building and scaling more »