Greater London, England, United Kingdom Hybrid / WFH Options
Validis
CI/CD tools to streamline data pipeline development and deployment. Proven expertise in designing and implementing ETL pipelines using tools like Apache Airflow, Luigi, Spark, or similar frameworks. Strong understanding of data warehousing concepts and data modelling techniques. Experience with SQL and proficiency in writing complex queries (e.g., T more »
london, south east england, United Kingdom Hybrid / WFH Options
Validis
CI/CD tools to streamline data pipeline development and deployment. Proven expertise in designing and implementing ETL pipelines using tools like Apache Airflow, Luigi, Spark, or similar frameworks. Strong understanding of data warehousing concepts and data modelling techniques. Experience with SQL and proficiency in writing complex queries (e.g., T more »
Greater London, England, United Kingdom Hybrid / WFH Options
Validis
to leverage CI/CD tools to streamline data pipeline development and deployment. Experience designing and implementing ETL pipelines using tools like Apache Airflow, Luigi, Spark, or similar frameworks (familiarity is a plus). Understanding of data warehousing concepts and data modelling techniques. Experience with SQL and proficiency in writing more »
london, south east england, United Kingdom Hybrid / WFH Options
Validis
to leverage CI/CD tools to streamline data pipeline development and deployment. Experience designing and implementing ETL pipelines using tools like Apache Airflow, Luigi, Spark, or similar frameworks (familiarity is a plus). Understanding of data warehousing concepts and data modelling techniques. Experience with SQL and proficiency in writing more »
Qualifications * Designing and implementing real-time pipelines. * Designing and implementing data pipelines for CV/ML systems. * Experience with workflow management engines (i.e. Airflow, Luigi, Prefect, Dagster, Google Cloud Composer, AWS Step Functions, Azure Data Factory, UC4, Control-M). * Experience with data quality and validation. * Experience querying massive datasets more »
via machine learning engineering Hands-on knowledge of NoSQL and relational Databases, alongside relevant data modelling techniques ETL/ELT tooling Knowledge of DBT, Luigi or similar orchestration tooling of managing and developing a team in an agile environment Preferred Software development of a product/service provided as SaaS more »
via machine learning engineering Hands-on knowledge of NoSQL and relational Databases, alongside relevant data modelling techniques ETL/ELT tooling Knowledge of DBT, Luigi or similar orchestration tooling of managing and developing a team in an agile environment Preferred Software development of a product/service provided as SaaS more »
models, schemas to support business requirements Develop and maintain data ingestion and processing systems using various tools and technologies, such as SQL, NoSQL, ETL, Luigi, Airflow, Argo, etc. Implement data storage solutions using different types of databases, such as relational, non-relational, or cloud-based. Working collaboratively with the client … SQL/Azure SQL, PostgreSQL) You have framework experience within either Flask, Tornado or Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as Apache Spark, Hadoop, Kafka, etc. Data acquisition and development of data sets and improving data more »
london, south east england, United Kingdom Hybrid / WFH Options
Solirius Consulting
models, schemas to support business requirements Develop and maintain data ingestion and processing systems using various tools and technologies, such as SQL, NoSQL, ETL, Luigi, Airflow, Argo, etc. Implement data storage solutions using different types of databases, such as relational, non-relational, or cloud-based. Working collaboratively with the client … SQL/Azure SQL, PostgreSQL) You have framework experience within either Flask, Tornado or Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as Apache Spark, Hadoop, Kafka, etc. Data acquisition and development of data sets and improving data more »