SQL and experienced with various database technologies. Knowledge of Python or Java, with the ability to leverage either in building scalable solutions. Experience with Airflow & other big data technologies is useful. Familiarity with DevOps practices and tools, including CI/CD pipelines. Previous experience with reporting tools is helpful More ❯
london, south east england, united kingdom Hybrid / WFH Options
Saragossa
SQL and experienced with various database technologies. Knowledge of Python or Java, with the ability to leverage either in building scalable solutions. Experience with Airflow & other big data technologies is useful. Familiarity with DevOps practices and tools, including CI/CD pipelines. Previous experience with reporting tools is helpful More ❯
transformation, cleaning, and loading. ·Strong coding experience with Python and Pandas. ·Experience with any flavour of data pipeline and workflow management tools: Dagster, Celery, Airflow, etc. ·Build processes supporting data transformation, data structures, metadata, dependency and workload management. ·Experience supporting and working with cross-functional teams in a dynamic More ❯
london, south east england, united kingdom Hybrid / WFH Options
Datatech Analytics
transformation, cleaning, and loading. Strong coding experience with Python and Pandas. Experience with any flavour of data pipeline and workflow management tools: Dagster, Celery, Airflow, etc. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Experience supporting and working with cross-functional teams in a dynamic More ❯
Tech SQL & Python are your native languages, with a dash of Scala when needed. DBT, data modeling , and analytics are your go-to tools; Airflow is your daily companion. BigQuery/GCP hold no secrets for you, and AWS is a trusted friend. You know when to build real More ❯
Altrincham, Cheshire, North West, United Kingdom Hybrid / WFH Options
Chroma Recruitment Ltd
Strong proficiency in SQL and experience with database management systems (e.g., MySQL, PostgreSQL, MongoDB). Experience with data pipeline and workflow management tools (e.g., ApacheAirflow, Luigi). Experience working with Google Cloud Platform tools Proficiency in programming languages such as Python, Java, or Scala. Desirable skills: Experience More ❯
Chicago, Illinois, United States Hybrid / WFH Options
Jobot
engineering leadership role with Ruby Experience designing, building, and maintaining large-scale, complex software systems Strong experience with several or more of Ruby, Python, Airflow, SQL, JS, and web frameworks Experience in multiple areas of software development such as backend, frontend, infrastructure, database design, etc Mastery of software architecture More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Square One Resources
Job Type : Permanent Job Responsibilities/Objectives Design and manage data warehouses using SQL, NoSQL, and cloud platforms. Develop ETL/ELT pipelines using Airflow and dbt. Collaborate with stakeholders to deliver impactful solutions. Ensure data quality, security, and governance. Required Skills/Experience The ideal candidate will have More ❯
business needs into technical solutions using SQL and cloud tools. 🧠 What you’ll bring: Strong SQL skills Proven experience building and maintaining data pipelines (Airflow, DBT, etc.). Familiarity with cloud data platforms like Snowflake, BigQuery, or Redshift. Solid experience with BI tools like Looker, Tableau, or similar. Understanding More ❯
london, south east england, united kingdom Hybrid / WFH Options
ENI – Elizabeth Norman International
business needs into technical solutions using SQL and cloud tools. 🧠 What you’ll bring: Strong SQL skills Proven experience building and maintaining data pipelines (Airflow, DBT, etc.). Familiarity with cloud data platforms like Snowflake, BigQuery, or Redshift. Solid experience with BI tools like Looker, Tableau, or similar. Understanding More ❯
understanding of data architecture, data-modelling, and best practices in data engineering Proficient in Python and SQL; experience with data processing frameworks such as Airflow, TensorFlow, or Spark is advantageous Willingness to gain working knowledge of backend development (e.g., Python with Django) for pipeline integration Familiarity with data versioning More ❯
Databricks (PySpark, SQL, Delta Lake, Unity Catalog). You have extensive experience in ETL/ELT development and data pipeline orchestration (Databricks Workflows, DLT, Airflow, ADF, Glue, Step Functions). You're proficient in SQL and Python, using them to transform and optimize data. You know your way around More ❯
Oxford, Oxfordshire, United Kingdom Hybrid / WFH Options
Connect Centric LLC
to enhance scalability and efficiency. Collaboration & Leadership : Work closely with software and AI engineering teams while mentoring junior engineers. Legacy Workflow Integration : Manage ArgoCD, Airflow, Jenkins, Bitbucket, and Bamboo pipelines. Technical Ownership : Act as a tech owner for software products, liaising with stakeholders and presenting cloud solutions. Continuous Learning More ❯
Solid understanding of ETL/ELT processes, along with hands-on experience building and maintaining data pipelines using DBT, Snowflake, Python, SQL, Terraform and Airflow * Experience in designing and implementing data products and solutions on cloud-based architectures. * Cloud Platforms: Experience working with cloud data warehouses and analytics platforms More ❯
analysis, extraction, transformation, and loading, data intelligence, data security and proven experience in their technologies (e.g. Spark, cloud-based ETL services, Python, Kafka, SQL, Airflow) You have experience in assessing the relevant data quality issues based on data sources & uses cases, and can integrate the relevant data quality checks More ❯
data platform (PaaS) Ensure scalable storage solutions (data lakes, data warehouses) to handle structured and unstructured data. Implement ETL/ELT pipelines using Dagster, Airflow, or similar tools. Optimize performance and scalability for large data volumes. Govern data security, compliance, and access controls. Development & DevOps: Strong programming and scripting More ❯
development lifecycle and agile methodologies. Proven experience designing, developing, and deploying machine learning models. Experience with debugging ML models. Experience with orchestration frameworks (e.g. Airflow, MLFlow, etc). Experience deploying machine learning models to production environments. Knowledge of MLOps practices and tools for model monitoring and maintenance. Familiarity with More ❯
london, south east england, united kingdom Hybrid / WFH Options
Parser
data platform (PaaS) Ensure scalable storage solutions (data lakes, data warehouses) to handle structured and unstructured data. Implement ETL/ELT pipelines using Dagster, Airflow, or similar tools. Optimize performance and scalability for large data volumes. Govern data security, compliance, and access controls. Development & DevOps: Strong programming and scripting More ❯
High attention to detail. Strong project management skills, capable of independently delivering projects. Excellent client-facing communication skills. Nice to Have: Knowledge of dbt, Airflow, and CI/CD pipelines. Experience with dashboard solutions such as Looker, PowerBI, etc. Experience with marketing platforms and understanding of marketing data. Pre More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
N Brown Group
Cloud products (BigQuery, GCS, Cloud Composer, Dataflow etc.) or similar products from other cloud platforms Experience with open-source data-stack tools such as Airflow, Airbyte, DBT, Kafka etc. Experience with data visualisation tools such as PowerBI, Tableau and/or Looker Knowledge of Teradata, Mainframe and/or More ❯
experience analysis Knowledge of dashboard design and data visualization best practices Experience with cloud-based data infrastructure (AWS) Familiarity with modern data stack tools (Airflow, dbt, etc.) Why This Role Matters Judge.me is at an inflection point. As the market leader in Shopify reviews, we've chosen to build More ❯
Staines, Middlesex, United Kingdom Hybrid / WFH Options
Industrial and Financial Systems
in data pipelines across cloud/on-premises, using Azure and other technologies. Experienced in orchestrating data workflows and Kubernetes clusters on AKS using Airflow, Kubeflow, Argo, Dagster or similar. Skilled with data ingestion tools like Airbyte, Fivetran, etc. for diverse data sources. Expert in large-scale data processing More ❯
Arlington, Virginia, United States Hybrid / WFH Options
Full Visibility LLC
or similar) Proficiency in data parsing and transformation, handling structured and unstructured data Hands-on experience with ETL tools and data workflow orchestration (e.g., ApacheAirflow, Luigi, Prefect) Strong programming skills in Python, SQL, or Scala Experience with open-source data processing tools (e.g., Kafka, Spark, Flink, Hadoop More ❯