London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
in the office in average. In this role you will be working within a new, greenfield division of the business, using a brand-new technology stack including Snowflake, dbt, Airflow and AWS. This function provide data for Machine Learning and Artificial Intelligence capabilities, helping them to provide the best possible service offering to their customers. You'll work on … a strong financial backing, and the chance to make a real impact! We're looking for the following experience: Extensive hands-on experience with Snowflake Extensive experience with dbt, Airflow, AWS and Terraform Exellent scripting skills in SQL Experience developing solutions entirely from scratch Great communication skills, with the ability to understand and translate complex requirements into technical solutions More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
and analysing A/B tests Strong storytelling and stakeholder-management skills Full UK work authorization Desirable Familiarity with cloud data platforms (Snowflake, Redshift, BigQuery) and ETL tools (dbt, Airflow) Experience with loyalty-programme analytics or CRM platforms Knowledge of machine-learning frameworks (scikit-learn, TensorFlow) for customer scoring Technical Toolbox Data & modeling: SQL, Python/R, pandas, scikit … learn Dashboarding: Tableau or Power BI ETL & warehousing: dbt, Airflow, Snowflake/Redshift/BigQuery Experimentation: A/B testing platforms (Optimizely, VWO More ❯
London, England, United Kingdom Hybrid / WFH Options
Harnham
and analysing A/B tests Strong storytelling and stakeholder-management skills Full UK work authorization Desirable Familiarity with cloud data platforms (Snowflake, Redshift, BigQuery) and ETL tools (dbt, Airflow) Experience with loyalty-programme analytics or CRM platforms Knowledge of machine-learning frameworks (scikit-learn, TensorFlow) for customer scoring Technical Toolbox Data & modeling: SQL, Python/R, pandas, scikit … learn Dashboarding: Tableau or Power BI ETL & warehousing: dbt, Airflow, Snowflake/Redshift/BigQuery Experimentation: A/B testing platforms (Optimizely, VWO) Desired Skills and Experience 8+ years in retail/FMCG customer insights and analytics Built customer segmentation, CLV, and propensity models in Python/R Designed and analysed A/B and multivariate tests for pricing More ❯
Herndon, Virginia, United States Hybrid / WFH Options
Maxar Technologies Holdings Inc
and learning new technologies quickly. Preferred Qualifications: Experience with software development. Experience with geospatial data. Experience building data-streaming processes. Experience using PostGIS. Experience with any of the following: Apache-Hive, Trino, Presto, Starburst, OpenMetadata, Apache-SuperSet, Terraform, dbt, Tableau, Fivetran, Airflow. Experience implementing resilient, scalable, and supportable systems in AWS. Experience using a wide variety of open More ❯
bash. Ability to document code, architectures, and experiments. Preferred Qualifications Experience with databases and data warehousing (Hive, Iceberg). Data transformation skills (SQL, DBT). Experience with orchestration platforms (Airflow, Argo). Knowledge of data catalogs, metadata management, vector databases, relational/object databases. Experience with Kubernetes. Understanding of computational geometry (meshes, boundary representations). Ability to analyze data More ❯
by a proactive and self-motivated passion for data and technology. Nice, not-mandatory plus would be: Good understanding of building and maintaining data pipelines using tools such as Airflow, Docker Knowledge of data governance and data quality frameworks. Experience working in an Agile development environment. At JET, this is on the menu: Our teams forge connections internally and More ❯
Easter Howgate, Midlothian, United Kingdom Hybrid / WFH Options
Leonardo UK Ltd
methodologies and tools, including experience with CI/CD pipelines, containerisation, and workflow orchestration. Familiar with ETL/ELT frameworks, and experienced with Big Data Processing Tools (e.g. Spark, Airflow, Hive, etc.) Knowledge of programming languages (e.g. Java, Python, SQL) Hands-on experience with SQL/NoSQL database design Degree in STEM, or similar field; a Master's is More ❯
Product Owner or Product Owner for data/analytics platforms. Understanding of the software development lifecycle with a data-centric lens. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Proven experience translating data and analytics requirements into actionable backlog items. Knowledge of regulatory and compliance frameworks (e.g., GDPR, CCPA) as More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Motability Operations Limited
Product Owner or Product Owner for data/analytics platforms. Understanding of the software development lifecycle with a data-centric lens. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Proven experience translating data and analytics requirements into actionable backlog items. Knowledge of regulatory and compliance frameworks (e.g., GDPR, CCPA) as More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
Motability Operations
Product Owner or Product Owner for data/analytics platforms. Understanding of the software development lifecycle with a data-centric lens. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Proven experience translating data and analytics requirements into actionable backlog items. Knowledge of regulatory and compliance frameworks (e.g., GDPR, CCPA) as More ❯
Employment Type: Permanent, Part Time, Work From Home
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
Motability Operations
Product Owner or Product Owner for data/analytics platforms. Understanding of the software development lifecycle with a data-centric lens. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Proven experience translating data and analytics requirements into actionable backlog items. Knowledge of regulatory and compliance frameworks (e.g., GDPR, CCPA) as More ❯
Employment Type: Permanent, Part Time, Work From Home
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
Product Owner or Product Owner for data/analytics platforms. Understanding of the software development lifecycle with a data-centric lens. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Proven experience translating data and analytics requirements into actionable backlog items. Knowledge of regulatory and compliance frameworks (e.g., GDPR, CCPA) as More ❯
Employment Type: Permanent, Part Time, Work From Home
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
Product Owner or Product Owner for data/analytics platforms. Understanding of the software development lifecycle with a data-centric lens. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Proven experience translating data and analytics requirements into actionable backlog items. Knowledge of regulatory and compliance frameworks (e.g., GDPR, CCPA) as More ❯
Employment Type: Permanent, Part Time, Work From Home
in trust metrics or customer experience analysis Knowledge of dashboard design and data visualization best practices Experience with cloud-based data infrastructure (AWS) Familiarity with modern data stack tools (Airflow, dbt, etc.) Why This Role Matters Judge.me is at an inflection point. As the market leader in Shopify reviews, we've chosen to build our future with Shopify because More ❯
leading and managing technical teams, with excellent people development skills. Strong project management skills, with experience running complex data initiatives. Strong knowledge of modern data engineering, including SQL, Python, Airflow, Dataform/DBT, Terraform, or similar tools. Understanding of data architecture patterns (e.g., lakehouse, event-driven pipelines, star/snowflake schemas). Excellent communication and stakeholder management skills. Experience More ❯
Fairfax, Virginia, United States Hybrid / WFH Options
CGI
skills and ability to collaborate effectively with team members and stakeholders. Required qualifications to be successful in this role: Certifications in Google Cloud Platform. Experience with orchestration tools like Apache Airflow. Knowledge of machine learning and AI tools on GCP. CGI is required by law in some jurisdictions to include a reasonable estimate of the compensation range for this More ❯
in the face of many nuanced trade offs and varied opinions. Experience in a range of tools sets comparable with our own: Database technologies: SQL, Redshift, Postgres, DBT, Dask, airflow etc. AI Feature Development: LangChain, LangSmith, pandas, numpy, sci kit learn, scipy, hugging face, etc. Data visualization tools such as plotly, seaborn, streamlit etc You are Able to chart More ❯
West Bend, Wisconsin, United States Hybrid / WFH Options
Delta Defense
tools: Python, Snowflake, dbt, Fivetran, Kafka, Tableau, Git, Informatica, Kestra, Excel, and/or related technologies. Experience working with tech stack/tools: JS, PHP, PostgreSQL, Kubernetes, Kafka/Airflow, NeonDB, Cloudflare, Gitlab, Doppler, and/or related technologies. Experience securing applications built on cloud platforms (DigitalOcean, AWS, or GCP). Deep understanding of key industry frameworks and controls More ❯
systems (e.g. Git) Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the More ❯
systems (e.g. Git) Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
Starling Bank Limited
systems (e.g. Git) Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Starling Bank Limited
systems (e.g. Git) Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the More ❯
In this role, you will be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, Apache Spark, and ApacheAirflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and … significant impact, we encourage you to apply! Job Responsibilities ETL/ELT Pipeline Development: Design, develop, and optimize efficient and scalable ETL/ELT pipelines using Python, PySpark, and Apache Airflow. Implement batch and real-time data processing solutions using Apache Spark. Ensure data quality, governance, and security throughout the data lifecycle. Cloud Data Engineering: Manage and optimize … effectiveness. Implement and maintain CI/CD pipelines for data workflows to ensure smooth and reliable deployments. Big Data & Analytics: Develop and optimize large-scale data processing pipelines using Apache Spark and PySpark. Implement data partitioning, caching, and performance tuning techniques to enhance Spark-based workloads. Work with diverse data formats (structured and unstructured) to support advanced analytics and More ❯
In this role, you will be responsible for designing, building, and maintaining robust data pipelines and infrastructure on the Azure cloud platform. You will leverage your expertise in PySpark, Apache Spark, and ApacheAirflow to process and orchestrate large-scale data workloads, ensuring data quality, efficiency, and scalability. If you have a passion for data engineering and … desire to make a significant impact, we encourage you to apply! Job Responsibilities Data Engineering & Data Pipeline Development Design, develop, and optimize scalable DATA workflows using Python, PySpark, and Airflow Implement real-time and batch data processing using Spark Enforce best practices for data quality, governance, and security throughout the data lifecycle Ensure data availability, reliability and performance through … data processing workloads Implement CI/CD pipelines for data workflows to ensure smooth and reliable deployments. Big Data & Analytics: Build and optimize large-scale data processing pipelines using Apache Spark and PySpark Implement data partitioning, caching, and performance tuning for Spark-based workloads. Work with diverse data formats (structured and unstructured) to support advanced analytics and machine learning More ❯
in New York, NY Pay Range: $95,493 $190,962 Job Duties: Improve and extend software systems that speed up data product development by providing automation and abstraction. Maintain ApacheAirflow and Airbyte for creation of data pipelines and orchestration. Utilize container technologies such as Docker and Kubernetes. Develop and maintain a suite of internal tools to facilitate … equivalent, plus two (2) years of experience in the job offered or a related occupation. Requires 2 years of experience in each: Container technologies, e.g. Helm, Docker or Kubernetes. ApacheAirflow Observability of data platforms (Datadog, Splunk, AWS CloudWatch, or similar). Using AWS to provide cloud solutions for data platforms. Data pipelines, and orchestration tools. Working in More ❯