and secure data handling Requirements: 5+ years in data engineering or Strong experience with Azure, Databricks, Microsoft Fabric, and Snowflake Proficiency in SQL, Python, and tools like dbt and Airflow Familiarity with DevOps practices in a data context Benefits: Work on impactful, enterprise-wide data projects Collaborate with architects, analysts, and data scientists Be part of a supportive, innovative More ❯
and data visualisation. BSc in Computer Science or a related discipline Good working knowledge of SQL Comfortable using Git for version control Exposure to workflow orchestration tools (e.g. Prefect, Airflow, Dagster) Experience with cloud data warehouses (Azure SQL, Snowflake) or dbt Basic familiarity with Docker and BI tools (Power BI, Tableau) Interest in shipping, financial markets, or commodities Package More ❯
Computer Science or a related discipline Solid Python programming skills Good working knowledge of SQL Comfortable using Git for version control Desirables: Exposure to workflow orchestration tools (e.g. Prefect, Airflow, Dagster) Experience with cloud data warehouses (Azure SQL, Snowflake) or dbt Basic familiarity with Docker and BI tools (Power BI, Tableau) Interest in shipping, financial markets, or commodities Package More ❯
analytics platforms. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Knowledge of GDPR, data ethics, and governance practices. Strong More ❯
code and building robust processes for scale. Nice-to-have Experience shipping at pace on AWS S3 environments. Experience working with Dagster orchestrator or similar data orchestration systems (e.g. Airflow, Databricks)". Software Engineering background. Benefits Competitive salary & equity options Unlimited holiday Benefits package Career development opportunities as the company scales Ownership of ambitious, mission-driven work with real More ❯
bash. Ability to document code, architectures, and experiments. Preferred Qualifications Experience with databases and data warehousing (Hive, Iceberg). Data transformation skills (SQL, DBT). Experience with orchestration platforms (Airflow, Argo). Knowledge of data catalogs, metadata management, vector databases, relational/object databases. Experience with Kubernetes. Understanding of computational geometry (meshes, boundary representations). Ability to analyze data More ❯
applications to the Cloud (AWS) We'd love to hear from you if you Have strong experience with Python & SQL Have experience developing data pipelines using dbt, Spark and Airflow Have experience Data modelling (building optimised and efficient data marts and warehouses in the cloud) Work with Infrastructure as code (Terraform) and containerising applications (Docker) Work with AWS, S3 More ❯
concepts to diverse audiences and collaborate effectively across teams. Bonus Points For: Familiarity with data visualization tools (e.g., Tableau, Power BI, Looker). Experience with specific orchestration tools (e.g., Airflow, dbt). Experience working in Agile/Scrum development methodologies. Experience with Big Data Technologies & Frameworks Join Us! This role can be based in either of our amazing offices More ❯
concepts to diverse audiences and collaborate effectively across teams. Bonus Points For: Familiarity with data visualization tools (e.g., Tableau, Power BI, Looker). Experience with specific orchestration tools (e.g., Airflow, dbt). Experience working in Agile/Scrum development methodologies. Experience with Big Data Technologies & Frameworks Join Us! This role can be based in either of our amazing offices More ❯
by a proactive and self-motivated passion for data and technology. Nice, not-mandatory plus would be: Good understanding of building and maintaining data pipelines using tools such as Airflow, Docker Knowledge of data governance and data quality frameworks. Experience working in an Agile development environment. At JET, this is on the menu: Our teams forge connections internally and More ❯
data into a data platform using Fivetran. Experience of developing BI dashboards using Power BI. Knowledge of security concepts relevant to Azure. Experience of workflow management tools such as Apache Airflow. Interested in the role? Complete the online application. We look forward to getting to know you. Discover more about LGT Wealth Management A message from our CEO Ben More ❯
Easter Howgate, Midlothian, United Kingdom Hybrid / WFH Options
Leonardo UK Ltd
methodologies and tools, including experience with CI/CD pipelines, containerisation, and workflow orchestration. Familiar with ETL/ELT frameworks, and experienced with Big Data Processing Tools (e.g. Spark, Airflow, Hive, etc.) Knowledge of programming languages (e.g. Java, Python, SQL) Hands-on experience with SQL/NoSQL database design Degree in STEM, or similar field; a Master's is More ❯
s Degree (Engineering/Computer Science preferred but not required); or equivalent experience required Deep proficiency in Python, SQL, Cloud Platforms AWS, GCP, Azure). Data Warehousing (Snowflake), Orchestration (Airflow, Rundeck), Streaming (Kafka) Continuous engagement with Data Science and Analytics colleagues to understand requirements for our data-assets and empower them with best possible data, to create high value More ❯
through coursework, Kaggle competitions, or personal data projects You've shown initiative in teaching yourself new technical tools or concepts beyond what was required - such as exploring BigQuery, dbt, Airflow, Docker, or other data engineering technologies on your own time Progression This is an initial six-month engagement. If you perform well, the expectation is that you'll move More ❯
Product Owner or Product Owner for data/analytics platforms. Understanding of the software development lifecycle with a data-centric lens. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Proven experience translating data and analytics requirements into actionable backlog items. Knowledge of regulatory and compliance frameworks (e.g., GDPR, CCPA) as More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Motability Operations Limited
Product Owner or Product Owner for data/analytics platforms. Understanding of the software development lifecycle with a data-centric lens. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Proven experience translating data and analytics requirements into actionable backlog items. Knowledge of regulatory and compliance frameworks (e.g., GDPR, CCPA) as More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
Motability Operations
Product Owner or Product Owner for data/analytics platforms. Understanding of the software development lifecycle with a data-centric lens. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Proven experience translating data and analytics requirements into actionable backlog items. Knowledge of regulatory and compliance frameworks (e.g., GDPR, CCPA) as More ❯
Employment Type: Permanent, Part Time, Work From Home
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
Product Owner or Product Owner for data/analytics platforms. Understanding of the software development lifecycle with a data-centric lens. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Proven experience translating data and analytics requirements into actionable backlog items. Knowledge of regulatory and compliance frameworks (e.g., GDPR, CCPA) as More ❯
Employment Type: Permanent, Part Time, Work From Home
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
Motability Operations
Product Owner or Product Owner for data/analytics platforms. Understanding of the software development lifecycle with a data-centric lens. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Proven experience translating data and analytics requirements into actionable backlog items. Knowledge of regulatory and compliance frameworks (e.g., GDPR, CCPA) as More ❯
Employment Type: Permanent, Part Time, Work From Home
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
Product Owner or Product Owner for data/analytics platforms. Understanding of the software development lifecycle with a data-centric lens. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Proven experience translating data and analytics requirements into actionable backlog items. Knowledge of regulatory and compliance frameworks (e.g., GDPR, CCPA) as More ❯
Employment Type: Permanent, Part Time, Work From Home
in trust metrics or customer experience analysis Knowledge of dashboard design and data visualization best practices Experience with cloud-based data infrastructure (AWS) Familiarity with modern data stack tools (Airflow, dbt, etc.) Why This Role Matters Judge.me is at an inflection point. As the market leader in Shopify reviews, we've chosen to build our future with Shopify because More ❯
such as Tableau, with a focus on optimizing underlying data structures for dashboard performance. Ingestion and orchestration tools: Skilled in using pipeline orchestration and data ingestion tools such as Airflow and Stitch, along with Python scripting for integrating diverse data sources. Large-scale data processing: Proficient with distributed query engines like AWS Athena or SparkSQL for working with datasets More ❯
leading and managing technical teams, with excellent people development skills. Strong project management skills, with experience running complex data initiatives. Strong knowledge of modern data engineering, including SQL, Python, Airflow, Dataform/DBT, Terraform, or similar tools. Understanding of data architecture patterns (e.g., lakehouse, event-driven pipelines, star/snowflake schemas). Excellent communication and stakeholder management skills. Experience More ❯
Fairfax, Virginia, United States Hybrid / WFH Options
CGI
skills and ability to collaborate effectively with team members and stakeholders. Required qualifications to be successful in this role: Certifications in Google Cloud Platform. Experience with orchestration tools like Apache Airflow. Knowledge of machine learning and AI tools on GCP. CGI is required by law in some jurisdictions to include a reasonable estimate of the compensation range for this More ❯
in the face of many nuanced trade offs and varied opinions. Experience in a range of tools sets comparable with our own: Database technologies: SQL, Redshift, Postgres, DBT, Dask, airflow etc. AI Feature Development: LangChain, LangSmith, pandas, numpy, sci kit learn, scipy, hugging face, etc. Data visualization tools such as plotly, seaborn, streamlit etc You are Able to chart More ❯