data modeling (star schema, snowflake schema). Version Control Practical experience with Git (branching, merging, pull requests). Preferred Qualifications (A Plus) Experience with a distributed computing framework like Apache Spark (using PySpark). Familiarity with cloud data services ( AWS S3/Redshift, Azure Data Lake/Synapse, or Google BigQuery/Cloud Storage ). Exposure to workflow orchestration … tools ( ApacheAirflow, Prefect, or Dagster ). Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. More ❯
data modeling (star schema, snowflake schema). Version Control Practical experience with Git (branching, merging, pull requests). Preferred Qualifications (A Plus) Experience with a distributed computing framework like Apache Spark (using PySpark). Familiarity with cloud data services ( AWS S3/Redshift, Azure Data Lake/Synapse, or Google BigQuery/Cloud Storage ). Exposure to workflow orchestration … tools ( ApacheAirflow, Prefect, or Dagster ). Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. More ❯
Role Within the Kingdom Work closely with stakeholders to understand their data needs and design scalable solutions Build, maintain and optimise data pipelines and models using SQL, Python and Airflow Design and develop BI and reporting products such as Looker models, dashboards and data visualisations Contribute to our data modelling standards and best practices to ensure quality, reliability and … Thrills Strong SQL skills, able to write complex and performant queries with ease. Solid experience in Python development for data workflows Experience building and maintaining ETL pipelines, ideally with ApacheAirflow or a similar orchestration tool Hands-on experience with Google Cloud Platform (BigQuery, GCS, etc.) or another major cloud provider Good understanding of data modelling principles and More ❯
platforms. Proven experience in a leadership or technical lead role, with official line management responsibility. Strong experience with modern data stack technologies, including Python, Snowflake, AWS (S3, EC2, Terraform), Airflow, dbt, Apache Spark, Apache Iceberg, and Postgres. Skilled in balancing technical excellence with business priorities in a fast-paced environment. Strong communication and stakeholder management skills, able More ❯
and cloud data platforms (AWS, Azure, or GCP). Proficiency in Python and SQL for data manipulation and transformation. Experience with ETL/ELT development and orchestration tools (e.g., ApacheAirflow, dbt, Prefect). Knowledge of data modelling, data warehousing, and lakehouse architectures. Familiarity with DevOps practices, CI/CD pipelines, and infrastructure-as-code. Strong problem-solving More ❯
and cloud data platforms (AWS, Azure, or GCP). Proficiency in Python and SQL for data manipulation and transformation. Experience with ETL/ELT development and orchestration tools (e.g., ApacheAirflow, dbt, Prefect). Knowledge of data modelling, data warehousing, and lakehouse architectures. Familiarity with DevOps practices, CI/CD pipelines, and infrastructure-as-code. Strong problem-solving More ❯
deploy solutions in cloud and on-prem environments. Required Skills & Experience Strong proficiency in Python , including libraries such as Pandas, NumPy, and PySpark. Experience with data engineering tools (e.g., Airflow, Kafka, SQL, Parquet). Solid understanding of commodities markets , trading workflows, and financial instruments. Familiarity with cloud platforms (AWS, Azure, or GCP) and containerization (Docker, Kubernetes). Proven ability More ❯
deploy solutions in cloud and on-prem environments. Required Skills & Experience Strong proficiency in Python , including libraries such as Pandas, NumPy, and PySpark. Experience with data engineering tools (e.g., Airflow, Kafka, SQL, Parquet). Solid understanding of commodities markets , trading workflows, and financial instruments. Familiarity with cloud platforms (AWS, Azure, or GCP) and containerization (Docker, Kubernetes). Proven ability More ❯
Software Python Development. 3-5 years experience in software engineering Experience with libraries/frameworks such as Pandas, Numpy, Scipy, etc. Skilled in data pipeline orchestration management libraries (e.g., Airflow, Prefect). Experience with cloud infrastructure (AWS, GCP, Azure). DevOps skills (CI/CD, containerisation). Familiarity with Plotly Dash for interactive data visualisations. You will play a More ❯
Software Python Development. 3-5 years experience in software engineering Experience with libraries/frameworks such as Pandas, Numpy, Scipy, etc. Skilled in data pipeline orchestration management libraries (e.g., Airflow, Prefect). Experience with cloud infrastructure (AWS, GCP, Azure). DevOps skills (CI/CD, containerisation). Familiarity with Plotly Dash for interactive data visualisations. You will play a More ❯
markets data (e.g., equities, fixed income, derivatives Hands-on experience with cloud platforms (AWS, Azure, or GCP Knowledge of data governance frameworks and metadata management. Experience with Kafka, Spark, Airflow, or similar tools for data orchestration. Qualifications Bachelor's or Masters degree in Computer Science, Finance, or related field. 5 to 7 years of experience in data engineering, preferably More ❯
or equivalent. Strong understanding of data modeling concepts (star/snowflake schema, normalization). Preferred Qualifications Experience with cloud platforms (AWS, Azure, or GCP). Knowledge of Python or Airflow for orchestration and automation. Exposure to CI/CD pipelines and version control (Git). Experience with data governance and data quality frameworks. Soft Skills Strong analytical and problem More ❯
or internship experience within financial services or technology. Exposure to Java. Experience managing on-premise or hybrid data infrastructure (e.g. Hadoop). Knowledge of workflow orchestration tools such as Apache Airflow. Postgraduate degree in Computer Science, Data Science, or related field. Benefits Comprehensive health, dental, and vision coverage Flexible approach to time off and sick leave Discretionary bonus More ❯
or internship experience within financial services or technology. Exposure to Java. Experience managing on-premise or hybrid data infrastructure (e.g. Hadoop). Knowledge of workflow orchestration tools such as Apache Airflow. Postgraduate degree in Computer Science, Data Science, or related field. Benefits Comprehensive health, dental, and vision coverage Flexible approach to time off and sick leave Discretionary bonus More ❯
tuning. Experience with designing and programming relational database such as MySQL, RedShift, Oracle SQL Server, or Postgres. Experience with AWS based system architecture covering S3, EKS, EC2, Batch, or Airflow etc. Experience with caching and messaging technologies such as, Redis, Hazelcast, MQ, or Kafka etc. Experience with programming within a CICD pipeline such as Git, Jenkins etc. Strong problem More ❯
a distinct advantage. 5+ years of intensive experience as a Data Engineer or in a similar role, with a demonstrable track record of leading large-scale projects. Familiarity with Airflow, Dagster or similar data orchestration frameworks Strong understanding of RESTful APIs as well as experience working with both synchronous and asynchronous endpoints Experience with Snowflake or Redshift with a More ❯
a distinct advantage. 5+ years of intensive experience as a Data Engineer or in a similar role, with a demonstrable track record of leading large-scale projects. Familiarity with Airflow, Dagster or similar data orchestration frameworks Strong understanding of RESTful APIs as well as experience working with both synchronous and asynchronous endpoints Experience with Snowflake or Redshift with a More ❯
in Snowflake , SQL , and cloud data platforms (AWS, Azure, or GCP). Proficiency in Python for data transformation and automation. Experience with ELT development and orchestration tools (e.g. dbt, Airflow, Prefect ). Knowledge of data modelling , data warehousing , and modern analytics architectures . Familiarity with DevOps practices , CI/CD pipelines, and infrastructure-as-code. Strong problem-solving skills More ❯
in Snowflake , SQL , and cloud data platforms (AWS, Azure, or GCP). Proficiency in Python for data transformation and automation. Experience with ELT development and orchestration tools (e.g. dbt, Airflow, Prefect ). Knowledge of data modelling , data warehousing , and modern analytics architectures . Familiarity with DevOps practices , CI/CD pipelines, and infrastructure-as-code. Strong problem-solving skills More ❯
in Snowflake , SQL , and cloud data platforms (AWS, Azure, or GCP). Proficiency in Python for data transformation and automation. Experience with ELT development and orchestration tools (e.g. dbt, Airflow, Prefect ). Knowledge of data modelling , data warehousing , and modern analytics architectures . Familiarity with DevOps practices , CI/CD pipelines, and infrastructure-as-code. Strong problem-solving skills More ❯
or Tableau , including data modeling and performance optimization. Advanced SQL skills, data modeling, and proficiency with dbt Core for modular SQL transformations and testing. Experience orchestrating pipelines with Dagster, Airflow, or similar tools. Familiarity with Python for data manipulation, orchestration, and automation. Experience deploying and managing data workloads on Kubernetes (AKS preferred). Experience working within a DevOps or More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
POWWR
or Tableau , including data modeling and performance optimization. Advanced SQL skills, data modeling, and proficiency with dbt Core for modular SQL transformations and testing. Experience orchestrating pipelines with Dagster, Airflow, or similar tools. Familiarity with Python for data manipulation, orchestration, and automation. Experience deploying and managing data workloads on Kubernetes (AKS preferred). Experience working within a DevOps or More ❯
Deep understanding of data wareho using concepts, ETL/ELT pipelines and dimensional modelling Proficiency in advanced programming languages (Python/PySpark, SQL) Experience in data pipeline orchestration (e.g. Airflow, Data Factory) Familiarity with DevOps and CI/CD practices (Git, Azure DevOps etc) Ability to communicate technical concepts to both technical and non-technical audiences Proven experience in More ❯