object-oriented design principles, and data structures Extensive experience in developing microservices using Java, Python Experience in distributed computing frameworks like - Hive/Hadoop, Apache Spark. Good experience in Test driven development and automating test cases using Java/Python Experience in SQL/NoSQL (Oracle, Cassandra) database design … following cloud services: AWS Elastic Beanstalk, EC2, S3, CloudFront, RDS, DynamoDB, VPC, Elastic Cache, Lambda Working experience with Terraform Experience in creating workflows for ApacheAirflow About Roku Roku pioneered streaming to the TV. We connect users to the streaming content they love, enable content publishers to build More ❯
with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
Searchability®
data modeling Proficient in data warehouse technologies (such as Amazon Redshift, Google BigQuery, or Snowflake) Hands-on experience with ETL tools and frameworks, including ApacheAirflow, Talend, or dbt Strong programming ability in Python or another data-focused language Knowledgeable about data management best practices, including governance, security More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Qodea
with big data technologies such as Hadoop, Spark, or Hive. Familiarity with data warehousing and ETL tools such as Amazon Redshift, Google BigQuery, or Apache Airflow. Proficiency in at least one programming language such as Python, Java, or Scala. Strong analytical and problem-solving skills with the ability to More ❯
communication and stakeholder management skills. Desirable: Experience working with large-scale retail datasets (e.g., POS, CRM, supply chain). Familiarity with tools like dbt, Airflow, or MLflow. Master’s or PhD in Data Science, Statistics, Computer Science, or related field. Benefits: Competitive salary and performance bonuses Flexible working options More ❯
Greater Manchester, England, United Kingdom Hybrid / WFH Options
ECOM
communication and stakeholder management skills. Desirable: Experience working with large-scale retail datasets (e.g., POS, CRM, supply chain). Familiarity with tools like dbt, Airflow, or MLflow. Master’s or PhD in Data Science, Statistics, Computer Science, or related field. Benefits: Competitive salary and performance bonuses Flexible working options More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
InterQuest Group (UK) Limited
communication and stakeholder management skills. Desirable: Experience working with large-scale retail datasets (e.g., POS, CRM, supply chain). Familiarity with tools like dbt, Airflow, or MLflow. Master's or PhD in Data Science, Statistics, Computer Science, or related field. Benefits: Competitive salary and performance bonuses Flexible working options More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
Harnham
engagement and performance Techstack: Python (for scripting and data wrangling) SQL (strong experience in querying and transforming data) Experience with building analytics data layers Airflow, dbt (for orchestration and transformation) Cloud data warehouses like BigQuery or Redshift Terraform, Kubernetes (for infrastructure as code and deployment More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
NLP PEOPLE
integration of ML solutions into existing workflows. Develop robust, production-quality software artifacts using Python along with large-scale data workflow orchestration platforms (e.g., Airflow). Leverage expertise in cloud computing platforms (AWS and Azure) to build and optimize AI infrastructure, using services like AWS Bedrock, S3, SageMaker, Azure More ❯
Altrincham, Greater Manchester, United Kingdom Hybrid / WFH Options
RP International
the infrastructure provisioning • Set up infrastructure security • Implementation of IAM policy and permissions for applications and GitHub • Changes of environment variables for applications and Airflow DAGS If you are interested please hit apply with your updated CV and we will arrange a call to further your application. More ❯
Leigh, Greater Manchester, United Kingdom Hybrid / WFH Options
RP International
the infrastructure provisioning • Set up infrastructure security • Implementation of IAM policy and permissions for applications and GitHub • Changes of environment variables for applications and Airflow DAGS If you are interested please hit apply with your updated CV and we will arrange a call to further your application. More ❯
Bury, Greater Manchester, United Kingdom Hybrid / WFH Options
RP International
the infrastructure provisioning • Set up infrastructure security • Implementation of IAM policy and permissions for applications and GitHub • Changes of environment variables for applications and Airflow DAGS If you are interested please hit apply with your updated CV and we will arrange a call to further your application. More ❯
Bolton, Greater Manchester, United Kingdom Hybrid / WFH Options
RP International
the infrastructure provisioning • Set up infrastructure security • Implementation of IAM policy and permissions for applications and GitHub • Changes of environment variables for applications and Airflow DAGS If you are interested please hit apply with your updated CV and we will arrange a call to further your application. More ❯
Ashton-Under-Lyne, Greater Manchester, UK Hybrid / WFH Options
RP International
the infrastructure provisioning • Set up infrastructure security • Implementation of IAM policy and permissions for applications and GitHub • Changes of environment variables for applications and Airflow DAGS If you are interested please hit apply with your updated CV and we will arrange a call to further your application. More ❯
Ashton-Under-Lyne, Greater Manchester, United Kingdom Hybrid / WFH Options
RP International
the infrastructure provisioning • Set up infrastructure security • Implementation of IAM policy and permissions for applications and GitHub • Changes of environment variables for applications and Airflow DAGS If you are interested please hit apply with your updated CV and we will arrange a call to further your application. More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
iO Associates
to architectural discussions and promote engineering best practices Technical Environment: Python, PyTorch or TensorFlow AWS (including SageMaker, S3, Lambda) or Azure ML Docker, Kubernetes, Airflow CI/CD tools (e.g. GitHub Actions, Jenkins) MLflow or similar frameworks Required Experience: Proven track record of deploying machine learning models to production More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Smart DCC
you be doing? Design and implement efficient ETL processes for data extraction, transformation, and loading. Build real-time data processing pipelines using platforms like Apache Kafka or cloud-native tools. Optimize batch processing workflows with tools like Apache Spark and Flink for scalable performance. Infrastructure Automation: Implement Infrastructure … Integrate cloud-based data services with data lakes and warehouses. Build and automate CI/CD pipelines with Jenkins, GitLab CI/CD, or Apache Airflow. Develop automated test suites for data pipelines, ensuring data quality and transformation integrity. Monitoring & Performance Optimization: Monitor data pipelines with tools like Prometheus More ❯
deployment. Key Responsibilities Design, build and maintain cloud-based data platform infrastructure (AWS, Azure, or GCP) Deploy and manage modern data tools (e.g. DBT, Airflow, Snowflake) Implement Infrastructure as Code using Terraform Automate deployment pipelines using CI/CD tools, preferably Azure Pipelines Ensure platform stability, scalability, and performance … Infrastructure as Code expertise using Terraform CI/CD experience (Azure Pipelines preferred) Docker and Linux tooling Exposure to modern data tools (e.g. DBT, Airflow, Snowflake, Redshift) Agile delivery environment experience More ❯
deployment. Key Responsibilities Design, build and maintain cloud-based data platform infrastructure (AWS, Azure, or GCP) Deploy and manage modern data tools (e.g. DBT, Airflow, Snowflake) Implement Infrastructure as Code using Terraform Automate deployment pipelines using CI/CD tools, preferably Azure Pipelines Ensure platform stability, scalability, and performance … Infrastructure as Code expertise using Terraform CI/CD experience (Azure Pipelines preferred) Docker and Linux tooling Exposure to modern data tools (e.g. DBT, Airflow, Snowflake, Redshift) Agile delivery environment experience More ❯