Deep understanding in software architecture, object-oriented design principles, and data structures Extensive experience in developing microservices using Java, Python Experience in distributed computing frameworks like - Hive/Hadoop, Apache Spark. Good experience in Test driven development and automating test cases using Java/Python Experience in SQL/NoSQL (Oracle, Cassandra) database design Demonstrated ability to be proactive … HR related applications Experience with following cloud services: AWS Elastic Beanstalk, EC2, S3, CloudFront, RDS, DynamoDB, VPC, Elastic Cache, Lambda Working experience with Terraform Experience in creating workflows for ApacheAirflow About Roku Roku pioneered streaming to the TV. We connect users to the streaming content they love, enable content publishers to build and monetize large audiences, and More ❯
Glue Catalog, and AWS Glue Databrew. They are experienced in developing batch and real-time data pipelines for Data Warehouse and Datalake, utilizing AWS Kinesis and Managed Streaming for Apache Kafka. They are also proficient in using open source technologies like ApacheAirflow and dbt, Spark/Python or Spark/Scala on AWS Platform. The data More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Starling Bank Limited
systems (e.g. Git) Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the More ❯
London, Manchester, North West Hybrid / WFH Options
Starling Bank
systems (e.g. Git) Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Starling Bank Limited
systems (e.g. Git) Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the More ❯
best practices. Automation & Monitoring: Implement and support automated monitoring systems to detect data anomalies, system failures, and performance issues and leverage advanced scripting and orchestration tools (e.g., Python, Bash, ApacheAirflow) to automate workflows and reduce operational overhead. Root Cause Analysis & Incident Management: Lead post-incident reviews, perform root cause analysis for data disruptions, and implement corrective actions More ❯
best practices. Automation & Monitoring: Implement and support automated monitoring systems to detect data anomalies, system failures, and performance issues and leverage advanced scripting and orchestration tools (e.g., Python, Bash, ApacheAirflow) to automate workflows and reduce operational overhead. Root Cause Analysis & Incident Management: Lead post-incident reviews, perform root cause analysis for data disruptions, and implement corrective actions More ❯
MySQL, PostgreSQL, or Oracle Experience with big data technologies such as Hadoop, Spark, or Hive Familiarity with data warehousing and ETL tools such as Amazon Redshift, Google BigQuery, or ApacheAirflow Proficiency in Python and at least one other programming language such as Java, or Scala Willingness to mentor more junior members of the team Strong analytical and More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
PHMG
environment using agile methodologies with operational targets crossing multiple departments Nice-to-have experience: Infrastructure as code (Terraform, cloud formation etc...) Cloud platforms (Aws, Azure, GCP etc..) Data orchestration (Airflow, Dagster etc...) The Team The Data & Analytics department serves as the central hub for data engineering and insights across our organization, comprising four essential teams: Data Engineering, Reporting, Commercial More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Dept Agency
architectures, data pipelines, and ETL processes Hands-on experience with cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to More ❯
AND EXPERIENCE: A successful Data Engineer will have the following skills and experience: Ability and experience interacting with key stakeholders Strong experience in SQL/Python Good understanding of Airflow/DBT Experience with GCP/AWS Background in CI/CD THE BENEFITS: You will receive a salary, dependent on experience. Salary is up to £60,000. On More ❯
will need experience in the following: A track record of consolidating data sources, creating pipelines, and ideally a software engineering mindset. Excellent experience with Containers. Strong knowledge of Python, Airflow, SQL, and Linux. Proficient in AWS Lambda, ECS, Athena, S3, Kinesis, Glue, CloudWatch/Terraform/CDK, Tableau Online. They are particularly keen on Senior Data Engineers who utilize More ❯
of machine learning algorithms, time series forecasting, and model evaluation Familiarity with cloud platforms (AWS and/or GCP) Experience with productionising models using tools like Kubernetes, Docker, and Airflow Good knowledge of NLP techniques for text analysis Ability to design and run effective experiments (A/B, multivariate) Excellent communicator – able to explain complex ideas to both technical More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
modeling, and predictive analytics. Excellent communication and stakeholder management skills. Desirable: Experience working with large-scale retail datasets (e.g., POS, CRM, supply chain). Familiarity with tools like dbt, Airflow, or MLflow. Master’s or PhD in Data Science, Statistics, Computer Science, or related fields. Benefits: Competitive salary and performance bonuses Flexible working options, including hybrid remote work More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
CenterXchange Inc
Cloud Platform stack (BigQuery, Composer, Dataplex, Dataflow, Cloud Functions, Cloud Run, Pub/Sub, GCS, Vertex AI, GKE) or similar cloud platforms Familiarity with open-source data-stack tools (Airflow, DBT, Kafka, Great Expectation, etc.) Appreciation of modern cloud data stack, headless BI, analytics engineering, Data Mesh, and Lake House Although not essential, it would be great if you More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Fitch Group
AWS and Azure cloud services to provide the necessary infrastructure, resources, and interfaces for data loading and LLM workflows. Use Python and large-scale data workflow orchestration platforms (e.g. Airflow) to build software artifacts for ETL, integrating diverse data formats and storage technologies, and incorporate them into robust data workflows and dynamic systems Design and develop APIs (e.g. using More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Auto Trader UK
are some technologies that our Data Scientists use (we don't expect applicants to have experience with all of these): Python and Databricks for Data Science, Spark, MLFlow and Airflow for ML Workflows, Google Cloud Platform for our analytics infrastructure, dbt and BigQuery for data modelling and warehousing. We are looking to grow our diverse team of curious data More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Ripjar
using python (specifically pyspark) and Node.js for processing data, backed by various Hadoop stack technologies such as HDFS and HBase. MongoDB and Elasticsearch are used for indexing smaller datasets. Airflow & Nifi are used to co-ordinate the processing of data, while Jenkins, Jira, Confluence and Github are used as support tools. We use ansible to manage configuration and deployments. … for processing data You will be using Hadoop stack technologies such as HDFS and HBase Experience using MongoDB and Elasticsearch for indexing smaller datasets would be beneficial Experience using Airflow & Nifi to co-ordinate the processing of data would be beneficial You will be using Ansible to manage configuration and deployments Salary And Benefits Salary DOE 25 days annual More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Top Remote Talent
Bonus points for experience with: State-of-the-art NLP models, Transformers, Agentic Approaches for mixed (temporal and text) data analysis and summarization; Experience with pipeline orchestration tools like Airflow, Argo, etc.; Proven Experience with Anomaly Detection and Forecasting with explainability for temporal and mixed data; Intermediate+ English — ability to participate in written discussions with international teams and clients. More ❯
. Technologies Used Our data platform supports industry-leading Data Science tools. While experience with all these technologies is not expected, familiarity is beneficial: Python and Databricks Spark, MLFlow, Airflow Google Cloud Platform dbt and BigQuery Examples of our data science work can be found on our engineering blog . What We’re Looking For Experience delivering data science More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
JIM - Jobs In Manchester
Data Science. These Are Some Technologies That Our Data Scientists Use (we Don’t Expect You To Have Experience With All Of These) Python and Databricks Spark, MLFlow, and Airflow for ML Workflows Google Cloud Platform for our analytics infrastructure dbt and BigQuery for data modelling and warehousing Some examples of our data science work can be found in More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Autotrader
Data Science. These Are Some Technologies That Our Data Scientists Use (we Don't Expect You To Have Experience With All Of These) Python and Databricks Spark, MLFlow, and Airflow for ML Workflows Google Cloud Platform for our analytics infrastructure dbt and BigQuery for data modelling and warehousing Some examples of our data science work can be found in More ❯
common life sciences data acquisition software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., ApacheAirflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes). ZONTAL is an More ❯
common life sciences data acquisition software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., ApacheAirflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes). ZONTAL is an More ❯
common life sciences data acquisition software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., ApacheAirflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes). ZONTAL is an More ❯