london, south east england, united kingdom Hybrid / WFH Options
Noir
robust data pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (Apache Airflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Cathcart Technology
Data Scientist with Machine Learning experience ** Strong understanding and experience with ML models and ML observability tools ** Strong Python and SQL experience ** Spark/Apache Airflow ** ML frame work experience (PyTorch/TensorFlow/Scikit-Learn) ** Experience with cloud platforms (preferably AWS) ** Experience with containerisation technologies Useful information: Their More ❯
data flow, storage, and processing • Good logical thinking and attention to detail ⸻ 🌟 Nice-to-Have (But Not Required) : • Experience with data pipeline tools like Apache Airflow, DBT, or Kafka • Knowledge of cloud data services (AWS S3/Glue/Redshift, GCP BigQuery, Azure Data Factory) • Exposure to Spark, Hadoop More ❯
to adopt in order to enhance our platform. What you'll do: Develop across our evolving technology stack - we're using Python, Java, Kubernetes, Apache Spark, Postgres, ArgoCD, Argo Workflow, Seldon, MLFlow and more. We are migrating into AWS cloud and adopting many services that are available in that More ❯
related field. 5+ years of experience in data engineering and data quality. Strong proficiency in Python/Java, SQL, and data processing frameworks including Apache Spark. Knowledge of machine learning and its data requirements. Attention to detail and a strong commitment to data integrity. Excellent problem-solving skills and More ❯
related field. 5+ years of experience in data engineering and data quality. Strong proficiency in Python/Java, SQL, and data processing frameworks including Apache Spark. Knowledge of machine learning and its data requirements. Attention to detail and a strong commitment to data integrity. Excellent problem-solving skills and More ❯
Technical requirements: Highly proficient in Python. Experience working with data lakes; experience with Spark, Databricks. Understanding of common data transformation and storage formats, e.g. Apache Parquet. Good understanding of cloud environments (ideally Azure), and workflow management systems (e.g. Dagster, Airflow, Prefect). Follow best practices like code review, clean More ❯
or a similar language. Proficiency in database technologies (SQL, NoSQL, time-series databases) and data modelling. Strong understanding of data pipeline orchestration tools (e.g., Apache Airflow, Kubernetes). You thrive when working as part of a team Comfortable in a fast-paced environment Have excellent written and verbal English More ❯
Design and Maintenance, Apps, Hive Metastore Management, Network Management, Delta Sharing, Dashboards, and Alerts. Proven experience working with big data technologies, i.e., Databricks and Apache Spark. Proven experience working with Azure data platform services, including Storage, ADLS Gen2, Azure Functions, Kubernetes. Background in cloud platforms and data architectures, such More ❯
Dagster Good understanding of cloud environments (ideally Azure), distributed computing and scaling workflows and pipelines Understanding of common data transformation and storage formats, e.g. Apache Parquet Awareness of data standards such as GA4GH ( ) and FAIR ( ). Exposure of genotyping and imputation is highly advantageous Benefits: Competitive base salary Generous More ❯
and support automated monitoring systems to detect data anomalies, system failures, and performance issues and leverage advanced scripting and orchestration tools (e.g., Python, Bash, Apache Airflow) to automate workflows and reduce operational overhead. Root Cause Analysis & Incident Management : Lead post-incident reviews, perform root cause analysis for data disruptions More ❯
implement elegant solutions for them. Are a data enthusiast who wants to be surrounded by brilliant teammates and huge challenges. Bonus Points: Experience with Apache Airflow, including designing, managing, and troubleshooting DAGs and data pipelines. Experience with CI/CD pipelines and tools like Jenkins, including automating the process More ❯
field. Proficiency in programming languages such as Python, Spark, SQL. Strong experience with SQL databases. Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF). Experience with cloud platforms (Azure preferred) and related data services. There's no place quite like BFS and we're proud More ❯
Nottingham, Nottinghamshire, United Kingdom Hybrid / WFH Options
Experian Group
Significant experience of programming using Scala and Python Experience of using Terraform to provision and deploy cloud services and components Experience of developing on Apache Spark Experience of developing with AWS cloud services including (but not limited to) AWS Glue, S3, Step Functions, Lambdas, EventBridge and SQS BDD/ More ❯
Learning (ML): Deep understanding of machine learning principles, algorithms, and techniques. Experience with popular ML frameworks and libraries like TensorFlow, PyTorch, scikit-learn, or Apache Spark. Proficiency in data preprocessing, feature engineering, and model evaluation. Knowledge of ML model deployment and serving strategies, including containerization and microservices. Familiarity with More ❯
CD pipelines and integrating automated tests within them - Jenkins, BitBucket required. Familiarity with performance testing, security testing, and other non-functional testing approaches - JMeter, Apache Benchmark preferred. Good experience of working on cloud technologies and services on AWS. Strong practical experience in Flyway or Liquibase. Strong understanding of modern More ❯
with NoSQL databases (e.g., MongoDB) and relational databases (e.g., PostgreSQL, MySQL). 5+ years in Python and SQL work. Knowledge of ETL tools (e.g., Apache Airflow) and cloud platforms (e.g., AWS, Azure, GCP). Understand data modelling concepts and best practices. Experience with healthcare data standards (e.g., HL7, FHIR More ❯
field. Proficiency in programming languages such as Python, Spark, SQL. Strong experience with SQL databases. Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF). Experience with cloud platforms (Azure preferred) and related data services. There’s no place quite like BFS and we’re proud More ❯
field. Proficiency in programming languages such as Python, Spark, SQL. Strong experience with SQL databases. Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF). Experience with cloud platforms (Azure preferred) and related data services. There's no place quite like BFS and we're proud More ❯
Learning (ML): • Deep understanding of machine learning principles, algorithms, and techniques. • Experience with popular ML frameworks and libraries like TensorFlow, PyTorch, scikit-learn, or Apache Spark. • Proficiency in data preprocessing, feature engineering, and model evaluation. • Knowledge of ML model deployment and serving strategies, including containerization and microservices. • Familiarity with More ❯
X.509 certificate infrastructure. Extensive experience supporting and implementing TLS/SSL certificate management systems Proficient with Token-based authentication services, Perfect Forward Security (PFS), Apache, Nginx, HAProxy Solid knowledge of Linux security and system operations. Benefits Roku is committed to offering a diverse range of benefits as part of More ❯
AWS, or Azure. Experience with CI/CD pipelines for machine learning (e.g., Vertex AI). Experience with data processing frameworks and tools, particularly Apache Beam/Dataflow is highly desirable. Knowledge of monitoring and maintaining models in production. Proficiency in employing containerization tools, including Docker, to streamline the More ❯
Warwick, Warwickshire, United Kingdom Hybrid / WFH Options
ICEO
blockchain, DeFi, and decentralized technologies. English proficiency at B2 level or higher Nice to have: Knowledge of Argo CD and Argo Rollouts Experience with ApacheHTTPServer, OpenVPN, or advanced networking (LAN/WAN, Firewall, Load Balancers) PostgreSQL administration and optimization skills Exposure to SSO and Okta for secure More ❯
Newcastle Upon Tyne, Tyne and Wear, North East, United Kingdom Hybrid / WFH Options
Client Server
scientific discipline, backed by minimum A A B grades at A-level You have commercial Data Engineering experience working with technologies such as SQL, Apache Spark and Python including PySpark and Pandas You have a good understanding of modern data engineering best practices Ideally you will also have experience More ❯
Management/Automation Red Hat Satellite server/Uyuni Python, Java, shell scripting Ansible Gitlab Terraform Containers Docker Kubernetese OpenShift Middleware/Web servers Apache/Nginx MySql/MariaDB RabbitMQ Memcache HAProxy Ability to troubleshoot, research and diagnose root cause for an incident or problem. Excellent communication skills More ❯