VMWare General/Usage Technical Leadership & Design DevSecOps tooling and practices Application Security Testing SAFe (scaled agile) Processes Data Integration Focused Data Pipeline Orchestration, and ELT tooling such as ApacheAirflow, Apark, NiFi, Airbyte and Singer. Message Brokers, streaming data processors, such as Apache Kafka Object Storage, such as S3, MinIO, LakeFS CI/CD Pipeline, Integration More ❯
with new methodologies to enhance the user experience. Key skills: Senior Data Scientist experience Commercial experience in Generative AI and recommender systems Strong Python and SQL experience Spark/ApacheAirflow LLM experience MLOps experience AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill) with hybrid working More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Cathcart Technology
with new methodologies to enhance the user experience. Key skills: ** Senior Data Scientist experience ** Commercial experience in Generative AI and recommender systems ** Strong Python and SQL experience ** Spark/ApacheAirflow ** LLM experience ** MLOps experience ** AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill) with hybrid working More ❯
Central London, London, United Kingdom Hybrid / WFH Options
Cathcart Technology
with new methodologies to enhance the user experience. Key skills: ** Senior Data Scientist experience ** Commercial experience in Generative AI and recommender systems ** Strong Python and SQL experience ** Spark/ApacheAirflow ** LLM experience ** MLOps experience ** AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill) with hybrid working More ❯
City of London, London, Tottenham Court Road, United Kingdom Hybrid / WFH Options
Cathcart Technology
with new methodologies to enhance the user experience. Key skills: ** Senior Data Scientist experience ** Commercial experience in Generative AI and recommender systems ** Strong Python and SQL experience ** Spark/ApacheAirflow ** LLM experience ** MLOps experience ** AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill) with hybrid working More ❯
building and scaling Extract, Load, Transform (ELT) pipelines. Experience overseeing or managing projects involving data pipeline orchestration, data quality management, and performance optimization. Knowledge of common ELT tools like ApacheAirflow, Informatica PowerCenter, or cloud-native data integration services is a plus. • Data Science/AI & Data Standards: Understanding of how data standards impact data science and AI More ❯
building and scaling Extract, Load, Transform (ELT) pipelines. Experience overseeing or managing projects involving data pipeline orchestration, data quality management, and performance optimization. Knowledge of common ELT tools like ApacheAirflow, Informatica PowerCenter, or cloud-native data integration services is a plus. • Data Science/AI & Data Standards: Understanding of how data standards impact data science and AI More ❯
position? We are seeking a passionate DataOps Engineer who loves optimizing pipelines, automating workflows, and scaling cloud-based data infrastructure. Key Responsibilities: Design, build, and optimize data pipelines using Airflow, DBT, and Databricks. Monitor and improve pipeline performance to support real-time and batch processing. Manage and optimize AWS-based data infrastructure, including S3 and Lambda, as well as … experience supporting high-velocity data/development teams and designing and maintaining data infrastructure, pipelines, and automation frameworks. You should also have experience streamlining data workflows using tools like Airflow, DBT, Databricks, and Snowflake while maintaining data integrity, security, and performance. Bachelor's degree in Computer Science, Information Technology, or a related field, or equivalent work experience. Minimum of … years of experience in DataOps or similar. Proficiency in key technologies, including Airflow, Snowflake, and SageMaker. Certifications in AWS/Snowflake/other technologies a plus. Excellent communication and interpersonal skills. Ability to work in a fast-paced environment and manage multiple priorities effectively. What's in it for you? We offer our employees more than just competitive compensation. More ❯
and maintaining critical data pipelines that support core trading, forecasting, risk, and PPA processes across all Octopus international regions. The role leverages a modern tech stack including SQL, Python, Airflow, Kubernetes, and various other cutting-edge technologies. You'll work with tools like dbt on Databricks, PySpark, Streamlit, and Django, ensuring robust data infrastructure that powers business-critical operations. … to ensure best practices and code standardization Take ownership of data platform improvements Share knowledge and upskill team members Requirements For Data Engineer Strong aptitude with SQL, Python and Airflow Experience in Kubernetes, Docker, Django, Spark and related monitoring tools Experience with dbt for pipeline modelling Ability to shape needs into requirements and design scalable solutions Quick understanding of More ❯
of data warehousing concepts and SQL (Snowflake experience a plus) Experience with or willingness to learn CI/CD tooling (e.g. GitHub Actions), containerization (Docker), and workflow orchestration tools (Airflow/AstroCloud) Strong debugging and troubleshooting skills for data pipelines and ML systems Experience writing tests (unit, integration) and implementing monitoring/alerting for production systems Strong data skills … of data warehousing concepts and SQL (Snowflake experience a plus) Experience with or willingness to learn CI/CD tooling (e.g. GitHub Actions), containerization (Docker), and workflow orchestration tools (Airflow/AstroCloud) Strong debugging and troubleshooting skills for data pipelines and ML systems Experience writing tests (unit, integration) and implementing monitoring/alerting for production systems Strong data skills More ❯
communication skills. Bonus Points For: Experience leading projects in SCIF environments. Expertise in Cyber Analytics, PCAP, or network monitoring. Familiarity with Spark, Dask, Snowpark, Kafka, or task schedulers like Airflow and Celery. More ❯
/CD tools like Jenkins and GitHub. Understanding of how to build and run containerized applications (Docker, Helm) Familiarity with, or a working understanding of big data search tools (Airflow, Pyspark, Trino, OpenSearch, Elastic, etc. More ❯
with cloud services (Azure ML, AWS SageMaker, GCP Vertex AI) Proficient in Python and common ML libraries (TensorFlow, PyTorch, Scikit-learn) Familiarity with data engineering workflows and tools (Spark, Airflow, Databricks, etc.) Experience with GenAI, LLMs, or NLP is a strong plus Solid understanding of model governance, compliance, and responsible AI principles Excellent communication skills and stakeholder management abilities More ❯
SRE, and product teams. We'd Love to See Experience with semantic technologies: ontologies, RDF, or graph databases (e.g., Neo4j, RDF4J). Familiarity with ETL or EIS platforms like Apache Camel or Airflow. Knowledge of financial market data, especially around latency, availability, and correctness. Experience building or contributing to observability platforms or knowledge graph tooling. More ❯
including PCAP, CVEs, and network monitoring. Experience integrating with technologies such as Spark, Dask, Snowpark, or Kafka. Background in web application stacks (e.g., Flask, Django) or task schedulers (e.g., Airflow, Celery, Prefect). Compensation & Benefits: Competitive salary, equity, and performance-based bonus. Full benefits package including medical, dental, and vision. Unlimited paid time off. If you want to know More ❯
experience working effectively with cross-functional teams across multiple time zones with with remote stakeholders BS degree in Computer Science or related engineering field Nice to Have Experience with Airflow, Celery, AWS and/or Azure, Postgres Experience with API platform development Experience with Go More ❯
pipelines. 1-2 years of hands-on experience with Azure services such as Data Factory, Databricks, Synapse (DWH), Azure Functions, and other data analytics tools, including streaming. Experience with Airflow and Kubernetes. Programming skills in Python (PySpark) and scripting languages like Bash. Knowledge of Git, CI/CD operations, and Docker. Basic PowerBI knowledge is a plus. Experience deploying More ❯
with an ability to work on multiple projects simultaneously Strong interpersonal and communication skills Quick, self-learning capabilities and creativity in problem-solving Preferred: Familiarity with Python Familiarity with Airflow, ETL tools, Snowflake and MSSQL Hands-on with VCS (Git) Apply for this job indicates a required field First Name Last Name Email Phone Resume/CV Enter manually More ❯
SRE, and product teams. We'd Love to See Experience with semantic technologies: ontologies, RDF, or graph databases (e.g., Neo4j, RDF4J). Familiarity with ETL or EIS platforms like Apache Camel or Airflow. Knowledge of financial market data, especially around latency, availability, and correctness. Experience building or contributing to observability platforms or knowledge graph tooling. Bloomberg is an equal More ❯