Gloucester, Gloucestershire, United Kingdom Hybrid / WFH Options
Navtech, Inc
of Science Degree in software engineering or a related field Proficiency in English spoken and written Nice-to-Haves: Experience with ETL/ELT pipeline design and tools (e.g., Apache Airflow). Familiarity with Change Data Capture (CDC) solutions. Knowledge of database services on other cloud platforms (e.g., Azure SQL Database, Google Cloud Spanner). Understanding of ORM frameworks More ❯
Gloucester, Gloucestershire, South West, United Kingdom Hybrid / WFH Options
Omega Resource Group
GitLab) Contributing across the software development lifecycle from requirements to deployment Tech Stack Includes: Java, Python, Linux, Git, JUnit, GitLab CI/CD, Oracle, MongoDB, JavaScript/TypeScript, React, Apache NiFi, Elasticsearch, Kibana, AWS, Hibernate, Atlassian Suite What's on Offer: Hybrid working and flexible schedules (4xFlex) Ongoing training and career development Exciting projects within the UK's secure More ❯
North West London, London, United Kingdom Hybrid / WFH Options
Anson Mccade
knowledge of Kafka , Confluent , and event-driven architecture Hands-on experience with Databricks , Unity Catalog , and Lakehouse architectures Strong architectural understanding across AWS, Azure, GCP , and Snowflake Familiarity with Apache Spark, SQL/NoSQL databases, and programming (Python, R, Java) Knowledge of data visualisation, DevOps principles, and ML/AI integration into data architectures Strong grasp of data governance More ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
Azure Storage, Medallion Architecture, and working with data formats such as JSON, CSV, and Parquet. • Strong understanding of IT concepts, including security, IAM, Key Vault, and networking. • Exposure to Apache Airflow and DBT is a bonus. • Familiarity with agile principles and practices. • Experience with Azure DevOps pipelines. The "Nice to Haves": • Certification in Azure or related technologies. • Experience with More ❯
MySQL Exposure to Docker, Kubernetes, AWS, Helm, Terraform, Vault, Grafana, ELK Stack, New Relic Relevant experience in the maintenance of data APIs and data lake architectures, including experience with Apache Iceberg, Trino/Presto, Clickhouse, Snowflake, BigQuery. Master's degree in Computer Science or Engineering-related field Get to know us better YouGov is a global online research company More ❯
MySQL Exposure to Docker, Kubernetes, AWS, Helm, Terraform, Vault, Grafana, ELK Stack, New Relic Relevant experience in the maintenance of data APIs and data lake architectures, including experience with Apache Iceberg, Trino/Presto, Clickhouse, Snowflake, BigQuery. Master's degree in Computer Science or Engineering-related field #LI-PM1 Get to know us better... YouGov is a global online More ❯
might be more valuable than your direct technical contributions on a project. You care about your craft In addition it would be a bonus if you have Worked with Apache Airflow - we use Airflow extensively to orchestrate and schedule all of our data workflows. A good understanding of the quirks of operating Airflow at scale would be helpful. Experience More ❯
technical direction to a growing team of developers globally. The platform is a Greenfield build using standard modern technologies such as Java, Spring Boot, Kubernetes, Kafka, MongoDB, RabbitMQ, Solace, Apache Ignite. The platform runs in a hybrid mode both on-premise and in AWS utilising technologies such as EKS, S3, FSX. The main purpose of this role is to More ❯
of real-time and analytical data pipelines, metadata, and cataloguing (e.g., Atlan) Strong communication, stakeholder management, and documentation skills Preferred (but not essential): AWS or Snowflake certifications Knowledge of Apache Airflow, DBT, GitHub Actions Experience with Iceberg tables and data product thinking Why Apply? Work on high-impact, high-scale client projects Join a technically elite team with a More ❯
governance SME; support teams with tooling, guidance, and best practices. About You: Strong technical foundation in data governance architecture and tooling. You've worked with tools such as DataHub, Apache Airflow, AWS, dbt, Snowflake, BigQuery , or similar. Hands-on experience building and maintaining centralized data inventories, business glossaries, and data mapping frameworks. Proficient in automating data classification and lineage More ❯
methodologies. Collaborating with stakeholders to define data strategies, implement data governance policies, and ensure data security and compliance. About you: Strong technical proficiency in data engineering technologies, such as Apache Airflow, ClickHouse, ETL tools, and SQL databases. Deep understanding of data modeling, ETL processes, data integration, and data warehousing concepts. Proficiency in programming languages commonly used in data engineering More ❯
processes Knowledge of Mifid II, Dodd Frank regulations and controls Knowledge/experience of FIX flows - TradeWeb, RFQ Hub, BlackRock and Bloomberg, RFQ Workflows. Additional technology experience: React JS, Apache NiFi, Mongo, DBaaS, SaaS, Tibco/Solace or similar messaging middleware Education: Bachelor's degree or equivalent experience operating in a similar role This job description provides a high More ❯
networks into production Experience with Docker Experience with NLP and/or computer vision Exposure to cloud technologies (eg. AWS and Azure) Exposure to Big data technologies Exposure to Apache products eg. Hive, Spark, Hadoop, NiFi Programming experience in other languages This is not an exhaustive list, and we are keen to hear from you even if you don More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Fruition Group
best practices for data security and compliance. Collaborate with stakeholders and external partners. Skills & Experience: Strong experience with AWS data technologies (e.g., S3, Redshift, Lambda). Proficient in Python, Apache Spark, and SQL. Experience in data warehouse design and data migration projects. Cloud data platform development and deployment. Expertise across data warehouse and ETL/ELT development in AWS More ❯
Catalog). Familiarity with Data Mesh, Data Fabric, and product-led data strategies. Expertise in cloud platforms (AWS, Azure, GCP, Snowflake). Technical Skills Proficiency in big data tools (Apache Spark, Hadoop). Programming knowledge (Python, R, Java) is a plus. Understanding of ETL/ELT, SQL, NoSQL, and data visualisation tools. Awareness of ML/AI integration into More ❯
class-leading data and ML platform infrastructure, balancing maintenance with exciting greenfield projects. develop and maintain our real-time model serving infrastructure, utilising technologies such as Kafka, Python, Docker, Apache Flink, Airflow, and Databricks. Actively assist in model development and debugging using tools like PyTorch, Scikit-learn, MLFlow, and Pandas, working with models from gradient boosting classifiers to custom More ❯
flat structure, you'll work autonomously on business critical projects, and collaborate with others throughout our team and user base. Our tech stack includes TypeScript, Python, Node, WebAssembly, WebGL, Apache Arrow, DuckDB, Kubernetes and React. For the best possible user experience, we have developed various technologies in-house, including a custom WebGL rendering engine, data visualization library, reactive SQL More ❯
pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new services as More ❯
Jenkins, TeamCity Scripting languages such as PowerShell, bash Observability/Monitoring: Prometheus, Grafana, Splunk Containerisation tools such as Docker, K8S, OpenShift, EC, containers Hosting technologies such as IIS, nginx, Apache, App Service, LightSail Analytical and creative approach to problem solving We encourage you to apply , even if you don't meet all of the requirements. We value your growth More ❯
Islington, London, United Kingdom Hybrid / WFH Options
National Centre for Social Research
to design and deliver enterprise-scale data warehouses in regulated or complex environments. Expertise in ETL/ELT, and reporting system architectures. Strong technical skills in SQL, Python, PySpark, Apache Spark. Hands-on background as a data engineer or platform engineer - you can design and build. Excellent communication and relationship-building skills across technical and non-technical audiences. Demonstrated More ❯
pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new services as More ❯
pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new services as More ❯
data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation tools (we use Preset.io/Apache Superset). Exposure to CI/CD pipelines (GitLab CI preferred). Experience with advertising or media data is a plus. More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Medialab Group
data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation tools (we use Preset.io/Apache Superset). Exposure to CI/CD pipelines (GitLab CI preferred). Experience with advertising or media data is a plus. More ❯