Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook . Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet More ❯
Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook . Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet More ❯
Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook . Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet More ❯
of CI/CD tools and technologies (e.g., Git, Gitlab, Jenkins, GCP, AWS) Knowledge of containerisation and microservice architecture Ability to develop dashboard UIs for publishing performance (e.g., Grafana, Apache Superset, etc.) Exposure to safety certification standards and processes We provide: Competitive salary, benchmarked against the market and reviewed annually Company share programme Hybrid and/or flexible work More ❯
systems such as: Ansible, Terraform, Jenkins, Bamboo, Concourse etc. Monitoring utilising products such as: Prometheus, Grafana, ELK, filebeat etc. Observability - SRE Big Data solutions (ecosystems) and technologies such as: Apache Spark and the Hadoop Ecosystem Edge technologies e.g. NGINX, HAProxy etc. Excellent knowledge of YAML or similar languages The following Technical Skills & Experience would be desirable for Data Devops More ❯
development experience using HTML, VUE.js • Strong hands-on experience in performant and scalable database design in SQL, NOSQL and GRAPH databases such SQL Server/PgSQL, MongoDB, Cassandra, Redis ,Apache Druid • Solid experience in REST APIs, GraphQL & gRPC • Strong hands-on experience in GitHub/GitLab and testing tools/frameworks such as SonarQube, xUnit, Postman, Cucumber, Polaris, Blackduck. More ❯
Team Valley Trading Estate, Gateshead, Tyne and Wear, England, United Kingdom
Nigel Wright Group
include: 3+ years experience in data engineering roles, delivering integrated data-driven applications Hands-on experience with Microsoft Fabric components (Pipelines, Lakehouse, Warehouses) Proficient in T-SQL and either Apache Spark or Python for data engineering Comfortable working across cloud platforms, with emphasis on Microsoft Azure Familiarity with REST APIs and integrating external data sources into applications More ❯
Data Modeler. Collaborate with business stakeholders and data analysts to understand data requirements and translate them into efficient engineering solutions. Optimize data flows and manage orchestration and scheduling using Apache Airflow. Ensure data integrity, accuracy, and consistency across systems. Implement CI/CD pipelines and monitor job performance and failures. Support data governance, data quality, and compliance initiatives. Required More ❯
in Computer Science, Data Science, Engineering, or a related field. Strong programming skills in languages such as Python, SQL, or Java. Familiarity with data processing frameworks and tools (e.g., Apache Spark, Hadoop, Kafka) is a plus. Basic understanding of cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services. Knowledge of database systems (e.g., MySQL, PostgreSQL, MongoDB) and More ❯
Jenkins, TeamCity Scripting languages such as PowerShell, bash Observability/Monitoring: Prometheus, Grafana, Splunk Containerisation tools such as Docker, K8S, OpenShift, EC, containers Hosting technologies such as IIS, nginx, Apache, App Service, LightSail Analytical and creative approach to problem solving We encourage you to apply , even if you don't meet all of the requirements. We value your growth More ❯
both strategic and hands-on levels. Prior experience contributing to open-source projects or standards bodies (e.g., JCP). Some familiarity with the Hazelcast platform or similar technologies (e.g., Apache Ignite, Redis, AWS ElastiCache, Oracle Coherence, Kafka, etc.). Experience writing technical whitepapers or benchmark reports. BENEFITS 25 days annual leave + Bank holidays Group Company Pension Plan Private More ❯
Islington, London, United Kingdom Hybrid / WFH Options
National Centre for Social Research
to design and deliver enterprise-scale data warehouses in regulated or complex environments. Expertise in ETL/ELT, and reporting system architectures. Strong technical skills in SQL, Python, PySpark, Apache Spark. Hands-on background as a data engineer or platform engineer - you can design and build. Excellent communication and relationship-building skills across technical and non-technical audiences. Demonstrated More ❯
to ensure code is fit for purpose Experience that will put you ahead of the curve Experience using Python on Google Cloud Platform for Big Data projects, BigQuery, DataFlow (Apache Beam), Cloud Run Functions, Cloud Run, Cloud Workflows, Cloud Composure SQL development skills Experience using Dataform or dbt Demonstrated strength in data modelling, ETL development, and data warehousing Knowledge More ❯
pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new services as More ❯
pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new services as More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Medialab Group
data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation tools (we use Preset.io/Apache Superset). Exposure to CI/CD pipelines (GitLab CI preferred). Experience with advertising or media data is a plus. More ❯
data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation tools (we use Preset.io/Apache Superset). Exposure to CI/CD pipelines (GitLab CI preferred). Experience with advertising or media data is a plus. More ❯
are constantly looking for components to adopt in order to enhance our platform. What you'll do: Develop across our evolving technology stack - we're using Python, Java, Kubernetes, Apache Spark, Postgres, ArgoCD, Argo Workflow, Seldon, MLFlow and more. We are migrating into AWS cloud and adopting many services that are available in that environment. You will have the More ❯
Programming Mastery: Advanced skills in Python or another major language; writing clean, testable, production-grade ETL code at scale. Modern Data Pipelines: Experience with batch and streaming frameworks (e.g., Apache Spark, Flink, Kafka Streams, Beam), including orchestration via Airflow, Prefect or Dagster. Data Modeling & Schema Management: Demonstrated expertise in designing, evolving, and documenting schemas (OLAP/OLTP, dimensional, star More ❯
have the chance to work with a talented and engaged team on an innovative product that connects with external systems, partners, and platforms across the industry. Our Tech Stack: Apache Airflow Python Django React/Typescript AWS (S3, RDS withPostgresql, ElastiCache, MSK, EC2, ECS, Fargate, Lamda etc.) Snowflake Terraform CircleCI Bitbucket Your mission Lead and scale multiple engineering teams More ❯
Gloucester, Gloucestershire, United Kingdom Hybrid / WFH Options
Leonardo UK Ltd
attitude, capable of acquiring new skills. Objective and logical with an enquiring and creative mind. It would be nice if you had : Data Engineering - experience of one or more: Apache ecosystem, SQL, Python. Web - HTML, CSS, JavaScript, XML, SOAP. Experience with Secure DevSecOps within an Agile/SAFe environment. Containerisation & Orchestration - Docker, Podman, Kubernetes, Rancher etc. Software development capability. More ❯
and occasional front-end development (Bootstrap, Angular, WordPress) Collaborate with stakeholders to scope, design, and deliver new features Support the deployment and optimisation of applications on both Linux (Ubuntu, Apache, Nginx) and Windows (IIS) servers Ensure best practices around security , performance , and code quality Work alongside a small IT/dev team with opportunities to contribute to architecture decisions More ❯
Lincoln, Lincolnshire, East Midlands, United Kingdom Hybrid / WFH Options
Oscar Associates (UK) Limited
and occasional front-end development (Bootstrap, Angular, WordPress) Collaborate with stakeholders to scope, design, and deliver new features Support the deployment and optimisation of applications on both Linux (Ubuntu, Apache, Nginx) and Windows (IIS) servers Ensure best practices around security , performance , and code quality Work alongside a small IT/dev team with opportunities to contribute to architecture decisions More ❯
Gloucester, Gloucestershire, South West, United Kingdom Hybrid / WFH Options
Forward Role
live, mission-critical environments Deep knowledge of Linux server administration Skilled in log analysis using tools like Splunk or ELK stack Hands-on with tools and platforms such as: Apache NiFi, MinIO, AWS S3 Java & Python applications (deployment, patching, support) Containerisation and deployment technologies such as Docker, Podman, Kubernetes, OpenShift Excellent analytical, troubleshooting, and prioritisation skills Security Clearance You More ❯