Desirable Extensive exprience in developing architectural strategies, blueprints for hybrid and cloud-native solutions ELT/ETL Frameworks & Pipelines Essential Develop robust ELT/ETL pipelines using tools like Apache Airflow, DBT, AWS Glue, Azure Data Factory, or Kafka Connect. Desirable Optimize data transformations for performance, reusability, and modular design (e.g., using SQL/Scala/Python). Disclosure More ❯
GCP/BigQuery or other cloud data warehouses (e.g., Snowflake, Redshift). Familiarity with data orchestration tools (e.g., Airflow). Experience with data visualisation platforms (such as Preset.io/Apache Superset or other). Exposure to CI/CD pipelines, ideally using GitLab CI. Background working with media, marketing, or advertising data. The Opportunity: Work alongside smart, supportive teammates More ❯
London, Oxford Circus, United Kingdom Hybrid/Remote Options
Datatech
GCP/BigQuery or other cloud data warehouses (e.g., Snowflake, Redshift). ·Familiarity with data orchestration tools (e.g., Airflow). ·Experience with data visualisation platforms (such as Preset.io/Apache Superset or other). ·Exposure to CI/CD pipelines, ideally using GitLab CI. ·Background working with media, marketing, or advertising data. The Opportunity: ·Work alongside smart, supportive teammates More ❯
MySQL) to modern NoSQL solutions (e.g., MongoDB, Cassandra). Focus on strategies that enhance data accessibility, integrity, and performance. Big Data Processing & Analytics: Utilise big data frameworks such as Apache Spark and Apache Flink to address challenges associated with large-scale data processing and analysis. These technologies are crucial for managing vast datasets and performing complex data transformations … as Databricks and Snowflake. Well-versed in various storage technologies including AWS S3, Google Cloud BigQuery, Cassandra, MongoDB, Neo4J, and HDFS. Adept in pipeline orchestration tools like AWS Glue, Apache Airflow, and dbt, as well as streaming technologies like Kafka, AWS Kinesis, Google Cloud Pub/Sub, and Azure Event Hubs. Data Storage Expertise: Knowledgeable in data warehousing technologies More ❯
or another language such as Python Good knowledge of developing in a Linux environment Working knowledge of Git version control and GitLabs CI/CD pipelines Experience working with Apache NiFi Some exposure to front-end elements like JavaScript, TypeScript or React Some data interrogation with ElasticSearch and Kibana Exposure to working with Atlassian products Looking for a role More ❯
. Solid understanding of DevOps principles and agile delivery. Excellent problem-solving skills and a proactive, team-oriented approach. Confident client-facing communication skills. Desirable Skills & Experience Experience with Apache NiFi and Node.js . Familiarity with JSON, XML, XSD, and XSLT . Knowledge of Jenkins, Maven, BitBucket, and Jira . Exposure to AWS and cloud technologies. Experience working within More ❯
teams to build scalable data pipelines and contribute to digital transformation initiatives across government departments. Key Responsibilities Design, develop and maintain robust data pipelines using PostgreSQL and Airflow or Apache Spark Collaborate with frontend/backend developers using Node.js or React Implement best practices in data modelling, ETL processes and performance optimisation Contribute to containerised deployments (Docker/Kubernetes More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Harnham - Data & Analytics Recruitment
Proven experience designing and implementing end-to-end MLOps processes in a production environment. Cloud ML Stack: Expert proficiency with Databricks and MLflow . Big Data/Coding: Expert Apache Spark and Python engineering experience on large datasets. Core Engineering: Strong experience with GIT for version control and building CI/CD/release pipelines. Data Fundamentals: Excellent SQL More ❯
cases, results, and automation frameworks. Required Education & Experience: Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent experience. Familiarity with performance testing tools such as Apache Bench, JMeter, LoadRunner , or modern alternatives like K6, Gatling . Experience working with Java 11, Spring Boot 2.7 , and Oracle 19 is highly desirable. What we're looking for More ❯
or another language such as Python Good knowledge of developing in a Linux environment Working knowledge of Git version control and GitLabs CI/CD pipelines Experience working with Apache NiFi Some exposure to front-end elements like JavaScript, TypeScript or React Some data interrogation with ElasticSearch and Kibana Exposure to working with Atlassian products Looking for a role More ❯
United Kingdom, Wolstanton, Staffordshire Hybrid/Remote Options
Uniting Ambition
experience in a commercial environment, working on AI/ML applications Multi cloud exposure (Azure/AWS/GCP) . Some of the following - Pytorch, GPT/BERT, RAG, Apache Airflow, Power Automate, Azure logic apps, RPA/Zapier, HuggingFace, LangChain... Background in Data Science or Software Engineering The values and ethos of this business Innovation with real purpose More ❯
implementing and consuming DTOs, Service/Repository patterns, etc. -Performance: Proven track record of identifying and fixing performance bottlenecks -Linux & DevOps: Comfortable with Ubuntu; experience with Docker, Nginx/Apache, CI/CD tools -Version Control: Git fluency; experience collaborating via Bitbucket workflows -Communication: Strong written and verbal English; ability to explain technical concepts clearly Nice-to-Have: - Experience More ❯
of containerisation and orchestration (e.g., Docker , Kubernetes , OpenShift ). Experience with CI/CD pipelines (e.g., Jenkins, TeamCity, Concourse). Familiarity with web/application servers such as NGINX, Apache, or JBoss. Exposure to monitoring and logging tools (ELK, Nagios, Splunk, DataDog, New Relic, etc.). Understanding of security and identity management (OAuth2, SSO, ADFS, Keycloak, etc.). Experience More ❯
to learn new technologies IT WOULD BE NICE FOR THE DATA ENGINEER TO HAVE.... Cloud based architectures Microservice architecture or server-less architecture Messaging/routing technologies such as Apache Nifi/RabbitMQ TO BE CONSIDERED.... Please either apply by clicking online or emailing me directly to For further information please call me on . I can make myself More ❯
integration of software IT WOULD BE NICE FOR THE BIG DATA ENGINEER TO HAVE.... Cloud based architectures Microservice architecture or server-less architecture Messaging/routing technologies such as Apache Nifi/RabbitMQ Experience of DevSecOps automated deployment tools such as Jenkins, Ansible, Docker TO BE CONSIDERED.... Please either apply by clicking online or emailing me directly to For More ❯
cloud platforms (AWS, Azure, GCP) and server provisioning. Knowledge of CI/CD pipelines, Docker, and Git is advantageous but not essential. Experience with Linux-based environments, Nginx/Apache, and monitoring tools, is also advantageous but not essential. In return, you will be rewarded with 25-30 days holiday, flexible working, pension and ongoing career development and training. More ❯
Employment Type: Permanent
Salary: £45000 - £50000/annum To £50,000 + Benefits
cloud platforms (AWS, Azure, GCP) and server provisioning. Knowledge of CI/CD pipelines, Docker, and Git is advantageous but not essential. Experience with Linux-based environments, Nginx/Apache, and monitoring tools, is also advantageous but not essential. In return, you will be rewarded with 25-30 days holiday, flexible working, pension and ongoing career development and training. More ❯
business and technical stakeholders to improve data reliability and transparency. Identify opportunities for automation and process optimisation once BAU stability is achieved. Technical Environment AWS (data storage and processing) Apache Airflow (workflow orchestration) Power BI (reporting and analytics) What We're Looking For Strong background in data engineering or data operations. Experience managing or mentoring offshore technical teams. Solid More ❯
monitoring processes to maintain data integrity and reliability. * Optimise data workflows for performance, cost-efficiency, and maintainability using tools such as Azure Data Factory, AWS Data Pipeline, Databricks, or Apache Spark . * Integrate and prepare data for Tableau dashboards and reports , ensuring optimal performance and alignment with business needs. * Collaborate with visualisation teams to develop, maintain, and enhance Tableau More ❯
Crewe, Cheshire, England, United Kingdom Hybrid/Remote Options
DCS Recruitment
change management processes. * Ensure all systems and environments comply with internal policies, controls, and security standards. Key Skills & Experience: Essential: * Strong hands-on experience with Linux technologies including Ubuntu, Apache, PHP, MySQL, PostgreSQL, Nginx, Postfix, and Git * Experience with configuration and automation tools such as Puppet, Ansible, Terraform, and Ninja * Solid understanding of AWS services, ideally including EC2, ECS More ❯
Crewe, Cheshire, England, United Kingdom Hybrid/Remote Options
Radius
levels and efficiency. Documenting systems and following change management protocols. Ensuring compliance with internal controls and security standards. What we’re looking for Proven experience with Linux technologies (Ubuntu, Apache, PHP, MySQL, PostgreSQL, Nginx, Postfix, Git) Familiarity with configuration tools (Puppet, Ansible, Terraform, Ninja) Experience with AWS services (EC2, ECS, Lambda, VPC, Route53, S3, RDS, CloudWatch, CloudFormation) Knowledge of More ❯
Gloucester, Gloucestershire, United Kingdom Hybrid/Remote Options
NSD
hybrid working when possible Must hold active Enhanced DV Clearance (West) Competitive Salary DOE - 6% bonus, 25 days holiday, clearance bonus Experience in Data Pipelines, ETL processing, Data Integration, Apache, SQL/NoSQL Who Are We? Our client is a trusted and growing supplier to the National Security sector, delivering mission-critical solutions that help keep the nation safe More ❯
for hardware and Linux based systems Strong AWS experience – management and deployment Experience working in a SaaS environment, ideally CRM Full Software Development Lifecycle experience DevOps skills with Linux, Apache and MySQL Agile project methodologies The role can pay up to c85k basic plus a bonus of c5k. Max package will be 90k. There is parking onsite. 25 days More ❯
Crewe, Cheshire, England, United Kingdom Hybrid/Remote Options
Radius
Kubernetes, Docker) Strong knowledge of AWS services (EC2, ECS, EKS, Lambda, VPC, Route53, S3, RDS, CloudWatch) Scripting skills in Python and Bash Familiarity with Linux and database technologies (Ubuntu, Apache, PHP, MySQL, PostgreSQL, Nginx) Experience with Git/Mercurial and system integrations/migrations Understanding of IT security fundamentals and ITIL service management A collaborative mindset and enthusiasm for More ❯
able to work across full data cycle. - Proven Experience working with AWS data technologies (S3, Redshift, Glue, Lambda, Lake formation, Cloud Formation), GitHub, CI/CD - Coding experience in Apache Spark, Iceberg or Python (Pandas) - Experience in change and release management. - Experience in Database Warehouse design and data modelling - Experience managing Data Migration projects. - Cloud data platform development and … the AWS services like Redshift, Lambda,S3,Step Functions, Batch, Cloud formation, Lake Formation, Code Build, CI/CD, GitHub, IAM, SQS, SNS, Aurora DB - Good experience with DBT, Apache Iceberg, Docker, Microsoft BI stack (nice to have) - Experience in data warehouse design (Kimball and lake house, medallion and data vault) is a definite preference as is knowledge of More ❯