Gloucester, Gloucestershire, South West, United Kingdom Hybrid / WFH Options
Forward Role
live, mission-critical environments Deep knowledge of Linux server administration Skilled in log analysis using tools like Splunk or ELK stack Hands-on with tools and platforms such as: Apache NiFi, MinIO, AWS S3 Java & Python applications (deployment, patching, support) Containerisation and deployment technologies such as Docker, Podman, Kubernetes, OpenShift Excellent analytical, troubleshooting, and prioritisation skills Security Clearance You More ❯
Expertise in data warehousing, data modelling, and data integration. Experience in MLOps and machine learning pipelines. Proficiency in SQL and data manipulation languages. Experience with big data platforms (including Apache Arrow, Apache Spark, Apache Iceberg, and Clickhouse) and cloud-based infrastructure on AWS. Education & Qualifications Bachelors or Masters degree in Computer Science, Engineering, or a related field More ❯
Basic understanding of data structures and data modeling. Good problem-solving and logical thinking skills. Ability to work independently in a remote environment. Preferred Qualifications: Exposure to tools like Apache Airflow, Spark, or Kafka. Basic experience with Python or Scala for scripting and automation. Knowledge of cloud platforms like AWS, Azure, or GCP. Previous academic or personal projects related More ❯
audiences. Self-motivated and able to work independently. Preferred Qualifications: Background in investment banking or financial services. Hands-on experience with Hive, Impala, and the Spark ecosystem (e.g., HDFS, Apache Spark, Spark-SQL, UDFs, Sqoop). Proven experience building and optimizing big data pipelines, architectures, and data sets. More ❯
and user needs Qualifications 5+ years of hands-on experience with Python, Java and/or C++ Development of distributed systems Kubernetes (K8s) AWS (SQS, DynamoDB, EC2, S3, Lambda) Apache Spark Performance testing Bonus Search system development (indexing/runtime/crawling) MLOps development and/or operations The cash compensation range for this role is More ❯
Central London, London, United Kingdom Hybrid / WFH Options
Singular Recruitment
applications and high proficiency SQL for complex querying and performance tuning. ETL/ELT Pipelines: Proven experience designing, building, and maintaining production-grade data pipelines using Google Cloud Dataflow (Apache Beam) or similar technologies. GCP Stack: Hands-on expertise with BigQuery , Cloud Storage , Pub/Sub , and orchestrating workflows with Composer or Vertex Pipelines. Data Architecture & Modelling: Ability to More ❯
the office. What You'll Need to Succeed You'll bring 5+ years of data engineering experience, with expert-level skills in Python and/or Scala, SQL, and Apache Spark. You're highly proficient with Databricks and Databricks Asset Bundles, and have a strong understanding of data transformation best practices. Experience with GitHub, DBT, and handling large structured More ❯
Belfast, County Antrim, Northern Ireland, United Kingdom
Hays
in Technical Data Analysis. Proficiency in SQL, Python, and Spark. Experience within an investment banking or financial services environment. Exposure to Hive, Impala, and Spark ecosystem technologies (e.g. HDFS, Apache Spark, Spark-SQL, UDF, Sqoop). Experience building and optimizing Big Data pipelines, architectures, and data sets. Familiarity with Hadoop and Big Data ecosystems. Strong knowledge of Data Warehouse More ❯
time and batch inference Monitor and troubleshoot deployed models to ensure reliability and performance Stay updated with advancements in machine learning frameworks and distributed computing technologies Experience: Proficiency in Apache Spark and Spark MLlib for machine learning tasks Strong understanding of predictive modeling techniques (e.g., regression, classification, clustering) Experience with distributed systems like Hadoop for data storage and processing More ❯
of large-scale distributed data processing. Experience with developing extract-transform-load (ETL). Experience with distributed messaging systems like Kafka and RabbitMQ. Experience with distributed computing frameworks like Apache Spark and Flink. Bonus Points Experience working with AWS or Google Cloud Platform (GCP). Experience in building a data warehouse and data lake. Knowledge of advertising platforms. About More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Curo Resourcing Ltd
etc. Infrastructure as Code and CI/CD paradigms and systems such as: Ansible, Terraform, Jenkins, Bamboo, Concourse etc. Observability - SRE Big Data solutions (ecosystems) and technologies such as: Apache Spark and the Hadoop Ecosystem Excellent knowledge of YAML or similar languages The following Technical Skills & Experience would be desirable: Jupyter Hub Awareness RabbitMQ or other common queue technology More ❯
Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook . Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet More ❯
knowledge and Unix skills. Highly proficient working with cloud environments (ideally Azure), distributed computing and optimising workflows and pipelines. Experience working with common data transformation and storage formats, e.g. Apache Parquet, Delta tables. Strong experience working with containerisation (e.g. Docker) and deployment (e.g. Kubernetes). Experience with Spark, Databricks, data lakes. Highly proficient in working with version control and More ❯
Bedford, Bedfordshire, England, United Kingdom Hybrid / WFH Options
Reed Talent Solutions
source systems into our reporting solutions. Pipeline Development: Develop and configure meta-data driven data pipelines using data orchestration tools such as Azure Data factory and engineering tools like Apache Spark to ensure seamless data flow. Monitoring and Failure Recovery: Implement monitoring procedures to detect failures or unusual data profiles and establish recovery processes to maintain data integrity. Azure More ❯
Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook . Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet More ❯
Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook . Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet More ❯
Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook . Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet More ❯
Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook . Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Apacheix
a year Individual healthcare cover Genuine flexible working Work from home, our Bristol offices, or client sites The latest secure tech Investment in personal development Vibrant social scene Why Apache iX? Our growing team brings a wealth of experience from across the defence and security sector, and we pride ourselves in delivering the highest quality services to our clients. More ❯
data-based insights, collaborating closely with stakeholders. Passionately discover hidden solutions in large datasets to enhance business outcomes. Design, develop, and maintain data processing pipelines using Cloudera technologies, including Apache Hadoop, Apache Spark, Apache Hive, and Python. Collaborate with data engineers and scientists to translate data requirements into technical specifications. Develop and maintain frameworks for efficient data More ❯
South West London, London, United Kingdom Hybrid / WFH Options
Anson Mccade
on experience with cloud platforms like AWS, Azure, GCP, or Snowflake. Strong knowledge of data governance, compliance, and security standards (GDPR, CCPA). Proficiency in big data technologies like Apache Spark and understanding of data product strategies. Strong leadership and stakeholder management skills in Agile delivery environments. Package: £90,000 - £115,000 base salary Bonus Pension and company benefits More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
XPERT-CAREER LTD
AI agents Understanding of Large Language Models (LLMs) and intelligent automation workflows Experience building high-availability, scalable systems using microservices or event-driven architecture Knowledge of orchestration tools like Apache Airflow , Kubernetes , or serverless frameworks Qualifications: Bachelor’s degree in Computer Science, Engineering, or related field Experience working in Agile/Scrum environments Strong problem-solving skills and attention More ❯
experience with CI/CD (e.g. Gitlab CI, Terraform, Ansible, Helm Charts, Python, PowerShell, REST APIs) Secret Management experience: Hashicorp Vault Operating Systems (Redhat & Windows) Must Have Application Experience: Apache NiFi, Elastic ECK, Artifactory Experience using an array of automation tools. Familiarity with software security methods. Demonstrated experience using a wide variety of coding languages. Curious, discerning envelope pusher. More ❯
Bristol, Stoke Gifford, Gloucestershire, United Kingdom
ARM
experience with CI/CD (e.g. Gitlab CI, Terraform, Ansible, Helm Charts, Python, PowerShell, REST APIs) Secret Management experience: Hashicorp Vault Operating Systems (Redhat & Windows) Must Have Application Experience: Apache NiFi, Elastic ECK, Artifactory Experience using an array of automation tools. Familiarity with software security methods. Demonstrated experience using a wide variety of coding languages. Curious, discerning envelope pusher. More ❯