Gloucester, Gloucestershire, United Kingdom Hybrid / WFH Options
Navtech, Inc
of Science Degree in software engineering or a related field Proficiency in English spoken and written Nice-to-Haves: Experience with ETL/ELT pipeline design and tools (e.g., Apache Airflow). Familiarity with Change Data Capture (CDC) solutions. Knowledge of database services on other cloud platforms (e.g., Azure SQL Database, Google Cloud Spanner). Understanding of ORM frameworks More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
Navtech, Inc
of Science Degree in software engineering or a related field Proficiency in English spoken and written Nice-to-Haves: Experience with ETL/ELT pipeline design and tools (e.g., Apache Airflow). Familiarity with Change Data Capture (CDC) solutions. Knowledge of database services on other cloud platforms (e.g., Azure SQL Database, Google Cloud Spanner). Understanding of ORM frameworks More ❯
experience working as a Software Engineer on large software applications Proficient in many of the following technologies - Python, REST, PyTorch, TensorFlow, Docker, FastAPI, Selenium, React, TypeScript, Redux, GraphQL, Kafka, Apache Spark. Experience working with one or more of the following database systems - DynamoDB, DocumentDB, MongoDB Demonstrated expertise in unit testing and tools - JUnit, Mockito, PyTest, Selenium. Strong working knowledge More ❯
or MS degree in Computer Science or equivalent Experience in developing Finance or HR related applications Working experience with Tableau Working experience with Terraform Experience in creating workflows for Apache Airflow and Jenkins Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive More ❯
REST APIs and integration techniques Familiarity with data visualization tools and libraries (e.g. Power BI) Background in database administration or performance tuning Familiarity with data orchestration tools, such as Apache Airflow Previous exposure to big data technologies (e.g. Hadoop, Spark) for large data processing Strong analytical skills, including a thorough understanding of how to interpret customer business requirements and More ❯
Gloucester, Gloucestershire, South West, United Kingdom Hybrid / WFH Options
Omega Resource Group
GitLab) Contributing across the software development lifecycle from requirements to deployment Tech Stack Includes: Java, Python, Linux, Git, JUnit, GitLab CI/CD, Oracle, MongoDB, JavaScript/TypeScript, React, Apache NiFi, Elasticsearch, Kibana, AWS, Hibernate, Atlassian Suite What's on Offer: Hybrid working and flexible schedules (4xFlex) Ongoing training and career development Exciting projects within the UK's secure More ❯
Hucclecote, Gloucestershire, United Kingdom Hybrid / WFH Options
Omega Resource Group
GitLab) Contributing across the software development lifecycle from requirements to deployment Tech Stack Includes: Java, Python, Linux, Git, JUnit, GitLab CI/CD, Oracle, MongoDB, JavaScript/TypeScript, React, Apache NiFi, Elasticsearch, Kibana, AWS, Hibernate, Atlassian Suite What s on Offer: Hybrid working and flexible schedules (4xFlex) Ongoing training and career development Exciting projects within the UK s secure More ❯
Gloucester, Gloucestershire, South West, United Kingdom
Anson Mccade
Python Strong experience developing on Linux Version control using Git Agile development (SCRUM) Working with both relational databases (Oracle) and NoSQL (MongoDB) Experience with GitLab CI/CD Pipelines , Apache NiFi , and Atlassian tools (Jira, Bitbucket, Confluence) Front-end skills: JavaScript/TypeScript, React Search and analytics tools: Elasticsearch, Kibana Nice to Have: Experience developing for AWS Cloud (EC2 More ❯
North West London, London, United Kingdom Hybrid / WFH Options
Anson Mccade
knowledge of Kafka , Confluent , and event-driven architecture Hands-on experience with Databricks , Unity Catalog , and Lakehouse architectures Strong architectural understanding across AWS, Azure, GCP , and Snowflake Familiarity with Apache Spark, SQL/NoSQL databases, and programming (Python, R, Java) Knowledge of data visualisation, DevOps principles, and ML/AI integration into data architectures Strong grasp of data governance More ❯
collaboratively Proficiency in multiple programming languages Technologies: Scala, Java, Python, Spark, Linux, shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience with process scheduling platforms like Apache Airflow Open to working with proprietary GS technologies such as Slang/SECDB Understanding of compute resources and performance metrics Knowledge of distributed computing, including parallel and cloud processing More ❯
with multiple languages Technologies: Scala, Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience in working with process scheduling platforms like Apache Airflow. Open to working in GS proprietary technology like Slang/SECDB An understanding of compute resources and the ability to interpret performance metrics (e.g., CPU, memory, threads, file More ❯
with multiple languages Technologies: Scala, Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience in working with process scheduling platforms like Apache Airflow. Open to working in GS proprietary technology like Slang/SECDB An understanding of compute resources and the ability to interpret performance metrics (e.g., CPU, memory, threads, file More ❯
Experience with real-time analytics from telemetry and event-based streaming (e.g., Kafka) Experience managing operational data stores with high availability, performance, and scalability Expertise in data lakes, lakehouses, Apache Iceberg, and data mesh architectures Proven ability to build, deliver, and support modern data platforms at scale Strong knowledge of data governance, data quality, and data cataloguing Experience with More ❯
and managing cloud infrastructure as code Proficiency in programming languages such as Python, Spark, SQL Strong experience with SQL databases Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF) Experience with cloud platforms (Azure preferred) and related data services Excellent problem-solving skills and attention to detail Inclusive and curious, continuously seeks to build knowledge and More ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
Azure Storage, Medallion Architecture, and working with data formats such as JSON, CSV, and Parquet. • Strong understanding of IT concepts, including security, IAM, Key Vault, and networking. • Exposure to Apache Airflow and DBT is a bonus. • Familiarity with agile principles and practices. • Experience with Azure DevOps pipelines. The "Nice to Haves": • Certification in Azure or related technologies. • Experience with More ❯
requirements into data solutions Monitor and improve pipeline performance and reliability Maintain documentation of systems, workflows, and configs Tech environment Python, SQL/PLSQL (MS SQL + Oracle), PySpark Apache Airflow (MWAA), AWS Glue, Athena AWS services (CDK, S3, data lake architectures) Git, JIRA You should apply if you have: Strong Python and SQL skills Proven experience designing data More ❯
programming Ability to multitask and manage stakeholders Proficiency in multiple programming languages Technologies: Scala, Java, Python, Spark, Linux, shell scripting, TDD, build tools Experience with process scheduling platforms like Apache Airflow Willingness to work with proprietary technologies like Slang/SECDB Understanding of compute resources and performance metrics Knowledge of distributed computing and parallel processing Experience with SDLC and More ❯
of real-time and analytical data pipelines, metadata, and cataloguing (e.g., Atlan) Strong communication, stakeholder management, and documentation skills Preferred (but not essential): AWS or Snowflake certifications Knowledge of Apache Airflow, DBT, GitHub Actions Experience with Iceberg tables and data product thinking Why Apply? Work on high-impact, high-scale client projects Join a technically elite team with a More ❯
Python Java Development experience in Java and/or Python Experience working with terraform to provision AWS cloud services Experience of AWS Glue, AWS Athena & AWS S3 Knowledge of Apache Parquet & open table formats Extensive knowledge of distributed systems Experience in developing, debugging, and maintaining code in a large corporate environment Overall knowledge of the Software Development Life Cycle More ❯
Vue.js Scripting & Automation: Bash, Ansible DevOps & CI/CD: Jenkins, GitLab CI/CD, Terraform Cloud & Infrastructure: AWS Testing & Quality: Cucumber, SonarQube Monitoring & Logging: ELK Stack, Grafana Dataflow & Integration: Apache NiFi Willingness to learn and contribute across the stack is important, as experience across multiple areas is desirable but not mandatory. Skills: DevOps Java Nifi Terraform What you can More ❯
South East London, London, United Kingdom Hybrid / WFH Options
TEN10 SOLUTIONS LIMITED
and data validation techniques. Experience using test automation frameworks for data pipelines and ETL workflows Strong communication and stakeholder management skills. Nice-to-Have: Hands-on experience with Databricks , Apache Spark , and Azure Deequ . Familiarity with Big Data tools and distributed data processing. Experience with data observability and data quality monitoring. Proficiency with CI/CD tools like More ❯
IV, IFRS 9, CRD4). Strong leadership and stakeholder engagement skills. 15+ years in software development and cloud engineering, ideally in financial services. Experience with big data frameworks (e.g., Apache Beam, Spark) and data governance tools. About working for us Our ambition is to be the leading UK business for diversity, equity and inclusion supporting our customers, colleagues and More ❯
IV, IFRS 9, CRD4). Strong leadership and stakeholder engagement skills. 15+ years in software development and cloud engineering, ideally in financial services. Experience with big data frameworks (e.g., Apache Beam, Spark) and data governance tools. About working for us Our ambition is to be the leading UK business for diversity, equity and inclusion supporting our customers, colleagues and More ❯
Bexhill-on-sea, Sussex, United Kingdom Hybrid / WFH Options
Hastings Direct
drive projects from initiation to completion. Keen interest in emerging Machine Learning techniques. Desirable: Experience with Cloud deployments (e.g. Azure/AWS/GCP) and data processing frameworks (e.g. Apache Spark). The interview process: Our interview process involves the below: Recruiter screening call 1st stage interview - Intro with Hiring Leader 2nd interview - case Study round with hiring leaders More ❯
management and associated tools such as Git/Bitbucket. Experience in the use of CI/CD tools such as Jenkins or an understanding of their role. Experience with Apache Spark or Hadoop. Experience in building data pipelines. Experience of designing warehouses, ETL pipelines and data modelling. Good knowledge in designing, building, using, and maintaining REST APIs. Good SQL More ❯