Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
financial data, risk modelling, or algorithmic trading is a plus. Familiarity with cloud platforms (AWS, GCP, or Azure) and modern data stack tools (e.g., Apache Airflow, dbt, Snowflake). Excellent communication and stakeholder management skills. Must be available to work onsite in London 3 days per week. What's More ❯
front-end technologies like HTML and Vue.js. Proficiency in scalable database design (SQL, NoSQL, Graph databases) such as SQL Server, MongoDB, Cassandra, Redis, and Apache Druid. Experience with REST APIs, GraphQL, and gRPC. Hands-on experience with version control (GitHub/GitLab) and testing frameworks like SonarQube, xUnit, Postman More ❯
concepts to non-technical stakeholders. Preferred Skills: Experience with insurance platforms such as Guidewire, Duck Creek, or legacy PAS systems. Knowledge of Delta Lake, Apache Spark, and data pipeline orchestration tools. Exposure to Agile delivery methodologies and tools like JIRA, Confluence, or Azure DevOps. Understanding of regulatory data requirements More ❯
Skills: x5 + experience with Python programming for data engineering tasks Strong proficiency in SQL and database management Hands-on experience with Databricks and Apache Spark Familiarity with Azure cloud platform and related services Knowledge of data security best practices and compliance standards Excellent problem-solving and communication skills More ❯
TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like Apache Airflow, Spark, or Kafka. Strong understanding of SQL, NoSQL, and data modeling. Familiarity with cloud platforms (AWS, Azure, GCP) for deploying ML and data More ❯
Bracknell, Berkshire, United Kingdom Hybrid / WFH Options
Perforce Software, Inc
an on-call rotation Requirements: Experience using the AWS EC2 web console and APIs. Deep understanding of HTTP protocol, including web security, and troubleshooting. Apache or Nginx web server administration and configuration experience. Linux system administration experience (Red Hat, Rocky, Alma, Debian, Ubuntu, et. al.) Experience maintaining production RDBMS More ❯
systems Strong knowledge of Kubernetes and Kafka Experience with Git, and Deployment Pipelines Having worked with at least one of the following stacks: Hadoop, Apache Spark, Presto Experience profiling performance issues in database systems Ability to learn and/or adapt quickly to complex issues Happy to collaborate with More ❯
Reading, England, United Kingdom Hybrid / WFH Options
Areti Group | B Corp™
efficient and performant queries. • Skilled in optimizing data ingestion and query performance for MSSQL or other RDBMS. • Familiar with data processing frameworks such as Apache Spark. • Highly analytical and tenacious in solving complex problems. Seniority level Seniority level Not Applicable Employment type Employment type Full-time Job function Job More ❯
SQL. Performance optimisation of data ingestion and query performance for MSSQL (or transferable skills from another RDBMS) Familiar with data processing frameworks such as Apache Spark. Experience of working with Terabyte data sets and managing rapid data growth. The benefits at APF: At AllPoints Fibre, we're all about More ❯
data management (MDM). Deep understanding of data integration, transformation, and ingestion techniques using modern tools (e.g., Azure Data Factory, Boomi, Informatica, Talend, dbt, Apache NiFi). Strong familiarity with data warehousing, data lake/lakehouse architectures, and cloud-native analytics platforms. Hands-on experience with SQL and cloud More ❯
master data management (MDM) Deep understanding of data integration, transformation, and ingestion techniques using modern tools (e.g., Azure Data Factory, Boomi, Informatica, Talend, dbt, Apache NiFi) Strong familiarity with data warehousing, data lake/lakehouse architectures, and cloud-native analytics platforms Hands-on experience with SQL and cloud data More ❯
Reading, England, United Kingdom Hybrid / WFH Options
Areti Group | B Corp™
efficient and performant queries. • Skilled in optimizing data ingestion and query performance for MSSQL or other RDBMS. • Familiar with data processing frameworks such as Apache Spark. • Highly analytical and tenacious in solving complex problems. More ❯
Bracknell, England, United Kingdom Hybrid / WFH Options
BlckBx
KNOWLEDGE Experience with modern front-end frameworks: React, Vue.js. Knowledge of database technologies: MySQL, Airtable. Knowledge of AWS or other cloud services. Exposure to Apache, Nginx, Linux (bonus). Familiarity with Git and version control. Strong problem-solving skills and attention to detail. Excellent communication and teamwork skills. Knowledge More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
in ETL, data modelling, and Azure Data Services. Experience in designing and implementing data pipelines, data lakes, and data warehouses. Hands-on experience with Apache Spark and bonus points for Microsoft Fabric Any certifications are a bonus. Hybrid work once a week into their Central London office Learning and More ❯
software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., Apache Airflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes More ❯
software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., Apache Airflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
experience with ML frameworks (PyTorch, TensorFlow, Keras). Proficiency in Python and C/C++. Experience with scalable data tools (e.g., PySpark, Kubernetes, Databricks, Apache Arrow). Proven ability to manage GPU-intensive data processing jobs. 4+ years of applied research or industry experience. Creative problem-solver with a More ❯
gain more hands-on experience and build their software skills. Key Skills: Experience with databases and basic SQL Experience with web servers, such as Apache, is helpful Remote access tools Starting salary in the region of £30-35K + benefits. Apply now for immediate consideration and to be More ❯
more hands on, & build up their software experience. Key Skills: o Windows support experience o Databases/basic SQL o Web servers - such as Apache experience is helpful o Remote Access Tool Starting salary in the region £30-35K + benefits. Apply now for immediate consideration and interview More ❯
days ago Be among the first 25 applicants Direct message the job poster from Apache Associates Senior Product Manager - Mobile Applications - Hungerford - £80k - £90k pa - Hybrid Role Our client, a fast-growing SaaS business, is seeking an experienced Product Manager to lead the strategy, development, and optimization of their More ❯
Good knowledge of security tools configuration and endpoint deployment. Knowledge and understanding of Networking protocols - TCP, NFS, NTP, SMTP, DNS, DHCP, FTP, TELNET, SAMBA, Apache Webserver configurations. Proven customer-facing experience in an IT service role. * Free services are subject to limitations #J-18808-Ljbffr More ❯
actively contribute throughout the Agile development lifecycle , participating in planning, refinement, and review ceremonies. Key Responsibilities: Develop and maintain ETL pipelines in Databricks , leveraging Apache Spark and Delta Lake . Design, implement, and optimize data transformations and treatments for structured and unstructured data. Work with Hive Metastore and Unity … technical impact assessments and rationales. Work within GitLab repository structures and adhere to project-specific processes. Required Skills and Experience: Strong expertise in Databricks , Apache Spark , and Delta Lake . Experience with Hive Metastore and Unity Catalog for data governance. Proficiency in Python, SQL, Scala , or other relevant languages. More ❯