london, south east england, United Kingdom Hybrid / WFH Options
Harnham
metrics. Collaborating with business teams to understand reporting needs and deliver intuitive analytics experiences. Creating clear documentation for dashboards, metrics, and models. Ensuring optimal performance and cost efficiency across Looker and BigQuery. Supporting a stakeholder-first approach to data usability and access. KEY SKILLS AND REQUIREMENTS Advanced experience with … clarity. Comfortable working independently in a dynamic, fast-changing environment. DESIRABLE SKILLS Experience rebuilding or refactoring a Looker instance from scratch. Familiarity with BigQuery performancetuning and cost optimization. Experience scaling analytics across a cross-functional organisation HOW TO APPLY Please register your interest by sending your CV More ❯
for scalable and secure data management. Collaborating with cross-functional teams to design and optimize cloud-based data solutions. Ensuring data quality, integration, and performance through best practices and advanced ETL techniques. Implement data quality checks & Standardization in the code. Document all mappings, mapplets and rules in detail and … Cloud Services IDMC components - application integration, data integration, Informatica data quality. Strong functional understanding of RDBMS DWH-BI conceptual knowledge. Strong SQL Skills and performancetuning capabilities. Excellent knowledge of the Informatica platform as a whole and the integration among different Informatica components and services. Excellent data analysis More ❯
for scalable and secure data management. Collaborating with cross-functional teams to design and optimize cloud-based data solutions. Ensuring data quality, integration, and performance through best practices and advanced ETL techniques. Implement data quality checks & Standardization in the code. Document all mappings, mapplets and rules in detail and … Cloud Services IDMC components - application integration, data integration, Informatica data quality. Strong functional understanding of RDBMS DWH-BI conceptual knowledge. Strong SQL Skills and performancetuning capabilities. Excellent knowledge of the Informatica platform as a whole and the integration among different Informatica components and services. Excellent data analysis More ❯
scalable technical solutions. Integrate third-party systems and in-house tools with the Endur platform. Support testing, deployment, and post-implementation phases. Contribute to performancetuning, debugging, and general maintenance of the Endur environment. Key Requirements: Strong hands-on experience with Endur, particularly JVS (Java-based scripting). More ❯
scalable technical solutions. Integrate third-party systems and in-house tools with the Endur platform. Support testing, deployment, and post-implementation phases. Contribute to performancetuning, debugging, and general maintenance of the Endur environment. Key Requirements: Strong hands-on experience with Endur, particularly JVS (Java-based scripting). More ❯
Work with business analysts and stakeholders to gather and understand requirements Conduct code reviews and ensure best practices are followed Support testing, deployment, and performancetuning Troubleshoot issues and suggest improvements Maintain technical documentation Run workshops and break down requirements into development tasks Skills Required: Strong hands-on More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Paritas Recruitment
coordination, and training for end-users. Ensure alignment of Findur functionalities with regulatory requirements (e.g., EMIR, MiFID II, REMIT). Support ongoing enhancements and performancetuning of the Findur platform. Requirements: Strong Business/Technical Analyst Experience Some exposure to Openlink Findur project design and implementation/support More ❯
in place of this. At least 5 years of in-depth experience working with Windows desktop and server operating systems. Experience hardware monitoring and performancetuning Deep experience working with Active Directory/Group Policy Deep Experience working in DHCP and DNS Advanced PowerShell scripting skills (a huge More ❯
robust data security protocols aligned with regulatory and organizational standards. Stakeholder Collaboration: Work cross-functionally to translate data needs into actionable architecture and solutions. PerformanceTuning: Continuously optimize data pipelines and database systems for speed and efficiency. Documentation: Produce and maintain architecture documentation, technical specs, and design artefacts. More ❯
in place of this. At least 5 years of in-depth experience working with Windows desktop and server operating systems. Experience hardware monitoring and performancetuning Deep experience working with Active Directory/Group Policy Deep Experience working in DHCP and DNS Advanced PowerShell scripting skills (a huge More ❯
with: SQL and NoSQL databases Containerisation Working in a continuous delivery environment Distributed and horizontally scalable systems Observability and monitoring tools Triaging production issues Performancetuning of JVM apps Nice to have Not vital, but you'll have the edge if you also have experience with: Kotlin Prometheus More ❯
territory, drive innovation across multiple deployment sites, and collaborate closely with our optimization and engineering teams. You'll develop robust Python code, analyze system performance, and contribute to deployment pipelines that power critical operations. If you're an independent, hands-on engineer who thrives on writing production-grade Python … analytics and machine learning workflows. Support the migration of our codebase toward machine learning capabilities by building scalable, maintainable solutions. Analyze system logs and performance to debug issues and optimize operations using forensic analysis tools. Qualifications: Bachelor's or Master's degree in Computer Science, Mathematics, Data Analytics, or … a related field. 3+ years of experience developing and deploying production-grade Python software. 3+ years of experience with Python and high-performance data libraries such as Polars and Pandas. Proficiency with JavaScript, SQL, and KQL. Experience with Extract, Transform, Load (ETL), Data Streaming, and Reconciliation. Experience building and More ❯
Role Lead the design, deployment and tuning of enterprise-grade SIEM platforms (e.g. Splunk, Azure Sentinel etc.) Collaborate with stakeholders to define logging requirements, use cases, detection rules and dashboards Oversee integration of data sources from cloud, on-prem, endpoint, network and application layers Create and maintain detection rules … Provide technical leadership and mentorship to team members Work closely with SOC teams to align SIEM capabilities with business objectives Conduct SIEM health checks, performancetuning and capacity planning Skills Expertise in SIEM design, deployment and optimisation Hands-on expertise with one or more major SIEM platforms (e.g. More ❯
Role Lead the design, deployment and tuning of enterprise-grade SIEM platforms (e.g. Splunk, Azure Sentinel etc.) Collaborate with stakeholders to define logging requirements, use cases, detection rules and dashboards Oversee integration of data sources from cloud, on-prem, endpoint, network and application layers Create and maintain detection rules … Provide technical leadership and mentorship to team members Work closely with SOC teams to align SIEM capabilities with business objectives Conduct SIEM health checks, performancetuning and capacity planning Skills Expertise in SIEM design, deployment and optimisation Hands-on expertise with one or more major SIEM platforms (e.g. More ❯
expert-level support and configuration for Cloud Connector, SAP BTP, Datasphere, and SAC. Provide S/4HANA and BTP Basis expertise, including system setup, performancetuning, and troubleshooting. Ensure compliance with best practices for system integration, migration, and security across cloud and on-premise landscapes. Support post-go More ❯
needed.... Elastic Stack (Elasticsearch, Logstash, Kibana, Beats) Experience managing & integrating ELK infrastructure Index lifecycle management (ILM) IP networking and data flow Data pipeline creation, performancetuning of Logstash and Beats A nice to have.... Corvil and/or Pico tools APIs (REST/JSON/XML); Python and More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Hawksworth
needed.... Elastic Stack (Elasticsearch, Logstash, Kibana, Beats) Experience managing & integrating ELK infrastructure Index lifecycle management (ILM) IP networking and data flow Data pipeline creation, performancetuning of Logstash and Beats A nice to have.... Corvil and/or Pico tools APIs (REST/JSON/XML); Python and More ❯
up, where you will be involved in the full lifecycle of its creation—ranging from architecture design and system development to continuous enhancement and performance tuning. Requirements: Excellent programming and technology skills, including an in-depth understanding of or strong knowledge in C++. Ability to take ownership of technical More ❯
up, where you will be involved in the full lifecycle of its creation—ranging from architecture design and system development to continuous enhancement and performance tuning. Requirements: Excellent programming and technology skills, including an in-depth understanding of or strong knowledge in C++. Ability to take ownership of technical More ❯
/deliverables. Champion and drive through alerting and monitoring requirements for the platform. Identify and execute pro-active actions to ensure continued stability and performance of the platform. Our ideal candidate: Strong experience in Designing and delivering Azure based data platform solutions, technologies including: Azure Data Bricks, Azure Synapse … knowledge in real-time streaming applications preferably with experience in Kafka Real-time messaging or Azure Stream Analytics/Event Hub. Spark processing and performance tuning. File formats partitioning for e.g. Parquet, JSON, XML, CSV. Azure DevOps, GitHub actions. Hands-on experience in at least one of Python with More ❯
manage reliable ETL pipelines using SQL and SSIS to support enterprise-wide data integration and transformation. Optimise SQL code and ETL workflows to enhance performance and scalability. Develop and maintain SSIS packages to ensure accurate and consistent data processing. Support operational reporting systems by resolving issues raised by users … as Computer Science, Engineering, Mathematics, or Economics. At least 5 years' experience in SQL development. Advanced skills in writing complex SQL, stored procedures, and performance tuning. Solid hands-on experience with SSIS and ETL development. Knowledge of BI/reporting tools (e.g. Power BI, Tableau, SSRS, SSAS) is advantageous. More ❯
practical data architectures and drive improvements in data reliability, efficiency, and quality. A proven track record of recommending changes to enhance database maintenance, monitoring, performancetuning, etc. What you'll get for this role: Salary Circa £85,000 (depending on location, skills, experience, and qualifications). Bonus opportunity … of annual salary Actual amount depends on your performance and Aviva's. Generous pension scheme - Aviva will provide up to 14%, depending on individual contributions. 29 days holiday plus bank holidays, and a choice to buy or sell up to 5 days. Make your money go further - Up to More ❯
the next generation of GPU and AI acceleration solutions. This is a unique opportunity to contribute to cutting-edge technology focused on delivering high-performance and energy-efficient compute platforms for modern AI workloads. You'll be working on a flagship GPU and AI platform supporting PyTorch, OpenCL, and … integrations and other AI tools for a custom AI accelerator platform. You'll work closely with hardware and software teams to ensure tight integration, performancetuning, and a seamless developer experience. Key Responsibilities: Develop and maintain PyTorch integration for a custom AI platform. Build and optimize kernels and … 5+ years of experience in AI/ML software development. Deep understanding of PyTorch internals and other major ML frameworks. Experience optimizing deep learning performance on accelerator hardware. Solid knowledge of deep learning algorithms and compute patterns. Strong programming skills in C++, CUDA, or OpenCL. Background in performanceMore ❯
. Collaborate closely with traders, quants, and analysts to identify opportunities and turn ideas into production-ready code. Enhance our existing trading platform, improving performance, scalability, and automation. Work on integration with market data providers and exchange APIs (e.g., EPEX, Nord Pool, or similar). Contribute to backtesting frameworks … energy markets . Deep understanding of intraday power trading , including market dynamics, order books, and constraints. Familiarity with multithreading , real-time data processing , and performancetuning . Bonus: experience with Python, F#, or time-series databases (InfluxDB, kdb+, etc). This would be a hybrid role, with More ❯