that serve,process and transform large quantities of data in the cloud Minimum Qualifications: Experience with Python (or other Object Oriented language) Experience building reliable, distributed applications for Data Processing or similar areas Hands-on experience developing cloud applications (e.g. AWS, GCP, Azure) Experience with technologies like BigQuery and SnowFlake Preferred Qualifications: Experience writing testable and modular code Experience … working in a fast-paced environment, collaborating across teams and disciplines Experience designing, deploying, and maintaining distributed systems Data pipelines, data platforms, workflow orchestration, batchprocessing Experience on building ML pipelines WHAT WE OFFER We are committed to creating a modern work environment that supports our employees and their loved ones. We offer many options of the best More ❯
testable and modular code Experience working in a fast-paced environment, collaborating across teams and disciplines Experience designing, deploying, and maintaining distributed systems Data pipelines, data platforms, workflow orchestration, batchprocessing WHAT WE OFFER We are committed to creating a modern work environment that supports our employees and their loved ones. We offer many options of the best More ❯
and scaling cloud-based data infrastructure. Key Responsibilities: Design, build, and optimize data pipelines using Airflow, DBT, and Databricks. Monitor and improve pipeline performance to support real-time and batch processing. Manage and optimize AWS-based data infrastructure, including S3 and Lambda, as well as Snowflake. Implement best practices for cost-efficient, secure, and scalable data processing. Enable and More ❯
Nottingham, Nottinghamshire, East Midlands, United Kingdom
COMPUTACENTER (UK) LIMITED
IT infrastructure to work in triage support across multiple customers within multiple technologies. The roles will focus on resolving incidents, increasing the first-time fix rates in addition to Batch Management and Event Monitoring. Youll be encouraged to use your skill and knowledge to proactively improve the services we offer to add quality and value to our customers. Youll … levels within the variety of technologies the Team supports. You will be responsible for the BAU service for multiple technologies within the Command Second Line team, working to manage batchprocessing, event management, incidents, problems, requests, and changes. We value strong communication skills and the ability to engage with colleagues and stakeholders at all levels, as the account … this role is aligned to is a high-flying and flagship one for us. On a day-to-day basis this looks like: Technical 80% Batch Monitoring and Management. Event/Alert Monitoring and Management. Incident Management - respond to service calls and resolve incidents to ensure SLA targets are achieved. Patch Management. Increase the First Time Fix rate. Adhere More ❯
passion for working in: Java Spring Boot Developing and using enterprise APIs Various testing methodologies System design at high scale and commercial experience with: SQL and NoSQL databases Async processing Cloud native applications Working in a Continuous Delivery environment Modern observability practices Nice to have Not vital, but you'll have the edge if you also have experience with … Grafana Prometheus Kotlin or a least the willingness to learn it Batchprocessing data pipelines or have worked in: an eCommerce organisation a shipping/logistics/exports organisation What you bring Agile: Test-Driven Development, collaboration and continuous delivery are your preferred engineering practices? We take the best bits of Lean, Scrum and Kanban too. Architecture: In More ❯
working with modern tools in a fast-moving, high-performance environment. Your responsibilities may include: Build and maintain scalable, efficient ETL/ELT pipelines for both real-time and batch processing. Integrate data from APIs, streaming platforms, and legacy systems, with a focus on data quality and reliability. Design and manage data storage solutions, including databases, warehouses, and lakes. … Leverage cloud-native services and distributed processing tools (e.g., Apache Flink, AWS Batch) to support large-scale data workloads. Operations & Tooling Monitor, troubleshoot, and optimize data pipelines to ensure performance and cost efficiency. Implement data governance, access controls, and security measures in line with best practices and regulatory standards. Develop observability and anomaly detection tools to support Tier … maintaining scalable ETL/ELT pipelines and data architectures. Hands-on expertise with cloud platforms (e.g., AWS) and cloud-native data services. Comfortable with big data tools and distributed processing frameworks such as Apache Flink or AWS Batch. Strong understanding of data governance, security, and best practices for data quality. Effective communicator with the ability to work across technical More ❯
ministers and the nation's digitised history from the 11th to the early 21st century are all to be preserved in the system. We currently operate a series of batch-processing workflows constructed predominantly in Scala, Java, XSLT and XML Schema running in a Linux environment. You will influence the future design of the system and will be More ❯
be doing: Learn how our billing and CRM systems support critical customer and operational processes, including billing cycles, debt management, and financial updates. Support system operations such as overnight batchprocessing, data reporting, and issue troubleshooting while building your technical knowledge. Develop your understanding of regulatory frameworks like GDPR and industry best practices in IT security and customer More ❯
Integration). Expert-level understanding of API design principles (REST, SOAP), API management platforms, and microservices architecture. Proficient in various integration patterns (e.g., ETL, EAI, B2B, real-time streaming, batchprocessing). Strong experience with cloud-native integration services and hybrid cloud environments. Expertise in various data formats (JSON, XML, EDI) and messaging protocols (Kafka, RabbitMQ, JMS). More ❯
Integration). Expert-level understanding of API design principles (REST, SOAP), API management platforms, and microservices architecture. Proficient in various integration patterns (e.g., ETL, EAI, B2B, real-time streaming, batchprocessing). Strong experience with cloud-native integration services and hybrid cloud environments. Expertise in various data formats (JSON, XML, EDI) and messaging protocols (Kafka, RabbitMQ, JMS). More ❯
solutions) is highly desirable. Familiarity with insurance products and their configuration within a PAS. Proficiency in troubleshooting and resolving complex technical issues. Experience with system integrations (APIs, web services, batch processes), as PAS often integrates with many other core insurance systems (CRM, Claims, Billing, Data Warehouses). Understanding of SaaS and cloud technologies if the PAS is cloud-based. More ❯
solutions) is highly desirable. Familiarity with insurance products and their configuration within a PAS. Proficiency in troubleshooting and resolving complex technical issues. Experience with system integrations (APIs, web services, batch processes), as PAS often integrates with many other core insurance systems (CRM, Claims, Billing, Data Warehouses). Understanding of SaaS and cloud technologies if the PAS is cloud-based. More ❯
moving from a successful proof-of-concept to beta-product phase. You will have the opportunity to learn new techniques and algorithms at the cutting edge of Natural Language Processing (NLP), specifically in compensating for Large Language Model limitations. You will also have the opportunity to innovate and contribute to algorithm development. Dr G.A.McHale, Technical Director, AI & Data Science … About the Team The team is led by someone with significant AI experience in bio-inspired architectures, reinforcement learning, expert systems, scheduling, meta-heuristics, robotics, and natural language processing (including LLMs). We have recruited an experienced scientific computing developer with a strong mathematics background in theoretical physics, responsible for distributed systems and GPU optimisation of AI algorithms. The … hardware-specific optimisation related to memory and CPU utilisation, hybrid-LLM system optimisation, and possible pre-filtering algorithms to reduce computational loads. Success is measured by significant improvements in batchprocessing and inference costs. About You This is a hands-on programming role. Expertise in LLMs (programming or hybrid systems) is essential. You have a background in Telecommunications More ❯
Eclipse Broking, Applied Epic, or similar platforms. Familiarity with insurance product configuration within PAS. Proficient in troubleshooting and resolving complex technical issues. Experience with system integrations (APIs, web services, batch processes). Knowledge of SaaS/cloud technologies and basic network configuration. Understanding of enterprise security principles and secure development practices. Experience in API development and management (RESTful, SOAP More ❯
Experience working with AI tolls for data analysis, process automation is preferred. Demonstrated experience in business process design and requirements analysis. Knowledge of and experience with Web APIs and Batch Processes is preferred. Ability to promote and maintain a positive and inclusive work environment with project team members, co-workers, management, and vendors by behaving, collaborating, and communicating in More ❯
solutions) is highly desirable. Familiarity with insurance products and their configuration within a PAS. Proficiency in troubleshooting and resolving complex technical issues. Experience with system integrations (APIs, web services, batch processes), as PAS often integrates with many other core insurance systems (CRM, Claims, Billing, Data Warehouses). Technical skills: Cloud Platforms and SaaS/PaaS solutions Application configuration PAS More ❯
solutions) is highly desirable. Familiarity with insurance products and their configuration within a PAS. Proficiency in troubleshooting and resolving complex technical issues. Experience with system integrations (APIs, web services, batch processes), as PAS often integrates with many other core insurance systems (CRM, Claims, Billing, Data Warehouses). Technical skills: Cloud Platforms and SaaS/PaaS solutions Application configuration PAS More ❯
and integrating enterprise systems. Key areas of expertise include dependency injection, inversion of control, aspect-oriented programming, functional programming, test-driven development, data access frameworks, transaction management frameworks, and batch processing. Proficiency in DevOps methodologies and tools is also essential for this role. Required education None Preferred education Bachelor's Degree Required technical and professional expertise No details supplied More ❯
a BA coupled with at least 2 years experience of Nice Actimize. - Analyzing current installation of Actimize WLF solution at a client side - Recommend improvements on data ingestion and batch process performance - Recommend optimal Threshold setting and Scoring configuration which adhere to the client requirements - Improve detection and alert generation Must Have: - Minimum 2 years experience supporting Nice Actimize More ❯
to support security operations. Working with development teams to optimize scripts for performance and maintainability. Expert knowledge of JCL, including: The creation, maintenance and troubleshooting of JCL scripts for batchprocessing in a secure manner. Ensuring that JCL scripts adhere to security best practices and company standards. Collaboration with the operations team to support job execution and resolve More ❯
Safety): DCS software configuration generated from client's design documentation such as user requirement specifications or control philosophies Alarm rationalisation and understanding of EEMUA 191 Continuous Process control and Batch process software programming Functional Safety SIL classifications and SIF calculations Supervise or implement the detailed design, by producing function block programming (using engineering software tools), implement the Safety system More ❯
management UI applications Development experience with databases such as Snowflake, Sybase IQ, and distributed systems like HDFS Interaction with business users to resolve application issues Design and support of batch processes using scheduling tools for data calculation and distribution Leadership in SDLC activities including design, code review, and deployment Skills and Experience Bachelor's degree in Computer Science, Mathematics … Apache Airflow Open to working with proprietary GS technologies such as Slang/SECDB Understanding of compute resources and performance metrics Knowledge of distributed computing, including parallel and cloud processing Experience managing the full project lifecycle About Goldman Sachs Founded in 1869, Goldman Sachs is a leading global investment banking, securities, and investment management firm headquartered in New York More ❯
risk management actions Develop software for calculations using databases like Snowflake, Sybase IQ and distributed HDFS systems. Interact with business users for resolving issues with applications. Design and support batch processes using scheduling infrastructure for calculation and distributing data to other systems. Oversee junior technical team members in all aspects of Software Development Life Cycle (SDLC) including design, code … and the ability to interpret performance metrics (e.g., CPU, memory, threads, file handles). Knowledge and experience in distributed computing - parallel computation on a single machine like DASK, Distributed processing on Public Cloud. Knowledge of SDLC and experience in working through entire life cycle of the project from start to end ABOUT GOLDMAN SACHS At Goldman Sachs, we commit More ❯
risk management actions Develop software for calculations using databases like Snowflake, Sybase IQ and distributed HDFS systems. Interact with business users for resolving issues with applications. Design and support batch processes using scheduling infrastructure for calculation and distributing data to other systems. Oversee junior technical team members in all aspects of Software Development Life Cycle (SDLC) including design, code … and the ability to interpret performance metrics (e.g., CPU, memory, threads, file handles). Knowledge and experience in distributed computing - parallel computation on a single machine like DASK, Distributed processing on Public Cloud. Knowledge of SDLC and experience in working through entire life cycle of the project from start to end ABOUT GOLDMAN SACHS At Goldman Sachs, we commit More ❯
for risk management UI Experience with databases such as Snowflake, Sybase IQ, and distributed systems like HDFS Effective communication with business users to resolve issues Design and support of batch processes with scheduling tools Leadership experience in SDLC including design, review, and deployment Skills and Experience Bachelor's in Computer Science, Mathematics, Electrical Engineering, or related field 3+ years … process scheduling platforms like Apache Airflow Willingness to work with proprietary technologies like Slang/SECDB Understanding of compute resources and performance metrics Knowledge of distributed computing and parallel processing Experience with SDLC and managing projects end-to-end Goldman Sachs is committed to diversity and inclusion, offering various benefits and support programs. We are an equal opportunity employer More ❯