working with modern tools in a fast-moving, high-performance environment. Your responsibilities may include: Build and maintain scalable, efficient ETL/ELT pipelines for both real-time and batch processing. Integrate data from APIs, streaming platforms, and legacy systems, with a focus on data quality and reliability. Design and manage data storage solutions, including databases, warehouses, and lakes. … Leverage cloud-native services and distributed processing tools (e.g., Apache Flink, AWS Batch) to support large-scale data workloads. Operations & Tooling Monitor, troubleshoot, and optimize data pipelines to ensure performance and cost efficiency. Implement data governance, access controls, and security measures in line with best practices and regulatory standards. Develop observability and anomaly detection tools to support Tier … maintaining scalable ETL/ELT pipelines and data architectures. Hands-on expertise with cloud platforms (e.g., AWS) and cloud-native data services. Comfortable with big data tools and distributed processing frameworks such as Apache Flink or AWS Batch. Strong understanding of data governance, security, and best practices for data quality. Effective communicator with the ability to work across technical More ❯
systems are written in Elixir, but where necessary, we use small amounts of Python and Java where vendor SDKs require it. Designing, developing and maintaining realtime data streaming, and batchprocessing workloads. Providing on-call support as part of our teamwide rotation. Our on-call rotation is split across US and UK time zones, ensuring coverage whilst keeping … parallelism for speed/space performance tradeoffs. Bonus Experience: Exchange traded financial instruments. Statistics, discrete mathematics, linear algebra. Problem-solving and proof construction. For more information about DRW's processing activities and our use of job applicants' data, please view our Privacy Notice at . California residents, please review the California Privacy Notice for information about certain legal rights More ❯
they operate. Hands on Experience in Java , Spark , Scala ( or Java) Production scale hands-on Experience to write Data pipelines using Spark/any other distributed real time/batch processing. Strong skill set in SQL/Databases Strong understanding of Messaging tech like Kafka, Solace , MQ etc. Writing production scale applications to use the Caching technologies. Understanding of More ❯
they operate. Hands on Experience in Java , Spark , Scala ( or Java) Production scale hands-on Experience to write Data pipelines using Spark/any other distributed real time/batch processing. Strong skill set in SQL/Databases Strong understanding of Messaging tech like Kafka, Solace , MQ etc. Writing production scale applications to use the Caching technologies. Understanding of More ❯
What You'll Do: Design, implement, and maintain robust ETL pipelines for diverse data sources including Facebook, Google Analytics, and payment providers Build and optimise data models for both batchprocessing and real-time streaming using AWS Redshift, S3, and Kafka Lead data acquisition, processing, and provisioning initiatives aligned with evolving business needs Conduct customer behaviour and More ❯
Reigate, Surrey, England, United Kingdom Hybrid / WFH Options
esure Group
proactive mindset to help ensure the stability, efficiency, and resilience of applications across esure. Good working knowledge of Java is needed for this position. What you’ll do: Support batchprocessing and carry out proactive analysis across squads and services. Review Java code and suggest improvements to enhance quality and performance. Investigate and resolve production incidents, problems, and … change requests. Offer expertise on business processes for projects and production support issues. Maintain core Insurance systems and related infrastructure like Kubernetes and SOLR. Improve batch efficiency and reduce costs, including audit-ready data archiving strategies. Provide data analysis to support business decisions across Operations and Claims. Deliver patching, database fixes, and business-critical data correction services. Qualifications What More ❯
physical server provisioning, especially in strategic data centres Platform Resilience & Observability: Designing for uptime, performance, and root cause analysis. Web Services & APIs: Used for Integration with 24+ LBGI systems BatchProcessing: Understanding of batch suite performance and scheduling constraints RPA & Automation (Batching): Familiarity with robotic process automation Log Aggregation & Analysis: Tooling for log interrogation and root cause More ❯
background - proven track record in enterprise data solutions Experience with ETL processes and data transformation , preferably using Databricks Strong foundation in Data Warehousing architectures and dimensional modeling Familiarity with batchprocessing from relational database sources Communication & Collaboration Skills of the Data Engineer Outstanding stakeholder engagement abilities across technical and business audiences Strong relationship-building skills with experience managing More ❯
risk management actions • Develop software for calculations using databases like Snowflake, Sybase IQ and distributed HDFS systems. • Interact with business users for resolving issues with applications. • Design and support batch processes using scheduling infrastructure for calculation and distributing data to other systems. • Oversee junior technical team members in all aspects of Software Development Life Cycle (SDLC) including design, code … and the ability to interpret performance metrics (e.g., CPU, memory, threads, file handles). • Knowledge and experience in distributed computing - parallel computation on a single machine like DASK, Distributed processing on Public Cloud. • Knowledge of SDLC and experience in working through entire life cycle of the project from start to end ABOUT GOLDMAN SACHS At Goldman Sachs, we commit More ❯
and review code, paired programming, and debugging code related performance issues, SQL tuning, etc. Experience with AWS services such as S3, RDS, Aurora, NoSQL, MSK (Kafka). Experience with batchprocessing/ETL using Glue Jobs, AWS Lambda, and Step functions. Experience with designing bespoke & tailored front-end solutions (GUI based) using open source technology such as React More ❯
and review code, paired programming, and debugging code related performance issues, SQL tuning, etc. Experience with AWS services such as S3, RDS, Aurora, NoSQL, MSK (Kafka). Experience with batchprocessing/ETL using Glue Jobs, AWS Lambda, and Step functions. Experience with designing bespoke & tailored front-end solutions (GUI based) using open source technology such as React More ❯
and review code, paired programming, and debugging code related performance issues, SQL tuning, etc. Experience with AWS services such as S3, RDS, Aurora, NoSQL, MSK (Kafka). Experience with batchprocessing/ETL using Glue Jobs, AWS Lambda, and Step functions. Experience with designing bespoke & tailored front-end solutions (GUI based) using open source technology such as React More ❯
/review code/paired programming/debugging code related performance issues, SQL tuning etc. Experience with AWS services such as S3, RDS, Aurora, NoSQL, MSK (Kafka) Experience with batchprocessing/ETL using Glue Jobs, AWS Lambda and Step functions. Experience with designing bespoke & tailored front-end solutions (GUI based) using open source technology such as React More ❯
Intelligence, with 5+ years in a BI leadership role in a global or matrixed organisation . Proven expertise in modern BI architecture (Data Lake, EDW, Streaming, APIs, Real-Time & BatchProcessing). Demonstrated experience delivering cloud-based analytics platforms (Azure, AWS, GCP). Strong knowledge of data governance, cataloguing, security, automation, and self-service analytics. Excellent leadership and More ❯
Intelligence, with 5+ years in a BI leadership role in a global or matrixed organisation . Proven expertise in modern BI architecture (Data Lake, EDW, Streaming, APIs, Real-Time & BatchProcessing). Demonstrated experience delivering cloud-based analytics platforms (Azure, AWS, GCP). Strong knowledge of data governance, cataloguing, security, automation, and self-service analytics. Excellent leadership and More ❯
field Experience: Proven experience in ML model lifecycle management ● Core Competencies: Model lifecycle: You’ve got hands-on experience with managing the ML model lifecycle, including both online and batch processes Statistical Methodology: You have worked with GLMs and other machine learning algorithms and have in-depth knowledge of how they work Python: You have built and deployed production More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Burns Sheehan
field Experience: Proven experience in ML model lifecycle management ● Core Competencies: Model lifecycle: You’ve got hands-on experience with managing the ML model lifecycle, including both online and batch processes Statistical Methodology: You have worked with GLMs and other machine learning algorithms and have in-depth knowledge of how they work Python: You have built and deployed production More ❯
field Experience: Proven experience in ML model lifecycle management Core Competencies: Model lifecycle: You've got hands-on experience with managing the ML model lifecycle, including both online and batch processes Statistical Methodology: You have worked with GLMs and other machine learning algorithms and have in-depth knowledge of how they work Python: You have built and deployed production More ❯
a culture of continuous learning, collaboration, and high performance. Hold team members accountable for their performance and professional growth. Technical Expertise: Bring a deep understanding of data technologies, including batchprocessing, stream processing, storage, analytics, machine learning, and AI. Collaborate with Engineering teams to ensure the data platform meets technical requirements and supports product initiatives. Evaluate and … a product team in a data-focused environment. Strong technical background and a degree in Computer Science, Engineering, or a related field. Hands-on experience with data technologies, including batchprocessing, stream processing, data warehousing, and analytics platforms. Proven track record of successfully leading product strategies that drive significant business outcomes. Experience in managing complex product portfolios … Strong people management skills with experience in mentoring and developing Product Managers. Ability to inspire and lead cross-functional teams toward common goals. Deep understanding of data architectures, data processing frameworks, and data modelling. Familiarity with technologies such as AWS and other cloud-based data services. Knowledge of data privacy laws and compliance requirements (e.g., GDPR, CCPA). Deep More ❯
a culture of continuous learning, collaboration, and high performance. Hold team members accountable for their performance and professional growth. Technical Expertise: Bring a deep understanding of data technologies, including batchprocessing, stream processing, storage, analytics, machine learning, and AI. Collaborate with Engineering teams to ensure the data platform meets technical requirements and supports product initiatives. Evaluate and … a product team in a data-focused environment. Strong technical background and a degree in Computer Science, Engineering, or a related field. Hands-on experience with data technologies, including batchprocessing, stream processing, data warehousing, and analytics platforms. Proven track record of successfully leading product strategies that drive significant business outcomes. Experience in managing complex product portfolios … Strong people management skills with experience in mentoring and developing Product Managers. Ability to inspire and lead cross-functional teams toward common goals. Deep understanding of data architectures, data processing frameworks, and data modelling. Familiarity with technologies such as AWS and other cloud-based data services. Knowledge of data privacy laws and compliance requirements (e.g., GDPR, CCPA). Deep More ❯
mindset with strong organizational skills and attention to detail. Familiarity with tools like JIRA or ServiceNow. Certifications in ITIL, Linux, or cloud platforms (e.g., AWS, Azure). Experience with batchprocessing, data interfaces, or monitoring tools. Exposure to automation or scripting. Learn more about the LexisNexis Risk team and how we work We are committed to providing a More ❯
and scaling cloud-based data infrastructure. Key Responsibilities: Design, build, and optimize data pipelines using Airflow, DBT, and Databricks. Monitor and improve pipeline performance to support real-time and batch processing. Manage and optimize AWS-based data infrastructure, including S3 and Lambda, as well as Snowflake. Implement best practices for cost-efficient, secure, and scalable data processing. Enable and More ❯
ensure alignment with business goals. Support upgrades, vendor management, and regulatory compliance, including GDPR . Investigate and resolve complex technical issues, ensuring minimal service disruption. Maintain and monitor overnight batch processes and key operational reports using tools like Talend, Skybot, SOA, and Progress. Take ownership of environment refreshes, code deployments, and configuration changes across multiple system instances. Contribute to More ❯
to develop their skills in a secure, enterprise-grade setting. Key Responsibilities Deliver 2nd line support across a broad range of IT infrastructure technologies Monitor alerts and events, manage batchprocessing, and respond proactively to incidents Work to meet SLAs through timely resolution of technical issues Support change requests, patching, and technical escalations Maintain detailed records using call More ❯
Employment Type: Permanent
Salary: £28000 - £31000/annum Pension, Healthcare + more
Willenhall, West Midlands, United Kingdom Hybrid / WFH Options
Parser Limited
Sr QA Manual Engineer We are looking for a detail-oriented Functional QA Engineer to join a data-centric project focused on real-time processing and system reliability. You will be responsible for validating business-critical workflows, ensuring data accuracy in CSV imports/exports, and maintaining traceability for audit and compliance purposes. The ideal candidate is analytical, structured … testing with sprint goals. Perform regression, integration, and exploratory testing. Manage defect reporting and tracking, while ensuring traceability and timely resolution. Contribute to the testing of real-time and batch processes using CSV and other input formats. Ensure proper documentation of test cases, results, and traceability matrices to support audit and compliance needs. What you'll bring to us More ❯