working with modern tools in a fast-moving, high-performance environment. Your responsibilities may include: Build and maintain scalable, efficient ETL/ELT pipelines for both real-time and batch processing. Integrate data from APIs, streaming platforms, and legacy systems, with a focus on data quality and reliability. Design and manage data storage solutions, including databases, warehouses, and lakes. … Leverage cloud-native services and distributed processing tools (e.g., Apache Flink, AWS Batch) to support large-scale data workloads. Operations & Tooling Monitor, troubleshoot, and optimize data pipelines to ensure performance and cost efficiency. Implement data governance, access controls, and security measures in line with best practices and regulatory standards. Develop observability and anomaly detection tools to support Tier … maintaining scalable ETL/ELT pipelines and data architectures. Hands-on expertise with cloud platforms (e.g., AWS) and cloud-native data services. Comfortable with big data tools and distributed processing frameworks such as Apache Flink or AWS Batch. Strong understanding of data governance, security, and best practices for data quality. Effective communicator with the ability to work across technical More ❯
Responsibilities: Design, implement, and maintain robust ETL pipelines to integrate data from diverse sources, including APIs like Facebook, Google Analytics, and payment providers. Develop and optimize data models for batchprocessing and real-time streaming using tools like AWS Redshift, S3, and Kafka. Lead efforts in acquiring, storing, processing, and provisioning data to meet evolving business requirements. More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Intec Select
Responsibilities: Design, implement, and maintain robust ETL pipelines to integrate data from diverse sources, including APIs like Facebook, Google Analytics, and payment providers. Develop and optimize data models for batchprocessing and real-time streaming using tools like AWS Redshift, S3, and Kafka. Lead efforts in acquiring, storing, processing, and provisioning data to meet evolving business requirements. More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Intec Select
Responsibilities: Design, implement, and maintain robust ETL pipelines to integrate data from diverse sources, including APIs like Facebook, Google Analytics, and payment providers. Develop and optimize data models for batchprocessing and real-time streaming using tools like AWS Redshift, S3, and Kafka. Lead efforts in acquiring, storing, processing, and provisioning data to meet evolving business requirements. More ❯
scale data migration and modernization initiatives. Architect end-to-end data platforms using Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Define and implement real-time and batchprocessing pipelines for complex use cases involving streaming analytics, ML feature engineering, and automation. Act as a trusted advisor to senior technical and business stakeholders across industries (e.g. … BigQuery, Cloud Spanner, and Bigtable. Hands-on experience with orchestration tools like Apache Airflow or Cloud Composer. Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data More ❯
scale data migration and modernization initiatives. Architect end-to-end data platforms using Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Define and implement real-time and batchprocessing pipelines for complex use cases involving streaming analytics, ML feature engineering, and automation. Act as a trusted advisor to senior technical and business stakeholders across industries (e.g. … BigQuery, Cloud Spanner, and Bigtable. Hands-on experience with orchestration tools like Apache Airflow or Cloud Composer. Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data More ❯
scale data migration and modernization initiatives. Architect end-to-end data platforms using Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Define and implement real-time and batchprocessing pipelines for complex use cases involving streaming analytics, ML feature engineering, and automation. Act as a trusted advisor to senior technical and business stakeholders across industries (e.g. … BigQuery, Cloud Spanner, and Bigtable. Hands-on experience with orchestration tools like Apache Airflow or Cloud Composer. Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data More ❯
scale data migration and modernization initiatives. Architect end-to-end data platforms using Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Define and implement real-time and batchprocessing pipelines for complex use cases involving streaming analytics, ML feature engineering, and automation. Act as a trusted advisor to senior technical and business stakeholders across industries (e.g. … BigQuery, Cloud Spanner, and Bigtable. Hands-on experience with orchestration tools like Apache Airflow or Cloud Composer. Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data More ❯
scale data migration and modernization initiatives. Architect end-to-end data platforms using Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable. Define and implement real-time and batchprocessing pipelines for complex use cases involving streaming analytics, ML feature engineering, and automation. Act as a trusted advisor to senior technical and business stakeholders across industries (e.g. … BigQuery, Cloud Spanner, and Bigtable. Hands-on experience with orchestration tools like Apache Airflow or Cloud Composer. Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data More ❯
and review code, paired programming, and debugging code related performance issues, SQL tuning, etc. Experience with AWS services such as S3, RDS, Aurora, NoSQL, MSK (Kafka). Experience with batchprocessing/ETL using Glue Jobs, AWS Lambda, and Step functions. Experience with designing bespoke & tailored front-end solutions (GUI based) using open source technology such as React More ❯
from either a technical or business area. Comfortable making presentations covering business, technical or sales. Detailed knowledge of; Project phasing and reporting On-site delivery Systems support Operational procedures Batchprocessing System sizing and capacity planning System backup, contingency and disaster recovery Regional roll outs Education & Preferred Qualifications Third level qualification essential ideally a Technical Bachelors Degree. Fluency More ❯
Business Intelligence, with 5+ years in a BI leadership role in a global or matrixed organisation. Proven expertise in modern BI architecture (Data Lake, EDW, Streaming, APIs, Real-Time & BatchProcessing). Demonstrated experience delivering cloud-based analytics platforms (Azure, AWS, GCP). Strong knowledge of data governance, cataloguing, security, automation, and self-service analytics. Excellent leadership and More ❯
Intelligence, with 5+ years in a BI leadership role in a global or matrixed organisation . Proven expertise in modern BI architecture (Data Lake, EDW, Streaming, APIs, Real-Time & BatchProcessing). Demonstrated experience delivering cloud-based analytics platforms (Azure, AWS, GCP). Strong knowledge of data governance, cataloguing, security, automation, and self-service analytics. Excellent leadership and More ❯
Conduct descriptive and exploratory analysis to uncover patterns, trends, and potential applications across a wide range of datasets. Maintain robust and scalable data infra to support real-time and batchprocessing workflows. Provide domain expertise to help shape financial knowledge representation in LLM outputs. Develop and apply custom quality metrics to assess and guide model performance. Actively participate … Computer Science, Finance, Economics, Statistics, Engineering). 3+ years of professional experience in Data Science or Quantitative roles within finance. Proficiency in Python and SQL with exp. in data processing libraries such as pandas/numpy. Experience with machine learning frameworks and statistical analysis tools. Familiarity with financial data sources, APIs, and data modelling techniques. Ability to design, implement More ❯
Conduct descriptive and exploratory analysis to uncover patterns, trends, and potential applications across a wide range of datasets. Maintain robust and scalable data infra to support real-time and batchprocessing workflows. Provide domain expertise to help shape financial knowledge representation in LLM outputs. Develop and apply custom quality metrics to assess and guide model performance. Actively participate … Computer Science, Finance, Economics, Statistics, Engineering). 3+ years of professional experience in Data Science or Quantitative roles within finance. Proficiency in Python and SQL with exp. in data processing libraries such as pandas/numpy. Experience with machine learning frameworks and statistical analysis tools. Familiarity with financial data sources, APIs, and data modelling techniques. Ability to design, implement More ❯
a culture of continuous learning, collaboration, and high performance. Hold team members accountable for their performance and professional growth. Technical Expertise: Bring a deep understanding of data technologies, including batchprocessing, stream processing, storage, analytics, machine learning, and AI. Collaborate with Engineering teams to ensure the data platform meets technical requirements and supports product initiatives. Evaluate and … a product team in a data-focused environment. Strong technical background and a degree in Computer Science, Engineering, or a related field. Hands-on experience with data technologies, including batchprocessing, stream processing, data warehousing, and analytics platforms. Proven track record of successfully leading product strategies that drive significant business outcomes. Experience in managing complex product portfolios … Strong people management skills with experience in mentoring and developing Product Managers. Ability to inspire and lead cross-functional teams toward common goals. Deep understanding of data architectures, data processing frameworks, and data modelling. Familiarity with technologies such as AWS and other cloud-based data services. Knowledge of data privacy laws and compliance requirements (e.g., GDPR, CCPA). Deep More ❯
calls are logged, updated and followed through to resolution. 7. Familiarity with source code and change management 8. Provide support for, and perform, monitor, the daily End-of-Day batchprocessing operation for Flexcube. Key Skills/Experience: University degree educated. Computer Sciences or other related discipline is a must. Key Technical Skills: Oracle FLEXCUBE WS02 API platform More ❯
calls are logged, updated and followed through to resolution. 7. Familiarity with source code and change management 8. Provide support for, and perform, monitor, the daily End-of-Day batchprocessing operation for Flexcube. Key Skills/Experience: University degree educated. Computer Sciences or other related discipline is a must. Key Technical Skills: Oracle FLEXCUBE WS02 API platform More ❯
City of London, London, United Kingdom Hybrid / WFH Options
The Curve Group
calls are logged, updated and followed through to resolution. 7. Familiarity with source code and change management 8. Provide support for, and perform, monitor, the daily End-of-Day batchprocessing operation for Flexcube. Key Skills/Experience: University degree educated. Computer Sciences or other related discipline is a must. Key Technical Skills: Oracle FLEXCUBE WS02 API platform More ❯
calls are logged, updated and followed through to resolution. 7. Familiarity with source code and change management 8. Provide support for, and perform, monitor, the daily End-of-Day batchprocessing operation for Flexcube. Key Skills/Experience: University degree educated. Computer Sciences or other related discipline is a must. Key Technical Skills: Oracle FLEXCUBE WS02 API platform More ❯
City of London, London, United Kingdom Hybrid / WFH Options
The Curve Group
calls are logged, updated and followed through to resolution. 7. Familiarity with source code and change management 8. Provide support for, and perform, monitor, the daily End-of-Day batchprocessing operation for Flexcube. Key Skills/Experience: University degree educated. Computer Sciences or other related discipline is a must. Key Technical Skills: Oracle FLEXCUBE WS02 API platform More ❯
City of London, London, United Kingdom Hybrid / WFH Options
The Curve Group
calls are logged, updated and followed through to resolution. 7. Familiarity with source code and change management 8. Provide support for, and perform, monitor, the daily End-of-Day batchprocessing operation for Flexcube. Key Skills/Experience: University degree educated. Computer Sciences or other related discipline is a must. Key Technical Skills: Oracle FLEXCUBE WS02 API platform More ❯
South East London, England, United Kingdom Hybrid / WFH Options
The Curve Group
calls are logged, updated and followed through to resolution. 7. Familiarity with source code and change management 8. Provide support for, and perform, monitor, the daily End-of-Day batchprocessing operation for Flexcube. Key Skills/Experience: University degree educated. Computer Sciences or other related discipline is a must. Key Technical Skills: Oracle FLEXCUBE WS02 API platform More ❯
calls are logged, updated and followed through to resolution. 7. Familiarity with source code and change management 8. Provide support for, and perform, monitor, the daily End-of-Day batchprocessing operation for Flexcube. Key Skills/Experience: University degree educated. Computer Sciences or other related discipline is a must. Key Technical Skills: Oracle FLEXCUBE WS02 API platform More ❯
City of London, London, United Kingdom Hybrid / WFH Options
FirstBank UK Limited
calls are logged, updated and followed through to resolution. 7. Familiarity with source code and change management 8. Provide support for, and perform, monitor, the daily End-of-Day batchprocessing operation for Flexcube. Key Skills/Experience: University degree educated. Computer Sciences or other related discipline is a must. Key Technical Skills: Oracle FLEXCUBE WS02 API platform More ❯