London, England, United Kingdom Hybrid / WFH Options
Workato
AI & Intelligent Data Automation Experience integrating AI/ML-driven insights into data management products to enhance data quality, lineage tracking, and transformation recommendations. Strong understanding of predictive analytics, anomalydetection, and semantic data enrichment for operational intelligence. Security, Governance & Observability Deep knowledge of data security, compliance, and governance best practices for enterprise data platforms. Experience embedding More ❯
effectively with stakeholders across data, governance, and infrastructure. Experience in defining quality strategies , driving process improvements, and implementing automation in large-scale data platforms. Knowledge of data observability and anomalydetection tools (preferred). Industry experience in financial services, insurance, or regulated environments (not essential, but preferred). This role is an excellent opportunity for a Lead More ❯
London, England, United Kingdom Hybrid / WFH Options
Howden Group Holdings
effectively with stakeholders across data, governance, and infrastructure. Experience in defining quality strategies , driving process improvements, and implementing automation in large-scale data platforms. Knowledge of data observability and anomalydetection tools (preferred). Industry experience in financial services, insurance, or regulated environments (not essential, but preferred). This role is an excellent opportunity for a Lead More ❯
and maintain scalable, modular pipelines using AWS services (Glue, Lambda, Step Functions, S3), supporting ingestion, transformation, and storage across key business domains. Data Quality & Governance: Implement automated data validation, anomalydetection, lineage, and auditability; enforce consistent naming, access controls, and compliance with GDPR and healthcare standards. Performance & Cost Optimisation: Tune pipelines and query layers (Glue, Athena) for More ❯
and maintain scalable, modular pipelines using AWS services (Glue, Lambda, Step Functions, S3), supporting ingestion, transformation, and storage across key business domains. Data Quality & Governance: Implement automated data validation, anomalydetection, lineage, and auditability; enforce consistent naming, access controls, and compliance with GDPR and healthcare standards. Performance & Cost Optimisation: Tune pipelines and query layers (Glue, Athena) for More ❯
maintain, performance tune, and respond to incidents on our big data pipeline infrastructure. Build out observability and intelligent monitoring of data pipelines and infrastructure to achieve early and automated anomalydetection and alerting. Present your research and insights to all levels of the company, clearly and concisely. Build solutions to continually improve our software release and change More ❯
and optimize data pipelines to ensure performance and cost efficiency. Implement data governance, access controls, and security measures in line with best practices and regulatory standards. Develop observability and anomalydetection tools to support Tier 1 systems. Work with engineers and business teams to gather requirements and translate them into technical solutions. Maintain documentation, follow coding standards … technical and non-technical teams. Additional Strengths Experience with orchestration tools like Apache Airflow. Knowledge of real-time data processing and event-driven architectures. Familiarity with observability tools and anomalydetection for production systems. Exposure to data visualization platforms such as Tableau or Looker. Relevant cloud or data engineering certifications. What we offer: A collaborative and transparent … Apache Airflow (or similar) and integrating them into containerised CI/CD pipelines (Docker, GitHub Actions, Jenkins, etc.)? Select Which option best describes your experience building observability and automated anomalydetection tooling for data pipelines? Select What best describes your current location and working rights status? Select By submitting your application, you confirm that you have read More ❯
clean, reliable, query-ready datasets used by commercial, operational, and marketing teams. Defining and evolving the data architecture to support scale, cost-efficiency, and data quality. Building validation layers, anomalydetection, and alerting to ensure trustworthy, production-grade pipelines . Working with infrastructure-as-code tools (e.g. CDK, Terraform) to manage data infrastructure securely and repeatably. Driving More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
scalable, agnostic testing frameworks for use across agile delivery teams. Promote best practices including Test-Driven Development (TDD) , Behaviour-Driven Development (BDD) , and AI/ML-based testing for anomalydetection and performance validation. Mentor and upskill test and engineering teams in modern, automation-first testing approaches. Collaborate across teams to ensure quality and consistency throughout the More ❯
models using relevant datasets to achieve optimal performance Implement strategies for continuous model improvement and optimization Data Mining & Analysis Apply data mining techniques such as clustering, classification, regression, and anomalydetection to discover patterns and trends in large datasets. Analyze and preprocess large datasets to extract meaningful insights and features for model training MLOps - Deployment into production More ❯
models using relevant datasets to achieve optimal performance Implement strategies for continuous model improvement and optimization Data Mining & Analysis Apply data mining techniques such as clustering, classification, regression, and anomalydetection to discover patterns and trends in large datasets. Analyze and preprocess large datasets to extract meaningful insights and features for model training Code Review and Documentation More ❯
annotations. Experience in using Bloomberg Data, Bloomberg Terminal, and/or enterprise financial data products. Interest in solving problems and developing data-driven methodologies for high precision & high recall anomaly detection. Past project experience using the Agile/Scrum project management methodology. Does this sound like you? Apply if you think we're a good match. We'll get More ❯
industry certifications (e.g. DAMA CDMP, DCAM, etc.) Keen interest and familiarity with generative AI frameworks. Interest in solving problems and developing data-driven methodologies for high precision & high recall anomaly detection. Past project experience using the Agile/Scrum project management methodology. Does this sound like you? Apply if you think we're a good match. We'll get More ❯
Route Optimization: Use data analytics to identify the most efficient routes. Optimized routing not only shortens travel time but also makes ETA predictions more reliable. 5. Machine Learning for AnomalyDetection: Implement machine learning algorithms to detect anomalies that could affect delivery times, such as unexpected traffic jams or vehicle breakdowns, and adjust ETAs accordingly. 6. Sensor More ❯
of different data sources into our Lakehouse (Databricks on Azure Data Lake) and its architecture. Be responsible for the reliability and quality of data in the Data Lake (including anomalydetection, data quality checks, reconciliations, access, permission, and retention management, PII treatment, and back-up/restoration plans). Set up and manage platform technologies to support More ❯
extract business insights from technical results and effectively communicate them to a non-technical audience. Job Responsibilities Design and architect end to end solutions in AI domain ranging from Anomalydetection Use cases, Chat with your at data, and using GenAI. Proactively develop an understanding of key business problems and processes. Execute tasks throughout the model development … NLP: tokenization, embeddings, sentiment analysis, basic transformers for text-heavy datasets. Experience with LLM & Prompt Engineering, including tools like LangChain, LangGraph, and Retrieval-Augmented Generation (RAG). Experience in anomalydetection techniques, algorithms, and applications. Excellent problem-solving, communication (verbal and written), and teamwork skills. Preferred qualifications, capabilities, and skills Experience with deep learning frameworks such as More ❯
technical and non-technical teams. Bonus Points For: Experience with orchestration tools like Apache Airflow. Familiarity with real-time data processing and event-driven systems. Knowledge of observability and anomalydetection in production environments. Exposure to visualization tools like Tableau or Looker. Relevant cloud or data engineering certifications. What’s Offered: Competitive salary with two annual discretionary More ❯
technical and non-technical teams. Bonus Points For: Experience with orchestration tools like Apache Airflow. Familiarity with real-time data processing and event-driven systems. Knowledge of observability and anomalydetection in production environments. Exposure to visualization tools like Tableau or Looker. Relevant cloud or data engineering certifications. What’s Offered: Competitive salary with two annual discretionary More ❯
for problem-solving. Preferred Skills: Interest and familiarity with generative AI frameworks. Knowledge of data governance and management, supported by industry certifications (e.g., DAMA CDMP, DCAM). Experience with anomalydetection methodologies. Experience with Agile/Scrum project management methodologies. If this sounds like you, apply to join our team. We will contact you with the next More ❯
technical and non-technical teams. Bonus Points For: Experience with orchestration tools like Apache Airflow. Familiarity with real-time data processing and event-driven systems. Knowledge of observability and anomalydetection in production environments. Exposure to visualization tools like Tableau or Looker. Relevant cloud or data engineering certifications. What’s Offered: Competitive salary with two annual discretionary More ❯
core ledger systems, transaction processors, credit bureaus, open banking APIs, and third-party providers. Implement and maintain robust processes to ensure data accuracy, completeness, and reliability. Implement automated checks, anomalydetection, and data validation routines. Deliver production-ready datasets that power credit decision engines, risk models, and affordability assessments in real time. Work closely with credit risk More ❯
retrieval systems or search engine development. Experience with OSINT or intelligence analysis tools and methodologies. Understanding of data privacy regulations and security considerations. Experience with time series analysis and anomaly detection. Familiarity with A/B testing and experimentation design. Experience working with PAI sources and analytical techniques. Knowledge of graph databases and network analysis for relationship mapping. Open More ❯
/Java/Scala/etc.) Experience working with imbalanced datasets and applying appropriate techniques Experience with time series data, including preprocessing, feature engineering, and forecasting Experience with outlierdetection and anomalydetection Experience working with various data types: text, image, and video data Familiarity with AI/ML cloud implementations (AWS, Azure, GCP, NVidia More ❯
the integrity and security of our payment systems. Your role will involve leveraging data analytical skills, Python, SQL, and various analytical tools to identify suspicious activities, enhance fraud detection strategies, and support decision-making processes. Your New Day-to-day Will Involve Fraud Detection and Analysis: Analyse transaction data to identify patterns and trends indicative of … fraudulent activities. Investigate and monitor real-time transaction alerts to detect potential card fraud. Develop and implement data-driven fraud detection rules and models. Analyse false positives to refine and optimize fraud detection systems. Ensure that the company's financial practices comply with statutory regulations and legislation. Data Management and Analysis: Extract, clean, and manage large … practices for fraud detection. Analytical Mindset: Strong problem-solving skills with a keen eye for detail. Ability to interpret complex data and turn it into actionable insights. Familiarity with anomalydetection and pattern recognition techniques. Communication and Stakeholder Management: Excellent communication skills with the ability to translate complex data findings into clear insights for non-technical stakeholders. More ❯
and optimize data pipelines to ensure performance and cost efficiency. Implement data governance, access controls, and security measures in line with best practices and regulatory standards. Develop observability and anomalydetection tools to support Tier 1 systems. Collaboration & Continuous Improvement Work with engineers and business teams to gather requirements and translate them into technical solutions. Maintain documentation … technical and non-technical teams. Additional Strengths Experience with orchestration tools like Apache Airflow. Knowledge of real-time data processing and event-driven architectures. Familiarity with observability tools and anomalydetection for production systems. Exposure to data visualization platforms such as Tableau or Looker. Relevant cloud or data engineering certifications. What we offer: A collaborative and transparent More ❯