Good start-up. This is a 3-month contract initially with scope for extension. In this role, you will design, build, and maintain robust data pipelines using streaming and batchprocessing technologies and help to lead the modernization of their data lakehouse. You will implement and manage secure cloud infrastructure using AWS, championing a culture of data quality … security, and technical excellence. Key Responsibilities: Architect and maintain data pipelines (streaming & batch) Build and manage scalable data storage solutions (data lakehouse) Design and maintain secure AWS cloud infrastructure Implement and manage CI/CD and DevOps pipelines Champion data quality, security, and best practices Collaborate with cross-functional teams Implement and manage MLOps capabilities Essential Skills: Advanced Python More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Hays Specialist Recruitment Limited
per day (Inside IR35) Remote/must be UK based 12 months We're looking for an experienced D365 Finance & Operations Developer to lead the development of custom journal processing extensions. This role suits someone with strong X++ skills and a deep understanding of D365 Finance, especially in high-volume, cross-functional environments. Key Requirements: 5+ years in D365 … and CI/CD pipelines. Proficient in Power BI: report creation, configuration, and export. Skilled in workflows, security roles, SSRS, and unit testing (SysTest). Familiar with DMF, OData, batchprocessing, and error handling. Ability to optimize performance for high-volume transactions. Strong documentation and collaboration skills. Nice to have: Integration experience across systems and platforms. If you More ❯
full-time, remote UK-based role. ***Unfortunately we can't consider candidates that require sponsorship*** Key Responsibilities: Research and deploy LLM-based solutions (e.g., LangChain, Mastra.ai, Pydantic) for document processing, summarization, and clinical Q&A systems. Develop and optimize predictive models using scikit-learn, PyTorch, TensorFlow, and XGBoost. Design robust data pipelines using tools like Spark and Kafka for … real-time and batch processing. Manage ML lifecycle with tools such as Databricks , MLflow , and cloud-native platforms (Azure preferred). Collaborate with engineering teams to ensure scalable, secure ML infrastructure aligned with compliance standards (e.g., ISO27001). Ensure data governance, particularly around sensitive healthcare data. Share best practices and stay current with developments in AI, ML, and LLMs. More ❯