a dynamic environment. Build, train, and deploy state-of-the-art models (e.g., deep learning, NLP, computer vision, reinforcement learning, or relevant domain-specific architectures). Design infrastructure for data ingestion, experimentation, model versioning, and monitoring. Collaborate with product, design, and DevOps teams to integrate AI features into our platform. Stay current with AI research, open-source tools, and … frameworks to maintain a leading edge. Support and mentor junior engineers as the team grows. Key Responsibilities Implement end-to-end AI pipelines: data collection/cleaning, feature engineering, model training, validation, and inference. Rapidly prototype novel models using PyTorch, TensorFlow, JAX, or equivalent. Productionize models in cloud/on-prem environments (AWS/GCP/Azure) with containerization … Docker/Kubernetes). 2. Data & Infrastructure Build and maintain scalable data pipelines (ETL/ELT) and data lakes/warehouses. Establish best practices for data labeling, versioning, and governance. Implement MLOps processes: CI/CD for model training, automated testing, model drift detection, and continuous monitoring. Evaluate applicability of new research and tools to improve More ❯
respond to RFIs/RFPs, and prepare sales proposals. What We're Seeking MSc in computer science, engineering, physics, statistics, mathematics, operations research, or natural sciences, with hands-on data science experience including AI/Gen AI and machine learning. Experience analyzing large data sets, data cleaning, and statistical analysis. Proven experience with at least three machine … learning algorithms (e.g., neural networks, logistic regression, random forests). Proficiency with Java and Python, understanding of datastructures, algorithms, and software design patterns. Experience with AI/Gen AI frameworks like TensorFlow or PyTorch. Experience with cloud platforms such as AWS SageMaker or Azure Machine Learning. Ability to translate business problems into solutions. Strong communication skills; bilingualism … work/life balance, resource groups, and social events. Why Join FICO? At FICO, you will develop your career in one of the fastest-growing fields in tech - Big Data analytics. You'll contribute to our mission to help businesses improve decision-making using AI, machine learning, and optimization. FICO makes a difference worldwide: Credit Scoring - Used by More ❯
Birmingham, Staffordshire, United Kingdom Hybrid / WFH Options
Low Carbon Contracts Company
learning, and modern software engineering. They will be naturally curious and self-directed, with the ability to move between exploration and production delivery. A strong advocate for AI and data best practices, they will promote responsible use of emerging technologies and contribute to building digital confidence and security awareness across teams. Key Responsibilities Design, develop, and deploy secure and … scalable AI and ML solutions, ensuring alignment with organisational priorities and standards. Improve and maintain existing tools and models; enhancing accuracy, data ingestion, and user experience. Responsible for contributing to the development of reusable frameworks, engineering standards, and documentation that guide sustainable AI development. Work collaboratively with analysts, subject matter experts, and stakeholders to translate business needs into technically … high-quality technical documentation and user-facing material. Skills, Knowledge and Expertise A good first degree or higher in a highly numerate subject (e.g. computer science, engineering, mathematics, or data science). Minimum 2 years' experience in Python development, including use of scientific and data libraries such as NumPy, pandas, SciPy, or PySpark. Experience working with machine learning More ❯
Leeds, Yorkshire, United Kingdom Hybrid / WFH Options
Low Carbon Contracts Company
learning, and modern software engineering. They will be naturally curious and self-directed, with the ability to move between exploration and production delivery. A strong advocate for AI and data best practices, they will promote responsible use of emerging technologies and contribute to building digital confidence and security awareness across teams. Key Responsibilities Design, develop, and deploy secure and … scalable AI and ML solutions, ensuring alignment with organisational priorities and standards. Improve and maintain existing tools and models; enhancing accuracy, data ingestion, and user experience. Responsible for contributing to the development of reusable frameworks, engineering standards, and documentation that guide sustainable AI development. Work collaboratively with analysts, subject matter experts, and stakeholders to translate business needs into technically … high-quality technical documentation and user-facing material. Skills, Knowledge and Expertise A good first degree or higher in a highly numerate subject (e.g. computer science, engineering, mathematics, or data science). Minimum 2 years' experience in Python development, including use of scientific and data libraries such as NumPy, pandas, SciPy, or PySpark. Experience working with machine learning More ❯
purpose (Java, Cloud computing, HDFS, Spark, S3, ReactJS, Sybase IQ among many others). A glimpse of the interesting problems that we engineer solutions for, include acquiring high quality data, storing it, performing risk computations in limited amount of time using distributed computing, and making data available to enable actionable risk insights through analytical and response user interfaces. … memory and CPU utilization. • Perform statistical analyses to identify trends and exceptions related Market Risk metrics. • Build internal and external reporting for the output of risk metric calculation using data extraction tools, such as SQL, and data visualization tools, such as Tableau. • Utilize web development technologies to facilitate application development for front end UI used for risk management … like Snowflake, Sybase IQ and distributed HDFS systems. • Interact with business users for resolving issues with applications. • Design and support batch processes using scheduling infrastructure for calculation and distributing data to other systems. • Oversee junior technical team members in all aspects of Software Development Life Cycle (SDLC) including design, code review and production migrations. Skills And Experience • Bachelor's More ❯
the best platform, people, and partners together to create limitless opportunities for growth. For more information, visit About Choreograph: A Leading WPP Media Brand Choreograph is WPP's global data products and technology company. We're on a mission to transform marketing by building the fastest, most connected data platform that bridges marketing strategy to scaled activation. We … work with agencies and clients to transform the value of data by bringing together technology, data and analytics capabilities. We deliver this through the Open Media Studio, an AI-enabled media and data platform for the next era of advertising. We're endlessly curious. Our team of thinkers, builders, creators and problem solvers are over … strong, across 20 markets around the world. WHO WE ARE LOOKING FOR Our team consists of 100+ engineers, designers, data, and product people, working in small inter-disciplinary teams closely with creative agencies, media agencies, and with our customers, to develop and scale our DCO platform, a leading digital advertising optimization suite that delivers amazing outcomes for brands and More ❯
colleagues, internal partners, and business stakeholders. Ensure solutions align with security and compliance standards. What We're Looking For: Strong programming skills in Python , with solid grounding in OOP , datastructures , and algorithms . Experience with DevOps tools and practices, version control (Git) , and CI/CD pipelines . Comfortable with Agile methodologies and collaborative team-based development. … interview will be contacted. Proactive Appointments Limited operates as an employment agency and employment business and is an equal opportunities organisation We take our obligations to protect your personal data very seriously. Any information provided to us will be processed as detailed in our Privacy Notice, a copy of which can be found on our website http:// More ❯
re excited if you have 10+ years of experience building large scale and low latency distributed systems Command of Java, C++ or Go/Golang Solid understanding of algorithms, datastructures, performance optimization techniques, object-oriented programming, multi-threading, and real-time programming Experience with distributed caching, SQL/NO SQL, and other databases is a plus Experience … with Big Data and cloud services such as AWS/GCP is a plus Experience in advertising domain a big plus B.S. or M.S. degree in Computer Science, Engineering, or equivalent Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive benefits include … global footprint, and how we've grown, visit . By providing your information, you acknowledge that you have read our Applicant Privacy Notice and authorize Roku to process your data subject to those terms. More ❯
Maidenhead, Berkshire, United Kingdom Hybrid / WFH Options
Spectrum IT Recruitment
the use of Azure DevOps or Jira 2+ years of practical experience with Agile development methodologies Experience working on public cloud native applications Computer science fundamentals: OOP, design patterns, datastructures & algorithms Ideally you will have studied Computer Science, Software Engineering, Mathematics or similar STEM degree. Please hit apply and upload your CV or email me at (url More ❯
platforms, Docker and Kubernetes. Familiarity with More than one of: C#, Java, Python, and C++ Databases such as: MSSQL, Postgres, Redis Kafka/RabbitMQ or similar event-based platforms Datastructures and design/analysis of algorithms Not required, but a bonus Fixed Income products and Interest Rate derivatives (including Risk, PnL attribution, scenario analysis, etc.) Possesses the … Strong attention to detail, with a track record of leading and driving projects to completion. For more information about DRW's processing activities and our use of job applicants' data, please view our Privacy Notice athttps://drw.com/privacy-notice . #J-18808-Ljbffr More ❯
platforms, Docker and Kubernetes. Familiarity with More than one of: C#, Java, Python, and C++ Databases such as: MSSQL, Postgres, Redis Kafka/RabbitMQ or similar event-based platforms Datastructures and design/analysis of algorithms Not required, but a bonus Fixed Income products and Interest Rate derivatives (including Risk, PnL attribution, scenario analysis, etc.) IR derivatives … Strong attention to detail, with a track record of leading and driving projects to completion. For more information about DRW's processing activities and our use of job applicants' data, please view our Privacy Notice at . California residents, please review the California Privacy Notice for information about certain legal rights at . More ❯
platforms, Docker and Kubernetes. Familiarity with More than one of: C#, Java, Python, and C++ Databases such as: MSSQL, Postgres, Redis Kafka/RabbitMQ or similar event-based platforms Datastructures and design/analysis of algorithms Not required, but a bonus Fixed Income products and Interest Rate derivatives (including Risk, PnL attribution, scenario analysis, etc.) IR derivatives … Strong attention to detail, with a track record of leading and driving projects to completion. For more information about DRW's processing activities and our use of job applicants' data, please view our Privacy Notice at . California residents, please review the California Privacy Notice for information about certain legal rights at . [#LI-SK1] #J-18808-Ljbffr More ❯
teams. Our work includes a suite of web-based applications and Enterprise Platform Services (EPS) that facilitate integration between third-party systems. Given that we handle sensitive and confidential data, maintaining the highest standards of quality is essential. About the role As a lead, you consider yourself a platform services automation developer and developing innovative solutions using modern software … Proven track record of designing and implementing successful test automation strategies Experience in using various testing tools and technologies Deep understanding of software architecture, object-oriented design principles, and datastructures Experience in JavaScript/TypeScript and Cypress framework Experience in MySQL like databases and SQL Demonstrated ability to be proactive, self-driven, and make practical trade-offs … global footprint, and how we've grown, visit . By providing your information, you acknowledge that you have read our Applicant Privacy Notice and authorize Roku to process your data subject to those terms. More ❯
platforms, Docker and Kubernetes. Familiarity with More than one of: C#, Java, Python, and C++ Databases such as: MSSQL, Postgres, Redis Kafka/RabbitMQ or similar event-based platforms Datastructures and design/analysis of algorithms Not required, but a bonus Fixed Income products and Interest Rate derivatives (including Risk, PnL attribution, scenario analysis, etc.) IR derivatives … Strong attention to detail, with a track record of leading and driving projects to completion. For more information about DRW's processing activities and our use of job applicants' data, please view our Privacy Notice at https://drw.com/privacy-notice. California residents, please review the California Privacy Notice for information about certain legal rights at https More ❯
support and maintain high-quality code using TDD principles. Technical Skills: Strong knowledge of Core Java (JDK 21 and above) and multithreading concepts. Proficiency in designing and implementing efficient datastructures and algorithms. Experience with SQL/NoSQL databases (e.g., Oracle, MySQL, Postgres, MongoDB, Cassandra). Familiarity with messaging systems (e.g., Kafka, Tibco, Solace). Solid understanding of … and Unix/Linux environments. Qualifications: A wealth of experience building business-critical applications in a full-stack manner. Strong understanding of computer science fundamentals, including algorithms, complexity, and data structures. Proven track record of managing and implementing successful projects. Ability to work under pressure and meet tight deadlines. Bachelor's degree/University degree or equivalent experience. Additional More ❯
london, south east england, united kingdom Hybrid / WFH Options
Citigroup Inc
support and maintain high-quality code using TDD principles. Technical Skills: Strong knowledge of Core Java (JDK 21 and above) and multithreading concepts. Proficiency in designing and implementing efficient datastructures and algorithms. Experience with SQL/NoSQL databases (e.g., Oracle, MySQL, Postgres, MongoDB, Cassandra). Familiarity with messaging systems (e.g., Kafka, Tibco, Solace). Solid understanding of … and Unix/Linux environments. Qualifications: A wealth of experience building business-critical applications in a full-stack manner. Strong understanding of computer science fundamentals, including algorithms, complexity, and data structures. Proven track record of managing and implementing successful projects. Ability to work under pressure and meet tight deadlines. Bachelor’s degree/University degree or equivalent experience. Additional More ❯
Newcastle Upon Tyne, Tyne and Wear, North East, United Kingdom Hybrid / WFH Options
Client Server
good knowledge of modern software engineering best practices and have experience across the full software development lifecycle You have a thorough understanding of Computer Science fundamentals such as OOP, DataStructures, Design Patterns, Algorithms You're excited to join a start-up in a role that you can shape and influence What's in it for you: As More ❯
sunderland, tyne and wear, north east england, united kingdom Hybrid / WFH Options
Client Server
good knowledge of modern software engineering best practices and have experience across the full software development lifecycle You have a thorough understanding of Computer Science fundamentals such as OOP, DataStructures, Design Patterns, Algorithms You're excited to join a start-up in a role that you can shape and influence What's in it for you: As More ❯
users and stakeholders and influence product design evolution and delivery strategies Required Qualifications Experience as a Software Engineer/Developer using Java and/or Python Clear understanding of DataStructures and Object-Oriented Principles Experience building horizontally scalable software using Cloud-native development or Container Orchestration tools such as Kubernetes Spring Framework including Core, Integration, Batch, JDBC More ❯
Team Overview: The CloudKDB team is revolutionizing the architecture of our existing market and trade capture system to enable it to scale and ingest an ever-increasing volume of data sets. The team's vision extends beyond data capture, encompassing the development of innovative applications and advanced analytics directly on top of this data. Development Value: This is … an exceptional opportunity to tackle big data challenges at a Terabyte scale, working in close collaboration with business stakeholders to deliver impactful, data-driven solutions. You will be part of a team managing 50+ servers hosting 200+ globally distributed datasets, providing critical insights to over 600 users. Comprehensive training will be provided on the KDB+ database, and you … fostering a culture of continuous improvement. System Optimization: Continually improve the software development lifecycle and the overall quality of the product. Provide support to users in accessing and querying data, ensuring a seamless user experience. Qualifications: Proven ability to lead and mentor a team of developers, fostering a collaborative and high-performing environment. Strong understanding of computing fundamentals: concurrency More ❯
implementing novel machine learning and deep learning methods applied to somatic genomics. This includes identifying research problems that could be addressed through structured or unstructured, complex omics and imaging data and developing appropriate models and analytical solutions. Responsibilities: Develop and implement novel machine learning and deep learning methods to address key challenges in somatic genomics, ranging from imaging data analysis to variant interpretation. Extract research and business value from complex, unstructured somatic genomics data and metadata. Optimize large-scale data preparation, enhance analytics platforms, and industrialize validated analytical methods in collaboration with the data engineering team. Uncover novel biological insights into disease, spanning complex and rare diseases, and develop methods to validate new drug targets. … technical audiences. Key Qualifications: PhD (or equivalent experience) in Machine Learning, Computational Biology, Bioinformatics, or a related quantitative field. Strong programming skills with a solid understanding of algorithms and data structures. Proficiency in Python with hands-on experience using open-source ML frameworks such as scikit-learn, PyTorch, TensorFlow, or Keras. Extensive experience with Machine Learning and Deep Learning More ❯
5+ years of hands-on experience with Microsoft Dynamics 365, specializing in developing Power Pages, Power Apps, and Dataverse-based solutions, with a strong understanding of Dataverse application and data structures. Confident in creating and managing solution CI/CD pipelines using Azure DevOps. Extensive experience in customizing Dynamics 365, with a particular focus on CRM (Sales and Marketing … of end-to-end connectivity (network/proxy) between devices and cloud services to support seamless integration. Understanding of Azure security principles and best practices to ensure compliance and data protection. Strong attention to detail and a customer-focused approach to solution delivery. Microsoft certifications related to Dynamics 365, Power Platform, or Azure are highly valued. Experience with Azure … Data Factory and MS Fabric is an advantage. Competencies & Aptitude Fast learner with the ability to quickly adapt to new technologies and implement solutions effectively. Analytical and problem-solving mindset, capable of identifying patterns, understanding root causes, and formulating effective solutions. Strong communication skills, both written and verbal, ensuring clear documentation and stakeholder engagement. Detail-oriented with a structured More ❯
Nottingham, Nottinghamshire, East Midlands, United Kingdom
In Technology Group Limited
Job Title: Oracle Fusion Data Architect Location: Nottingham (3 Days onsite) Salary: £80,000 - £90,000 DOE The Role Oracle Fusion Data Architect to lead the data strategy, architecture, integration and migration activities for Oracle Fusion Cloud implementation. Should have knowledge of Oracle Fusion data models across multiple modules (Financials, SCM, EPM etc), strong expertise in … ETL, data governance and cloud data integration tools. Key responsibilities: Define and lead the data architecture strategy for Oracle Fusion Cloud applications across various business domains Design end-to-end data solutions, including data modeling, integration architecture and migration plan for on-premise to cloud transitions Work closely with functional team and implementation partners to … map legacy data to Fusion datastructures Lead data conversion activities using tool like Oracle FBDI, ADFdi, HDL and REST APIs Ensure data quality, integrity and compliance with governance policies and regulatory standards Collaborate with enterprise architects, DBAs and infrastructure teams to optimize data performance security Develop and maintain data lineage, metadata and More ❯
banking, has recently acquired FullCircl. This position will be within our FullCircl brand, established in 2021 through the merger of Artesian Solutions and DueDil. FullCircl subsequently acquired W2 Global Data Solutions to further enhance its capabilities. FullCircl connects the insight you need when it matters most. We partner with more than 500 of the UK’s leading banks, insurers … development of best practices and standards for the team. Challenge the team to always break work down into the smallest possible units. Drive the team to be high-performance. Data Plan, design, and implement scalable data pipelines. Develop and implement data models and algorithms to support data science and machine learning initiatives. Optimize data storage … and retrieval systems for maximum performance. Continuously monitor and improve data solutions to meet client needs. Software Plan, design, and implement secure, scalable APIs and backend services. Optimize APIs and backend services for performance. Monitor and improve the performance of APIs and backend services. Develop enterprise-level software with high performance and availability. Technical Requirements Leadership Previous experience in More ❯
people to unlock their digital potential! To get a sneak peek into our culture, find us on In this role, you will be responsible for planning and overseeing the Data Services Data Engineering and Machine Learning teams engaged in enterprise-wide data projects to ensure they are completed in a timely fashion and within budget. You will … stakeholders informed the entire way. This role will be supporting the charge in implementing D&A technologies and principles and will act as a single point of contact for data engineering processes to ensure the team is delivering impactful and useful solutions. Job Responsibilities Leads both operational and directional aspects for the data engineering team Make high-judgement … and enable team to deliver on their commitments Builds team with healthy dynamics Upholds department and company policies and reinforces them when necessary Champion clean, simple, methodical, and ethical data engineering practices Create and maintain optimal data pipeline architecture Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater More ❯