cutting-edge quantitative tools and infrastructure used by our front-office traders and quants. Requirements Responsibilities: Design, develop, and implement quantitative applications using Python, leveraging libraries such as NumPy, Pandas, SciPy, and others. Partner with quants and traders to understand their needs and translate them into efficient and robust software solutions. Develop and maintain test frameworks to ensure the quality More ❯
a quantitative emphasis in mathematics, engineering, or computer science. Minimum of 5 years of experience in Python development, with a strong understanding of object-oriented programming principles. Experience with pandas, numpy, matplotlib, scipy, and web frameworks like Flask or FastAPI is a plus. Good understanding of relational databases such as Oracle and MS SQL Server. Previous experience in front-office More ❯
engineering, ideally in front office finance , fintech , high-frequency trading , or technology-driven startups Expert-level Python skills - designing, debugging, and optimizing complex applications Very strong data capabilities with Pandas , NumPy , SQL , real-time data processing, and ETL Familiarity with DevOps practices including Linux , cloud platforms , CI/CD , and containerization (Docker/Kubernetes desirable) Proven ability as a live More ❯
and risk analysis projects. Drive innovation and resource efficiency, contributing to the continuous development of their consulting offering and analytical capabilities as the company grows. Skills & Experience Python (incl. pandas, numpy, fastapi, dash/plotly) Database development: e.g. SQL, PostgreSQL, SQLAlchemy, data warehousing, ETL pipelines Cloud computing & DevOps: e.g. AWS (EC2, Lambda, S3), Docker, CI/CD, serverless architecture Frontend More ❯
writing testable, modular code. Strong understanding of data structures, data modeling, and software architecture Data Science Library Knowledge: Deep understanding of key Data Science and Machine Learning libraries (e.g., pandas, NumPy, scikit-learn, TensorFlow), with a preference for PySpark experience Model Productionisation: Experience in taking Machine Learning models from development to production CI/CD and MLOps Experience : Familiarity with More ❯
with LLM application design and deployment. Strong software engineering skills, including version control (Git), code reviews, and unit testing. Familiarity with common data science libraries and tools (e.g., NumPy, Pandas, Scikit-learn, Jupyter). Experience in setting up and managing continuous integration and continuous deployment pipelines. Proficiency with containerization technologies (e.g., Docker, Kubernetes). Experience with cloud services (e.g., AWS More ❯
Cambridge, Cambridgeshire, England, United Kingdom Hybrid / WFH Options
Oscar Technology
senior client stakeholders Tech Stack You'll Work With BI Tools : Power BI, Tableau, Looker, Google Data Studio Data Warehousing : Snowflake, BigQuery, Redshift, SQL Server Languages : SQL, DAX, Python (Pandas, NumPy, Scikit-learn - desirable) CRM Systems : Salesforce, HubSpot, Dynamics 365, Zoho ETL Tools : dbt, Fivetran, Talend, Alteryx AI/ML Frameworks (Desirable): Azure ML, AWS Sagemaker, HuggingFace, OpenAI APIs Version More ❯
Oxford, Oxfordshire, England, United Kingdom Hybrid / WFH Options
Oscar Technology
senior client stakeholders Tech Stack You'll Work With BI Tools : Power BI, Tableau, Looker, Google Data Studio Data Warehousing : Snowflake, BigQuery, Redshift, SQL Server Languages : SQL, DAX, Python (Pandas, NumPy, Scikit-learn - desirable) CRM Systems : Salesforce, HubSpot, Dynamics 365, Zoho ETL Tools : dbt, Fivetran, Talend, Alteryx AI/ML Frameworks (Desirable): Azure ML, AWS Sagemaker, HuggingFace, OpenAI APIs Version More ❯
Bristol, Avon, England, United Kingdom Hybrid / WFH Options
Oscar Technology
senior client stakeholders Tech Stack You'll Work With BI Tools : Power BI, Tableau, Looker, Google Data Studio Data Warehousing : Snowflake, BigQuery, Redshift, SQL Server Languages : SQL, DAX, Python (Pandas, NumPy, Scikit-learn - desirable) CRM Systems : Salesforce, HubSpot, Dynamics 365, Zoho ETL Tools : dbt, Fivetran, Talend, Alteryx AI/ML Frameworks (Desirable): Azure ML, AWS Sagemaker, HuggingFace, OpenAI APIs Version More ❯
Milton Keynes, Buckinghamshire, England, United Kingdom Hybrid / WFH Options
Oscar Technology
senior client stakeholders Tech Stack You'll Work With BI Tools : Power BI, Tableau, Looker, Google Data Studio Data Warehousing : Snowflake, BigQuery, Redshift, SQL Server Languages : SQL, DAX, Python (Pandas, NumPy, Scikit-learn - desirable) CRM Systems : Salesforce, HubSpot, Dynamics 365, Zoho ETL Tools : dbt, Fivetran, Talend, Alteryx AI/ML Frameworks (Desirable): Azure ML, AWS Sagemaker, HuggingFace, OpenAI APIs Version More ❯
appropriate dashboards Mandatory required high proficiency in ETL, SQL and database management Experience with AWS services like Glue, Athena, Redshift, Lambda, S3 Python programming experience using data libraries like pandas and numpy etc Interest in machine learning, logistic regression and emerging solutions for data analytics You are comfortable working without direct supervision on outcomes that have a direct impact on More ❯
optimisation techniques-including supervised/unsupervised learning and operations research (e.g. linear, mixed-integer programming, heuristics). Proficient in Python (required), with experience using libraries such as scikit-learn, pandas, numpy, and Gurobi. Other programming languages are a plus. Solid experience with SQL, data engineering, and cloud-based tools (AWS preferred), as well as version control (Git), experiment tracking (e.g. More ❯
or mentorship. Have good communication skills. Nice to have Experience deploying LLMs and agent-based systems Our technology stack Python and associated ML/DS libraries (scikit-learn, numpy, pandas, LightGBM, LangChain/LangGraph, TensorFlow, etc ) PySpark AWS cloud infrastructure: EMR, ECS, ECR, Athena, etc. MLOps: Terraform, Docker, Spacelift, Airflow, MLFlow Monitoring: New Relic CI/CD: Jenkins, Github Actions More ❯
or mentorship. Have good communication skills. Nice to have Experience deploying LLMs and agent-based systems Our technology stack Python and associated ML/DS libraries (scikit-learn, numpy, pandas, LightGBM, LangChain/LangGraph, TensorFlow, etc...) PySpark AWS cloud infrastructure: EMR, ECS, ECR, Athena, etc. MLOps: Terraform, Docker, Spacelift, Airflow, MLFlow Monitoring: New Relic CI/CD: Jenkins, Github Actions More ❯
pragmatic approach to working in cross-functional Agile squads. Your experience: Expert-level fluency in Python (required) with deep experience in ML, OR, and DS libraries (e.g. scikit-learn, pandas, numpy, Gurobi). Strong knowledge of machine learning and optimisation techniques-including supervised, unsupervised learning, and operations research methods. Solid background in software engineering for data science products: version control More ❯
pragmatic approach to working in cross-functional Agile squads. Your experience: Expert-level fluency in Python (required) with deep experience in ML, OR, and DS libraries (e.g. scikit-learn, pandas, numpy, Gurobi). Strong knowledge of machine learning and optimisation techniques-including supervised, unsupervised learning, and operations research methods. Solid background in software engineering for data science products: version control More ❯
data cycle. • Proven Experience working with AWS data technologies (S3, Redshift, Glue, Lambda, Lake formation, Cloud Formation), GitHub, CI/CD • Coding experience in Apache Spark, Iceberg or Python (Pandas) • Experience in change and release management. • Experience in Database Warehouse design and data modelling • Experience managing Data Migration projects. • Cloud data platform development and deployment. • Experience of performance tuning in More ❯
Mc Lean, Virginia, United States Hybrid / WFH Options
MITRE
D3.js. • Demonstrated ability to manipulate large financial datasets and time series data and perform calculations with at least one modern programming language like Python (utilizing packages like scikit-learn, pandas, or dask), R (utilizing packages like caret, dplyr, or data.table), or other modern language. • Ability to apply, modify and formulate algorithms and processes to solve computational financial problems. • Desire and More ❯
meaningful insights from structured and unstructured datasets of varying sizes. Track record of implementing impactful models that drive sustained business results. Proficiency in the Python data science tech stack (pandas, scikit-learn, NumPy, and visualisation libraries) Experience working in a Linux-based cloud environment (e.g. GCP, Azure, AWS). Experience using git version control. Communication, stakeholder management, and problem-solving More ❯
experience with the ability to write ad-hoc and complex queries to perform data analysis. Experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. You will be able to develop solutions in a hybrid data environment (on-Prem and Cloud). Hands on experience with developing data pipelines for structured, semi More ❯
City of London, London, England, United Kingdom Hybrid / WFH Options
Atrium Workforce Solutions Ltd
grow your technical capabilities in a collaborative environment, this is an excellent opportunity to take the next step in your career. Essential Solid programming knowledge with Python Experience using Pandas, NumPy, Matplotlib, and PyTorch Strong understanding of SQL for querying and data manipulation Familiarity with CI/CD workflows and Git version control Detail-oriented and proactive problem solver Enthusiastic More ❯
field. 6+ years of hands-on experience in developing and deploying machine learning or AI systems in production environments in financial services. Proficiency in Python and essential libraries (e.g., Pandas, NumPy, TensorFlow, PyTorch, PySpark). Solid foundation in software engineering best practices, including modular code design, version control, and documentation. Excellent communication and collaboration skills, with the ability to work More ❯
of the art in machine learning algorithms for text analytics Demonstrated experience with machine learning frameworks such as PyTorch, Keras, Tensorflow Demonstrated experience with data visualization tools (i.e. Tableau, Pandas, D3.js, ggplot, etc) Demonstrated experience with NoSQL data stores such as MongoDB or DynamoDB Demonstrated experience using Natural Language Processing tools such as spaCy, NLTK, Stanford CoreNLP, or Gensim Natural More ❯
with interfacing with customer stakeholders to align on requirements and technical implementations Required Qualifications: Active Poly Minimum 3-5 years' experience with: Data Processing Python Libraries such as PySpark, Pandas and Numpy Experience with API development in Python using Python libraries such as FastAPI Experience with Unit Testing Frameworks in PyTest and Mocking Desired Qualifications: Experience with Python ORM tools More ❯