modern C++20. Deep knowledge of performance optimization techniques, multi-threading, parallel processing, and efficient network utilization. Familiarity with Python and scientific libraries such as NumPy and Pandas (expertise not mandatory). Experience or strong interest in high-performance and big data systems, including data lakes or serverless architectures. A passion More ❯
integration, modelling, optimisation and data quality Exceptional understanding of Python Experience developing in the cloud (AWS preferred) Solid understanding of libraries like Pandas and NumPy Experience in data warehousing tools like Snowflake, Databricks, BigQuery Familiar with AWS Step Functions, Airflow, Dagster, or other workflow orchestration tools Commercial experience with performant More ❯
architecture, and business intelligence A good understanding of Python, including fundamental knowledge and a willingness to learn, along with familiarity with packages such as NumPy, pandas, and scikit-learn. Proficiency in SQL is required, along with experience in Unix and familiarity with scheduling tools like Tidal and Airflow. Good understanding More ❯
Message bus/queue OIDC, OAuth 2.0, JWTs. Preferred Technical Skills: Systems integration experience. Knowledge of Kubernetes, Kafka, Terraform, GitHub Actions, Open Telemetry (OTEL), Numpy, and Pandas. Required Soft Skills: Experience leading an agile software development team (preferably in a Scrum environment). Comfort in challenging assumptions, asking difficult questions More ❯
Node.js, JavaScript (JS), and TypeScript (TS). Statistical Knowledge: Solid understanding of statistical concepts and methodologies. Data Manipulation & Analysis: Proficiency with tools like Pandas, NumPy, and Jupyter Notebooks. Experience with data visualization tools such as Matplotlib, Seaborn, or Tableau. More ❯
of hands-on development experience in Python or another object-oriented programming language. Strong proficiency in Python’s data ecosystem, including libraries such as NumPy , Pandas , and Matplotlib . Experience working with cloud platforms (AWS preferred) and Infrastructure-as-Code tools. Solid understanding of relational databases and SQL. Proven track More ❯
Experience with distributed systems, parallel computing, and high-performance processing of large datasets. Strong experience in data pipelines, working with tools such as Pandas, NumPy, and SQL/NoSQL databases. Proven experience working in fast-paced environments, ideally within trading, financial services, or high-frequency environments. Proficiency in developing RESTful More ❯
skills with experience in backend frameworks like Flask , FastAPI , or Django . Ability to write efficient, reusable, and scalable code. Experience with libraries like NumPy , Pandas , SciPy , or any others commonly used in data processing or modelling. Strong understanding of data structures , algorithms , and complexity analysis . ✅ Mathematical or Computational More ❯
SR2 | Socially Responsible Recruitment | Certified B Corporation™
3+ years of Python in production (e.g., Flask, Django, Gunicorn, SQLAlchemy, Psycopg2). Strong background in working with large datasets using tools like Pandas, NumPy, or Spark. At least 2 years' experience with machine learning, deep learning, or AI systems. Familiarity with key libraries and frameworks such as TensorFlow, Scikit More ❯
developer with 5+ years working in Python or another object-oriented environment. Strong familiarity with tools and libraries for data processing and visualization (e.g., NumPy, Pandas, Matplotlib). Comfortable deploying applications in cloud ecosystems, particularly AWS, and using infrastructure automation tools. Experienced with building data workflows and managing databases; Airflow More ❯
Skills & Experience: 5+ years of experience in Python development, ideally within financial services Strong knowledge of Python frameworks (e.g. Flask, Django), data libraries (Pandas, NumPy) Cloud-native experience, especially Microsoft Azure (Functions, Data Lake, etc.) Familiarity with DevOps practices (Git, CI/CD, testing frameworks) Experience with SQL/NoSQL More ❯
/Computer Science Academic background 3-5 years of experience with Python (writing and managing code releases) Familiarity with numerical Python libraries, e.g. Pandas, Numpy Firm understanding of version control, testing, and continuous integration Understanding and appreciation for best practices Experience working with SQL databases and Sqlalchemy desired We encourage More ❯
internal teams. Tech: While they use a range of technologies internally, familiarity with open-source tools is highly valued. Languages & Libraries: Python, R, Pandas, NumPy, scikit-learn, XGBoost, LightGBM, statsmodels NLP & ML Frameworks: spaCy, Hugging Face Transformers, TensorFlow or PyTorch (where applicable) Data Engineering & Pipelines: dbt, Airflow, SQL, Spark, Dask More ❯
STEM degree (Maths, Stats, Computer Science, Engineering, etc.) from a top university. Strong foundation in statistics, probability, and applied mathematics. Proficiency in Python (Pandas, NumPy, Scikit-learn). Experience with cloud platforms (AWS, Azure, or GCP) for data processing or model deployment. Familiarity with SQL and relational databases. Exposure to More ❯
GCP) to store and process data. Document workflows, pipelines, and transformation logic for transparency. Key Skills & Experience: Strong hands-on experience in Python (Pandas, NumPy, PySpark). Experience building ETL/ELT processes. Familiarity with cloud platforms (AWS, Azure, GCP) and big data technologies (e.g., Snowflake, Databricks). Understanding of More ❯
ideas simply, and work well in cross-functional teams Tech You’ll Work With ML & Data Science Python (primary language) TensorFlow, PyTorch, or Keras NumPy, pandas Data pipelines (Azure Data Factory, Airflow, etc.) Applied ML: NLP, CV, transformers, GANs, time series, etc. Engineering & Cloud Azure (or similar cloud platforms like More ❯
and resource efficiency, contributing to the continuous development of their consulting offering and analytical capabilities as the company grows. Skills & Experience Python (incl. pandas, numpy, fastapi, dash/plotly) Database development: e.g. SQL, PostgreSQL, SQLAlchemy, data warehousing, ETL pipelines Cloud computing & DevOps: e.g. AWS (EC2, Lambda, S3), Docker, CI/ More ❯
R, Java, C++). Machine Learning & AI Frameworks: Experience with TensorFlow, PyTorch, Scikit-learn, or similar libraries. Data Manipulation & Analysis: Strong skills in Pandas, NumPy, and SQL for handling and analysing large datasets. Cloud Computing: Familiarity with cloud platforms such as AWS, Azure, or Google Cloud for AI model deployment. More ❯
a strong python software engineering skill set with a solid data science background so we will be looking for skills across: Python Python Frameworks (NumPy, Django, etc) Writing and implementing Code Testing Quality assurance Software tool deployment AWS (Lambda, AWS integrations etc) Metaflow & Prefect Key will be working with Raster More ❯
the data science team and its role within the wider business. What We’re Looking For Strong technical foundation with proficiency in Python (Pandas, NumPy, Scikit-learn), SQL, and cloud platforms (GCP or AWS). Experience with modern data warehouses (BigQuery, Snowflake, Redshift). Proven experience in deploying machine learning More ❯
experience as a Quant Developer or in a similar role. Strong programming skills in Python and C++. Experience with Python data science tools (Pandas, Numpy, Scikit-learn). Exceptional coding, debugging, and analytical abilities. Strong problem-solving and communication skills. Ownership mentality with the ability to multitask effectively. Commitment to More ❯
ability to mentor colleagues, drive projects forward, and influence stakeholders. You should apply if you: Have a strong technical background, including proficiency in Python (NumPy, Pandas, Scikit-Learn, etc.), SQL, and cloud platforms such as GCP or AWS. Have experience working with modern databases like BigQuery, Snowflake, or Redshift. Have More ❯
ML algorithms and techniques, including LLMs, GenAI, and automated AI systems. Experience with machine learning frameworks (e.g., TensorFlow, PyTorch) and data science libraries (e.g., NumPy, pandas, scikit-learn). Proficiency in Python, R, or other relevant programming languages. Proficiency in working with large datasets, data wrangling, and data preprocessing. Experience More ❯
Python code for data processing, API development, and integration with the Azure Databricks environment. Utilise relevant Python libraries and frameworks (e.g., Flask, Django, Pandas, NumPy). Collaborate with cross-functional teams to build and enhance banking applications Work closely with UI/UX Designers to integrate visualizations seamlessly into web More ❯