Desired Skills and Experience: Kafka and message bus/queue expertise Kubernetes knowledge Terraform and GitHub Actions skills Open Telemetry (OTEL) implementation Proficiency with NumPy and Pandas Systems integration experience Background in commodity trading (gas & power) Quantitative finance knowledge Understanding of compliance and regulation, particularly Sarbanes-Oxley (SOx) Role Responsibilities More ❯
Flakes) Data pipelines and big data tech Docker: both building but running too Wide AWS and infrastructure knowledge, including production support Scientific computing e.g. Numpy/scipy/pandas Just state the word 'Salmon' anywhere in your application, just to prove you can read a job advert. We aim to More ❯
Flakes) Data pipelines and big data tech Docker: both building but running too Wide AWS and infrastructure knowledge, including production support Scientific computing e.g. Numpy/scipy/pandas Just state the word 'Salmon' anywhere in your application, just to prove you can read a job advert. We aim to More ❯
integrity. Essential: • 3+ years of experience in data engineering, data architecture, or similar roles. • Expert proficiency in Python, including popular data libraries (Pandas, PySpark, NumPy, etc.). • Strong experience with AWS services—specifically S3, Redshift, Glue (Athena a plus). • Solid understanding of applied statistics. • Hands-on experience with large More ❯
tools e.g. PowerBI or Tableau Data extraction and manipulation languages e.g. SQL, Python Knowledge of statistical software and packages such as R, Matlab, Pandas, NumPy or SciPy Experience with algorithmic trading systems and low-latency trading Knowledge of financial markets is desirable but experience of working in other big data More ❯
Node.js, JavaScript (JS), and TypeScript (TS). Statistical Knowledge: Solid understanding of statistical concepts and methodologies. Data Manipulation & Analysis: Proficiency with tools like Pandas, NumPy, and Jupyter Notebooks. Experience with data visualization tools such as Matplotlib, Seaborn, or Tableau. More ❯
and even have the ability to be client facing! Requirements 5+ years experience with Data Engineering Strong working knowledge with Python/Pandas/NumPy/ETL pipelines Strong AWS experience (ideally Lambda & Step Functions) Beneficial: Experience or interest in the financial markets Experience in client facing role Previous start More ❯
and even have the ability to be client facing! Requirements 5+ years experience with Data Engineering Strong working knowledge with Python/Pandas/NumPy/ETL pipelines Strong AWS experience (ideally Lambda & Step Functions) Beneficial: Experience or interest in the financial markets Experience in client facing role Previous start More ❯
software engineering. Required Experience & Skills Significant experience in analytics engineering, data modeling, or data engineering roles. Advanced proficiency in Python (including libraries such as NumPy and Pandas) and strong SQL skills. Proven track record of leading technical projects and mentoring engineering teams. Excellent communication skills, with the ability to bridge More ❯
of hands-on development experience in Python or another object-oriented programming language. Strong proficiency in Python’s data ecosystem, including libraries such as NumPy , Pandas , and Matplotlib . Experience working with cloud platforms (AWS preferred) and Infrastructure-as-Code tools. Solid understanding of relational databases and SQL. Proven track More ❯
of hands-on development experience in Python or another object-oriented programming language. Strong proficiency in Python’s data ecosystem, including libraries such as NumPy , Pandas , and Matplotlib . Experience working with cloud platforms (AWS preferred) and Infrastructure-as-Code tools. Solid understanding of relational databases and SQL. Proven track More ❯
Experience with distributed systems, parallel computing, and high-performance processing of large datasets. Strong experience in data pipelines, working with tools such as Pandas, NumPy, and SQL/NoSQL databases. Proven experience working in fast-paced environments, ideally within trading, financial services, or high-frequency environments. Proficiency in developing RESTful More ❯
Didcot, Oxfordshire, South East, United Kingdom Hybrid / WFH Options
Richard Wheeler Associates
retrieval-augmented generation (RAG) systems Knowledge of optimization techniques for reducing LLM latency and computational costs Software Engineering & Architecture Strong Python development skills (pandas, numpy, scikit-learn, PyTorch/TensorFlow) Experience with web frameworks (Streamlit, FastAPI, or Flask) and frontend technologies Database design and optimization (PostgreSQL preferred) for large protein More ❯
skills with experience in backend frameworks like Flask , FastAPI , or Django . Ability to write efficient, reusable, and scalable code. Experience with libraries like NumPy , Pandas , SciPy , or any others commonly used in data processing or modelling. Strong understanding of data structures , algorithms , and complexity analysis . ✅ Mathematical or Computational More ❯
Development (RAD). Messaging/Data Streaming solution (Kafka, RabbitMQ, etc.). Database Admin. Linux exposure. REST API. Good To Have Languages Python: Pandas, Numpy, FastAPI, data manipulation. Web: JS, HTML, CSS, FE framework (React, angular or similar). Ex Scala, Clojure, Groovy or other functional language. This role offers More ❯
PyCaret, Prophet, or custom implementations for time series A/B testing frameworks (e.g., DoWhy, causalml) Programming & Data Tools : Python: Strong foundation in Pandas, NumPy, matplotlib/seaborn, scikit-learn, TensorFlow, Pytorch etc. SQL: Advanced querying for large-scale datasets. Jupyter, Databricks, or notebooks-based workflows for experimentation. Data Access More ❯
developer with 5+ years working in Python or another object-oriented environment. Strong familiarity with tools and libraries for data processing and visualization (e.g., NumPy, Pandas, Matplotlib). Comfortable deploying applications in cloud ecosystems, particularly AWS, and using infrastructure automation tools. Experienced with building data workflows and managing databases; Airflow More ❯
developer with 5+ years working in Python or another object-oriented environment. Strong familiarity with tools and libraries for data processing and visualization (e.g., NumPy, Pandas, Matplotlib). Comfortable deploying applications in cloud ecosystems, particularly AWS, and using infrastructure automation tools. Experienced with building data workflows and managing databases; Airflow More ❯
internal teams. Tech: While they use a range of technologies internally, familiarity with open-source tools is highly valued. Languages & Libraries: Python, R, Pandas, NumPy, scikit-learn, XGBoost, LightGBM, statsmodels NLP & ML Frameworks: spaCy, Hugging Face Transformers, TensorFlow or PyTorch (where applicable) Data Engineering & Pipelines: dbt, Airflow, SQL, Spark, Dask More ❯
years of experience in a quantitative, analytics, or developer role within a financial institution or trading environment. Strong proficiency in Python (e.g., Pandas, NumPy, Jupyter) and experience building data pipelines , analytical tools , or dashboards . SQL experience is a plus. Proficiency in Excel and data visualization platforms such as Power More ❯
GCP) to store and process data. Document workflows, pipelines, and transformation logic for transparency. Key Skills & Experience: Strong hands-on experience in Python (Pandas, NumPy, PySpark). Experience building ETL/ELT processes. Familiarity with cloud platforms (AWS, Azure, GCP) and big data technologies (e.g., Snowflake, Databricks). Understanding of More ❯
GCP) to store and process data. Document workflows, pipelines, and transformation logic for transparency. Key Skills & Experience: Strong hands-on experience in Python (Pandas, NumPy, PySpark). Experience building ETL/ELT processes. Familiarity with cloud platforms (AWS, Azure, GCP) and big data technologies (e.g., Snowflake, Databricks). Understanding of More ❯
BeautifulSoup, or Selenium). Strong knowledge of data cleaning, standardization, and normalization techniques Experience with data analysis and modeling using libraries such as Pandas, NumPy, Scikit-learn, or TensorFlow. Familiarity with SQL and database management systems (e.g., PostgreSQL, MySQL). Experience with cloud platforms (e.g., AWS, Azure, GCP) and big More ❯
BeautifulSoup, or Selenium). Strong knowledge of data cleaning, standardization, and normalization techniques Experience with data analysis and modeling using libraries such as Pandas, NumPy, Scikit-learn, or TensorFlow. Familiarity with SQL and database management systems (e.g., PostgreSQL, MySQL). Experience with cloud platforms (e.g., AWS, Azure, GCP) and big More ❯
exposure to capital markets or trading (equities and/or execution). Technical Skills: Proficient in Python with some exposure to data analysis (Pandas, NumPy) or similar (R, MATLAB). Solid background in object-oriented programming (Java preferred; C++ or C# also valued). Familiarity with Linux-based systems (configuration More ❯