its ecosystems; experience with other languages is a plus Ability to deploy ML capabilities into systems Familiarity with Python data science libraries (NumPy, SciPy, Pandas, Dask, spaCy, NLTK, scikit-learn) Commitment to writing clean, maintainable, and well-tested code Proficiency in automation, system monitoring, and cloud platforms like AWS or More ❯
/Typescript) Exposure to Cloud platforms like Kubernetes/Cloud Foundry Proficient in coding in one or more languages and frameworks such as Python, Pandas, Django, Java, Spring, SQL, MQ/Kafka Demonstrated ability to problem-solve in real-time and handle technical issues with a sense of urgency. Strong More ❯
a strong understanding of computer systems and how they operate. Excellent Python programming skills, including experience with relevant analytical and machine learning libraries (e.g., pandas, polars, numpy, sklearn, TensorFlow/Keras, PyTorch, etc.), in addition to visualization and API libraries (matplotlib, plotly, streamlit, Flask, etc). Experience developing and implementing More ❯
years of software development experience in quantitative trading, with deep expertise in Java and/or Python. Proficient in Python's data science ecosystem (Pandas, NumPy, Scikit-learn), with strong debugging and analytical skills. Proven track record implementing trading algorithms and working with distributed systems in fast-paced front-office More ❯
paced environment Detail oriented, organized, demonstrating thoroughness and strong ownership of work Desirable skills/experience: Experience working with python, and data analysis libraries (pandas/polars/numpy) Experience with financial mathematics, statistics, and broad understanding of financial services/instruments Experience in JavaScript development, especially in AngularJS or More ❯
engineers. Familiarity with data-centric workflows and pipeline orchestration (e.g., Apache Airflow). Proficiency in data validation, anomaly detection, or debugging using tools like Pandas, Polars, or data.table/R. Experience working with AWS or other cloud platforms. Knowledge of building and monitoring high-performance, concurrent services. Additional Information We More ❯
and Experience: Kafka and message bus/queue expertise Kubernetes knowledge Terraform and GitHub Actions skills Open Telemetry (OTEL) implementation Proficiency with NumPy and Pandas Systems integration experience Background in commodity trading (gas & power) Quantitative finance knowledge Understanding of compliance and regulation, particularly Sarbanes-Oxley (SOx) Role Responsibilities: Collaborate with More ❯
large datasets. Ability to apply statistical techniques to validate models and algorithms. Data Manipulation & Analysis: Proficient in data manipulation and analysis using tools like Pandas, NumPy, and Jupyter Notebooks. Experience with data visualization tools such as Matplotlib, Seaborn, or Tableau to communicate insights effectively. Our offer: Cash: Depends on experience More ❯
large datasets. Ability to apply statistical techniques to validate models and algorithms. Data Manipulation & Analysis: Proficient in data manipulation and analysis using tools like Pandas, NumPy, and Jupyter Notebooks. Experience with data visualization tools such as Matplotlib, Seaborn, or Tableau to communicate insights effectively. Our offer: Cash: Depends on experience More ❯
Math, Physics, Engineering, or related quantitative field. Minimum of 2+ years of Python developer proficiency with quantitative analysis experience with packages such as numpy, pandas, scipy, scikitlearn, matplotlib, etc. Proficiency in Linux environment (including shell scripting). 1+ years of experience with automation frameworks in software testing (e.g., PyTest, Cucumber More ❯
client needs shift. Role Requirements Technical Skills Substantial experience in data engineering, analytics engineering, or similar roles. Hands-on expertise with Python (Numpy/Pandas) and SQL. Proven experience designing and building robust ETL/ELT pipelines (dbt, Airflow). Strong knowledge of data pipelining, schema design, and cloud platforms More ❯
AWS Sagemaker, and Azure Machine Learning Experience in relevant Data Manipulation, Machine Learning and Statistical Analysis coding packages (eg. in Python: NumPy, Scikit-Learn, Pandas, Matplotlib etc.) Strong skills in data exploration, cleansing, modelling and presentation Strong experience in testing data models and Machine Learning Models Strong experience in data More ❯
stacks and intuitive UX Strong grounding in best practices of software development Professional experience with python Proficiency with common data science libraries such as pandas, numpy and scipy Comfortable with quickly evolving requirements Additional Valuable Skills: Experience designing and building front end apps Understanding and appreciation of dev ops industry More ❯
experience. You’ll help streamline client onboarding, build robust integrations, and play a key role in evolving the platform. 🛠 Tech Stack & Methodology Python (Dataclasses, Pandas, Asyncio) PostgreSQL , AWS (ECS, Lambda, SQS) Agile (XP), TDD, CI/CD, and strong automation culture If you're familiar with these tools—or eager More ❯
field Strong proficiency in Python and writing scalable backend systems Experience working with cloud platforms such as GCP , AWS, or Azure Solid knowledge of Pandas for data analysis and manipulation Familiarity with software testing best practices and writing clean, testable code Competency in Git, CI/CD, and working within More ❯
big data tech Docker: both building but running too Wide AWS and infrastructure knowledge, including production support Scientific computing e.g. Numpy/scipy/pandas Just state the word 'Salmon' anywhere in your application, just to prove you can read a job advert. We aim to improve all our colleagues More ❯
big data tech Docker: both building but running too Wide AWS and infrastructure knowledge, including production support Scientific computing e.g. Numpy/scipy/pandas Just state the word 'Salmon' anywhere in your application, just to prove you can read a job advert. We aim to improve all our colleagues More ❯
Agile methodology . You have strong analytical skills You can communicate about complex subjects to non-technical stakeholders You are familiar with Terraform , Python , Pandas , and NumPy It is great if you have: Experience with Neural Networks/Deep Learning. Experience with information extraction, parsing, and segmentation. Experience with machine More ❯
Science, Engineering, or a related field (or equivalent practical experience). Technical Skills: Deep expertise in Python, including libraries such as Django, Flask, FastAPI, Pandas and NumPy. Strong knowledge of relational and non-relational databases (e.g., PostgreSQL, MongoDB, Redis). Proficiency with cloud platforms (e.g., AWS, Azure, GCP) and containerization More ❯
Science, Engineering, or a related field (or equivalent practical experience). Technical Skills: Deep expertise in Python, including libraries such as Django, Flask, FastAPI, Pandas and NumPy. Strong knowledge of relational and non-relational databases (e.g., PostgreSQL, MongoDB, Redis). Proficiency with cloud platforms (e.g., AWS, Azure, GCP) and containerization More ❯
availability and integrity. Essential: • 3+ years of experience in data engineering, data architecture, or similar roles. • Expert proficiency in Python, including popular data libraries (Pandas, PySpark, NumPy, etc.). • Strong experience with AWS services—specifically S3, Redshift, Glue (Athena a plus). • Solid understanding of applied statistics. • Hands-on experience More ❯
on experience on AWS services like SageMaker and Bedrock, and programming skills such as Python, R, SQL, Java, Julia, Scala, Spark/Numpy/Pandas/scikit, JavaScript. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace More ❯
process efficiency and effectiveness. Technical Expertise: Advanced proficiency in Python with minimum demonstrable programming experience including data manipulation and analysis using libraries such as Pandas, NumPy, and SQLAlchemy. Extensive experience with Dash framework for building web applications. In-depth knowledge of Impala or other SQL-on-Hadoop query engines. Understanding More ❯
Machine Learning capabilities and techniques into other systems. Are familiar with the Python data science stack through exposure to libraries such as Numpy , Scipy , Pandas, Dask , spaCy , NLTK, scikit-learn. Take pride in writing clean, reusable, maintainable and well-tested code. Demonstrate proficiency in automation, system monitoring, and cloud-native More ❯