a technical and non-technical audience Independent and self-driven learner, able to step outside of their area of expertise Python; we work with asyncio, SQLAlchemy, FastAPI, Pydantic, NumPy, Pandas SQL; performance tuning, schema design, monitoring in production, we mainly work with PostgreSQL Cloud (AWS) deployments and monitoring, basic networking and security best practices Command line familiarity, git, automated testing More ❯
of established Quant Analytics team. What You Will Bring • University degree or equivalent with proven and displayed competency in data interrogation. • Strong proficiency and experience in Python (e.g., NumPy, pandas, scikit-learn) and quantitative thinking - you enjoy working with data to unearth patterns, trends, and nuances. • Exceptional analytical, problem-solving, and communication skills, with the ability to translate and present More ❯
a technical and non-technical audience Independent and self-driven learner, able to step outside of their area of expertise Python; we work with asyncio, SQLAlchemy, FastAPI, Pydantic, NumPy, Pandas SQL; performance tuning, schema design, monitoring in production, we mainly work with PostgreSQL Cloud (AWS) deployments and monitoring, basic networking and security best practices Command line familiarity, git, automated testing More ❯
in languages such as Python. Solid understanding of machine learning concepts, algorithms, and libraries (e.g., scikit-learn, TensorFlow, PyTorch). Experience with data manipulation and analysis using tools like Pandas and NumPy. Familiarity with cloud computing platforms (e.g., AWS, Azure, GCP). Desired Qualifications: Masters degree. Demonstrated experience with the application of machine learning and artificial intelligence within the Department More ❯
tools (Airflow, Prefect, Temporal) or have built custom pipeline systems for multi-step autonomous processes. You bridge science and engineering. You are comfortable with scientific computing libraries (NumPy, SciPy, pandas) and understand scientific literature formats, databases (PubMed, arXiv), and academic data processing. What Sets You Apart: You have a research background. You are a former academic researcher who transitioned to More ❯
the adoption of best practices in data science across the organisation, lead other data science engineers MINIMUM QUALIFICATIONS Industry experience using Python for data science (e.g. numpy, scipy, scikit, pandas, etc.) and SQL or other languages for relational databases. Experience with a cloud platform such as (AWS, GCP, Azure etc.) Experience with common data science tools; statistical analysis, mathematical modelling More ❯
track record of handling high-visibility, customer-facing outputs. 1+ years experience using Python (or another programming language e.g. R, C++, Java) and with the scientific computing stack (Numpy, Pandas, SciPy, ScikitLearn, etc.) Familiarity with renewable energy technologies, market design, and regulatory frameworks within European power markets, specifically GB, Germany, Spain, Portugal, France, or Italy. Experience writing technical, report-style More ❯
Experience in both data engineering and machine learning, with a strong portfolio of relevant projects. Proficiency in Python with libraries like TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like Apache Airflow, Spark, or Kafka. Strong understanding of SQL, NoSQL, and data modeling. Familiarity with More ❯
of the art in machine learning algorithms for text analytics5. Demonstrated experience with machine learning frameworks such as PyTorch, Keras, Tensorflow6. Demonstrated experience with data visualization tools (i.e. Tableau, Pandas, D3.js, ggplot, etc)7. Demonstrated experience with NoSQL data stores such as MongoDB or DynamoDB8. Demonstrated experience using Natural Language Processing tools such as spaCy, NLTK, Stanford CoreNLP, or Gensim More ❯
field 5 years experience in data integration, data warehousing, and proficiency in programming languages, as well as expertise with ETL tools and data integration platforms. Programming & Scripting: Python (e.g., Pandas, Requests), PowerShell, Bash, or equivalent scripting languages APIs & Integration: RESTful/SOAP APIs, OAuth, API authentication mechanisms Databases: SQL (PostgreSQL, MySQL, MSSQL), NoSQL (MongoDB, DynamoDB is a plus) Automation & Workflow More ❯
Have: 5+ years of experience building production-level AI or ML systems, including LLMs, agents, or complex automation frameworks 3+ years of experience with Python and Python tools, including Pandas or NumPy Experience with Large Language Models ( LLMs), Machine Learning (ML), Deep Learning (DL), and Reinforcement Learning (RL) Experience with tools and AI agent frameworks such as TensorFlow, PyTorch, or More ❯
experience with Python and frameworks like Django/Flask/FastAPI. Database Expertise: Proficient with relational (PostgreSQL, MySQL) and NoSQL (MongoDB) databases. Data Analysis Skills: Experience using libraries like Pandas and NumPy. Software Development Best Practices: Understands object-oriented programming, Agile/Scrum methodologies, and version control (e.g., GitHub). Problem-Solving & Analytical Abilities: Demonstrated ability to solve complex problems More ❯
or data engineering. Ability to work standard European time-zone hours and legal authorisation to work in your country of residence. Strong experience with Python's data ecosystem (e.g., Pandas, NumPy) and deep expertise in SQL for building robust data extraction, transformation, and analysis pipelines. Hands-on experience with big data processing frameworks such as Apache Spark, Databricks, or Snowflake More ❯
track record of handling high-visibility, customer-facing outputs. 1+ years experience using Python (or another programming language e.g. R, C++, Java) and with the scientific computing stack (Numpy, Pandas, SciPy, ScikitLearn, etc.) Familiarity with renewable energy technologies, market design, and regulatory frameworks within European power markets, specifically GB, Germany, Spain, Portugal, France, or Italy. Experience writing technical, report-style More ❯
model performance evaluation, hyperparameter tuning, and maintenance using tools like Vertex AI Pipelines. Cloud Computing (Google Cloud Platform - GCP Preferred) Technical Expertise & Tools Python: Advanced proficiency in data analysis (Pandas, NumPy), machine learning, PI development (Flask/FastAPI), and writing clean, maintainable code. SQL: Expertise in querying, database design/optimization, stored procedures, functions, partitioning/clustering strategies for BigQuery More ❯
offs and varied opinions. Experience in a range of tools sets comparable with our own: Database technologies: SQL, Redshift, Postgres, DBT, Dask, airflow etc. AI Feature Development: LangChain, LangSmith, pandas, numpy, sci kit learn, scipy, hugging face, etc. Data visualization tools such as plotly, seaborn, streamlit etc You are Able to chart a path on a long term journey through More ❯
familiarity with LLM/GenAI prompting and augmentation for textual analysis, with an interest in learning more. Experience working with commonly used data science libraries and frameworks, e.g. Spacy, pandas, numpy, scikit-learn, Keras/TensorFlow, PyTorch, LangChain, Huggingface transformers etc. Familiar with both on-premises and cloud-based platforms (e.g. AWS). Working understanding of ML Ops workflows and More ❯
Java, Python, C/C++, BASH, Docker, Kubernetes, Cloud, AWS, Azure, Spring, REST, Nifi, Linux, Windows, VMWare, Kubernetes, Pandas. Due to federal contract requirements, United States citizenship and an active TS/SCI security clearance and polygraph are required for More ❯
Machine Learning Engineer - SaaS - London (Tech stack: Machine Learning Engineer, Python, TensorFlow, PyTorch, scikit-learn, Keras, Natural Language Processing (NLP), Hugging Face Transformers, Pandas, NumPy, Jupyter Notebooks, Matplotlib, Seaborn, Flask (for building APIs), FastAPI, Docker, MLflow, DVC (Data Version Control), AWS SageMaker, Azure Machine Learning, Google Cloud AI Platform, TensorFlow Serving, ONNX (Open Neural Network Exchange) We have several exciting … full training will be provided to fill any gaps in your skill set): Machine Learning Engineer, Python, TensorFlow, PyTorch, scikit-learn, Keras, Natural Language Processing (NLP), Hugging Face Transformers, Pandas, NumPy, Jupyter Notebooks, Matplotlib, Seaborn, Flask (for building APIs), FastAPI, Docker, MLflow, DVC (Data Version Control), AWS SageMaker, Azure Machine Learning, Google Cloud AI Platform, TensorFlow Serving, ONNX (Open Neural More ❯
track drift, response quality and spend; implement automated retraining triggers. Collaboration - work with Data Engineering, Product and Ops teams to translate business constraints into mathematical formulations. Tech stack Python (pandas, NumPy, scikit-learn, PyTorch/TensorFlow) SQL (Redshift, Snowflake or similar) AWS SageMaker → Azure ML migration, with Docker, Git, Terraform, Airflow/ADF Optional extras: Spark, Databricks, Kubernetes. What you … yrs optimisation/recommender work at production scale (dynamic pricing, yield, marketplace matching). Mathematical optimisation know-how - LP/MIP, heuristics, constraint tuning, objective-function design. Python toolbox: pandas, NumPy, scikit-learn, PyTorch/TensorFlow; clean, tested code. Cloud ML: hands-on with AWS SageMaker plus exposure to Azure ML; Docker, Git, CI/CD, Terraform. SQL mastery for More ❯
approaches. Deployment: ship a model as a container, update an Airflow (or Azure Data Factory) job. Review: inspect dashboards, compare control vs. treatment, plan next experiment. Tech stack Python (pandas, NumPy, scikit-learn, PyTorch/TensorFlow) SQL (Redshift, Snowflake or similar) AWS SageMaker → Azure ML migration, with Docker, Git, Terraform, Airflow/ADF Optional extras: Spark, Databricks, Kubernetes. What you … yrs optimisation/recommender work at production scale (dynamic pricing, yield, marketplace matching). Mathematical optimisation know-how - LP/MIP, heuristics, constraint tuning, objective-function design. Python toolbox: pandas, NumPy, scikit-learn, PyTorch/TensorFlow; clean, tested code. Cloud ML: hands-on with AWS SageMaker plus exposure to Azure ML; Docker, Git, CI/CD, Terraform. SQL mastery for More ❯
BI). iii. Agile methodologies and project management tools and utilization of project management tools (e.g., JIRA). iv. Statistical, mathematical, and database Python packages (e.g., NumPy, SQLAlchemy, or PandasMore ❯
to stakeholders Ideal Profile 5+ years of experience in data science, preferably in manufacturing or industrial settings Strong knowledge of machine learning, statistics, and data modeling Expert in Python (pandas, scikit-learn, etc.) and SQL Experience with cloud platforms (Azure, AWS, or GCP) Familiarity with tools like MLflow, Airflow, Power BI Strong stakeholder management and communication skills Fluent in Dutch More ❯
We prefer candidates to hold at minimum a Masters degree. We have multiple openings! -Exceptional academic and industry experience with Python and especially with data handling libraries such as Pandas and PyArrow -Strong background in database management solutions, familiarity with databases such as MySQL and Oracle -Large-scale data processing and implementing batch processing pipelines in HPC or cloud architecture … Experience with Cloud Computing environments (AWS, GCloud, Azure) is a plus -Deep experience with programming and scripting languages, with a strong focus on Python, Pandas and Numpy. Other languages include: Shell, PERL, Java, C/C C#, Scala, etc. -Exceptional technical fluency in machine learning frameworks (such as TensorFlow, Scikit-Learn, etc.) Personal Qualities: -Ability to contextualize data as it More ❯
Washington, Washington DC, United States Hybrid / WFH Options
Procession Systems
supporting data ingestion/ETL, data processing/analytic workloads, and presenting results/metrics across a range of technologies Linux (RedHat/Rocky, Alpine and distroless containers) Python (pandas/pyspark), bash, SQL, Terraform, data/file formats (json, parquet, etc.) Spark, Kafka, ELK, MinIO/Ceph, Nifi, Airflow Gitlab, Jira, Confluence AWS DESIRED QUALIFICATIONS: Familiarity with DoD and More ❯