Redshift, AWS S3 Environments/Infra: AWS (required), [AWS Lambda, Terraform] (nice to have) Platforms: Creating data pipelines within Databricks or equivalent such as Jupyter Notebook, PowerBI (nice to have) of Enterprise Models for Engineering: microservices, APIs, AWS services to support models (SQS, SNS, etc.) Please Note, The role is more »
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Confidential
development, including research and developing new propositions. Proficiency in Data Science and Machine Learning services from Cloud providers (AWS, Azure, GCP). Python and Jupyter ecosystem (Lab, Notebook) and related libraries and tooling. Python libraries for data management, statistical analysis, machine learning, and visualisation. Machine learning frameworks such as TensorFlow more »
we're looking for: 6+ years of professional experience with ML operations used in large scale digital applications. Extensive experience with programming in Python, Jupyter notebooks, Spark, and Docker Professional experience of designing, building and managing bespoke MLOps in cloud environments, using tools such as SageMaker Processing, SageMaker Pipelines, Apache more »
with Data processing, feature engineering and model evaluation techniques. Experience in cloud platforms like Amazon Web Services & Google Cloud Experience with Amazon Sagemaker and Jupyter Notebooks Experience with Model Servers such as Triton Inference server Experience working in AI/ML based Analytics products Experience in microservices & event-driven architecture more »
in order to fully understand the data; programming in Python or Ruby, utilizing AWS S3, MongoDB, PostgreSQL, AWS Redshift or similar database technologies; using Jupyter notebooks and one or more statistical visualization or graphing toolkits such as Excel, Qlik Sense or Tableau. Technical Blog Posts Read more about what our more »
foundation in statistical modelling and machine learning. At least 5 years of professional experience in a data science or similar role. Expertise in Python, Jupyter Notebooks, SageMaker (or similar), and familiarity with MLFlow or equivalent model management tools. Extensive experience in developing and deploying machine learning models in production, preferably more »
learning & reinforcement learning models Build processes for extracting, cleaning and transforming data (SQL/Python) Ad-hoc data mining for insights using Python + Jupyter notebooks Presenting insights and predictions in live dashboards using Tableau/PowerBI Design solutions that meet business objectives Present back findings to clients through written more »
an automated test/analytics module and a monitoring tool for performance of trading algorithms and AI/ML models using open source tools (Jupyter notebook, Plotly/Dash). What we ll offer you A healthy, engaged and well-supported workforce are better equipped to do their best work more »
expert users (e.g., Data Scientists, Machine Learning Engineers, Mechanical Engineers) in enterprise situations is a plus. Experience in data science/machine learning including Jupyter Notebooks is a plus Experience in any programming language such as Python, C or Arduino is a plus What we offer Be part of something more »
set and strong adaptability to evolving project demands.Desirable:Experience with cloud-based Data Science and Machine Learning tools (AWS, Azure, GCP).Proficiency in Python, Jupyter, and popular machine learning frameworks (TensorFlow, PyTorch).Knowledge in Ethical AI principles and their application.About the Company:Our client is a leader in consulting services more »
set and strong adaptability to evolving project demands.Desirable:Experience with cloud-based Data Science and Machine Learning tools (AWS, Azure, GCP).Proficiency in Python, Jupyter, and popular machine learning frameworks (TensorFlow, PyTorch).Knowledge in Ethical AI principles and their application.About the Company:Our client is a leader in consulting services more »
set and strong adaptability to evolving project demands.Desirable:Experience with cloud-based Data Science and Machine Learning tools (AWS, Azure, GCP).Proficiency in Python, Jupyter, and popular machine learning frameworks (TensorFlow, PyTorch).Knowledge in Ethical AI principles and their application.About the Company:Our client is a leader in consulting services more »
Logistic Regression, Random Forest, XGBoost) and modern deep learning algorithms (e.g., BERT, LSTM). Strong knowledge of SQL and Python's data analysis ecosystem (Jupyter, Pandas, Scikit-Learn, Matplotlib). Advanced Techniques : Familiarity with ensemble methods like bagging and boosting. Understanding of model evaluation, data pre-processing techniques (standardisation, normalisation more »
strong proficiency in programming languages for data science, e.g., SQL, R and Python alongside the ability to use tools and packages such as Alteryx, Jupyter notebook, R Markdown, TensorFlow, Keras, Pytorch etc. Practical expertise in producing reproducible code and pipelines including documentation, governance and assurance frameworks, automation and code review more »
strong proficiency in programming languages for data science, e.g., SQL, R and Python alongside the ability to use tools and packages such as Alteryx, Jupyter notebook, R Markdown, TensorFlow, Keras, Pytorch etc. Practical expertise in producing reproducible code and pipelines including documentation, governance and assurance frameworks, automation and code review more »
Newcastle Upon Tyne, United Kingdom Hybrid / WFH Options
NHS Counter Fraud Authority
strong proficiency in programming languages for data science, e.g., SQL, R and Python alongside the ability to use tools and packages such as Alteryx, Jupyter notebook, R Markdown, TensorFlow, Keras, Pytorch etc. Practical expertise in producing reproducible code and pipelines including documentation, governance and assurance frameworks, automation and code review more »