London, England, United Kingdom Hybrid / WFH Options
Anson McCade
development, including research and developing new propositions. • Proficiency in Data Science and Machine Learning services from Cloud providers (AWS, Azure, GCP). • Python and Jupyter ecosystem (Lab, Notebook) and related libraries and tooling. • Python libraries for data management, statistical analysis, machine learning, and visualisation. • Machine learning frameworks such as TensorFlow more »
S3 Cloud Environments/Infra: AWS (required), [AWS Lambda, Terraform] (nice to have) Data Platforms: Creating data pipelines within Databricks or equivalent such as Jupyter Notebook, PowerBI (nice to have) Understanding of Enterprise Models for Engineering: microservices, APIs, AWS services to support models (SQS, SNS, etc.) Preferred Qualifications: Master's more »
Manchester, England, United Kingdom Hybrid / WFH Options
Anson McCade
development, including research and developing new propositions. • Proficiency in Data Science and Machine Learning services from Cloud providers (AWS, Azure, GCP). • Python and Jupyter ecosystem (Lab, Notebook) and related libraries and tooling. • Python libraries for data management, statistical analysis, machine learning, and visualisation. • Machine learning frameworks such as TensorFlow more »
Bristol, England, United Kingdom Hybrid / WFH Options
Anson McCade
development, including research and developing new propositions. • Proficiency in Data Science and Machine Learning services from Cloud providers (AWS, Azure, GCP). • Python and Jupyter ecosystem (Lab, Notebook) and related libraries and tooling. • Python libraries for data management, statistical analysis, machine learning, and visualisation. • Machine learning frameworks such as TensorFlow more »
Skills and Experience: Experience with pre-silicon platforms such as Models, RTL simulation, emulation, or FPGA. Data analysis and visualization skills, such as with Jupyter Notebooks. In Return: In this role, you'll enjoy working in a highly stimulating collaborative environment with other software, hardware, and system teams across the more »
in order to fully understand the data; programming in Python or Ruby, utilizing AWS S3, MongoDB, PostgreSQL, AWS Redshift or similar database technologies; using Jupyter notebooks and one or more statistical visualization or graphing toolkits such as Excel, Qlik Sense or Tableau. Technical Blog Posts Read more about what our more »
working in a tech team using a diverse tech stack including: Backend: Python, FastAPI, PostgreSQL, Vespa, SQLAlchemy, Flask. Frontend: React, Next.js. Data Science: Python, Jupyter, PyTorch, Pandas, Spacy, Huggingface, Numpy, Streamlit, Weights and biases. Infra: Pulumi, Docker, AWS AppRunner, Step Functions, Grafana cloud monitoring, Prefect. Who you are Must haves more »
performance to inform design and implementation decisions. Software development experience and knowledge of C++11/14/17. Working knowledge of Python (Numpy, Scipy, Jupyter Notebooks, etc). A measured, scientific approach to building complex engineering systems Extra kudos if you have: The ability to understand and debug complex system more »
strong foundation in statistical methods and model building to extract valuable insights from data. Proficient in Python and its core data science libraries (Pandas, Jupyter Notebook, Scikit-learn, NumPy, SciPy, etc.) for efficient data manipulation and analysis. Data Wrangling: You have the skills to tackle messy data sets with confidence more »
targets for experimental testing. Be familiar with NGS and associated pipelines. Collate and annotate reference sequences across multiple microorganisms. Be confident using python and Jupyter Lab books as a working and application development environment along with GIT as a version control system. Requirements include; MSc degree or equivalent in a more »
/utilities, banking or commodities) Proficient with Python - Able to use python to ingest CSV or SQL queries and do manipulation of data in Jupyter Notebooks Proficient with SQL & Excel Previously used data visualisation tools (preferably Power BI) If this role is of interest, please apply with a copy of more »
Greater London, England, United Kingdom Hybrid / WFH Options
Anson McCade
development, including research and developing new propositions. Proficiency in Data Science and Machine Learning services from Cloud providers (AWS, Azure, GCP). Python and Jupyter ecosystem (Lab, Notebook) and related libraries and tooling. Python libraries for data management, statistical analysis, machine learning, and visualisation. Machine learning frameworks such as TensorFlow more »
Senior Data Scientist - London - Permanent Python (Jupyter), SQL, Machine Learning, GIS, GCP, Renewables, Energy ***Please note this role does not offer sponsorship, and you must be based in the UK to apply*** I'm currently working with an exciting start-up focused around decarbonisation. They have received a new round … that we can show in our platform Essentials: 4+ years spread across Data Science, Engineering, and Analysis. Excellent skills using Python/R and Jupyter notebooks, and SQL databases (MySQL/PostgreSQL). Track record building end-to-end machine learning models Experience with geographic data/GIS data Exposure … GCP tools (e.g. Dataflow, BigQuery, Vertex AI) Experience working in a SaaS start up is a massive bonus! Senior Data Scientist - London - Permanent Python (Jupyter), SQL, Machine Learning, GIS, GCP, Renewables, Energy more »
Cheltenham, England, United Kingdom Hybrid / WFH Options
Anson McCade
development, including research and developing new propositions. • Proficiency in Data Science and Machine Learning services from Cloud providers (AWS, Azure, GCP). • Python and Jupyter ecosystem (Lab, Notebook) and related libraries and tooling. • Python libraries for data management, statistical analysis, machine learning, and visualisation. • Machine learning frameworks such as TensorFlow more »
Python programming but also operate with modern tech such as Snowflake, Airflow, DBT, Kubernetes/Docker and cubeJS whilst also using Tableau CRM and Jupyter notebooks for gaining data insights. If you think there’s a better or newer tool, you’re free to use that too. The key think more »
and Digital and TV attribution. Excellent data visualization and storytelling capabilities. Proficiency in coding, preferably in R or Python; experience with R Studio or Jupyter notebooks is a plus. Solid proficiency in Excel and PowerPoint. more »
Greater London, England, United Kingdom Hybrid / WFH Options
Annalect
data (e.g. cookie logs) and cloud technologies - AWS, Redshift, S3, Athena, GCP, BigQuery Experience of using git for version control (e.g. BitBucket) Experience with Jupyter Notebooks Competencies Enquiring/questioning mind Ability to pick up new tools and keen to embrace new technologies and the ability to overcome the challenges more »
algorithms with frameworks such as Tensorflow/Pytorch Proficiency in Python with associated data processing/machine learning toolkits (Numpy, Scipy, Tensorflow/PyTorch, Jupyter notebooks, etc) Able to collaborate closely with internal and external teams and build trust Strong technical acumen and passion for learning Coordination of people and more »
algorithms with frameworks such as Tensorflow/Pytorch Proficiency in Python with associated data processing/machine learning toolkits (Numpy, Scipy, Tensorflow/PyTorch, Jupyter notebooks, etc) Able to collaborate closely with internal and external teams and build trust Strong technical acumen and passion for learning Coordination of people and more »
Newcastle upon Tyne, Tyne & Wear Hybrid / WFH Options
Real Staffing
and NCBI databases Be familiar with NextGen sequencing and associated pipelines Collate and annotate reference sequences across multiple microorganisms Be confident using python and Jupyter Lab books as a working and application development environment along with GIT as a version control system Generate, record, manipulate, analyse and present complex data more »
and across all levels. This ranges from data generation, (storage technologies and data management), processing and analysis (high performance computing and technologies such as JupyterHub), to making visual outputs for end users (web technologies and virtualisation) to increase the reach and impact of PML science. About You You will enjoy more »
This is a new position for a Senior Data Scientist with a global, data-driven company with cutting-edge technology who leverage data to serve as a true market differentiator. The focus of this role is to deliver data science more »
offline model delivery. Identifying business hypotheses worth pursuing. Hypothesis testing whilst including real-world constraints. Data structures, databases, and ETL processes. AWS, Snowflake, DBT, Jupyter notebooks, Spark, Mongo, and Postgress or similar. This is a pragmatic and humble organisation who are looking for like minded people to help them deliver more »