technologies such as: Apache Spark and the Hadoop Ecosystem Edge technologies eg NGINX, HAProxy etc. Excellent knowledge of YAML or similar languages Desirable Requirements: Jupyter Hub Awareness Minio or similar S3 storage technology Trino/Presto RabbitMQ or other common queue technology eg ActiveMQ NiFi Rego Familiarity with code development more »
forecasting, propensity modelling, predictive/prescriptive campaign performance analysis, and optimising marketing spend Programming proficiency in Python, SQL, Bash, and Excel, including experience with Jupyter notebooks, type-checking, functional programming, PyTest, Pandas, SciKit, PyTorch, CI/CD, and Git Experience with Docker, Kubernetes, and cloud platforms such as AWS, Databricks more »
forecasting, propensity modelling, predictive/prescriptive campaign performance analysis, and optimising marketing spend Programming proficiency in Python, SQL, Bash, and Excel, including experience with Jupyter notebooks, type-checking, functional programming, PyTest, Pandas, SciKit, PyTorch, CI/CD, and Git Experience with Docker, Kubernetes, and cloud platforms such as AWS, Databricks more »
working in a tech team using a diverse tech stack including: Backend: Python, FastAPI, PostgreSQL, Vespa, SQLAlchemy, Flask. Frontend: React, Next.js. Data Science: Python, Jupyter, PyTorch, Pandas, Spacy, Huggingface, Numpy, Streamlit, Weights and biases. Infra: Pulumi, Docker, AWS AppRunner, Step Functions, Grafana cloud monitoring, Prefect. Who you are Must haves more »
skills with SQL, working with large and complex data sets to extract insights and identify trends Advanced programming skills with Python, including experience with Jupyter notebooks, PyTest, Pandas, SciKit, PyTorch, type-checking, functional programming, CI/CD, and Git A borad background in machine learning for customer and marketing purposes more »
in order to fully understand the data; programming in Python or Ruby, utilizing AWS S3, MongoDB, PostgreSQL, AWS Redshift or similar database technologies; using Jupyter notebooks and one or more statistical visualization or graphing toolkits such as Excel, Qlik Sense or Tableau. Technical Blog Posts Read more about what our more »
strong proficiency in programming languages for data science, e.g., SQL, R and Python alongside the ability to use tools and packages such as Alteryx, Jupyter notebook, R Markdown, TensorFlow, Keras, Pytorch, Apache Spark etc. oPractical proficiency in producing reproducible code and pipelines including documentation, governance and assurance frameworks, automation and more »
Newcastle Upon Tyne, United Kingdom Hybrid / WFH Options
NHS Counter Fraud Authority
strong proficiency in programming languages for data science, e.g., SQL, R and Python alongside the ability to use tools and packages such as Alteryx, Jupyter notebook, R Markdown, TensorFlow, Keras, Pytorch, Apache Spark etc. oPractical proficiency in producing reproducible code and pipelines including documentation, governance and assurance frameworks, automation and more »
strong proficiency in programming languages for data science, e.g., SQL, R and Python alongside the ability to use tools and packages such as Alteryx, Jupyter notebook, R Markdown, TensorFlow, Keras, Pytorch, Apache Spark etc. oPractical proficiency in producing reproducible code and pipelines including documentation, governance and assurance frameworks, automation and more »
business intelligence in the healthcare field 5+ years of Strong SQL querying experience both creating and updating and stored procedures Experience with Python (i.e. Jupyter) Exposure to statistical software such as SAS or R Healthcare Payer and/or Provider experience with claims, clinical and or quality data Strong analytical more »
London, England, United Kingdom Hybrid / WFH Options
Anson McCade
development, including research and developing new propositions. • Proficiency in Data Science and Machine Learning services from Cloud providers (AWS, Azure, GCP). • Python and Jupyter ecosystem (Lab, Notebook) and related libraries and tooling. • Python libraries for data management, statistical analysis, machine learning, and visualisation. • Machine learning frameworks such as TensorFlow more »
Manchester, England, United Kingdom Hybrid / WFH Options
Anson McCade
development, including research and developing new propositions. • Proficiency in Data Science and Machine Learning services from Cloud providers (AWS, Azure, GCP). • Python and Jupyter ecosystem (Lab, Notebook) and related libraries and tooling. • Python libraries for data management, statistical analysis, machine learning, and visualisation. • Machine learning frameworks such as TensorFlow more »
ideal candidate if you possess experience in some of the following areas: - Implementing deep learning algorithms using PyTorch - Full-cycle ML model development, from Jupyter Notebook to deployment - Proficiency in the mathematical foundations of ML, including Linear Algebra and Statistics - Familiarity with modern CV and Natural Language Processing (NLP) techniques more »
Greater London, England, United Kingdom Hybrid / WFH Options
Annalect
data (e.g. cookie logs) and cloud technologies - AWS, Redshift, S3, Athena, GCP, BigQuery Experience of using git for version control (e.g. BitBucket) Experience with Jupyter Notebooks Competencies Enquiring/questioning mind Ability to pick up new tools and keen to embrace new technologies and the ability to overcome the challenges more »
impactful data-driven solutions. Proficiency in programming languages such as Python, R, or SQL, and experience with data manipulation and analysis tools such as Jupyter Notebook. Experience in using Data science Platforms like Anaconda, Amazon SageMaker. Experience in stand-alone environment and open-source tool for ML such as TensorFlow. more »
Greater London, England, United Kingdom Hybrid / WFH Options
Anson McCade
development, including research and developing new propositions. Proficiency in Data Science and Machine Learning services from Cloud providers (AWS, Azure, GCP). Python and Jupyter ecosystem (Lab, Notebook) and related libraries and tooling. Python libraries for data management, statistical analysis, machine learning, and visualisation. Machine learning frameworks such as TensorFlow more »
Cheltenham, England, United Kingdom Hybrid / WFH Options
Anson McCade
development, including research and developing new propositions. • Proficiency in Data Science and Machine Learning services from Cloud providers (AWS, Azure, GCP). • Python and Jupyter ecosystem (Lab, Notebook) and related libraries and tooling. • Python libraries for data management, statistical analysis, machine learning, and visualisation. • Machine learning frameworks such as TensorFlow more »
S3 Cloud Environments/Infra: AWS (required), [AWS Lambda, Terraform] (nice to have) Data Platforms: Creating data pipelines within Databricks or equivalent such as Jupyter Notebook, PowerBI (nice to have) Understanding of Enterprise Models for Engineering: microservices, APIs, AWS services to support models (SQS, SNS, etc.) Preferred Qualifications: Master's more »
this is a route you see for yourself 📈 They're looking for you to: Work closely with the ML data scientists; look at their Jupyter notebooks/Python code and set up ML infra to process data, train models, and deploy them in a production environment Be senior enough to more »
sport (rules, terminology, insight). Proficiency in Python. Experience using relational databases and SQL. Familiarity with data manipulation and analysis libraries (e.g., pandas, numpy, jupyter, scikit-learn). Knowledge of machine learning and statistical methods (e.g. linear/logistic regression, decision trees, random forest, unsupervised methods) is preferred. Ability to more »
Redshift, AWS S3 Environments/Infra: AWS (required), [AWS Lambda, Terraform] (nice to have) Platforms: Creating data pipelines within Databricks or equivalent such as Jupyter Notebook, PowerBI (nice to have) of Enterprise Models for Engineering: microservices, APIs, AWS services to support models (SQS, SNS, etc.) Please Note, The role is more »
Python programming but also operate with modern tech such as Snowflake, Airflow, DBT, Kubernetes/Docker and cubeJS whilst also using Tableau CRM and Jupyter notebooks for gaining data insights. If you think there’s a better or newer tool, you’re free to use that too. The key think more »
algorithms with frameworks such as Tensorflow/Pytorch Proficiency in Python with associated data processing/machine learning toolkits (Numpy, Scipy, Tensorflow/PyTorch, Jupyter notebooks, etc) Able to collaborate closely with internal and external teams and build trust Strong technical acumen and passion for learning Coordination of people and more »
algorithms with frameworks such as Tensorflow/Pytorch Proficiency in Python with associated data processing/machine learning toolkits (Numpy, Scipy, Tensorflow/PyTorch, Jupyter notebooks, etc) Able to collaborate closely with internal and external teams and build trust Strong technical acumen and passion for learning Coordination of people and more »
a focus on Mid to High Frequency equities Strong Python programming experience (KDV/Q would be a bonus) Good knowledge and understanding of Jupyter, Pandasm Numpy, Sklearn Demonstrated knowledge and understanding of mathematical modelling, statistical analysis and probability theory. Experienced in conducting alpha research Demonstrated successful in building uncorrelated more »