technologies such as: Apache Spark and the Hadoop Ecosystem Edge technologies eg NGINX, HAProxy etc. Excellent knowledge of YAML or similar languages Desirable Requirements: Jupyter Hub Awareness Minio or similar S3 storage technology Trino/Presto RabbitMQ or other common queue technology eg ActiveMQ NiFi Rego Familiarity with code development more »
forecasting, propensity modelling, predictive/prescriptive campaign performance analysis, and optimising marketing spend Programming proficiency in Python, SQL, Bash, and Excel, including experience with Jupyter notebooks, type-checking, functional programming, PyTest, Pandas, SciKit, PyTorch, CI/CD, and Git Experience with Docker, Kubernetes, and cloud platforms such as AWS, Databricks more »
skills with SQL, working with large and complex data sets to extract insights and identify trends Advanced programming skills with Python, including experience with Jupyter notebooks, PyTest, Pandas, SciKit, PyTorch, type-checking, functional programming, CI/CD, and Git A borad background in machine learning for customer and marketing purposes more »
in order to fully understand the data; programming in Python or Ruby, utilizing AWS S3, MongoDB, PostgreSQL, AWS Redshift or similar database technologies; using Jupyter notebooks and one or more statistical visualization or graphing toolkits such as Excel, Qlik Sense or Tableau. Technical Blog Posts Read more about what our more »
Newcastle Upon Tyne, United Kingdom Hybrid / WFH Options
NHS Counter Fraud Authority
strong proficiency in programming languages for data science, e.g., SQL, R and Python alongside the ability to use tools and packages such as Alteryx, Jupyter notebook, R Markdown, TensorFlow, Keras, Pytorch, Apache Spark etc. oPractical proficiency in producing reproducible code and pipelines including documentation, governance and assurance frameworks, automation and more »
strong proficiency in programming languages for data science, e.g., SQL, R and Python alongside the ability to use tools and packages such as Alteryx, Jupyter notebook, R Markdown, TensorFlow, Keras, Pytorch, Apache Spark etc. oPractical proficiency in producing reproducible code and pipelines including documentation, governance and assurance frameworks, automation and more »
strong proficiency in programming languages for data science, e.g., SQL, R and Python alongside the ability to use tools and packages such as Alteryx, Jupyter notebook, R Markdown, TensorFlow, Keras, Pytorch, Apache Spark etc. oPractical proficiency in producing reproducible code and pipelines including documentation, governance and assurance frameworks, automation and more »
working in a tech team using a diverse tech stack including: Backend: Python, FastAPI, PostgreSQL, Vespa, SQLAlchemy, Flask. Frontend: React, Next.js. Data Science: Python, Jupyter, PyTorch, Pandas, Spacy, Huggingface, Numpy, Streamlit, Weights and biases. Infra: Pulumi, Docker, AWS AppRunner, Step Functions, Grafana cloud monitoring, Prefect. Who you are Must haves more »
Manchester, England, United Kingdom Hybrid / WFH Options
Anson McCade
development, including research and developing new propositions. • Proficiency in Data Science and Machine Learning services from Cloud providers (AWS, Azure, GCP). • Python and Jupyter ecosystem (Lab, Notebook) and related libraries and tooling. • Python libraries for data management, statistical analysis, machine learning, and visualisation. • Machine learning frameworks such as TensorFlow more »
London, England, United Kingdom Hybrid / WFH Options
Anson McCade
development, including research and developing new propositions. • Proficiency in Data Science and Machine Learning services from Cloud providers (AWS, Azure, GCP). • Python and Jupyter ecosystem (Lab, Notebook) and related libraries and tooling. • Python libraries for data management, statistical analysis, machine learning, and visualisation. • Machine learning frameworks such as TensorFlow more »
Greater London, England, United Kingdom Hybrid / WFH Options
Annalect
data (e.g. cookie logs) and cloud technologies - AWS, Redshift, S3, Athena, GCP, BigQuery Experience of using git for version control (e.g. BitBucket) Experience with Jupyter Notebooks Competencies Enquiring/questioning mind Ability to pick up new tools and keen to embrace new technologies and the ability to overcome the challenges more »
impactful data-driven solutions. Proficiency in programming languages such as Python, R, or SQL, and experience with data manipulation and analysis tools such as Jupyter Notebook. Experience in using Data science Platforms like Anaconda, Amazon SageMaker. Experience in stand-alone environment and open-source tool for ML such as TensorFlow. more »
Greater London, England, United Kingdom Hybrid / WFH Options
Anson McCade
development, including research and developing new propositions. Proficiency in Data Science and Machine Learning services from Cloud providers (AWS, Azure, GCP). Python and Jupyter ecosystem (Lab, Notebook) and related libraries and tooling. Python libraries for data management, statistical analysis, machine learning, and visualisation. Machine learning frameworks such as TensorFlow more »
Cheltenham, England, United Kingdom Hybrid / WFH Options
Anson McCade
development, including research and developing new propositions. • Proficiency in Data Science and Machine Learning services from Cloud providers (AWS, Azure, GCP). • Python and Jupyter ecosystem (Lab, Notebook) and related libraries and tooling. • Python libraries for data management, statistical analysis, machine learning, and visualisation. • Machine learning frameworks such as TensorFlow more »
S3 Cloud Environments/Infra: AWS (required), [AWS Lambda, Terraform] (nice to have) Data Platforms: Creating data pipelines within Databricks or equivalent such as Jupyter Notebook, PowerBI (nice to have) Understanding of Enterprise Models for Engineering: microservices, APIs, AWS services to support models (SQS, SNS, etc.) Preferred Qualifications: Master's more »
cycle of high-impact projects Assist in scoping and staging projects to include detailed milestones and delivery schedules. Use SQL and/or Python (Jupyter Notebooks) to prepare data, perform exploratory data analysis, evaluate different modeling approaches. Engage in problem-solving, fault-finding, addressing issues in the data or approaches more »
sport (rules, terminology, insight). Proficiency in Python. Experience using relational databases and SQL. Familiarity with data manipulation and analysis libraries (e.g., pandas, numpy, jupyter, scikit-learn). Knowledge of machine learning and statistical methods (e.g. linear/logistic regression, decision trees, random forest, unsupervised methods) is preferred. Ability to more »
Redshift, AWS S3 Environments/Infra: AWS (required), [AWS Lambda, Terraform] (nice to have) Platforms: Creating data pipelines within Databricks or equivalent such as Jupyter Notebook, PowerBI (nice to have) of Enterprise Models for Engineering: microservices, APIs, AWS services to support models (SQS, SNS, etc.) Please Note, The role is more »
Python programming but also operate with modern tech such as Snowflake, Airflow, DBT, Kubernetes/Docker and cubeJS whilst also using Tableau CRM and Jupyter notebooks for gaining data insights. If you think there’s a better or newer tool, you’re free to use that too. The key think more »
types Energy trading lifecycle/activities and teams Technology & tools Workflow management tools e.g. JIRA, MS Azure DevOps Analytical tools – Python (including Pandas), SQL, Jupyter Notebooks MS Office applications If you are interested and to learn more, please send your CV for immediate consideration,. more »
algorithms with frameworks such as Tensorflow/Pytorch Proficiency in Python with associated data processing/machine learning toolkits (Numpy, Scipy, Tensorflow/PyTorch, Jupyter notebooks, etc) Able to collaborate closely with internal and external teams and build trust Strong technical acumen and passion for learning Coordination of people and more »
algorithms with frameworks such as Tensorflow/Pytorch Proficiency in Python with associated data processing/machine learning toolkits (Numpy, Scipy, Tensorflow/PyTorch, Jupyter notebooks, etc) Able to collaborate closely with internal and external teams and build trust Strong technical acumen and passion for learning Coordination of people and more »
a focus on Mid to High Frequency equities Strong Python programming experience (KDV/Q would be a bonus) Good knowledge and understanding of Jupyter, Pandasm Numpy, Sklearn Demonstrated knowledge and understanding of mathematical modelling, statistical analysis and probability theory. Experienced in conducting alpha research Demonstrated successful in building uncorrelated more »
and Digital and TV attribution. Excellent data visualization and storytelling capabilities. Proficiency in coding, preferably in R or Python; experience with R Studio or Jupyter notebooks is a plus. Solid proficiency in Excel and PowerPoint. more »
targets for experimental testing. Be familiar with NGS and associated pipelines. Collate and annotate reference sequences across multiple microorganisms. Be confident using python and Jupyter Lab books as a working and application development environment along with GIT as a version control system. Requirements include; MSc degree or equivalent in a more »