London, England, United Kingdom Hybrid / WFH Options
Anson McCade
development, including research and developing new propositions. • Proficiency in Data Science and Machine Learning services from Cloud providers (AWS, Azure, GCP). • Python and Jupyter ecosystem (Lab, Notebook) and related libraries and tooling. • Python libraries for data management, statistical analysis, machine learning, and visualisation. • Machine learning frameworks such as TensorFlow more »
Manchester, England, United Kingdom Hybrid / WFH Options
Anson McCade
development, including research and developing new propositions. • Proficiency in Data Science and Machine Learning services from Cloud providers (AWS, Azure, GCP). • Python and Jupyter ecosystem (Lab, Notebook) and related libraries and tooling. • Python libraries for data management, statistical analysis, machine learning, and visualisation. • Machine learning frameworks such as TensorFlow more »
Python programming but also operate with modern tech such as Snowflake, Airflow, DBT, Kubernetes/Docker and cubeJS whilst also using Tableau CRM and Jupyter notebooks for gaining data insights. If you think there’s a better or newer tool, you’re free to use that too.The key think is more »
in order to fully understand the data; programming in Python or Ruby, utilizing AWS S3, MongoDB, PostgreSQL, AWS Redshift or similar database technologies; using Jupyter notebooks and one or more statistical visualization or graphing toolkits such as Excel, Qlik Sense or Tableau. Technical Blog Posts Read more about what our more »
Python programming but also operate with modern tech such as Snowflake, Airflow, DBT, Kubernetes/Docker and cubeJS whilst also using Tableau CRM and Jupyter notebooks for gaining data insights. If you think there’s a better or newer tool, you’re free to use that too. The key think more »
handing with large data sets to create compelling reports for performance data insights with visualisations DAX and dashboard design experience knowledge in Excel Phython Jupyter NoteBooks + AWS in SQL (Visual Basic) of main IT platforms (AWS, SAP, Microsoft Platforms…etc) to work under own initiative written and verbal communication more »
skills with SQL, working with large and complex data sets to extract insights and identify trends Advanced programming skills with Python, including experience with Jupyter notebooks, PyTest, Pandas, SciKit, PyTorch, type-checking, functional programming, CI/CD, and Git A borad background in machine learning for customer and marketing purposes more »
coding skills with SQL, working with large and complex data sets to extract insights and identify trendsAdvanced programming skills with Python, including experience with Jupyter notebooks, PyTest, Pandas, SciKit, PyTorch, type-checking, functional programming, CI/CD, and GitA borad background in machine learning for customer and marketing purposes around more »
in order to fully understand the data;programming in Python or Ruby, utilizing AWS S3, MongoDB, PostgreSQL, AWS Redshift or similar database technologies;using Jupyter notebooks and one or more statistical visualization or graphing toolkits such as Excel, Qlik Sense or Tableau.Technical Blog PostsRead more about what our Engineers and more »
South East London, England, United Kingdom Hybrid / WFH Options
Climate Policy Radar
will be working in a tech team using a diverse tech stack including: Backend: Python, FastAPI, PostgreSQL, Vespa, SQLAlchemy, Flask.Frontend: React, Next.js.Data Science: Python, Jupyter, PyTorch, Pandas, Spacy, Huggingface, Numpy, Streamlit, Weights and biases.Infra: Pulumi, Docker, AWS AppRunner, Step Functions, Grafana cloud monitoring, Prefect.Who you areMust haves:Experience using Python more »
working in a tech team using a diverse tech stack including: Backend: Python, FastAPI, PostgreSQL, Vespa, SQLAlchemy, Flask. Frontend: React, Next.js. Data Science: Python, Jupyter, PyTorch, Pandas, Spacy, Huggingface, Numpy, Streamlit, Weights and biases. Infra: Pulumi, Docker, AWS AppRunner, Step Functions, Grafana cloud monitoring, Prefect. Who you are Must haves more »
development, including research and developing new propositions.• Proficiency in Data Science and Machine Learning services from Cloud providers (AWS, Azure, GCP).• Python and Jupyter ecosystem (Lab, Notebook) and related libraries and tooling.• Python libraries for data management, statistical analysis, machine learning, and visualisation.• Machine learning frameworks such as TensorFlow more »
to practice development, including research and developing new propositions.Proficiency in Data Science and Machine Learning services from Cloud providers (AWS, Azure, GCP).Python and Jupyter ecosystem (Lab, Notebook) and related libraries and tooling.Python libraries for data management, statistical analysis, machine learning, and visualisation.Machine learning frameworks such as TensorFlow and PyTorch.Knowledge more »
Greater London, England, United Kingdom Hybrid / WFH Options
Anson McCade
development, including research and developing new propositions. Proficiency in Data Science and Machine Learning services from Cloud providers (AWS, Azure, GCP). Python and Jupyter ecosystem (Lab, Notebook) and related libraries and tooling. Python libraries for data management, statistical analysis, machine learning, and visualisation. Machine learning frameworks such as TensorFlow more »
Cheltenham, England, United Kingdom Hybrid / WFH Options
Anson McCade
development, including research and developing new propositions. • Proficiency in Data Science and Machine Learning services from Cloud providers (AWS, Azure, GCP). • Python and Jupyter ecosystem (Lab, Notebook) and related libraries and tooling. • Python libraries for data management, statistical analysis, machine learning, and visualisation. • Machine learning frameworks such as TensorFlow more »
Greater London, England, United Kingdom Hybrid / WFH Options
Annalect
data (e.g. cookie logs) and cloud technologies - AWS, Redshift, S3, Athena, GCP, BigQuery Experience of using git for version control (e.g. BitBucket) Experience with Jupyter Notebooks Competencies Enquiring/questioning mind Ability to pick up new tools and keen to embrace new technologies and the ability to overcome the challenges more »
cycle of high-impact projects Assist in scoping and staging projects to include detailed milestones and delivery schedules. Use SQL and/or Python (Jupyter Notebooks) to prepare data, perform exploratory data analysis, evaluate different modeling approaches. Engage in problem-solving, fault-finding, addressing issues in the data or approaches more »
Redshift, AWS S3 Environments/Infra: AWS (required), [AWS Lambda, Terraform] (nice to have) Platforms: Creating data pipelines within Databricks or equivalent such as Jupyter Notebook, PowerBI (nice to have) of Enterprise Models for Engineering: microservices, APIs, AWS services to support models (SQS, SNS, etc.) Please Note, The role is more »
algorithms with frameworks such as Tensorflow/Pytorch Proficiency in Python with associated data processing/machine learning toolkits (Numpy, Scipy, Tensorflow/PyTorch, Jupyter notebooks, etc) Able to collaborate closely with internal and external teams and build trust Strong technical acumen and passion for learning Coordination of people and more »
algorithms with frameworks such as Tensorflow/Pytorch Proficiency in Python with associated data processing/machine learning toolkits (Numpy, Scipy, Tensorflow/PyTorch, Jupyter notebooks, etc) Able to collaborate closely with internal and external teams and build trust Strong technical acumen and passion for learning Coordination of people and more »
a focus on Mid to High Frequency equities Strong Python programming experience (KDV/Q would be a bonus) Good knowledge and understanding of Jupyter, Pandasm Numpy, Sklearn Demonstrated knowledge and understanding of mathematical modelling, statistical analysis and probability theory. Experienced in conducting alpha research Demonstrated successful in building uncorrelated more »
and Digital and TV attribution. Excellent data visualization and storytelling capabilities. Proficiency in coding, preferably in R or Python; experience with R Studio or Jupyter notebooks is a plus. Solid proficiency in Excel and PowerPoint. more »
This is a new position for a Senior Data Scientist with a global, data-driven company with cutting-edge technology who leverage data to serve as a true market differentiator. The focus of this role is to deliver data science more »
technologies such as: Apache Spark and the Hadoop Ecosystem Edge technologies eg NGINX, HAProxy etc. Excellent knowledge of YAML or similar languages Desirable Requirements: Jupyter Hub Awareness Minio or similar S3 storage technology Trino/Presto RabbitMQ or other common queue technology eg ActiveMQ NiFi Rego Familiarity with code development more »