and evaluate highly innovative models for Natural Language Programming (NLP), Large Language Model (LLM), or Large Computer Vision projects. Use SQL to query and analyze the data. Use Python, Jupyter notebook, and Pytorch to train/test/deploy ML models. Use machine learning and analytical techniques to create scalable solutions for business problems. Research and implement novel machine learning More ❯
the-art research areas (e.g., NLP, Transfer Learning, etc.), and modern Deep Learning algorithms (e.g., BERT, LSTM, etc.) Solid knowledge of SQL and Python's ecosystem for data analysis (Jupyter, Pandas, Scikit Learn, Matplotlib, etc.) Understanding of model evaluation, data pre-processing techniques, such as standardisation, normalisation, and handling missing data Solid understanding of summary, robust, and nonparametric statistics; hypothesis More ❯
and the Hadoop Ecosystem Edge technologies e.g. NGINX, HAProxy etc. Excellent knowledge of YAML or similar languages The following Technical Skills & Experience would be desirable for Data Devops Engineer: Jupyter Hub Awareness Minio or similar S3 storage technology Trino/Presto RabbitMQ or other common queue technology e.g. ActiveMQ NiFi Rego Familiarity with code development, shell-scripting in Python, Bash More ❯
and other Qualtrics products Acquire data from customers (usually sftp or cloud storage APIs) Validate data with exceptional detail orientation (including audio data) Perform data transformations (using Python and Jupyter Notebooks) Load the data via APIs or pre-built Discover connectors Advise our Sales Engineers and customers as needed on the data, integrations, architecture, best practices, etc. Build new AWS More ❯
design and implement data engineering and AI/ML infrastructure. Things we're looking for: Proficiency in data analysis, insights generation and using cloud-hosted tools (e.g., BigQuery, Metabase, Jupyter). Strong Python and SQL skills, with experience in data abstractions, pipeline management and integrating machine learning solutions. Adaptability to evolving priorities and a proactive approach to solving impactful problems More ❯
include: 3+ years industry experience in a Data Science role and a strong academic background Python Data Science Stack: Advanced proficiency in Python , including pandas , NumPy , scikit-learn , and Jupyter Notebooks . Statistical & ML Modelling: Strong foundation in statistical analysis and proven experience applying a range of machine learning techniques to solve business problems (e.g., regression, classification, clustering, time-series More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Singular Recruitment
include: 3+ years industry experience in a Data Science role and a strong academic background Python Data Science Stack: Advanced proficiency in Python , including pandas , NumPy , scikit-learn , and Jupyter Notebooks . Statistical & ML Modelling: Strong foundation in statistical analysis and proven experience applying a range of machine learning techniques to solve business problems (e.g., regression, classification, clustering, time-series More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Singular Recruitment
include: 3+ years industry experience in a Data Science role and a strong academic background Python Data Science Stack: Advanced proficiency in Python , including pandas , NumPy , scikit-learn , and Jupyter Notebooks . Statistical & ML Modelling: Strong foundation in statistical analysis and proven experience applying a range of machine learning techniques to solve business problems (e.g., regression, classification, clustering, time-series More ❯
DevOps Methodologies: experience of working on Agile projects Good understanding of SOA/Microservices based architectures Good understanding of OOP, SOLID principles and software design patterns Knowledge of Python (Jupyter notebooks) Benefits offered Bonus, Pension (9% non-contributory plus additional matched contributions), 4 x Life Assurance, Group Income Protection, Season Ticket Loan, GAYE, BUPA Private Medical, Private GP, Travel Insurance More ❯
effectively and confidently Build great relationships with Data Science, Technology, Finance, Collections, Ops and other stakeholders What you'll need Excellent SQL skills Python data science stack (pandas, NumPy, Jupyter notebooks, Plotly/matplotlib, etc) A drive to solve problems using data Experience in a management role What would be a bonus: Familiarity with Git Data visualization tool (Tableau, Looker More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Curo Resourcing Ltd
Big Data solutions (ecosystems) and technologies such as: Apache Spark and the Hadoop Ecosystem Excellent knowledge of YAML or similar languages The following Technical Skills & Experience would be desirable: Jupyter Hub Awareness RabbitMQ or other common queue technology e.g. ActiveMQ NiFi Rego Familiarity with code development, shell-scripting in Python, Bash etc. To apply for this DV Cleared DevOps Engineer More ❯
London, England, United Kingdom Hybrid / WFH Options
Xcede
building machine learning models for marketing use cases ADVANCED skills in segmentation, recommendation, campaign optimisation, forecasting, etc. Proficiency in Python, SQL, Bash, and Git. Familiarity with tools like Pandas, Jupyter notebooks, PyTorch. Experience with advanced techniques including CausalAI, NLP, RNNs, GraphAI, GenAI, and Computer Vision. Solid understanding of experimentation methods including A/B testing. Strong communication skills and ability More ❯
to generate analytics that enhance client service, partnering with traders, salespeople, and strategists across asset classes to promote platform adoption and deliver relevant, timely content. Key technologies include Python, Jupyter, Pandas, Trino, and SQL. RESPONSIBILITIES AND QUALIFICATIONS: Design and implement programmatic solutions to meet client needs Proficient in scripting languages such as Python Knowledge of data science and machine learning More ❯
service. You will work closely with Traders, Salespeople, and Strats across asset classes to produce insights that promote platform adoption and deliver relevant, timely content. Technologies used include Python, Jupyter, Pandas, Trino, and SQL. RESPONSIBILITIES AND QUALIFICATIONS Passion for designing and implementing programmatic solutions to client needs Excellent programming skills in languages like Python Knowledge and experience in data science More ❯
a more effective platform; Open to traveling to Octopus offices across Europe and the US. Our Data Stack: SQL-based pipelines built with dbt on Databricks Analysis via Python Jupyter notebooks Pyspark in Databricks workflows for heavy lifting Streamlit and Python for dashboarding Airflow DAGs with Python for ETL running on Kubernetes and Docker Django for custom app/database More ❯
large-scale analytics platform. Proficiency in Python or R, including libraries like Pandas. Experience working with BI tools such as Looker, Tableau, Power BI, or Mode. Confidence working in Jupyter or similar notebook environments. A solid grasp of ETL processes and associated best practices. Familiarity with Git, Agile workflows, and DevOps tooling is advantageous. Exposure to customer/product analytics More ❯
related or quantitative discipline, you will have insurance P&C experience. You will also be proficient in various coding languages (e.g. R, Python) and development environments (e.g. R Studio, Jupyter, VS Code). Alongside this, you will be experienced in data visualization and communication around this to present insights to a non-technical audience. In this key role, you will More ❯
Strong problem-solving and analytical skills Experience in support roles, incident triage, and handling (SLAs) Linux system administration basics, Bash scripting, environment variables Experience with browser-based IDEs like Jupyter Notebooks Familiarity with Agile methodologies (SAFE, Scrum, JIRA) Languages and Frameworks: JSON YAML Python (advanced proficiency, Pydantic bonus) SQL PySpark Delta Lake Bash Git Markdown Scala (bonus) Azure SQL Server More ❯
on solutions. Requirements 1-4 years of experience in a quantitative, analytics, or developer role within a financial institution or trading environment. Strong proficiency in Python (e.g., Pandas, NumPy, Jupyter) and experience building data pipelines , analytical tools , or dashboards . SQL experience is a plus. Proficiency in Excel and data visualization platforms such as Power BI, Tableau, or Plotly/ More ❯
on solutions. Requirements 1-4 years of experience in a quantitative, analytics, or developer role within a financial institution or trading environment. Strong proficiency in Python (e.g., Pandas, NumPy, Jupyter) and experience building data pipelines , analytical tools , or dashboards . SQL experience is a plus. Proficiency in Excel and data visualization platforms such as Power BI, Tableau, or Plotly/ More ❯
Harrogate, Yorkshire, United Kingdom Hybrid / WFH Options
Northrop Grumman Corp. (AU)
/or Perl Experience with Atlassian tool suite (e.g., Confluence, JIRA, etc.) Experience working in a geographically diverse team and matrix organization Any experience of Flask Web Framework or Jupyter Notebooks is desirable. Security clearance: You must be able to gain and maintain the highest level of UK Government security clearance. Our requirement team is on hand to answer any More ❯
of between £30,000 to £40,000 per annum depending upon experience. Key Skills/Experience: Programming skills in R or Python including use of notebooks (R markdown/Jupyter) Experience in data mining, working with large datasets such as DNA- and RNA-sequencing, epigenomic, proteomic and protein-protein interaction data Experience in statistical analysis and handling of large data More ❯
and supporting the business with both regular and ad hoc data deliverables 🛠 Tech you’ll work with: SQL Server (SSIS, SSRS, SSAS) Python AWS stack – Glue, Lambda, S3, EC2, Jupyter Power BI or Tableau (bonus) Excel (PowerPivot, VBA, lookups, advanced formulas) 🌱 You’ll also: Collaborate closely with our Data Engineers and Product Owner Own your solutions end-to-end, from More ❯
and supporting the business with both regular and ad hoc data deliverables 🛠 Tech you’ll work with: SQL Server (SSIS, SSRS, SSAS) Python AWS stack – Glue, Lambda, S3, EC2, Jupyter Power BI or Tableau (bonus) Excel (PowerPivot, VBA, lookups, advanced formulas) 🌱 You’ll also: Collaborate closely with our Data Engineers and Product Owner Own your solutions end-to-end, from More ❯