Comfortable working with imperfect data, ambiguity, and evolving priorities. Bonus: experience with DBT, cloud data warehouses (e.g. BigQuery), or automated experimentation platforms. Technology Python (incl. pandas, statsmodels, scikit-learn), Jupyter dbt, SQL (BigQuery, PostgreSQL) Tableau or similar BI tools GitHub, GCP, Docker (optional but useful) How we expect you to work ️ Collaboration : We work in cross-functional, autonomous squads where More ❯
on our data so you will need to understand how to develop your own models • Strong programming skills and experience working with Python, Scikit-Learn, SciPy, NumPy, Pandas and Jupyter Notebooks is desirable. Experience with object-oriented programming is beneficial • Publications at top conferences, such as NeurIPS, ICML or ICLR, is highly desirable Why should you apply? • Highly competitive compensation More ❯
effectively and confidently Build great relationships with Data Science, Technology, Finance, Collections, Ops and other stakeholders What you'll need Excellent SQL skills Python data science stack (pandas, NumPy, Jupyter notebooks, Plotly/matplotlib, etc) A drive to solve problems using data Experience in a management role What would be a bonus: Familiarity with Git Data visualization tool (Tableau, Looker More ❯
and experience in GA4, Google Search Console, Google Tag Manager, Looker Studio, Google Cloud Console (Big Query), Google Apps Scripts Strong working knowledge of HTML, basic JavaScript, Python and Jupyter Notebooks as they relate to technical SEO analysis Proficiency in SEO audit tools such as SEMrush, Ahrefs, Screaming Frog, DeepCrawl, or similar Proficiency gathering marketing insights for analysis and reporting More ❯
service. You will work closely with Traders, Salespeople, and Strats across asset classes to produce insights that promote platform adoption and deliver relevant, timely content. Technologies used include Python, Jupyter, Pandas, Trino, and SQL. RESPONSIBILITIES AND QUALIFICATIONS Passion for designing and implementing programmatic solutions to client needs Excellent programming skills in languages like Python Knowledge and experience in data science More ❯
Birmingham, Staffordshire, United Kingdom Hybrid / WFH Options
Kerv Digital for Digital Transformation
certifications • Azure Synapse: Synapse-link, on-demand SQL engine, dedicated SQL pool • Writing unit tests, git Version control • Awareness of reliability patterns in ETL pipelines • Use of python in Jupyter notebooks for data processing • Azure storage technologies and cost/performance characteristics • Power BI, DAX, data flows • Techniques and tools for sanitizing data prior to use • Awareness of Kimball modelling More ❯
Longbridge, City and Borough of Birmingham, West Midlands (County), United Kingdom Hybrid / WFH Options
Kerv Digital
certifications Azure Synapse: Synapse-link, on-demand SQL engine, dedicated SQL pool Writing unit tests, git Version control Awareness of reliability patterns in ETL pipelines Use of python in Jupyter notebooks for data processing Azure storage technologies and cost/performance characteristics Power BI, DAX, data flows Techniques and tools for sanitizing data prior to use Awareness of Kimball modelling More ❯
written communication skills, with the ability to explain data findings to both technical and non-technical audiences. Experience delivering data driven insights to businesses. Familiarity with tools such as Jupyter Notebook and basic Python for data analysis. Some exposure to cloud platforms (e.g., AWS, GCP, or Azure) and interest in learning cloud-based data tools. Experience in using BI tools More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Sanderson
technology across the business. Machine Learning Engineer, key skills: Significant experience working as a Data Scientist/Machine Learning Engineer Solid knowledge of SQLandPython's ecosystem for data analysis (Jupyter, Pandas, Scikit Learn, Matplotlib). GCP, VertexAI experience is desirable (developing GCP machine learning services) Timeseries forecasting Solid understanding of computer science fundamentals, including data structures, algorithms, data modelling and More ❯
UX development Communicate clearly and manage blockers proactively Your Profile: 1+ year professional or internship engineering experience Solid foundation in software design patterns and data structures Familiar with Git, Jupyter, command line, and agile workflows Experience with: React.js Node Python CSS Typescript Unit Testing AI/ML: LangChain, PyTorch, TensorFlow (basic understanding) Bonus: Interest in ethical AI, UX design, and More ❯
in HFT or MFT Deep understanding of probability and statistics, data modelling skills, scientific thinking Coding skills, particularly in Python, to implement and test research models using tools like Jupyter Notebooks. Excellent communication skills to articulate ideas and engage in productive discussions within the team. Humility and a collaborative mindset, willing to challenge and be challenged to refine ideas Compensation More ❯
in HFT or MFT Deep understanding of probability and statistics, data modelling skills, scientific thinking Coding skills, particularly in Python, to implement and test research models using tools like Jupyter Notebooks. Excellent communication skills to articulate ideas and engage in productive discussions within the team. Humility and a collaborative mindset, willing to challenge and be challenged to refine ideas Compensation More ❯
and dedicated time for your personal development What you'll be working with: •Backend: Distributed, event-driven core Java (90% of the code-base), MySQL, Kafka •Data analytics: Python & Jupyter notebooks, Parquet, Docker •Testing: JUnit, JMH, JCStress, Jenkins, Selenium, many in-house tools •OS: Linux (Fedora for development, Rocky in production) The LMAX way is to use the right tool More ❯
/AI/LLM Solutions. A passion for Generative AI, and an understanding of strengths and weaknesses of Generative LLM's and AI technologies. Comfortable working with Python and Jupyter Notebooks. Excellent communication skills - you can toggle seamlessly between presenting to CEOs, and getting in the weeds or white boarding with technical audiences. High tolerance for ambiguity. You can identify More ❯
workflows, and the data science ecosystem Familiarity with alpha modeling, signal construction, portfolio analytics, and modern investment strategies Exposure to modern programming languages (e.g. Python, R) and environments like Jupyter Notebooks A solution-oriented mindset and the ability to thrive in a fast-moving environment A strong desire to learn, adapt, and continuously grow both commercially and technically Based in More ❯
and dedicated time for your personal development What you'll be working with: •Backend: Distributed, event-driven core Java (90% of the code-base), MySQL, Kafka •Data analytics: Python & Jupyter notebooks, Parquet, Docker •Testing: JUnit, JMH, JCStress, Jenkins, Selenium, many in-house tools •OS: Linux (Fedora for development, Rocky in production) The LMAX way is to use the right tool More ❯
workflow management tools (Airflow/Argo) Prior experience writing documentation for senior stakeholders; the ability to accurately abstract and summarize technical information is critical Python programming skills: PySpark, Pandas, Jupyter Notebooks (3+ years in a professional environment) Prior experience working with git in a professional environment Ability to work independently in a fast-paced environment; prioritize multiple tasks and projects More ❯
adoption, user engagement, and client satisfaction to measure the success and identify areas for improvement Stay Ahead of the Curve: Continuously monitor advancements with our integrated platforms (eg Excel, Jupyter and more) to identify new opportunities to drive user engagement and satisfaction You'll need to have: 5+ years of product management experience, demonstrating strong execution capabilities Strong understanding of More ❯
Defender XDR, Entra, Purview). Create scripts, APIs, and orchestrations that reduce manual effort and improve speed and accuracy in security operations. - Tell Stories with Data: Use tools like Jupyter Notebooks, Kusto Query Language (KQL), and Python to query and visualize large-scale security datasets. Translate telemetry into insights and share narratives that influence decision-making across engineering and leadership … engineering, preferably in cloud-native or regulated environments. - Strong programming/scripting skills (Python preferred) with a focus on infrastructure and operations tooling. - Experience working with large datasets in Jupyter Notebooks and building dashboards or reports for security posture and compliance. - Strong communication skills with an ability to convey technical concepts to non-technical stakeholders. - Role is UK based and More ❯