with strategic goals Collaborate with the wider engineering team to ensure integrations follow best practices Partner with Product teams to incorporate customer support data into product development Provide technical guidance to support team members for troubleshooting complex issues Create documentation that bridges technical complexity for non-technical team … members Data & Analytics Build datapipelines that extract meaningful insights from support interactions Develop dashboards and reporting tools that measure support effectiveness Implement A/B testing frameworks for support automation initiatives Use data to identify patterns and opportunities for proactive support Create predictive … Jira and other support/workflow tools Understanding of the Shopify ecosystem and APIs Background in e-commerce or SaaS environments Knowledge of data visualization techniques and tools Experience with distributed systems Please don't hesitate to apply if you miss a few criteria but you believe you More ❯
maintain scalable fullstack web applications using React on the frontend and Python (Django/FastAPI/Flask) on the backend. Collaborate closely with data scientists, ML engineers, and product managers to integrate AI-driven features into the platform. Develop and maintain RESTful APIs and backend services to support … core infrastructure and AI tools. Build and enhance dynamic frontend interfaces that deliver real-time insights and data visualisations. Work with large datasets and real-time data processing systems. Optimise code and system architecture for performance, scalability, and reliability. Troubleshoot, debug, and resolve complex technical challenges … ML models or tools into production applications. Exposure to real-time systems, messaging queues (e.g., Kafka, RabbitMQ), or event-driven architectures. Knowledge of datapipelines, analytics tooling, or large-scale data processing. Interest in cloud-native and serverless architectures. Familiarity with tools such as TensorFlow, PyTorch More ❯
maintain scalable fullstack web applications using React on the frontend and Python (Django/FastAPI/Flask) on the backend. Collaborate closely with data scientists, ML engineers, and product managers to integrate AI-driven features into the platform. Develop and maintain RESTful APIs and backend services to support … core infrastructure and AI tools. Build and enhance dynamic frontend interfaces that deliver real-time insights and data visualisations. Work with large datasets and real-time data processing systems. Optimise code and system architecture for performance, scalability, and reliability. Troubleshoot, debug, and resolve complex technical challenges … ML models or tools into production applications. Exposure to real-time systems, messaging queues (e.g., Kafka, RabbitMQ), or event-driven architectures. Knowledge of datapipelines, analytics tooling, or large-scale data processing. Interest in cloud-native and serverless architectures. Familiarity with tools such as TensorFlow, PyTorch More ❯
learning, chain-of-thought prompting) and embedding to customise LLMs. Integrate AI functionalities into existing platforms and applications. Collaborate with engineers to define data requirements and develop data pipelines. Stay up-to-date with the latest advancements in AI and machine learning. Troubleshoot and debug AI … LLMs. Desirable but not essential: Experience with fine-tuning LLMs. Contributions to open-source LLM or NLP projects. Experience of working with health data and adhering to regulatory requirements. Familiar with IPaaS platforms for prototyping. Experience in working in healthcare software development. Familiarity with the cost aspects of More ❯
re just getting started. Our mission? To empower one million businesses with the financial tools they deserve. We combine cutting-edge technology and data science with genuine human understanding to make finance feel less like a barrier and more like a superpower. Whether it's managing cash flow … to grow and succeed. We operate as a fully autonomous unit, bringing together specialists in back-end and front-end development, product management, data science, and analytics. This diversity of expertise allows us to tackle challenges from every angle, move fast, and deliver solutions that truly work for … business outcomes. Analytics excellence: Develop experiments to test new product concepts, iterating based on customer feedback and performance data. Build custom tools and datapipelines to investigate complex topics and enable the wider team to monitor the performance of products post-launch, using insights to inform future improvements. More ❯
development of our Monolith Platform. This will be a lot of new feature development as we start to roll out new, cutting-edge data science tools and models which allow engineers to model complex physical systems using AI, reducing test times by up to 70%. Given we … new technologies and practices. There's a huge opportunity for cross-team collaboration in this role. You'll speak with DevOps, QA, Product, Data Science regularly. Your skillset: You have a minimum of 7 years experience working in Software Engineering. At least 3 years of experience coding in … years of experience coding in React. Team Lead experience. Experience working on Cloud Infrastructure - AWS or Azure. Nice to haves: Experience building data platforms/data pipelines. You've had the opportunity to and enjoyed being part of a fast-paced and growing Software Engineering company. More ❯
re just getting started. Our mission? To empower one million businesses with the financial tools they deserve. We combine cutting-edge technology and data science with genuine human understanding to make finance feel less like a barrier and more like a superpower. Whether it's managing cash flow … to grow and succeed. We operate as a fully autonomous unit, bringing together specialists in back-end and front-end development, product management, data science, and analytics. This diversity of expertise allows us to tackle challenges from every angle, move fast, and deliver solutions that truly work for … business outcomes. Analytics leadership: Lead experiments to test new product concepts, iterating based on customer feedback and performance data. Build custom tools and datapipelines to investigate complex topics and enable the wider team to monitor the performance of products post-launch, using insights to inform future improvements. More ❯
re just getting started. Our mission? To empower one million businesses with the financial tools they deserve. We combine cutting-edge technology and data science with genuine human understanding to make finance feel less like a barrier and more like a superpower. Whether it's managing cash flow … to grow and succeed. We operate as a fully autonomous unit, bringing together specialists in back-end and front-end development, product management, data science, and analytics. This diversity of expertise allows us to tackle challenges from every angle, move fast, and deliver solutions that truly work for … business outcomes. Analytics leadership: Lead experiments to test new product concepts, iterating based on customer feedback and performance data. Build custom tools and datapipelines to investigate complex topics and enable the wider team to monitor the performance of products post-launch, using insights to inform future improvements. More ❯
Nottingham, Nottinghamshire, East Midlands, United Kingdom
Hillarys HR
coverings, operating across 136 entities over 100 countries. Hunter Douglas is looking for a technical FP&A Lead with a passion for systems, data architecture, and scalable automation. This role goes far beyond classic reporting—it's about building the infrastructure and tools that power modern, insight-led … across the division make smarter, faster, and more informed decisions. You'll lead FP&A in a company that's exploring how big data, automation, and AI can elevate financial planning. Whether it's building predictive models, refining datapipelines, or embedding forecasting logic directly into … creation and provision for the UK division Support integration of AI into forecasting processes Own the design and enhancement of financial systems and data flows. Automate manual reporting and data consolidation using SQL, Python, or relevant tools. Collaborate with Data, Engineering, and Ops teams More ❯
experienced and autonomous Game Analyst to drive analytics across our portfolio of games. You'll own critical parts of our analytics stack - from data modeling to dashboarding to performance storytelling - and play a key role in shaping how we understand and communicate game performance. You'll also mentor … ideas and grow in a playful and supportive group. Starting date: ASAP, permanent contract after 6 months probationary period Reporting to: Head of Data Services Location: 100% remote from UK, Portugal, or Sweden with the option for hybrid/in-office work at our Stockholm or Lisbon office. … Please submit your application in English. Responsibilities Own and maintain game analytics dashboards and KPIs in Lightdash/Looker Design and iterate on data models, working closely with engineers and existing datapipelines Develop clear, actionable performance reports for clients and internal teams Advise on best More ❯
Barcelona, Luxembourg, Milan, Munich, Leipzig, The Risk Manager, Innovation supports Global Security Risk Management & Resilience by leading initiatives that transform complex risk-related data into actionable insights across Amazon's worldwide operations. This role blends traditional risk management expertise with advanced analytics, utilizing technology and automation to significantly … enhance risk identification, assessment, and response capabilities. Key responsibilities include designing robust data integration systems, implementing predictive risk models, and driving technological innovation to ensure risk insights are timely, accurate, and strategically actionable Key job responsibilities • Lead strategic innovation projects, including the implementation of global data centralization and automation initiatives, to enhance GRMR's risk, resilience, and business continuity capabilities. • Collaborate on optimizing risk datapipelines ensuring accurate global and regional risk insights. • Integrate advanced analytics, predictive modeling, AI, and machine learning to enhance risk assessment accuracy, streamline processes, and support real More ❯
Junior–Mid Software Engineer (DataPipelines) 📍 Onsite – London | Full-time We’re helping one of our partner — a seriously exciting AI company started by a team of top scientists — find a Software Engineer to join their data team. They’re building powerful platforms that help organisations … make smarter decisions, and the data ingestion & pipelines team plays a huge part in that. You’d be working on real infrastructure, solving real problems, and making an actual impact from day one. What you’ll be up to: Building and optimising datapipelines (think ETL … scraping, ingestion) Keeping data accurate, clean, and flowing smoothly Collaborating with engineers, analysts, and data scientists Playing around with modern tools, cloud platforms, and a very smart team You might be a good fit if you: Have 1–3 years of experience writing software (Python, Java More ❯
Junior–Mid Software Engineer (DataPipelines) 📍 Onsite – London | Full-time We’re helping one of our partner — a seriously exciting AI company started by a team of top scientists — find a Software Engineer to join their data team. They’re building powerful platforms that help organisations … make smarter decisions, and the data ingestion & pipelines team plays a huge part in that. You’d be working on real infrastructure, solving real problems, and making an actual impact from day one. What you’ll be up to: Building and optimising datapipelines (think ETL … scraping, ingestion) Keeping data accurate, clean, and flowing smoothly Collaborating with engineers, analysts, and data scientists Playing around with modern tools, cloud platforms, and a very smart team You might be a good fit if you: Have 1–3 years of experience writing software (Python, Java More ❯
improvements in S&OP accuracy, automation, and plan optimality - partnering with teams across SCOT. You will be in charge of creating new automated datapipelines and processes, building the single source of truth for all operations teams relying on S&OP data to make business decisions. … Key job responsibilities Designing and implementing complex data models, developing advanced analytics solutions, and creating insightful dashboards and reports. Collaborate closely with cross-functional teams, including operations, finance, and product management, to identify opportunities for supply chain optimization. Translating business requirements into technical specifications, conducting in-depth data analysis, and communicating findings to both technical and non-technical stakeholders. Mentoring junior team members, driving best practices in data engineering and visualization, and contributing to the overall data strategy of the supply chain organization. BASIC QUALIFICATIONS Bachelor's degree Experience owning/driving More ❯
London, St. Ann's, United Kingdom Hybrid / WFH Options
SF Recruitment
Python Software Engineer with key skills in python, Typescript and an understanding of datapipelines is sought by a market leading SaaS start up based in the UK. With recent investment this Python Software Engineer will work closely with the Head of Engineering on new product functionality aimed … express, angular etc - A product focussed, business first approach to engineering - Experience working in a fast paced, dynamic start up environments - Experience of datapipelines/ETL processes …/data platforms/data import processes etc would be a real plus - Cloud native architecture exposure - AWS, GCP - Solid pipeline automation skills (CI/CD) In return this Python Software Engineer will receive - Base salary of up to £90,000 - Generous equity stake with More ❯
Employment Type: Permanent
Salary: £75000 - £90000/annum Remote & equity & great progression
is growing fast! We're looking for smart, dedicated, driven individuals that want to help the world's most innovative organizations solve complex data management challenges. We envision a future without compromise for our customers, so we're creating a novel approach to data management that … We've raised $375M in capital and are backed by dozens of leading venture capital and strategic investors. Our flagship product, the WEKA Data Platform, is helping hundreds of the world's leading research organizations and enterprises-including 12 of the Fortune 50-to achieve first-to-market … is growing fast! We're looking for smart, dedicated, driven individuals that want to help the world's most innovative organizations solve complex data management challenges. "Everyone at WEKA works so hard, and we come together to do some really amazing things. It's a rush to be More ❯
Quant Researchers. Your challenges will be varied, and will involve implementing new trading strategies, building new research frameworks and quant libraries, prototyping new data feeds, development of new portfolio construction techniques or building risk analysis tools. The Team Quant Developers at Man Group are all part of our … pandas, scikit-learn to name a few of the open-source libraries we use extensively. We implement the systems that require the highest data throughput in Java and C++. We use Airflow for workflow management, Kafka for datapipelines, Bitbucket for source control, Jenkins for continuous … review, unit testing, refactoring and related approaches Strong knowledge of Python Proficient on Linux platforms with knowledge of various scripting languages Experience of data analysis techniques along with relevant libraries e.g. NumPy/SciPy/Pandas Relevant mathematical knowledge e.g. statistics, asset pricing theory, optimisation algorithms Advantageous Experience More ❯
Linux/Unix CLI, Git, and testing. Experience with one of a major cloud platform (preferably AWS, or Azure platform). Knowledge of data governance and security best practices. Experience with Infrastructure as Code (IaC) tools like Terraform or CloudFormation It would also be great if you have … Basic familiarity with utilizing GPUs in the cloud environment Familiarity with monitoring tools and techniques for ML models and datapipelines Experience building APIs Demonstrable experience with modern CI/CD pipelines Knowledge of good DevOps and Data Engineering practices As a technology consultancy, we look More ❯
drive improvements in S&OP accuracy, automation, and plan optimality - partnering with teams across SCOT. You will be responsible for creating new automated datapipelines and processes, building the single source of truth for all operations teams relying on S&OP data to make business decisions. … Key job responsibilities Designing and implementing complex data models, developing advanced analytics solutions, and creating insightful dashboards and reports. Collaborate closely with cross-functional teams, including operations, finance, and product management, to identify opportunities for supply chain optimization. Translating business requirements into technical specifications, conducting in-depth data analysis, and communicating findings to both technical and non-technical stakeholders. Mentoring junior team members, driving best practices in data engineering and visualization, and contributing to the overall data strategy of the supply chain organization. BASIC QUALIFICATIONS Bachelor's degree Experience owning/driving More ❯
GitLab for building and provisioning CI/CD pipelines for Kubernetes deployments. Qualifications Experienced AWS DevOps engineer Proficient with AWS services like S3, Data Lakes, Glue, IAM, DataPipelines, Lambda, QuickSight, Redshift, Fargate Deep understanding of AWS services, especially EKS and data-related services More ❯
building real relationships—not wrestling with admin. Our platform spans planning, promotion and post-show follow-up, backed by an AWS serverless stack, datapipelines in Python, and cutting-edge LLM tooling. Why this role matters We are scaling fast, and the technical foundation you build today will … AWS serverless stack (SST.dev, Lambda, EventBridge, DynamoDB/PostgreSQL) and Remix/React front-end. 20% Architect systems that can ingest millions of data points (social, streaming, ticketing) in near-real-time and surface actionable insights for promoters. 15% Integrate AI workflows (LangGraph, OpenAI/Anthropic/Gemini … quality through PR reviews, automated testing (Vitest/Playwright), and IaC best practices (CDK, SST). 10% Mentor teammates and collaborate with Product, Data, and Artist Relations to translate business goals into resilient software. 5% Champion DevEx, proposing improvements to CI/CD, observability, and performance. You'll More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Gotobeat
building real relationships—not wrestling with admin. Our platform spans planning, promotion and post-show follow-up, backed by an AWS serverless stack, datapipelines in Python, and cutting-edge LLM tooling. Why this role matters We are scaling fast, and the technical foundation you build today will … AWS serverless stack (SST.dev, Lambda, EventBridge, DynamoDB/PostgreSQL) and Remix/React front-end. 20% Architect systems that can ingest millions of data points (social, streaming, ticketing) in near-real-time and surface actionable insights for promoters. 15% Integrate AI workflows (LangGraph, OpenAI/Anthropic/Gemini … quality through PR reviews, automated testing (Vitest/Playwright), and IaC best practices (CDK, SST). 10% Mentor teammates and collaborate with Product, Data, and Artist Relations to translate business goals into resilient software. 5% Champion DevEx, proposing improvements to CI/CD, observability, and performance. You'll More ❯
Role Overview: As a Machine Learning Engineer at DeGould you will be responsible for building and maintaining our labelling, training and production inference datapipelines to produce high quality datasets, models and services to power our automated vehicle inspection product. Following MLOps and DevOps best practices you will … AWS, GCP and on Edge to process photos from DeGould's ultra high-resolution imaging photo booths. The objective is to convert this data into useful information that creates value for customers. DeGould is an exciting, multi-award-winning company, in the software and AI sector. The company … or all of the following: Developing and championing robust MLOps frameworks and policies. Training and maintaining performant vehicle segmentation models. Labelling tasks and data quality. Designing and implementing reporting dashboards. Developing novel approaches from academic and industry research. Production model deployment and maintenance. Skills: Technical expertise in AI More ❯
platform. Key job responsibilities Projects: Develop analytical models to assess the problems, solutions and impact on business Understand a business problem, the available data and identify what statistical techniques can be applied for the solution Responsible for giving insights to management for strategic planning Reporting: Own the design … key drivers of our business Partner with operations/business teams to consult, develop and implement KPIs, automated reporting/process solutions and data infrastructure improvements to meet business Enable effective decision making by retrieving and aggregating data from multiple sources and compiling it into a … digestible and actionable format Prepare and deliver business reviews to the senior management team regarding progress and roadblocks Data Management: Managing Datapipelines and warehouses. Interface with other technology teams to extract, transform, and load data from a wide variety of dataMore ❯
make a difference apply now! What youll be doing Working within a small, collaborative team, youll play a key role in creating geospatial data solutions that help other businesses to understand and improve their impact around sustaibility and more. In this role, youll be working on building and … maintaining datapipelines to load and transform geospatial datasets using PostgreSQL/PostGIS. Youll also manage databases and queries to keep everything running smoothly and efficiently. Youll also be responsible for sourcing and cleaning new datasets, ensuring theyre accurate and ready to use. Day to day, youll be … using various tools to carry out geospatial analysis and create useful data visualisations. Youll also write Python scripts to automate data processing, quality checks, and data ingestion, making things faster and more efficient! Youll collaborate closely with your team and work closely with developers More ❯