Plumstead, Greater London, UK Hybrid / WFH Options
FalconSmartIT
Athena, Quicksight, and Sagemaker, among others. Proficiency in Snowflake's architecture and familiarity with its features, including Snowpipe, Streams, and Tasks for real-time data processing. Additionally, familiarity with Streamlit and knowledge of DevOps practices for CI/CD pipelines. Should hold relevant technical certifications from AWS and Snowflake. The ability to conduct data landscape assessments and provide solutions for More ❯
Search & Retrieval: Implement retrieval-augmented workflows: ingest documents, index embeddings (Pinecone, FAISS, Weaviate), and build similarity search features. Rapid Prototyping: Create interactive AI demos and proofs-of-concept with Streamlit, Gradio, or Next.js for stakeholder feedback; MLOps & Deployment: Implement CI/CD pipelines (e.g., GitLab Actions, Apache Airflow), experiment tracking (MLflow), and model monitoring for reliable production workflows; Cross-Functional … in Python; Deep expertise in prompt engineering and tooling like LangChain or LlamaIndex; Proficiency with vector databases (Pinecone, FAISS, Weaviate) and document embedding pipelines; Proven rapid-prototyping skills using Streamlit or equivalent frameworks for UI demos. Familiarity with containerization (Docker) and at least one orchestration/deployment platform; Excellent communication and ability to frame AI solutions in business terms. Nice More ❯
Bedrock, Google Vertex AI, and Azure AI Studio. • Integrate semantic search strategies and embeddings to improve LLM relevance and contextual understanding. • Build demo-ready user interfaces with tools like Streamlit, Gradio, or React. • Develop and maintain robust APIs to connect AI agents to internal and third-party services. • Leverage SQL to query and integrate structured data into reasoning and knowledge More ❯
clients to understand their needs and deliver production-ready, secure AI applications Build and maintain CI/CD pipelines, deploying across AWS, GCP, or Azure Rapidly prototype tools using Streamlit, LangChain, CrewAI, and dev tools like Cursor Integrate third-party AI model APIs (e.g. OpenAI, Anthropic) and work with vector databases Translate business problems into scalable data pipelines and deployable … data pipelines, and structured code for production) Experience building and applying LLM-powered solutions (e.g. RAG, agentic frameworks) Familiarity with cloud platforms like AWS, GCP, or Azure Experience using Streamlit, vector databases, and integrating with model APIs Apply today or send your CV to kat@thedatagals.co.uk More ❯
systems during both development and long-term support Practical experience calling LLMs via APIs and dealing with varied responses Bonus skills Private equity or financial services experience Azure, Postgres, Streamlit or equivalents Fact extraction, Q&A and RAG on documents Comfort working in small teams with fast iteration cycles More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Behavioural Insights Team
concepts such as Infrastructure as Code, CI/CD, or MLOps. Knowledge of Python-based data pipeline frameworks (e.g. Airflow, Metaflow) and Python-based product development packages (e.g. FastAPI, Streamlit). An interest in developing or improving full stack product development skills. Who are we and what can we offer you? The Behavioural Insights Team is the world leader in More ❯
trends using real-world metrics like JIRA and Blue Optima, and influence how we evolve as a high-performing engineering organisation. What Youll Do Build dashboards and tools in Streamlit, Snowflake, and Python to measure and improve developer performance. Design and implement metrics frameworks using JIRA, Blue Optima, Git data, and more. Analyse engineering workflows to identify blockers, inefficiencies, and … and contribute to internal GenAI tools that help managers identify and address productivity bottlenecks. Support reporting to C-Suite leadership on developer productivity, workforce transformation, and team effectiveness. Skills Streamlit Snowflake Python AI Jira SQL Blueoptima Dora Job Title: Developer Productivity & Engineering Analytics Engineer Location: London, UK Rate/Salary: 500.00 - 600.00 GBP Daily Job Type: Contract Trading as TEKsystems. More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
Harnham
/CD pipelines and modern software practices Background in document extraction and reasoning (LLMs, RAG, agents) Exposure to multimodal data (text, images, structured) MLOps, model monitoring, and rapid prototyping (Streamlit, Dash, etc.) 3–4+ years in software or ML engineering roles The Interview Process Intro chat with Lead Data Scientist Take-home technical challenge – Focused on structuring AI software Final More ❯
Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
/CD pipelines and modern software practices Background in document extraction and reasoning (LLMs, RAG, agents) Exposure to multimodal data (text, images, structured) MLOps, model monitoring, and rapid prototyping (Streamlit, Dash, etc.) 3-4+ years in software or ML engineering roles The Interview Process Intro chat with Lead Data Scientist Take-home technical challenge - Focused on structuring AI software More ❯
persistent agent behaviours. Use cloud platforms like Amazon Bedrock, Google Vertex AI, and Azure AI Studio for model orchestration and deployment. Build and integrate UI prototypes using tools like Streamlit, Gradio, or React for demoable outputs. Develop robust API integrations to connect AI agents with internal and external services and data sources. Use SQL to query structured databases and integrate More ❯
a global team and working across multiple stakeholder groups Strong experience in a querying language such as SQL Experience in a visualization or dashboarding tool is beneficial (e.g. Tableau, Streamlit, Looker, or Mode) Excellent communication skills to summarize insights and recommendations to audiences of varying levels of technical sophistication Familiarity with CRM platforms (e.g. Salesforce) and other sales enablement tools More ❯
data platform and services using the below technologies: Python as our main programming language Databricks as our datalake platform Kubernetes for data services and task orchestration Terraform for infrastructure Streamlit for data applications Airflow purely for job scheduling and tracking Circle CI for continuous deployment Parquet and Delta file formats on S3 for data lake storage Spark for data processing More ❯
scipy, hugging face etc. Understanding of statistical and machine learning models. Knowledge of experimental design, statistical testing and model validation. Experience in data visualization tools such as plotly, seaborn, streamlit etc would be an advantage. Understanding of data modelling and exposure to tools like metaflow or airflow is desirable. Understanding of common software practices like version control, continuous deployment, and More ❯
architectural discussions and solution designs focused on data mesh, multi-domain data governance, and inter-organizational collaboration. Provide hands-on technical guidance in the implementation of Snowflake native applications, Streamlit dashboards, and Snowpark Container Services. Stay current on developments in cloud platforms and data-sharing protocols, identifying integration opportunities with Snowflake’s platform. Build strong relationships with key customers, acting … Field CTO. Snowflake Expertise: In-depth knowledge of Snowflake’s collaboration features, including Data Sharing, Clean Rooms, Listings, and Organization Development: Experience building with Snowflake’s application framework, including Streamlit, Snowpark, and Snowpark Container Services. Data Architecture: Deep understanding of modern data architecture patterns, such as data mesh, decentralized governance, and multi-tenant SaaS models. Cloud Platforms: Strong familiarity with More ❯
mission-driven team focused on learning and climate impact A plus but not essential: ⚡️ Power trading or electricity systems 📈 Time-series data 🎙 Comfortable chatting with clients Tech: Python/Streamlit/Docker/GCP/Figma More ❯
regions. The role leverages a modern tech stack including SQL, Python, Airflow, Kubernetes, and various other cutting-edge technologies. You'll work with tools like dbt on Databricks, PySpark, Streamlit, and Django, ensuring robust data infrastructure that powers business-critical operations. What makes this role particularly exciting is the combination of technical depth and business impact. You'll be working More ❯
regions. The role leverages a modern tech stack including SQL, Python, Airflow, Kubernetes, and various other cutting-edge technologies. You'll work with tools like dbt on Databricks, PySpark, Streamlit, and Django, ensuring robust data infrastructure that powers business-critical operations. What makes this role particularly exciting is the combination of technical depth and business impact. You'll be working More ❯