slough, south east england, united kingdom Hybrid / WFH Options
Omnis Partners
and AI consultancy. Our client is expanding their Databricks practice and needs multiple specialists to join this high-impact initiative. 🎯 DATABRICKS TEAM ROLES: Senior Data Engineering Consultant - Databricks/Snowflake focus Senior ML Engineer - Databricks specialisation (MLOps) Senior Platform Engineer - Databricks deployment expert Senior Databricks ML Engineer - Premium specialist role 🏢 Join a Dedicated Databricks Centre of Excellence: You'll be More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Meraki Talent Limited
Technology) Experience with Alteryx or Python would be desirable Exposure to financial services or similar data-driven sectors is desirable Understanding of Financial and Management Accounts Prior experience with Snowflake and/or Workday Adaptive is highly desirable Our client is an equal opportunities employer and welcomes applications from all backgrounds. Please let us know should you require any adjustments. More ❯
Brighton, East Sussex, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
testing, data governance, and observability. Lead roadmap planning and explore emerging technologies (e.g. GenAI). Ensure operational stability and support incident resolution. Tech Stack Python , SQL , Airflow , AWS , Fivetran , Snowflake , Looker , Docker (You don't need to tick every box - if you've worked with comparable tools, that's great too.) What We're Looking For Proven track record managing More ❯
Brighton, East Sussex, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
real-time data capabilities to better understand and predict user behaviour Supporting a data function embedded within the tech team - directly influencing architecture and systems decisions Tech Environment: AWS, Snowflake, Python, Airflow Docker, Fivetran, Looker Real-time streaming technologies are a bonus What We're Looking For: Proven experience managing or mentoring Engineers (either as a Lead or in a More ❯
depth knowledge of S/4 HANA vs ECC differences in FI/CO modules. Experience with analytics tools and data platforms such as PowerBI, Qlik, Azure Data Lake, Snowflake, SAP B4H, and SAP Analytics Cloud. Bachelor's or Master's degree in Finance or Accounting. What you'll get in return The role offers a competitive day rate, in More ❯
Employment Type: Temporary
Salary: £475 - £575 per day, Pro-rata, Inc benefits
depth knowledge of S/4 HANA vs ECC differences in FI/CO modules. Experience with analytics tools and data platforms such as PowerBI, Qlik, Azure Data Lake, Snowflake, SAP B4H, and SAP Analytics Cloud. Bachelor's or Master's degree in Finance or Accounting. What you'll get in return The role offers a competitive day rate, in More ❯
Employment Type: Contract
Rate: £475 - £575/day Up to £575 Daily Rate In or Outside Scope
client engagements. Demonstrate expertise in LLMs (e.g., GPT-4, LLaMA) with practical implementations across financial domains. Build scalable AI architectures using cloud platforms (AWS, Azure, GCP) and integrate with Snowflake and Databricks. Ensure AI solutions adhere to governance standards and financial regulations, with a focus on ethical and compliant deployment. Your Profile: Skilled in identifying and articulating AI/ML More ❯
journey using the modern cloud platforms. We specialise on using the latest frameworks, reference architectures and technologies using AWS, Azure and GCP along with various data platforms like Databricks, Snowflake, Quantexa, Palantir, SAS. Your Role We are looking for strong Azure Solution Architects who are passionate and focused on data solutions and Microsoft technology and who ideally have skills in More ❯
Birmingham, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (AI-Driven SaaS plaform) (Python, Snowflake, Data Modelling, ETL/ELT, Apache Airflow, Kafka, AWS) Large-scale data environment Up to £70,000 plus benefits FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalable data structures and data pipelines within a truly enterprise-scale data processing environment … have a robust Infrastructure background and a good understanding of the different complexities that come when moving one system to another. Lets talk tech. The platform integrates Python and Snowflake and you'll need a deep understanding of SQL and NoSQL databases (MongoDB or similar!) You'll also have exposure with streaming platforms like Apache Kafka and be able to … develop and maintain ELT and essentially bring a solid understanding of data warehousing concepts and best practice. Essentially, a strong Data Engineer who is a Snowflake enthusiast who can write solid SQL queries, within Snowflake! You will understanding Apache Kafka to a high standard and have solid knowledge of Apache Airflow - from a Cloud perspective, good AWS exposure. Naturally you More ❯
Senior Data Engineer who thrives in technically complex environments and enjoys solving large-scale data pipeline challenges. You'll work with tools like AWS Glue, PySpark, Iceberg, Databricks, and Snowflake , collaborating with data scientists and stakeholders across multiple business units. Key Responsibilities: Design, build, and maintain scalable data pipelines and architectures. Implement secure and efficient data lakes and warehouses to … of data governance, quality, and security best practices. Experience working with market data and its applications. Excellent communication and stakeholder management skills. Nice to Have: Experience with Databricks and Snowflake . Exposure to machine learning and data science Why Apply? Be part of a greenfield build with strategic visibility and long-term impact. Work with cutting-edge technologies in a More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
Senior Data Engineer who thrives in technically complex environments and enjoys solving large-scale data pipeline challenges. You'll work with tools like AWS Glue, PySpark, Iceberg, Databricks, and Snowflake , collaborating with data scientists and stakeholders across multiple business units. Key Responsibilities: Design, build, and maintain scalable data pipelines and architectures. Implement secure and efficient data lakes and warehouses to … of data governance, quality, and security best practices. Experience working with market data and its applications. Excellent communication and stakeholder management skills. Nice to Have: Experience with Databricks and Snowflake . Exposure to machine learning and data science Why Apply? Be part of a greenfield build with strategic visibility and long-term impact. Work with cutting-edge technologies in a More ❯
Analyst responsibilities will include: Support Influencer Marketing team: data cleaning, reporting, dashboards, automating manual processes. Support Performance Marketing team: performance reports and insights. Build dashboards in ThoughtSpot. Write SQL (Snowflake, some DBT). Present insights back to marketing stakeholders. YOUR SKILLS AND EXPERIENCE The successful Ecommerce Data Analyst will have the following skills and experience: SQL (essential). Experience in More ❯
Southampton, Hampshire, United Kingdom Hybrid / WFH Options
Spectrum IT Recruitment
solving ability and a track record of meeting deadlines Excellent communication skills for cross-team collaboration Desirable skills (not essential, but a big plus): SSAS, SSRS, SSIS Kafka, MSK, Snowflake, Aurora DB, SNS AWS or Azure database management If you're ready to join a company that challenges limits, delivers excellence, and offers a truly rewarding career path, we want More ❯
deep dives Excellent communication - comfortable presenting to large senior audiences Solid grasp of digital marketing KPIs and strategy Nice-to-Haves: Exposure to personalisation/testing workflows Experience with Snowflake or Python for light wrangling Why Apply? Real strategic ownership? Work directly with high-profile clients? Friendly, expert team (10 heads)? State pension + profit share? Hybrid setup - Tuesdays & Thursdays More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Addition
business strategy. Main Skills Needed: Minimum of 4+ years’ experience leading software projects and API development. Strong Python development skills (Java knowledge a plus). Experience with Azure Functions, Snowflake, Databricks, and cloud technologies. Familiarity with MLOps frameworks and API use in analytics/actuarial environments. Excellent communication and collaboration skills across technical and business teams. Strong problem-solving mindset More ❯
BI solutions (e.g., Power BI, Looker, Tableau, Datorama). Develop and maintain stakeholder dashboards that provide actionable insights. Write SQL queries to access and prepare data in BQ/Snowflake (with focus on GA4 data). (Nice to have) Experience with ContentSquare. Quality Assurance & Testing Carry out thorough end-to-end testing of dashboards to guarantee accuracy. Spot potential risks More ❯
BI solutions (e.g., Power BI, Looker, Tableau, Datorama). Develop and maintain stakeholder dashboards that provide actionable insights. Write SQL queries to access and prepare data in BQ/Snowflake (with focus on GA4 data). (Nice to have) Experience with ContentSquare. Quality Assurance & Testing Carry out thorough end-to-end testing of dashboards to guarantee accuracy. Spot potential risks More ❯
london (city of london), south east england, united kingdom
Bodhi
BI solutions (e.g., Power BI, Looker, Tableau, Datorama). Develop and maintain stakeholder dashboards that provide actionable insights. Write SQL queries to access and prepare data in BQ/Snowflake (with focus on GA4 data). (Nice to have) Experience with ContentSquare. Quality Assurance & Testing Carry out thorough end-to-end testing of dashboards to guarantee accuracy. Spot potential risks More ❯
results in a concise manner both verbally and written Desirable Postgraduate qualification in relevant field (eg Computer Science, Data Science, Operational Research) Experience with modern data platforms (eg Databricks, Snowflake, MS Fabric). Familiarity with MLOps practices and version control tools (e.g. Git). Experience with deployment and maintenance of ML models in production environments. Experience mentoring junior analysts, sharing More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Vermelo RPO
results in a concise manner both verbally and written Desirable Postgraduate qualification in relevant field (eg Computer Science, Data Science, Operational Research) Experience with modern data platforms (eg Databricks, Snowflake, MS Fabric). Familiarity with MLOps practices and version control tools (e.g. Git). Experience with deployment and maintenance of ML models in production environments. Experience mentoring junior analysts, sharing More ❯
foundations required for accurate reporting on Channel Performance. Key Tools and Technologies: Solid expertise in tools such as Google Tag Manager, Google Analytics (GA4), Segment/Server-Side Technologies, Snowflake and BigQuery. Tag Management: Familiarity with JavaScript and understanding of HTML/CSS for tag troubleshooting. Problem Solving: Strong problem-solving skills and attention to detail when validating data or More ❯
foundations required for accurate reporting on Channel Performance. Key Tools and Technologies: Solid expertise in tools such as Google Tag Manager, Google Analytics (GA4), Segment/Server-Side Technologies, Snowflake and BigQuery. Tag Management: Familiarity with JavaScript and understanding of HTML/CSS for tag troubleshooting. Problem Solving: Strong problem-solving skills and attention to detail when validating data or More ❯
CPG, and Public Services. Consolidated revenues as of 12 months ending December 2024 totaled $13.8 billion. Responsibilities: Design, develop, and maintain robust and scalable ETL/ELT pipelines using Snowflake as the data warehouse. Implement data transformations and build analytical data models using dbt (data build tool). Develop and optimize complex SQL queries for data extraction, transformation, and loading … for data engineering processes, including version control with Git. Monitor data pipelines, troubleshoot issues, and optimize performance for efficiency and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with Apache Airflow for workflow orchestration and scheduling. Proficiency in Python for data More ❯
CPG, and Public Services. Consolidated revenues as of 12 months ending December 2024 totaled $13.8 billion. Responsibilities: Design, develop, and maintain robust and scalable ETL/ELT pipelines using Snowflake as the data warehouse. Implement data transformations and build analytical data models using dbt (data build tool). Develop and optimize complex SQL queries for data extraction, transformation, and loading … for data engineering processes, including version control with Git. Monitor data pipelines, troubleshoot issues, and optimize performance for efficiency and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with Apache Airflow for workflow orchestration and scheduling. Proficiency in Python for data More ❯
london (city of london), south east england, united kingdom
HCLTech
CPG, and Public Services. Consolidated revenues as of 12 months ending December 2024 totaled $13.8 billion. Responsibilities: Design, develop, and maintain robust and scalable ETL/ELT pipelines using Snowflake as the data warehouse. Implement data transformations and build analytical data models using dbt (data build tool). Develop and optimize complex SQL queries for data extraction, transformation, and loading … for data engineering processes, including version control with Git. Monitor data pipelines, troubleshoot issues, and optimize performance for efficiency and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with Apache Airflow for workflow orchestration and scheduling. Proficiency in Python for data More ❯