Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
xarray, SciPy/PyMC/PyTorch, or similar). Experience validating models with historical data and communicating results to non-specialists. Exposure to real-time data engineering (Kafka, Airflow, dbt) Track record turning research code into production services (CI/CD, containers etc) Strong SQL and data-management skills; experience querying large analytical databases (Snowflake highly desirable, but Redshift/ More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Experience working with financial data, risk modelling, or algorithmic trading is a plus. Familiarity with cloud platforms (AWS, GCP, or Azure) and modern data stack tools (e.g., Apache Airflow, dbt, Snowflake). Excellent communication and stakeholder management skills. Must be available to work onsite in London 3 days per week. What's on Offer Competitive salary up to More ❯
Bracknell, England, United Kingdom Hybrid / WFH Options
Evelyn Partners
and maintaining our MS SQL Server Data Warehouses and associated data feeds into and out of the warehouses, and developing on our new modern cloud data platform, requiring Snowflake, dbt and Azure Data Factory experience. Our data platform's support regulatory requirements, business intelligence & reporting needs and numerous system integrations. This role requires strong technical proficiency and a deep understanding … You will be critical in the development and support of the new Evelyn Data Platform, which is being engineered on Snowflake, utilising Azure Data Factory pipelines for data integration, dbt for data modelling, Azure BLOB Storage for data storage, and GitHub for version control and collaboration. The role will be working in an agile and collaborative environment within a growing … understanding of data warehousing concepts and principles • Snowflake/Cloud Engineering experience is required: • Proven experience working as a Data Engineer, preferably with expertise in Snowflake, ADF, Azure Storage, dbt and GitHub. • Extensive experience in designing, building, deploying, and supporting cloud-based data products and pipelines. • Experience with Azure Storage services such as Blob Storage, Data Lake Storage, or Azure More ❯
years' experience as a Data Analyst, especially with experience in a systematic Hedge Fund or similar Quantitative Trading environment. Strong technical skills in Python, SQL and tools such as dbt, Snowflake, AWS S3, KDB and SQL Server. Solid understanding of financial instruments such as Equities, Futures, Forwards, CDS, IRS and ETFs with deep knowledge in at least one asset class. More ❯
Lead by example: mentor engineers, contribute to technical standards, and drive team alignment Work closely with stakeholders to translate business needs into scalable solutions Tech environment includes Python, SQL, dbt, Databricks, BigQuery, Delta Lake, Spark, Kafka, Parquet, Iceberg (If you haven’t worked with every tool, that’s totally fine — my client values depth of thinking and engineering craft over More ❯
to ensure adherence to data privacy regulations (e.g., GDPR) and internal governance standards. Lead evaluation and integration of data tools, platforms, and technologies (e.g., Snowflake, Databricks, Azure Synapse, Kafka, dbt, Power BI). Oversee data integration strategy across the enterprise-including ETL/ELT pipelines, APIs, and event-driven data streaming. Contribute to the development of a Data Center of … logical, physical), metadata management, and master data management (MDM). Deep understanding of data integration, transformation, and ingestion techniques using modern tools (e.g., Azure Data Factory, Boomi, Informatica, Talend, dbt, Apache NiFi). Strong familiarity with data warehousing, data lake/lakehouse architectures, and cloud-native analytics platforms. Hands-on experience with SQL and cloud data platforms (e.g., Snowflake, Azure More ❯
platform teams in a production environment Proven success with modern data infrastructure: distributed systems, batch and streaming pipelines Hands-on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Familiarity with data warehousing, ETL/ELT processes, and analytics engineering Programming proficiency in Python, Scala or Java Experience operating in a cloud-native environment (e.g. AWS, GCP More ❯
to ensure adherence to data privacy regulations (e.g., GDPR) and internal governance standards Lead evaluation and integration of data tools, platforms, and technologies (e.g., Snowflake, Databricks, Azure Synapse, Kafka, dbt, Power BI) Oversee data integration strategy across the enterprise—including ETL/ELT pipelines, APIs, and event-driven data streaming Contribute to the development of a Data Center of Excellence … conceptual, logical, physical), metadata management, and master data management (MDM) Deep understanding of data integration, transformation, and ingestion techniques using modern tools (e.g., Azure Data Factory, Boomi, Informatica, Talend, dbt, Apache NiFi) Strong familiarity with data warehousing, data lake/lakehouse architectures, and cloud-native analytics platforms Hands-on experience with SQL and cloud data platforms (e.g., Snowflake, Azure, AWS More ❯
Social network you want to login/join with: Senior Engineering Manager – Data Platform, slough col-narrow-left Client: Stott and May Location: slough, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 5 More ❯
Social network you want to login/join with: Duration: contract to run until 31/12/2025 Rate: up to £644 p/d Umbrellainside IR35 We are seeking an experienced Data Modeller with proven expertise in the More ❯
Social network you want to login/join with: We have partnered with a company that empowers underwriters to serve their insureds more effectively. They are using advanced data intelligence tools to rebuild the way that underwriters share and exchange More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
step/complex analytics is essential for this role. Experience in cloud platforms (GCP – BigQuery, Azure -Synapse, Snowflake) and exposure to data science tools/languages such as Python, dbt, D3, Github, GCP/AWS would be advantageous. (This is not a technical advanced data science role so advanced experience in these will not be relevant). Experience in the More ❯
Oxford, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
serve major UK brands and internal teams alike. What you’ll be doing: Lead the design and build of secure, scalable data pipelines using Azure Data Factory , Snowflake , and DBT Collaborate across Product, BI, Cloud, and IT to deliver data solutions that power real-world impact Optimise performance, enforce governance, and ensure seamless integration of APIs, external systems, and databases … from modelling and scripting to permissions and performance tuning What we’re looking for: 8+ years of experience in data engineering, with deep technical expertise across SQL, Snowflake, Azure, DBT , and Power BI Strong Python skills and experience integrating third-party systems via APIs A confident communicator who can bridge technical and non-technical teams Proven ability to deliver efficient More ❯
Milton Keynes, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
serve major UK brands and internal teams alike. What you’ll be doing: Lead the design and build of secure, scalable data pipelines using Azure Data Factory , Snowflake , and DBT Collaborate across Product, BI, Cloud, and IT to deliver data solutions that power real-world impact Optimise performance, enforce governance, and ensure seamless integration of APIs, external systems, and databases … from modelling and scripting to permissions and performance tuning What we’re looking for: 8+ years of experience in data engineering, with deep technical expertise across SQL, Snowflake, Azure, DBT , and Power BI Strong Python skills and experience integrating third-party systems via APIs A confident communicator who can bridge technical and non-technical teams Proven ability to deliver efficient More ❯
in e-commerce analytics with platforms like Google Analytics, Adobe Analytics, Amplitude, or similar. Proficient in data modeling, warehousing, and advanced SQL. Skilled in ETL processes, with tools like dbt and Sigma preferred. Experience with cloud platforms like GCP and BigQuery. Python skills are a plus but not mandatory. Interest or experience in AI products. Your responsibilities: Extract and integrate … data from various sources based on stakeholder needs. Consolidate data on platforms like dbt, Sigma, Census, Gravity, BigQuery for visualization. Manage the entire ETL cycle within our tech stack. Define and monitor KPIs, maintain documentation, and ensure data governance. Collaborate with the Analytics Team to enhance digital store sales using tools like Google Analytics, GA4, Contentsquare. Ready to advance your More ❯
resolution, customer segmentation, and real-time personalization. Hands-on experience with agile product development methodologies. Excellent communication and stakeholder management skills. Knowledge of modern data tools (e.g., Snowflake, Databricks, dbt, Kafka). Understanding of machine learning workflows and personalization engines. Product certifications (e.g., SAFe, Pragmatic, CSPO). Key Success Metrics: Consistent development roll outs of the Horizon CDP platform Increased More ❯
As part of the technical team, this role offers hands-on experience across a range of leading technologies and platforms. Full training will be provided in SQL, Looker, and DBT for reporting, dashboarding, and data pipeline development. You’ll also get exposure to Python, Databricks, and Azure as you grow into the role. There is also future potential to gain … machine learning projects as the team evolves. Day-to-Day Responsibilities: Build and maintain SQL queries and Looker dashboards for reporting and visualisation Develop and maintain data pipelines using DBT Ingest and process data from various sources across the business Collaborate with business stakeholders to understand reporting needs Contribute to large-scale European data integration projects Support internal projects such … technology A proactive, entrepreneurial mindset with a desire to learn and solve problems Excellent communication skills and confidence working with business stakeholders Tech Stack You’ll Learn SQL Looker DBT Python Databricks Let me know if you want to tailor this for LinkedIn or make it sound more casual or more technical! #J-18808-Ljbffr More ❯
company that takes data seriously, where your work has direct impact on product quality, decision making, and customer outcomes. Skills and Experience Strong Python SQL Experience working with GCP dbt If you are looking for a new challenge, then please submit your CV for initial screening and more details. Staff Data Engineer #J-18808-Ljbffr More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
rooted in wellness. What you’ll lead: Full strategic ownership of analytics across growth, health outcomes, customer journeys and commercial metrics Building and evolving the modern data stack (BigQuery, dbt, Fivetran, Looker, Hex – with full autonomy to optimise) Managing and growing a high-performing analytics team Driving business-wide decision-making through experimentation, modelling, and insight Embedding a data-first … culture across product, marketing, ops, and finance About you: 9+ years in analytics or data engineering, with deep SQL and dbt expertise (Python/R a bonus) Experience in a fast-growing D2C brand Strong commercial acumen – comfortable working across CAC, LTV, churn, forecasting, etc. Confident engaging with senior stakeholders and shaping company-wide strategy A hands-on builder with More ❯
atmosphere. What are we looking for in the ideal candidate? Proficient in data modelling, data warehousing, and advanced SQL query crafting. Skilled in ETL processes, preferably with tools like dbt and Sigma. Knowledgeable in e-commerce, digital and marketing analytics, including GA4 and BigQuery. Experienced with cloud platforms and database technologies, particularly GCP. Python proficiency is desirable, but not mandatory. … Here’s what your role will entail: Extract and integrate data from diverse sources, aligning with stakeholder needs. Consolidate data on platforms (using dbt, Sigma, Census, Gravity) for optimal stakeholder visualization. Manage the entire ETL cycle within our tech stack. Define and monitor KPIs, maintain documentation, and uphold data governance standards. Collaborate with the Analytics Team, including the Head of More ❯
wider business strategy Experience working as the 1st/only Data hire in a business Familiarity with AI/ML Experience with data orchestration and data warehousing Experience with DBT and Terraform Strong academic background ? Please note : unfortunately, this role does not offer VISA sponsorship. #J-18808-Ljbffr More ❯
a Lead Analyst to the team. THE ROLE AND RESPONSIBILITIES Analyse and enhance key metrics across customer acquisition, retention and revenue generation Develop and maintain advanced data models using dbt Ensure KPIs are aligned with overall business objectives, identifying areas for improvement and efficiency Ensure data insights are clear and actionable for driving performance improvements Use data to provide strategic … Strong technical experience in SQL (specifically within Windows Functions), Snowflake, Data Visualisation and Python/R Strong background in advanced analytics, predictive modelling and exploratory analysis Proven expertise in DBT and reporting THE BENEFITS Up to £85,000 + bonuses + equity Hybrid London HOW TO APPLY If interested in the role please send your CV to [emailprotected] or via More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
generation Collaborate closely with product and data to develop a live reporting portal (e.g. Metabase) Manage and mentor a small team of analysts ? Tech Environment Analytics : Tableau, SQL Engineering : dbt, Python, Postgres, Azure, Prefect Ideal Candidate 4-6 years' experience in analytics, with some leadership exposure Proven skills in SQL, Python, and dbt Experience delivering operational insights and improving reporting More ❯