with multiple stakeholders, both within and adjacent to the finance function. Deliver multiple projects, ensuring that stakeholder needs are met. Manage a suite of reporting and core data tables, ETL processes, Snowflake and Looker for data analysis and presentation. Requirements Experience of stakeholder management is required. Proficiency in handling data through ETL processes and Looker using SQL and Python is More ❯
table. Requirements Python Proficiency: Strong Python skills with experience in building and monitoring production services or APIs. Experience with third-party APIs is essential. Data Pipelining: Experience with SQL, ETL, data modeling. Experienced with the lifecycle of building ML solutions. Speak AI language: Understand the fundamentals of ML/AI and communicate effectively with AI and Data Scientists. Infra: Deep More ❯
a similar role Technical expertise with data models, data mining, and segmentation techniques Knowledge of programming languages (minimum Python & SQL) Hands-on experience with SQL database design Understanding of ETL, serverless and cloud computing (Google Cloud Platform preferred) Additional Information Publicis Media has fantastic benefits on offer to all of our employees. In addition to the classics, Pension,Life Assurance More ❯
new tools and techniques to level-up how we work. Key Responsabilities Data Collection and Transformation Ingest and clean CSV datasets (e.g. via Excel Power Query, VBA or comparable ETL tools) Standardise irregular inputs into a repeatable pipeline (“raw → final”) Automate routine data-prep steps where possible Write prompts for AI tools to organise and standardise data Support with developing More ❯
new tools and techniques to level-up how we work. Key Responsabilities Data Collection and Transformation Ingest and clean CSV datasets (e.g. via Excel Power Query, VBA or comparable ETL tools) Standardise irregular inputs into a repeatable pipeline (“raw → final”) Automate routine data-prep steps where possible Write prompts for AI tools to organise and standardise data Support with developing More ❯
engineering team sits at the centre of everything we do at Plentific and is constantly tackling challenging problems, such as online payments, quoting, invoicing, booking, search/scoring algorithms, ETL, data pipelines, in-app messaging, real-time notifications and fraud prevention. Our backend engineers mostly work with Python and Django on an increasingly more service-oriented architecture. The rest of More ❯
or equivalent) Strong SQL capability. Experience working/leading Workday data migration workstream for at least one end to end Workday implementation. Provide guidance on Workday Data Migration andETL best practices. Work closely with clients to understand their needs and requirements. Support clients during the data mapping and data validations. Good knowledge of Workday Reporting and experience with Data More ❯
effectively with both technical and non-technical audiences. Ideally you would have the following Solid understanding of modern data platform architectures, components and concepts (e.g., data warehousing, data lakehouse, ETL/ELT processes, data orchestration, streaming data, data governance). Experience shaping data platform strategy and vision, aligning technical capabilities with evolving business needs and helping drive technical innovation. A More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Reward
🚀 We’re Hiring: Data Engineering Manager | Leadership | Cloud | Agile | London/Hybrid Are you a people-first technical leader who’s passionate about building high-performing teams and scalable data platforms? At Reward, data is at the core of everything More ❯
🚀 We’re Hiring: Data Engineering Manager | Leadership | Cloud | Agile | London/Hybrid Are you a people-first technical leader who’s passionate about building high-performing teams and scalable data platforms? At Reward, data is at the core of everything More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Reward
We’re Hiring: Data Engineering Manager | Leadership | Cloud | Agile | London/Hybrid Are you a people-first technical leader who’s passionate about building high-performing teams and scalable data platforms? At Reward, data is at the core of everything More ❯
Our Analytics Engineering team Our goal in Growth and Marketing is to build Monzo into a global brand people love. We find exciting new ways to attract and engage customers, creating features and initiatives that generate buzz. We also focus More ❯
executable decision logic within the NBA ecosystem. Advise on which KPIs to prioritise to improve the business outcomes of the NBA ecosystem. Mastery in data analytics and visualisation to extract the relevant signals and pattern for better decision-making and prioritisation for faster time-to-market offers. Must be able to work alongside Commercial teams (e.g. base management, pricing) and … inbound/outbound flows and manages API calls across Pega, Vlocity, and external systems Machine Learning Models via Vertex AI (GCP) AWS Glue + Lambda + Step Functions: Batch ETL workflows Kafka (Confluent Cloud): Real-time feature streams ingestion Kinesis Streams + Firehose: Traceability and telemetry routing Cloud Storage (S3, GCP buckets): Configuration, logs, traceability data Envoy Proxy + ALB More ❯
City of London, Greater London, UK Hybrid / WFH Options
Superstars
executable decision logic within the NBA ecosystem. Advise on which KPIs to prioritise to improve the business outcomes of the NBA ecosystem. Mastery in data analytics and visualisation to extract the relevant signals and pattern for better decision-making and prioritisation for faster time-to-market offers. Must be able to work alongside Commercial teams (e.g. base management, pricing) and … inbound/outbound flows and manages API calls across Pega, Vlocity, and external systems Machine Learning Models via Vertex AI (GCP) AWS Glue + Lambda + Step Functions: Batch ETL workflows Kafka (Confluent Cloud): Real-time feature streams ingestion Kinesis Streams + Firehose: Traceability and telemetry routing Cloud Storage (S3, GCP buckets): Configuration, logs, traceability data Envoy Proxy + ALB More ❯
Permanent Market Data Developer - Commodities - Python/Market Data/DMS/ETL/Data Warehousing/tick data As a Senior market data developer, you will be responsible of developing market data capabilities ensuring reliable and accurate data feeds consumption and distribution. You will advance the development of our DMS aimed to ensure market data reliability for the Marketing … building and delivering services for data parsing and distribution. Deliver and enhance data parsers and other data processing mechanisms aligned to a standardised data consumption/distribution model. Steward ETL coding standards: ensuring that code is standardised, self-documenting and can be reliably tested. Lead a team of market data professionals on development projects. Act as SME for market data … integrity, among other. Understanding and experience of tooling and technology that support all aspects of the Data solutions development life cycle in an agile environment. Detailed working knowledge of ETL/ELT, data warehousing methodologies and best practice including dealing with EOD and tick data. Knowledge of different schema structures & design. Deep understanding of deployment and automation workflows. Knowledge in More ❯
visualizations Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool Develop and document data processes including data cleansing and matching ETL design You must have at least 1 to 2 years experience of working with Actimize software. Candidates must have a Bachelor's degree in a related subject Candidates must have … proven and significant experience as a Data Analyst Strong PL/SQL skills Database development (SQL Server, Oracle) Data migration knowledge & experience Development and test of ETLExtract/Transform/Load (ETL) with large data sets. Using Enterprise level tools including MS SSIS, Informatica etc Procedural language such as C# or Python Master data management routines using Oracle SQL More ❯
around 10 days per month for 3 months. Overview of role: Provide strategic and technical leadership in the design and development of Management Information Systems, with deep expertise in Extract-Transform-Load (ETL) operations to gather data from multiple sources at varying stages of digitisation. This includes building integrated, scalable databases and dashboards, as well as predictive indices and machine More ❯
energy provider is seeking a Senior Data Analyst to support a major migration project from a legacy CRM platform. As Senior Data Analyst, you’ll take ownership of building ETL pipelines that extract data from Junifer and populate data models in BigQuery, helping to deliver clean, accurate data for business-critical analysis and insight. The Senior Data Analyst will have More ❯
energy provider is seeking a Senior Data Analyst to support a major migration project from a legacy CRM platform. As Senior Data Analyst, you’ll take ownership of building ETL pipelines that extract data from Junifer and populate data models in BigQuery, helping to deliver clean, accurate data for business-critical analysis and insight. The Senior Data Analyst will have More ❯
Pandas, NumPy, MLFlow). Familiarity with cloud-based AI tools and infrastructure, especially within the AWS ecosystem. Strong understanding of data structures, algorithms, and statistical analysis. Experience working with ETL pipelines and structured/unstructured data. Must be available to attend quarterly company meetings in person. More ❯
Pandas, NumPy, MLFlow). Familiarity with cloud-based AI tools and infrastructure, especially within the AWS ecosystem. Strong understanding of data structures, algorithms, and statistical analysis. Experience working with ETL pipelines and structured/unstructured data. Must be available to attend quarterly company meetings in person. More ❯
of embeddings, vector databases (Pinecone, Weaviate, FAISS), and RAG pipelines NLP Fundamentals : Text preprocessing, language modelling, and semantic similarity Cloud Experience : AWS ecosystem knowledge (SageMaker, Lambda, etc.) Production Ready : ETL pipelines, version control, Agile methodologies It would be a major advantage if you have experience with: Experience with 3D/GIS domains Familiarity with Unity, Unreal, or 3D modelling tools More ❯
dashboards that directly informed product or GTM decisions. Our Data Stack We use a modern ELT setup: BigQuery (warehouse) dbt (transformation, deployed via GitHub Actions) Fivetran (ingestion) Census (reverse ETL Metabase (BI/dashboarding) Plus: Posthog, Stripe, Hubspot, Intercom, Meta Ads, and our internal production database Interview Process: Submit your CV no cover letter needed Intro call with someone from More ❯
programming experience in Python Proven experience working on the MS Azure cloud platform and its native tech stack in designing and building data & AI solutions Experience with data modeling, ETL processes, and data warehousing Knowledge of big data tools and frameworks such as Spark, Hadoop, or Kafka More ❯
database and any SQL Jobs Defining SQL Standards and deployment processes Management of SQL Backups Management of Azure SQL server Experience working in Database design and management Experience with: ETLand Monitoring Tools Cloud Based DB Solutions and Services (e.g. Azure) Relational and non-relational DBs Database Design and Management Considering Confidentiality and Privacy/Security of Sensitive Information Any More ❯