London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
with architects, analysts, and business stakeholders to unlock insights and enable innovation. What You'll Be Doing: Design and build robust, automated data pipelines using Azure Data Factory, Synapse, dbt, and Databricks. Integrate data from enterprise systems (e.g. Dynamics, iTrent, Unit4) into a unified data platform. Cleanse, transform, and model data to support BI tools (e.g., Power BI) and AI More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
with architects, analysts, and business stakeholders to unlock insights and enable innovation. What You'll Be Doing: Design and build robust, automated data pipelines using Azure Data Factory, Synapse, dbt, and Databricks. Integrate data from enterprise systems (e.g. Dynamics, iTrent, Unit4) into a unified data platform. Cleanse, transform, and model data to support BI tools (e.g., Power BI) and AI More ❯
South West London, London, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
with architects, analysts, and business stakeholders to unlock insights and enable innovation. What You'll Be Doing: Design and build robust, automated data pipelines using Azure Data Factory, Synapse, dbt, and Databricks. Integrate data from enterprise systems (e.g. Dynamics, iTrent, Unit4) into a unified data platform. Cleanse, transform, and model data to support BI tools (e.g., Power BI) and AI More ❯
New Malden, Surrey, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
Strong hands-on background in data engineering, with 5+ years working on modern data platforms Experience leading cloud data migrations- GCP and BigQuery strongly preferred Proficiency in SQL, Python, dbt, Airflow, Terraform and other modern tooling Excellent understanding of data architecture, governance, and DevOps best practices Proven leadership or team management experience within a regulated or mid-to-large tech More ❯
data management, governance, or investment operations. Deep familiarity with security master/reference data, market data workflows, and vendor management. Strong knowledge of the modern data stack (Snowflake, Databricks, dbt, Fivetran, etc.). Proven ability to lead data for advanced analytics, AI/ML, and regulatory initiatives. Experience in financial services, asset management, or fintech is strongly preferred. Strong stakeholder More ❯
take the initiative, and identify creative solutions to deliver outcomes in the face of obstacles. Knowledge of common data science tools around SQL-based data warehousing (eg. Snowflake, Databricks, DBT), BI tools (eg. Tableau, Looker), workflow orchestration, and ML Ops. Excellent spoken and written English skills. Fluency with scripting in Python. Ability to work effectively across time zones. Teammates will More ❯
that's batch files or real-time streams. You'll have set up and worked with ETL and ELT tools like Dagster, AWS Glue, Azure Data Factory, Airflow or dbt, and you can decide what tools are right for the job. You'll have an understanding of how Node.js and TypeScript fit into a modern development environment and can speak More ❯
or Scala, SQL, and Apache Spark. You're highly proficient with Databricks and Databricks Asset Bundles, and have a strong understanding of data transformation best practices. Experience with GitHub, DBT, and handling large structured and unstructured datasets is essential. You're detail-oriented, self-motivated, and able to communicate complex technical concepts clearly. Experience with media/social platform dataMore ❯
data applications Airflow purely for job scheduling and tracking Circle CI for continuous deployment Parquet and Delta file formats on S3 for data lake storage Spark for data processing DBT for data modelling SparkSQL for analytics Why else you'll love it here Wondering what the salary for this role is? Just ask us! On a call with one of More ❯
a team of Data Engineers Experience with Data modelling, Data warehousing, and building ETL pipelines Experience with AWS (S3, EKS, EC2, RDS) or similar cloud services, Snowflake, Fivetran, Airbyte, dbt, Docker, Argo Experience in SQL, Python, and Terraform Experience with building Data pipelines and applications to stream and process datasets Robust understanding of DevOps principles is required Experience managing cloud More ❯
complex operational businesses Experience using LLMs or AI tools to structure and extract meaning from unstructured data Experience automating workflows and deploying model pipelines (e.g. MLFlow, GCP Vertex, Airflow, dbt or similar) Exposure to business planning, pricing, or commercial decision-making Familiarity with geospatial data Experience in fast-scaling startups or operational teams We're flexible on experience - if you More ❯
a leading Media & Entertainment provider. They're looking for multiple experienced Data Engineers to join on an initial 6 month contract basis, with scope for further extensions. -Python, AWS, dbt, SQL, -Principal Data Engineer -6 Month Contract -Inside IR35 -£500 - 600/Day -Remote (Occasional office visits may be required ad-hoc) If you're interested in this role & meet More ❯
a leading Media & Entertainment provider. They're looking for multiple experienced Data Engineers to join on an initial 6 month contract basis, with scope for further extensions. -Python, AWS, dbt, SQL, -Principal Data Engineer -6 Month Contract -Inside IR35 -£(Apply online only)/Day -Remote (Occasional office visits may be required ad-hoc) If you're interested in this role More ❯
reporting, and integration projects. Deep understanding of data governance, consent management, and PII handling. Experience with: SQL, Python Power BI (or equivalent BI tools such as Looker, Tableau, Omni) dbt, Airflow, Docker (preferred) Twilio Segment (or other CDPs such as mParticle, Salesforce Data Cloud) Exceptional stakeholder management and communication skills - able to translate complex data topics into business impact. Experience More ❯
problem-solving skills and a collaborative mindset. Experience in industries with large consumer marketplaces (e.g., programmatic media, travel, financial services) is a plus. Our Tech Stack AWS Redshift Postgres DBT Grafana PowerBI Note: While our current environment is AWS, we are transitioning to Azure for new development. Why Join Us? At Kantar, you'll be part of a global leader More ❯
best practices for performance optimisation, scalability, and cost management, empowering teams to access and utilise data seamlessly. Streamline Data Pipelines: Lead the development and optimisation of data pipelines using DBT , enabling faster and more reliable data flows. Enhance Data Governance and Quality: Design and implement robust data governance frameworks, ensuring high data quality, compliance, and consistency. Develop Scalable Data Models More ❯
modern AI tools and models across the entire data and engineering stack. Tech Stack AI Engineering: Python, LangChain, LlamaIndex, LangGraph AI Models: OpenAI, Anthropic, Gemini, Custom Models Data Tools: DBT, BigQuery Databases: PostgreSQL, OpenSearch, Redis Product Engineering: TypeScript (Full-stack) Infrastructure: Serverless, AWS, Google Cloud, GitHub Actions Requirements Strong academic background in a relevant field Proven track record of delivering More ❯
make customer-centric work possible - providing foundational tooling like address lookup and consent capture, and offering more advanced capabilities like our internal event stream platform and a fully-managed DBT environment. Our services help teams track behavioural signals, enrich customer profiles, and transform raw data into decision-ready insight - all while maintaining a strong focus on data protection. We primarily More ❯
Whetstone, Greater London, UK Hybrid / WFH Options
RVU Co UK
make customer-centric work possible providing foundational tooling like address lookup and consent capture, and offering more advanced capabilities like our internal event stream platform and a fully-managed DBT environment. Our services help teams track behavioural signals, enrich customer profiles, and transform raw data into decision-ready insight all while maintaining a strong focus on data protection. We primarily More ❯
TensorFlow, Hugging Face) Proven MLOps, big data, and backend/API development experience Deep understanding of NLP and LLMs Proficient with cloud platforms (AWS/GCP/Azure), Airflow, DBT, Docker/Kubernetes Strong collaboration, problem-solving, and coding best practices Nice to have: LLM fine-tuning, streaming data, big data warehousing, open-source contributions. More ❯
TensorFlow, Hugging Face) Proven MLOps, big data, and backend/API development experience Deep understanding of NLP and LLMs Proficient with cloud platforms (AWS/GCP/Azure), Airflow, DBT, Docker/Kubernetes Strong collaboration, problem-solving, and coding best practices Nice to have: LLM fine-tuning, streaming data, big data warehousing, open-source contributions. More ❯
or similar Prior experience or interest in working with geospatial data Technologies we use ️ Programming languages: SQL, Python, LookML, (+ Go for other backend services) Development tools and frameworks: dbt, dagster, Airbyte, dlt, data-diff, Elementary Data lake and warehouse: GCS, BigQuery Analytics: Looker, Looker Studio and geospatial analytics tools How we reward our team Dynamic working environment with a More ❯
feature development process. Identifying tracking requirements to enable accurate reporting/measurement. Work with our developers to get these implemented. Helping to develop and maintain our business intelligence tool (DBT and Lightdash) by adding new metrics in line with emerging areas of business interest and product development. Be a passionate ZAVA data advocate, inspiring others to embrace and utilise dataMore ❯
and its data science ecosystem (e.g., pandas, scikit-learn, TensorFlow/PyTorch) , Statistical methods and machine learning (e.g., A/B testing, model validation) , Data pipelining tools like SQL, dbt, BigQuery, or Spark , A strong communicator with the ability to communicate technical concepts into layman's terms for a non-technical audience , You're not afraid to challenge the status More ❯
/m/d) to join our German and Western European intraday trading team. You should have strong Python skills, know how to manage Redis cache, AWS S3 or DBT, and bring experience in the German or European power market. You'll work in a fast-paced, tech-driven environment and collaborate closely with traders and developers to turn dataMore ❯