Data Analyst – City of London – Leading Brokerage – (£80k-£100k)+ Bonus Our client is a leading Brokerage who are looking to enhance their Data team with the addition of a talented Data Analyst in London to join their Operations department in a newly created position You will be pivotal in analysing both structured and … a strong understanding of various financial products and the trading life cycle. The role Advanced proficiency in SQL (T-SQL, PL/SQL, Databricks SQL) Knowledge of Kimball data modelling methodology Experience using scripting languages such as Python, PowerShell etc. Experience with Microsoft Azure. Strong knowledge of ETL/ELT tools and experience navigating data pipelines. More ❯
Data Analyst – City of London – Leading Brokerage – (£80k-£100k)+ Bonus Our client is a leading Brokerage who are looking to enhance their Data team with the addition of a talented Data Analyst in London to join their Operations department in a newly created position You will be pivotal in analysing both structured and … a strong understanding of various financial products and the trading life cycle. The role Advanced proficiency in SQL (T-SQL, PL/SQL, Databricks SQL) Knowledge of Kimball data modelling methodology Experience using scripting languages such as Python, PowerShell etc. Experience with Microsoft Azure. Strong knowledge of ETL/ELT tools and experience navigating data pipelines. More ❯
Sr Cloud Data Architect | UK Location: United Kingdom (Work from Office – Onsite Presence is required) Language Requirement: Fluent in English (Spoken & Written) Key Responsibilities Provide technical leadership and strategic direction for enterprise-scale data migration and modernization initiatives. Architect end-to-end data platforms using Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and … Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala is mandated in some cases. Deep understanding of data lakehouse design, event-driven architecture, and hybrid cloud data strategies. Strong proficiency in SQL … cloud-native architectures. Knowledge of regulatory data requirements across financial, healthcare, and telecom sectors. Familiarity with IaC (Terraform), GitOps, and CI/CD for datapipeline deployment. Strong communication, stakeholder management, and mentoring skills. More ❯
Sr Cloud Data Architect | UK Location: United Kingdom (Work from Office – Onsite Presence is required) Language Requirement: Fluent in English (Spoken & Written) Key Responsibilities Provide technical leadership and strategic direction for enterprise-scale data migration and modernization initiatives. Architect end-to-end data platforms using Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and … Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala is mandated in some cases. Deep understanding of data lakehouse design, event-driven architecture, and hybrid cloud data strategies. Strong proficiency in SQL … cloud-native architectures. Knowledge of regulatory data requirements across financial, healthcare, and telecom sectors. Familiarity with IaC (Terraform), GitOps, and CI/CD for datapipeline deployment. Strong communication, stakeholder management, and mentoring skills. More ❯
and Github. Your work will be pivotal in empowering data-driven decisions across our organization, significantly impacting product strategy and customer engagement. What to expect? Analytics Pipelines & Data Modeling: Convert existing product analyst queries and reports into efficient and maintainable analytics pipelines primarily using dbt, and optimize data for self-service analytics in Tableau. … performance: Well-versed in BI platforms such as Tableau, with a focus on optimizing underlying data structures for dashboard performance. Ingestion and orchestration tools: Skilled in using pipeline orchestration and data ingestion tools such as Airflow and Stitch, along with Python scripting for integrating diverse data sources. Large-scale data processing … and unstructured data: Able to apply LLM APIs and vector databases to analyze unstructured inputs, including customer call transcripts. Testing and containerization: Comfortable using Docker for local pipeline testing and writing comprehensive dbt tests for analytics workflows. What's in it for you? Join an ambitious tech company reshaping the way people build digital experiences Full-time More ❯
client of ours looking to hire for one of their teams that is developing an AI-first product to support commercial real estate investment decisions. Our application pulls data from a variety of sources, applies market-leading machine learning, and presents insights through innovative visualizations. After proving our product’s value as an internal tool, we were acquired … to detail. Good communication skills and a team player. Proficient in Python: demonstrated experience working in teams of Python developers on large projects. Solid understanding of algorithms and data structures. Experience with building features using test-driven development. Solid experience using Git for version control. Solid grasp of data concepts (relational databases, data cleansing … applications. Experience with data engineering frameworks. Portfolio of past experience (e.g., demos of past work, contributions to open source, blogs, talks). Technical Stack: DataPipeline Stack: Python 3, pandas, GeoPandas, boto3, Pydantic, Data Version Control (DVC) 2 Core API Stack: Python 3, Django 4 Infrastructure: AWS, EKS, Docker, ADO (for CI/ More ❯
client of ours looking to hire for one of their teams that is developing an AI-first product to support commercial real estate investment decisions. Our application pulls data from a variety of sources, applies market-leading machine learning, and presents insights through innovative visualizations. After proving our product’s value as an internal tool, we were acquired … to detail. Good communication skills and a team player. Proficient in Python: demonstrated experience working in teams of Python developers on large projects. Solid understanding of algorithms and data structures. Experience with building features using test-driven development. Solid experience using Git for version control. Solid grasp of data concepts (relational databases, data cleansing … applications. Experience with data engineering frameworks. Portfolio of past experience (e.g., demos of past work, contributions to open source, blogs, talks). Technical Stack: DataPipeline Stack: Python 3, pandas, GeoPandas, boto3, Pydantic, Data Version Control (DVC) 2 Core API Stack: Python 3, Django 4 Infrastructure: AWS, EKS, Docker, ADO (for CI/ More ❯
model building to leverage this tool for Capacity Planning, building Reporting Packages, and Finance Operations Volume Forecast Models. Owning the design, operations, and improvements for the Organization's Data Warehouse infrastructure. Maintain, improve, and manage all ETL pipelines and clusters. Explore and learn the latest AWS technologies to provide new capabilities and increase efficiency. Define metrics and KPIs … solution delivery team, partners, BD, and other cross-functional stakeholders. Help continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Collaborate with data scientists, BIEs, and BAs to deliver high-quality data architecture and pipelines. A Day in the Life Scripting language such as Python preferred. Establish scalable, efficient, automated … squared). 2+ years of experience in Anaplan model building. PREFERRED QUALIFICATIONS Master's degree or Advanced technical degree. Knowledge of data modeling and datapipeline design. Experience with statistical analysis, correlation analysis. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace More ❯
Our client is looking for an experienced Senior Data Engineering Manager with a strong background in data engineering to lead an established team. Job Title: Senior Engineering Manager – Data Engineering Location: London Job Type: Contract or Full-time - 6 months About the Role Key Responsibilities Lead and mentor a team of data engineers, supporting their growth and career development. Architect and manage the delivery of scalable, secure, and reliable datapipelines and platforms. Partner with data scientists, analysts, product managers, and stakeholders to deliver impactful data solutions. Define and implement long-term technical strategy for data infrastructure, including data lakes … warehouses, and streaming systems. Uphold engineering excellence through code reviews, technical guidance, and process improvements. Manage planning, execution, and delivery of multiple concurrent projects. Promote best practices in data quality, governance, and privacy across the organisation. Stay up to date with industry trends and emerging data technologies, and apply them where relevant. Qualifications 10+ years of More ❯
Our client is looking for an experienced Senior Data Engineering Manager with a strong background in data engineering to lead an established team. Job Title: Senior Engineering Manager – Data Engineering Location: London Job Type: Contract or Full-time - 6 months About the Role Key Responsibilities Lead and mentor a team of data engineers, supporting their growth and career development. Architect and manage the delivery of scalable, secure, and reliable datapipelines and platforms. Partner with data scientists, analysts, product managers, and stakeholders to deliver impactful data solutions. Define and implement long-term technical strategy for data infrastructure, including data lakes … warehouses, and streaming systems. Uphold engineering excellence through code reviews, technical guidance, and process improvements. Manage planning, execution, and delivery of multiple concurrent projects. Promote best practices in data quality, governance, and privacy across the organisation. Stay up to date with industry trends and emerging data technologies, and apply them where relevant. Qualifications 10+ years of More ❯
only will you directly contribute to our client deliverables, but you will have the opportunity to experiment with a range of cutting-edge techniques and deliver full-stack data science projects, from solution design through to deployment. We're looking for someone with a co-operative, can-do attitude who can build high-quality data engineering … If this sounds like you, we can't wait to hear from you! KEY RESPONSIBILITIES: Lead the design, development, and implementation of complex AI models and algorithms Define data collection, cleaning, and pre-processing strategies Architect and maintain robust datapipelines and data management systems Lead the training of machine learning models, ensuring high … performance and scalability Build processes for extracting, cleaning and transforming data (SQL/Python) Optimize model parameters and architectures for maximum efficiency and accuracy Ensure seamless integration of AI solutions with existing systems and applications Stay updated with the latest advancements in AI and machine learning Present findings to clients through written documentation, calls and presentations Be an More ❯
Job Description - Data Warehouse Developer About Hiscox: At Hiscox, we care about our people. We hire the best people for the work, and we're committed to diversity and creating a truly inclusive culture, which we believe drives success. We embrace hybrid-working practices, balancing the ability to work remotely with the culture and energy we experience when … since 1901), we are young in many ways-ambitious and going places. If that sounds good to you, get in touch. Follow us on LinkedIn, Glassdoor, and Instagram Data Warehouse Developer Location: London Band: Band II Type: Permanent The team The Technology organisation comprises a Corporate Centre team and five business-aligned Technology teams partnering with our federated … put Technology at the heart of the business. The role This role sits within the Group Enterprise Systems (GES) Technology team. The ideal candidate is an experienced Microsoft data warehouse developer (SQL Server, SSIS, SSAS) capable of working independently and within a team to deliver enterprise-class data warehouse solutions and analytics platforms. The role involves More ❯
Octopus Investments and Octopus Money. Check out the Seccl website for the latest on our products and our mission to shape the future of investments. The role The Data Squad engineering manager's purpose is to lead and empower a high-performing team of data engineers to deliver scalable, reliable, and secure data capabilities. … This role is central to enabling internal and external data products, supporting business insights, and developing a modern data platform. The role will work closely with Product, Engineering, and Data Operations to align delivery with business outcomes, support the team's growth, and help shape our overall data strategy. On a typical … day you will Guide and mentor, a team of data engineers, fostering their professional growth and ensuring high performance.Setting clear, measurable goals with regular performance reviews, development plans, and recognition. Collaborate with product managers and engineering leaders to define priorities and align data initiatives with business goals. Provide technical guidance and architectural oversight across the dataMore ❯
juggling multiple projects and priorities with finesse. A knack for thriving in a fast-paced, innovation-driven environment, where adaptability is key. Strong analytical and quantitative skills, leveraging data and metrics to inform strategic decisions. Clear and compelling communication skills, capable of articulating data insights to diverse stakeholders. If you're ready to challenge the status … and business required for launch of the new initiative Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions. Ensure data accuracy by validating data for new and existing tools. Apply data mining and quantitative analysis to understand ways to improve our content selection for customers … test, Chi-squared) - Experience with scripting languages (e.g., Python, Java, or R) - Master's degree or advanced technical degree - Knowledge of data modeling and datapipeline design - Experience with statistical analysis and correlation analysis Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need workplace More ❯
Pavegen Pavegen is where movement meets meaning. We’ve evolved from a kinetic energy pioneer into a next-generation AI-powered engagement platform that transforms footfall into energy, data, and dynamic experiences. Our mission? To revolutionise how people connect with the spaces around them - through intelligent, sustainable, and interactive technology. That’s where you’ll come in. The … This is not a pure management role. You'll already understand and be in a position to architect complex integrated environments that connect hardware (kinetic and solar floors), data collection modules, cloud infrastructure, and customer-facing digital experiences while adding new technologies such as AI, Digital Twin and more working with our in house software development team. You … ll lead the charge on building our smart infrastructure ecosystem - from IoT devices on the ground, through to digital twins, edge computing, and cloud datapipelines that drive actionable insight for our clients. Responsibilities Technology Vision & Strategy Own and evolve the company’s technology roadmap to support the next generation of Pavegen’s integrated systems. Lead the technical More ❯
City of London, London, United Kingdom Hybrid / WFH Options
pubX
seeking a seasoned Senior Full-Stack Engineer to join our fully-remote global team. You'll play a pivotal role in designing, developing, and deploying scalable web applications, datapipelines, and APIs that integrate or report on advanced machine learning models. Collaborating closely with cross-functional teams, you'll help shape the future of ad tech through pub … tech by building systems that are both robust and innovative. 🔧 Responsibilities Architect and develop end-to-end web and data applications and APIs using modern frameworks and technologies. Collaborate with data scientists and product managers to translate business requirements into technical solutions. Support and collaborate closely with the operations team to ensure system reliability, observability, and More ❯
seeking a seasoned Senior Full-Stack Engineer to join our fully-remote global team. You'll play a pivotal role in designing, developing, and deploying scalable web applications, datapipelines, and APIs that integrate or report on advanced machine learning models. Collaborating closely with cross-functional teams, you'll help shape the future of ad tech through pub … tech by building systems that are both robust and innovative. 🔧 Responsibilities Architect and develop end-to-end web and data applications and APIs using modern frameworks and technologies. Collaborate with data scientists and product managers to translate business requirements into technical solutions. Support and collaborate closely with the operations team to ensure system reliability, observability, and More ❯
Contact email: Job ref: ADE/HH/43 Startdate: ASAP AWS Data Architect - Market Data We're seeking a hands-on AWS Data Architect to play a lead role in a high-impact initiative building a next-generation data platform from the ground up. This is a rare greenfield opportunity to … architect and engineer cutting-edge solutions that will power critical market data workflows across the business. As a senior technical leader, you'll not only set the architectural direction but also roll up your sleeves to build and optimize scalable datapipelines using the latest cloud-native tools and frameworks. You'll be instrumental in shaping … the technical foundation of our platform, from core design principles to implementation best practices. What You'll Do: Design and implement end-to-end data architecture on AWS using tools such as Glue, Lake Formation, and Athena Develop scalable and secure ETL/ELT pipelines using Python, PySpark, and SQL Drive decisions on data modeling, lakehouse More ❯
Data Engineering Manager (x2)London - Hybrid (2 days/week, usually Tues & Fri)12-month contractCompetitive day rateASAP startAbout the RoleJoin a critical data remediation programme within the Surveillance Team/Lab, part of the Markets Platform area … of a major financial services business.You will lead design and delivery of stable, scalable, and performant data solutions within a complex GCP architecture, driving simplification and improving pipeline automation to enable timely, secure data delivery to surveillance systems.Key Responsibilities* Engineer stable, scalable, performant, accessible, testable, and secure data products aligned with endorsed technologies … data integration* Solid SQL expertise for querying, transforming, and troubleshooting data* Experience building robust ETL pipelines from diverse source systems* Familiarity with CI/CD pipeline automation and enhancement* Understanding of Agile delivery frameworks, including Jira and sprint planning* Knowledge of Terraform for infrastructure as code provisioning* Experience with containerization technologies to deploy and manage More ❯
Hybrid Role with 2 days per week onsite in Central London. Skillset required: * DataPipeline Expertise: Extensive experience in designing and implementing scalable ETL/ELT datapipelines in Azure Databricks, transforming raw data into usable datasets for analysis. * Azure Databricks Proficiency: Strong knowledge of Spark (SQL, PySpark) for data transformation … and processing within Databricks, along with experience building workflows and automation using Databricks Workflows. * Azure Data Services: Hands-on experience with Azure services like Azure Data Lake, Azure Blob Storage, and Azure Synapse for data storage, processing, and publication. * Data Governance & Security: Familiarity with managing data governance and security using … Databricks Unity Catalog, ensuring data is appropriately organized, secured, and accessible to authorized users. * Optimization & Performance Tuning: Proven experience in optimizing datapipelines for performance, cost-efficiency, and scalability, including partitioning, caching, and tuning Spark jobs. * Cloud Architecture & Automation: Strong understanding of Azure cloud architecture, including best practices for infrastructure-as-code, automation, and monitoring in More ❯
Hybrid Role with 2 days per week onsite in Central London. Skillset required: * DataPipeline Expertise: Extensive experience in designing and implementing scalable ETL/ELT datapipelines in Azure Databricks, transforming raw data into usable datasets for analysis. * Azure Databricks Proficiency: Strong knowledge of Spark (SQL, PySpark) for data transformation … and processing within Databricks, along with experience building workflows and automation using Databricks Workflows. * Azure Data Services: Hands-on experience with Azure services like Azure Data Lake, Azure Blob Storage, and Azure Synapse for data storage, processing, and publication. * Data Governance & Security: Familiarity with managing data governance and security using … Databricks Unity Catalog, ensuring data is appropriately organized, secured, and accessible to authorized users. * Optimization & Performance Tuning: Proven experience in optimizing datapipelines for performance, cost-efficiency, and scalability, including partitioning, caching, and tuning Spark jobs. * Cloud Architecture & Automation: Strong understanding of Azure cloud architecture, including best practices for infrastructure-as-code, automation, and monitoring in More ❯
Engineering sub-team owns our global technical platform that supports these different processes and drives forward long-term solutions to enhance Group capabilities. We are looking for a Data Engineer to help achieve this goal - ideally someone who is comfortable diving into different tasks to support each team using a variety of coding languages across our platform setup … who enjoys developing relationships across the company while explaining technical processes in the most appropriate way, and who keeps an eye on scalable solutions to support data growth. This is therefore an exciting opportunity to take on a role that combines complex data engineering, visual analytics and business critical need. What you'll do Supporting different … Strong aptitude with SQL, Python and Airflow; Experience in Kubernetes, Docker, Django, Spark and related monitoring tools for DevOps a big plus (e.g. Grafana, Prometheus); Experience with dbt for pipeline modeling also beneficial; Skilled at shaping needs into a solid set of requirements and designing scalable solutions to meet them; Able to quickly understand new domain areas and visualize More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
Job Title: Lead Data Engineer Location: London (Hybrid) Salary: Up to £85,000 About the Role I am excited to be working with a leading client who is seeking an experienced Lead Data Engineer to join their team. This is an excellent opportunity to drive the design, development, and optimisation of an analytic data platform that will enable informed decision-making and business growth. In this leadership role, you will be pivotal in building scalable, efficient datapipelines and products, while also mentoring and guiding a … talented team of data engineers. Role Overview As the Lead Data Engineer, you will be responsible for overseeing the full lifecycle of datapipeline development-designing, developing, and maintaining robust data systems. You will play a crucial role in shaping the evolution of the company's data platform, ensuring More ❯
Data Engineer – Azure | Permanent | UK – Hybrid Our client, a leading organisation within the Financial Services space, is seeking a talented Data Engineer to join their high-performing team. This is an exciting opportunity for a skilled data professional to play a critical role in modernising legacy platforms, delivering robust data solutions, and … contributing to business-wide insight initiatives. As a Data Engineer, you will be instrumental in implementing new technologies, designing and maintaining scalable ETL processes, and developing and administering cloud-based data platforms. This role requires a proactive individual with a passion for data quality, cloud technologies, and continuous improvement. Key Responsibilities: Design, build, and … the engineering function. Key Requirements: Bachelor’s degree in Computer Science, Engineering, or a related field. Proven experience with Azure infrastructure, including Infrastructure as Code and CI/CD pipeline development (GitHub/Azure DevOps). Strong proficiency in Python and SQL, with a solid understanding of data warehousing and ETL best practices. Experience working with both More ❯
Overview: 3 contract data engineers to supplement existing team during implementation phase of new data platform. Main Duties and Responsibilities: Write clean and testable code using PySpark and SparkSQL scripting languages, to enable our customer data products and business applications. Build and manage datapipelines and notebooks, deploying code in a structured … trackable and safe manner. Effectively create, optimise and maintain automated systems and processes across a given project(s) or technical domain. Data Analyse, profile and plan work, aligned with project priorities. Perform reviews of code, refactoring where necessary. Deploy code in a structured, trackable and safe manner. Document your data developments and operational procedures. Ensure adherence … in others. Working within the agile framework at Chambers. Skills and Experience: Excellent understanding of Data Lakehouse architecture built on ADLS. Excellent understanding of datapipeline architectures using ADF and Databricks. Excellent coding skills in PySpark and SQL. Excellent technical governance experience such as version control and CI/CD. Strong understanding of designing, constructing More ❯