SQL development skills including: MS SQL Server, T-SQL, indexing, stored procedures, relational/dimensional modelling, data dashboards. Any cloud platforms experience would be helpful including: such as Snowflake, Databricks, BigQuery, Azure SQL. Working closely with key stakeholders including data architects, analysts, testers and managers. Working across the full SQL development life-cycle including: design, development, documentation and testing. Advantageous More ❯
SQL, indexing, relational/dimensional modelling, data dashboards. Building/optimising data pipelines and integrations across cloud platforms. Any cloud platforms experience would be helpful including: such as Snowflake, Databricks, BigQuery, Azure SQL. Working closely with key stakeholders including data architects, analysts, testers and managers. Working across the full SQL development life-cycle including: design, development, documentation and testing. Advantageous More ❯
data modelling and analysis Power BI experience in a business-facing environment Nice to have: Python for data analysis or scripting Familiarity with cloud data environments (e.g. Azure, Snowflake, Databricks) If you're a BI developer with London Market experience looking to make a real impact, this is a rare opportunity to help shape a data-driven future from the More ❯
data modelling and analysis Power BI experience in a business-facing environment Nice to have: Python for data analysis or scripting Familiarity with cloud data environments (e.g. Azure, Snowflake, Databricks) If you're a BI developer with London Market experience looking to make a real impact, this is a rare opportunity to help shape a data-driven future from the More ❯
Finance/Actuarial processes within insurance, and a passion for solving complex business problems with smart data solutions. What you'll do: Build and optimise data solutions in Azure (Databricks, Data Factory, Azure SQL) Work hands-on with stakeholders across Finance and Actuarial teams Lead data engineering efforts in a dynamic, agile environment Collaborate with internal and external dev teams More ❯
data modelling and analysis Power BI experience in a business-facing environment Nice to have: Python for data analysis or scripting Familiarity with cloud data environments (e.g. Azure, Snowflake, Databricks) If you're a BI developer with London Market experience looking to make a real impact, this is a rare opportunity to help shape a data-driven future from the More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Fusion People
pension contributions Job type: Permanent/full-time or 6-12 month contract (both options available) Essential Experience: Strong Python programming knowledge, ideally with PySpark Knowledge of the Azure Databricks platform and its functionalities Adaptable with a willingness to work flexibly as organizational needs evolve Ability to work well within a team and collaborate with internal and external stakeholders Logical More ❯
as recommendation systems, segmentation, and marketing optimisation. Proficiency in Python, SQL, Bash, and Git, with hands-on experience in Jupyter notebooks, Pandas, and PyTorch. Familiarity with cloud platforms (AWS, Databricks, Snowflake) and containerisation tools (Docker, Kubernetes). Strong problem-solving skills and a passion for driving measurable business impact. Knowledge of marketing measurement techniques like media mix modelling. Exposure to More ❯
as recommendation systems, segmentation, and marketing optimisation. Proficiency in Python, SQL, Bash, and Git, with hands-on experience in Jupyter notebooks, Pandas, and PyTorch. Familiarity with cloud platforms (AWS, Databricks, Snowflake) and containerisation tools (Docker, Kubernetes). Strong problem-solving skills and a passion for driving measurable business impact. Knowledge of marketing measurement techniques like media mix modelling. Exposure to More ❯
Data Engineer to play a key role in the creation of a brand-new data platform within the Azure ecosystem including Azure Data Factory (ADF), Synapse and PySpark/Databricks and Snowflake. You will be a data ingestion and ETL Pipeline guru, tackling complex problems at source in order to retrieve the data and ensure to can flow upstream to More ❯
Translate non-technical requirements into technical solutions Identify new opportunities for data-driven decision-making across the business What We’re Looking For Strong data engineering skills (Python, SQL, Databricks) Experience delivering end-to-end data science and analytics projects Ability to communicate with non-technical stakeholders and work collaboratively Intellectually curious, self-motivated, and highly hands-on Comfortable working More ❯
Translate non-technical requirements into technical solutions Identify new opportunities for data-driven decision-making across the business What We’re Looking For Strong data engineering skills (Python, SQL, Databricks) Experience delivering end-to-end data science and analytics projects Ability to communicate with non-technical stakeholders and work collaboratively Intellectually curious, self-motivated, and highly hands-on Comfortable working More ❯
/fulltime or 6 - 12 month contract (both option available) Summary of any essential experience required for the role Strong Python programming knowledge, ideally Pyspark Knowledge of the Azure Databricks platform and associated functionalities Adaptable, with a willingness to work flexibly as the needs of the organisation evolve. Working well within a team, and able to work closely with internal More ❯
Salisbury, Wiltshire, South West, United Kingdom Hybrid / WFH Options
Anson Mccade
with senior stakeholders to influence architectural decisions and standards You'll bring: Proven experience as a Lead Data Architect or Senior Data Solutions Architect Expertise in Kafka/Confluent , Databricks , Unity Catalog , and modern data lake/lakehouse architectures Strong grasp of cloud data platforms (AWS, Azure, GCP, Snowflake) Understanding of Data Mesh , Data Fabric , and data product-centric approaches More ❯
a Data Engineer or Data Analyst role with a focus on technical delivery Strong hands-on expertise in SQL, Python, and Apache Spark in a production environment Experience with Databricks and Microsoft Azure is highly advantageous Exceptional problem-solving skills and a keen attention to detail Ability to communicate complex data concepts clearly to non-technical stakeholders Self-motivated, proactive More ❯
Newcastle upon Tyne, Tyne and Wear, Tyne & Wear, United Kingdom
Randstad Technologies Recruitment
in a data specialist role with a passion for solving data-related problems. Expertise in SQL, Python , and Apache Spark , with experience working in a production environment. Familiarity with Databricks and Microsoft Azure is a plus. Financial Services experience is a bonus, but not required. Strong verbal and written communication skills , with the ability to explain complex data concepts. Exceptional More ❯
Proficiency with digital analytics tools (e.g., Google Analytics or equivalent). Experience with visualization tools (e.g., Looker Studio, Power BI or equivalent). Experience with SQL environments (e.g., BigQuery, Databricks or equivalent). Strong analytical skills with the ability to draw concise insights from multiple data sets. Excellent stakeholder engagement and collaboration skills. Ability to manage multiple projects concurrently. Inquisitive More ❯
lead Agile teams to deliver robust outcomes. Experienced in securing work through RFI/RFPs, bids, and presentations across public and private sectors. Skilled in data science platforms (e.g. Databricks, AzureML) and cloud services (AWS, Azure, GCP), with knowledge of tools like Terraform. Experienced in deploying solutions using Docker, Kubernetes, CI/CD tools. To be Considered: Please either apply More ❯
City of London, London, United Kingdom Hybrid / WFH Options
KDR Talent Solutions
high-performing data science teams. Ability to build and communicate complex solutions to stakeholders across different levels and disciplines. Proficiency working with modern cloud-based tools (e.g., Azure ML , Databricks , Snowflake , SageMaker , etc.). Deep knowledge of machine learning techniques including predictive modelling, pattern recognition, and optimisation. Strong stakeholder management and product ownership skills. Experience with CI/CD tools More ❯
high-performing data science teams. Ability to build and communicate complex solutions to stakeholders across different levels and disciplines. Proficiency working with modern cloud-based tools (e.g., Azure ML , Databricks , Snowflake , SageMaker , etc.). Deep knowledge of machine learning techniques including predictive modelling, pattern recognition, and optimisation. Strong stakeholder management and product ownership skills. Experience with CI/CD tools More ❯
to tackling new challenges A strong desire to learn and share knowledge Ideally, you'll also have Certification for at least one non-SAP data platform, ideally Snowflake or Databricks Knowledge of ETL tools like SAP Data Services, Matillion, Fivetran or SNP Glue Hands-on experience with data visualisation tools such as Power BI (including data modelling and extraction) Hands More ❯
South East London, England, United Kingdom Hybrid / WFH Options
KDR Talent Solutions
high-performing data science teams. Ability to build and communicate complex solutions to stakeholders across different levels and disciplines. Proficiency working with modern cloud-based tools (e.g., Azure ML , Databricks , Snowflake , SageMaker , etc.). Deep knowledge of machine learning techniques including predictive modelling, pattern recognition, and optimisation. Strong stakeholder management and product ownership skills. Experience with CI/CD tools More ❯
and Excellent. What you'll need to succeed Strong proficiency in SQL, Excel, Power BI, and DAX is essential. Experience with cloud-based data platforms such as Snowflake or DataBricks is preferred. Expertise in T-SQL to write complex queries and stored procedures. Understanding of database optimisation, data mining, auditing, and segmentation. Skilled in data visualisation and statistical techniques such More ❯
talented Data Analyst to specialize in Marketing & Data Storytelling. This remote role offers an exciting opportunity to work with cutting-edge technologies such as Athena/Presto, Python (notebooks), Databricks, Google Analytics, and top BI tools like Looker and Tableau. About Constructor Constructor is the only search and product discovery platform tailor-made for enterprise ecommerce where conversions matter. Constructor More ❯
Distributed Systems: HDFS, Hadoop, Spark, Kafka Cloud: Azure or AWS Programming: Python, Java, Scala, PySpark – you’ll need two or more, Python preferred Data Engineering Tools: Azure Data Factory, Databricks, Delta Lake, Azure Data Lake SQL & Warehousing: Strong experience with advanced SQL and database design Bonus Points: Exposure to geospatial data or data science/ML pipelines What They're More ❯