best practices. Develop robust data models, utilize advanced DAX formulas, and perform efficient data transformations. Conduct data profiling and quality assessments to ensure accuracy and completeness of outputs. Optimize SQL queries and transformation logic for efficient reporting. Establish standardized development practices for dashboards and reports. Collaborate with data engineers to define data requirements and influence pipeline design. Engage with business … analytics, or data science with impactful dashboard delivery. Strong analytical and problem-solving skills across large datasets. Technical expertise in: Power BI: DAX, data modeling, report optimization, workspace management. SQL: advanced querying, performance tuning, CTEs, joins, aggregations. Data visualization: UX design, storytelling, accessibility. Data profiling and quality techniques. Python (preferred): data exploration, automation, integration. Knowledge of data modeling, star/… deployment and release processes. Knowledge of insurance data, especially within the London Market and Lloyd’s, is highly advantageous. Relevant certifications (e.g., Microsoft Certified: Data Analyst Associate, Power BI, SQL) are a plus. #J-18808-Ljbffr More ❯
of modern data modelling practices, analytics tooling, and interactive dashboard development in Power BI and Plotly/Dash. Key responsibilities: Designing and maintaining robust data transformation pipelines (ELT) using SQL, Apache Airflow, or similar tools. Building and optimizing data models that power dashboards and analytical tools Developing clear, insightful, and interactive dashboards and reports using Power BI and Plotly/… data modeling techniques (e.g. dimensional, star/snowflake schemas) and analytics layer design to support business intelligence and self-serve reporting. Proficiency in analytics engineering tools such as airflow, SQL, and version control systems like Git. Hands-on experience developing dashboards and reports using Power BI, Plotly/Dash, or other modern visualisation tools. Strong understanding of data governance, quality … AWS, Azure, or GCP, and working with data warehouse/lakehouse technologies such as Snowflake, BigQuery, Redshift, or Athena/Glue. Essential: Proficient in writing clean, efficient, and maintainable SQL and Python code, particularly for data transformation and analytics use cases. Strong understanding of data modeling concepts, including star/snowflake schemas and designing models optimized for reporting and dashboarding. More ❯
City of London, London, United Kingdom Hybrid / WFH Options
OTA Recruitment
of modern data modelling practices, analytics tooling, and interactive dashboard development in Power BI and Plotly/Dash. Key responsibilities: Designing and maintaining robust data transformation pipelines (ELT) using SQL, Apache Airflow, or similar tools. Building and optimizing data models that power dashboards and analytical tools Developing clear, insightful, and interactive dashboards and reports using Power BI and Plotly/… data modeling techniques (e.g. dimensional, star/snowflake schemas) and analytics layer design to support business intelligence and self-serve reporting. Proficiency in analytics engineering tools such as airflow, SQL, and version control systems like Git. Hands-on experience developing dashboards and reports using Power BI, Plotly/Dash, or other modern visualisation tools. Strong understanding of data governance, quality … AWS, Azure, or GCP, and working with data warehouse/lakehouse technologies such as Snowflake, BigQuery, Redshift, or Athena/Glue. Essential: Proficient in writing clean, efficient, and maintainable SQL and Python code, particularly for data transformation and analytics use cases. Strong understanding of data modeling concepts, including star/snowflake schemas and designing models optimized for reporting and dashboarding. More ❯
tools like Great Expectations or Soda to ensure the accuracy, reliability, and integrity of data throughout its lifecycle. Create & maintain data pipelines using Airflow & Snowflake as primary tools Create SQL Stored procs to perform complex transformation Understand data requirements and design optimal pipelines to fulfil the use-cases Creating logical & physical data models to ensure data integrity is maintained CI … Actions Tuning and optimizing data processes Qualifications Required Qualifications: · Bachelor's degree in Computer Science or a related field. · Proven hands-on experience as a Data Engineer. · Proficiency in SQL (any flavor), with experience using Window functions and advanced features. · Excellent communication skills. · Strong knowledge of Python. · Familiarity with Azure Services such as Blobs, Functions, Azure Data Factory, Service Principal … using Git and Git Actions. · Knowledge of various data modeling techniques, including Star Schema, Dimensional models, and Data Vault. · Hands-on experience with: · Developing data pipelines (Snowflake), writing complex SQL queries. · Building ETL/ELT/data pipelines. · Kubernetes and Linux containers (e.g., Docker). · Related/complementary open-source software platforms and languages (e.g., Scala, Python, Java, Linux). More ❯
tools like Great Expectations or Soda to ensure the accuracy, reliability, and integrity of data throughout its lifecycle. Create & maintain data pipelines using Airflow & Snowflake as primary tools Create SQL Stored procs to perform complex transformation Understand data requirements and design optimal pipelines to fulfil the use-cases Creating logical & physical data models to ensure data integrity is maintained CI … Actions Tuning and optimizing data processes Qualifications Required Qualifications: · Bachelor's degree in Computer Science or a related field. · Proven hands-on experience as a Data Engineer. · Proficiency in SQL (any flavor), with experience using Window functions and advanced features. · Excellent communication skills. · Strong knowledge of Python. · Familiarity with Azure Services such as Blobs, Functions, Azure Data Factory, Service Principal … using Git and Git Actions. · Knowledge of various data modeling techniques, including Star Schema, Dimensional models, and Data Vault. · Hands-on experience with: · Developing data pipelines (Snowflake), writing complex SQL queries. · Building ETL/ELT/data pipelines. · Kubernetes and Linux containers (e.g., Docker). · Related/complementary open-source software platforms and languages (e.g., Scala, Python, Java, Linux). More ❯
only. Required Qualifications: Digital Data Architecture–Experience delivering cloud platforms like AWS, Azure, or GCP, with relevant TOGAF qualification (e.g., Enterprise Architect, Data Scientist etc.) Programming-Strong proficiency in SQL and other programming languages, such as Python or Java, with an associated qualification (e.g., Microsoft Azure SQL, Oracle SQL, IBM SQL etc.) Applicants who are part-qualified or with sufficient … Qualifications 1. Digital Data Architecture –Experience delivering cloud platforms like AWS, Azure, or GCP, with relevant TOGAF qualification (e.g., Enterprise Architect, Data Scientist etc.) 2. Programming -Strong proficiency in SQL and other programming languages, such as Python or Java, with an associated qualification (e.g., Microsoft Azure SQL, Oracle SQL, IBM SQL etc.) Applicants who are part-qualified or with sufficient More ❯
London, England, United Kingdom Hybrid / WFH Options
Cushon Money Limited
our Business Intelligence, Data Engineering and Data Science teams to work together closely. You’ll also be: Designing and documenting our core data models, and implementing them through efficient SQL-based ETL workflows that deliver clean, high-quality data into our data lake Working with business stakeholders to understand and refine management information, operational reporting and regulatory reporting needs and … working with large datasets in Python and relevant libraries including Pandas, Matplotlib, seaborn or similar would be beneficial. You’ll also need: The ability to write efficient and readable SQL code to manipulate data and perform calculations Experience in dashboarding software such as Tableau, Power BI or QuickSight Experience working with large cross functional teams in an agile environment managed … Referrals increase your chances of interviewing at NatWest Cushon by 2x Get notified about new Senior Business Intelligence Analyst jobs in London, England, United Kingdom . Data Analyst - Advanced SQL - Data Analysis and Customer Insight London, England, United Kingdom 4 days ago Greater London, England, United Kingdom 3 weeks ago London, England, United Kingdom 3 weeks ago City Of London More ❯
London, England, United Kingdom Hybrid / WFH Options
Endeavour Recruitment Solutions
Technologies: Data Engineer ETL SQL Power BI Azure Data Warehouse We have an exciting Hybrid working opportunity for a Data Engineer to join our clients growing Data team, playing a key role in surfacing data within their fast-growing Finance business on the South Coast The role Responsibility for designing, building, and implementing a robust and scalable data warehouse that … and associated systems are protected in the event of a disaster. Ensuring continuity of service by developing and maintaining backup and recovery procedures and testing these procedures regularly. Writing SQL views and stored procedures Database modelling, development and optimisation Skills: Strong SQL skills to be able to write complex queries and procedures for data extraction and analysis.Can work closely with More ❯
Business Intelligence Engineer Hybrid – London Up to £450 a day Inside IR35 6 Months Key Skills: Business Intelligence (Data modelling, data warehousing, Dashboarding) SQL & Python AWS (S3, Lambda, Glue, Redshift) The Senior Business Intelligence Engineer occupies a unique role at the intersection of technology, marketing, finance, statistics, data mining, and social science. We provide the key insight into customer behavior … downstream consumption - Work with customers to build Dashboards with the right KPIs, Metrics for decision making - Data Quality checks, ETL/ELT processes, automation Technical Requirements: - Strong proficiency in SQL and Python programming - Extensive experience with data modeling and data warehouse concepts - Advanced knowledge of AWS data services, including: S3, Redshift, AWS Glue, AWS Lambda - Experience with Infrastructure as Code More ❯
Business Intelligence Engineer Hybrid – London Up to £450 a day Inside IR35 6 Months Key Skills: Business Intelligence (Data modelling, data warehousing, Dashboarding) SQL & Python AWS (S3, Lambda, Glue, Redshift) The Senior Business Intelligence Engineer occupies a unique role at the intersection of technology, marketing, finance, statistics, data mining, and social science. We provide the key insight into customer behavior … downstream consumption - Work with customers to build Dashboards with the right KPIs, Metrics for decision making - Data Quality checks, ETL/ELT processes, automation Technical Requirements: - Strong proficiency in SQL and Python programming - Extensive experience with data modeling and data warehouse concepts - Advanced knowledge of AWS data services, including: S3, Redshift, AWS Glue, AWS Lambda - Experience with Infrastructure as Code More ❯
scalable solutions for data analysis, integration, and eventual transformation. The chosen candidate will also use data mining to uncover patterns, anomalies and correlations in large data sets and use SQL to manipulate data and support with design of custom solutions for the Corporate Data Cloud platform and other technologies used for MDM. The MDM Solutions Architect will also be heavily … on data migration and reporting using ETL (Informatica, Boomi, DataStage, Matillion etc) and Reporting Tools (Power BI, Tableau) Experience on Snowflake Storage and Database. Thorough experience with writing complex SQL Thorough experience translating business requirement to technical requirements and vice versa Experience with data quality tools Background in engineering, ideally with Java and Python Understanding of the SDLC and agile More ❯
Analytics Possibility of remote work: Hybrid, 3 days/week from office is mandatory Contract duration: 12 to 24 months, depending on performance Location: London Required Core Skills: ETRM, SQL/python, Power BI/Tableau Detailed Job Description: We are seeking a talented and experienced Technical Data Business Analyst within our Data & Analytics portfolio. As Product Manager/Business … or Energy domains. Experience of financial market data, data analytics, data modelling, data visualization are advantageous. Solid understanding of data analysis and experience working with data analytics tools (incl. SQL/python), databases, and data visualization technologies (e.g., Power BI/Tableau/Other). Strong knowledge of product management methodologies and best practices, including product strategy, roadmap development, process More ❯
solving and analytical abilities, with a curious and inquisitive mind, and an openness to new ideas; Professional integrity and a respect for company values. Other requirements Proven experience with SQL, SSIS and SSAS. Proven experience with data modelling. Ability to create a strong relationship with stakeholders. Excellent communication skills with the ability to collaborate effectively with cross-functional teams. Self … multiple locations. Proven experience with understanding business requirements and translating these into technical deliverables. Motivated to expand technical skills. Experience with Microsoft BI Tools such as Power BI, SSRS & SQL Server. Experience with Azure Data Factory and Databricks (Python). Experience working with DevOps, IaC & CI/CD pipelines (e.g. Terraform and Databricks Asset Bundles). More ❯
Technology, Engineering, or a related field. Minimum 3 years of experience as a Database Designer & Senior Data Engineer, with hands-on experience in cloud based data services. Proficiency in SQL, Python, or Scala. Experience with AWS RDS, AWS Glue, AWS Kinesis, AWS S3, Redis and/or Azure SQL Database, Azure Data Lake Storage, Azure Data Factory. Knowledge of data More ❯
development with Python, using object-oriented and test-driven development techniques Knowledge and experience in software development with relational database management systems such as Oracle Database (using PL/SQL) or Microsoft SQL Server (using T-SQL) Knowledge and experience of automation and continuous integration with tools such as Azure DevOps Pipelines Experience of task automation with PowerShell 5 and More ❯
the consulting team into support. Skills and Attributes Required Experience working with Microsoft Data Analytics solutions on-premise and in the cloud. A good breadth of core technologies: T-SQL (Azure SQL Database, Synapse Serverless DB), Data Factory/Synapse Pipelines, SQL Server BI suite (SSIS/SSAS/SSRS), Power BI. Cloud Fundamentals - A solid grasp of Azure's More ❯
actionable insights to stakeholders. Your role will also involve predictive analytics, root-cause analysis, risk management collaboration, and ensuring compliance with industry standards. Proficiency in tools like Power BI, SQL, and Python, along with familiarity with PRISM or Primavera, is essential. Be part of a dynamic team driving project success through advanced analytics! About us: CMR is first and foremost … standards and regulatory guidelines. Stay updated on advancements in analytics tools, techniques, and best practices within project management. Analytics & Data Skills: Proficiency in using analytics tools (e.g., Power BI, SQL, Python) for data transformation, visualization, and predictive modelling. Experience with data integration and ETL processes, and familiarity with project management systems like PRISM or Primavera. Problem Solving & Insight Generation: Strong … analytics solutions and driving data-led decision-making on large, complex projects. Knowledge of data visualization tools (e.g., Power BI, Tableau), advanced Excel, and coding languages such as Python, SQL, or DAX. At CMR we have a strong culture driven by our 9 Core Principles. We look to build a community of people that have the same beliefs as we More ❯
3+ years in architecture roles, with deep experience designing solutions on Databricks and Apache Spark. Strong grasp of Delta Lake, Lakehouse architecture, and Unity Catalog governance. Expertise in Python, SQL, and optionally Scala; strong familiarity with dbt and modern ELT practices. Proven experience integrating Databricks with Azure services (e.g., Data Lake, Synapse, Event Hubs). Hands-on knowledge of CI … skills, able to bridge technical and business domains. Fluency in English; other European languages a plus. Technologies You’ll Work With Core: Databricks, Spark, Delta Lake, Unity Catalog, dbt, SQL, Python Cloud: Microsoft Azure (Data Lake, Synapse, Storage, Event Hubs) DevOps: Bitbucket/GitHub, Azure DevOps, Terraform Orchestration & Monitoring: Dragster, Airflow, Datadog, Grafana Visualization: Power BI Other: Confluence, Docker, Linux More ❯
reporting tools, offering guidance to key stakeholders. Maintain and optimise reporting outputs, identifying areas for enhancement and automation. Conduct ad-hoc analysis to meet dynamic business needs. Write complex SQL queries to extract and manipulate data across large-scale database environments. Support transformation and change initiatives by providing insights and reporting capabilities that improve data integrity and performance. Lead delivery … credit and financial information sources Qualifications and Skills Essential: Degree in a quantitative, scientific, or finance-related field. Extensive experience in BI, analytics, or MI (management information) roles. Strong SQL proficiency with experience querying large databases and writing scalable, maintainable code. Expertise in Tableau with a proven ability to build insightful, interactive dashboards and reports. Advanced Excel skills for complex More ❯
London, England, United Kingdom Hybrid / WFH Options
UBDS Group
connectors and working with REST/SOAP APIs and Azure functions Familiarity with Azure ecosystem including Azure Logic Apps, Functions, Service Bus and Entra ID for SSO Understanding of SQL databases (Azure SQL, SQL Server) Deep understanding of the Power Platform Centre of Excellence Toolkit, including setup & installation, usage and general Power Platform administration best practices. Creating and managing Power More ❯
e-commerce, then we want to hear from you! Key job responsibilities Responsibilities: • Understand the various operations across Amazon Now • Design and develop highly available dashboards and metrics using SQL, Quicksight, and Python • Understand the requirements of stakeholders and map them with the data sources/data warehouse • Own the delivery and backup of periodic metrics, dashboards to the leadership … scaling the 10 minute delivery service of Amazon BASIC QUALIFICATIONS - Bachelor's degree or equivalent - Experience defining requirements and using data and metrics to draw business insights - Experience with SQL or ETL - 2+ years of Excel or Tableau (data manipulation, macros, charts and pivot tables) experience - Knowledge of Microsoft Excel at an advanced level, including: pivot tables, macros, index/ More ❯
London, England, United Kingdom Hybrid / WFH Options
Fitch Ratings
automate early in conjunction with other engineer’s efforts to build systems. You May Be a Good Fit If 6+ years of ETL Processes/Data Warehouse experience. Experience SQL and PL/SQL scripts for querying data. Experience with ETL Processes (Data Warehouse) and BI tools (Qlikview,Cognos etc.,). Proficient enough to Analyze Source Systems, Staging area, Fact … and Dimension tables in Target D/W Hands-on experience with No-SQL DB, preferably MongoDB. Experience creating reusable code/scripts for Automated Framework using Python (Pandas, Numpy, Requests, Boto3) Experience creating automation frameworks using Selenium (Python) and tools to support test automation for web applications Experience working with Postman, SOAPUI, GraphQL or similar API testing tools Strong More ❯
problems Direct problem solving for projects or major phases of projects to resolve software technical issues Develop, test, debug, and implement software programs, applications and projects using Java, C#, SQL, JavaScript, or other related software engineering languages as well as keeping abreast of emerging technologies impactful to CRDs business. Provide informed guidance and critical analysis of proposed changes during code … cloud providers (Azure, AWS, Google Cloud) Experience of 2-4 years in cloud native development using Java and Spring Experience in Angular or React Experience in Snowflake Experience in SQL Server Knowledge of Kubernetes Experience in developing observable, operable cloud native software that horizontally scales Experience in financial services developing solutions for Portfolio Management, Trading, Compliance, Post-Trade, IBOR or More ❯
knowledge and experience of designing, building, optimising, deploying and managing business-critical machine learning models using Azure ML in Production environments. You must have good technical knowledge of Phyton, SQL, CI/CD and familiar with Power BI. A FTSE 250 company, they combine expertise and insight with advanced technology and analytics to address the needs of over … profile Essential Criteria Previous experience in designing, building, optimising, deploying and managing business-critical machine learning models using Azure ML in Production environments. Experience in data wrangling using Python, SQL and ADF. Experience in CI/CD and DevOps/MLOps and version control. Familiarity with data visualization and reporting tools, ideally PowerBI. Good written and verbal communication and interpersonal More ❯
knowledge and experience of designing, building, optimising, deploying and managing business-critical machine learning models using Azure ML in Production environments. You must have good technical knowledge of Phyton, SQL, CI/CD and familiar with Power BI. A FTSE 250 company, they combine expertise and insight with advanced technology and analytics to address the needs of over … profile Essential Criteria Previous experience in designing, building, optimising, deploying and managing business-critical machine learning models using Azure ML in Production environments. Experience in data wrangling using Python, SQL and ADF. Experience in CI/CD and DevOps/MLOps and version control. Familiarity with data visualization and reporting tools, ideally PowerBI. Good written and verbal communication and interpersonal More ❯