Guildford, Surrey, United Kingdom Hybrid/Remote Options
Sussex Police
support the build of dashboards based on Microsoft Power BI. Developing R or Python scripts to produce predictive models for use within PowerBI or reporting. Maintenance of Power BI, SQL and related reporting systems, helping to ensure a useful and reliable reporting function for the business. Developing data pipelines from Microsoft Fabric and integration products to exploit and maximise a … BI/Microsoft Analytics solutions, with a good grounding in all associated areas including; development , Enterprise Architecture, Governance, Rollout and Adoption Knowledge or experience ofData Factory, Data Lake, Azure SQL database, Azure SQL Data warehouse, Azure Analysis Services, R Studio, Python, Power Apps or Microsoft Fabric. Considerable technical knowledge or experience in any of the following: Power BI Report Server … or Cloud based Service, or similar visualisation experience. Database including MS-SQL, Oracle Database, Data warehousing, SQL and XML. Microsoft Fabric experience R or Python predictive modelling/classification Experience in building dashboards, reports and cubes using SQL, MDX, DAX, Power Query M Code, Power BI or other visualisation tools. Detailed understanding or experience of how to develop cloud-based More ❯
Power BI (Power Query, Power BI Desktop, Power BI Service, DAX, and M Query) Data/Statistical analysis experience (e.g. R or SPSS) Programming experience beyond simple 'selects' (e.g. SQL, Python, Java) Excellent communication and presentation skills for client-facing interactions High attention to detail and good organisational skills Strong analytical skills able to understand processes, gather and shape requirements … Analyst) Experience of financial reporting, FP&A and Excel modelling Knowledge of data model design principles and ETL Understanding of Azure services (e.g., Azure Synapse, Azure Data Factory, Azure SQL) Knowledge of Power Automate and Power Apps integration with Power BI Knowledge of non-Microsoft data sources (Snowflake, Oracle, PostgreSQL, MySQL, Synapse, Big Query) Performance optimisation (Partitioning, Understanding of columnar … database concept, SQL indexing) Troubleshooting (SQL Profiler or other profiler tools, Tabular Editor, DAX Studio, Power BI datasets, RLS security) Additional Information All your information will be kept confidential according to EEO guidelines. ** At this time insightsoftware is not able to offer sponsorship to candidates who are not eligible to work in the country where the position is located . More ❯
impact through data engineering, software development, or analytics Demonstrated success in launching and scaling technical products or platforms Strong programming skills in at least two of the following: Python, SQL, Java Commercial experience in client-facing projects is a plus, especially within multi-disciplinary teams Deep knowledge of database technologies: Distributed systems (e.g., Spark, Hadoop, EMR) RDBMS (e.g., SQL Server More ❯
we need you to have. These include:Must haves Proven track record as a developer in an insurance or financial services environment; Strong working knowledge of programming languages (e.g. SQL, C#, VBA, Javascript, XML, JSON, Python); Experience building Excel-integrated applications and/or web-based pricing platforms. Strong database skills (SQL Server or similar) and data integration experience; Proactive … patterns and able to implement where appropriate; Working knowledge of software delivery and development process; Strong comprehension of working with large data sets and being able to write complex SQL queries; Have qualifications in an insurance or reinsurance environment working with catastrophe and/or actuarial models; Previous Lloyd's and PRA experience; Knowledge of DevOps practice - Desirable Skills Experience More ❯
Birmingham, West Midlands, England, United Kingdom Hybrid/Remote Options
Isio
engineering role, ideally as a senior or lead engineer. Strong, hands-on experience of Azure Data Factory for managing and orchestrating ETL processes Experience with Microsoft Fabric Products Strong SQL experience, including SQL queries and stored procedures, and formal database design methodologies Experience in setting up monitoring and data quality exception handling Strong data modelling experience Experience managing and developing More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Billigence
Expertise in one or more modern cloud data platforms: Snowflake, Databricks, AWS Redshift, Microsoft Fabric, or similar Understanding of data modelling principles, dimensional modelling, and database design Proficiency in SQL and query optimization Comprehensive knowledge of ETL/ELT processes and data pipeline architecture Excellent communication skills with the ability to collaborate across cross-functional teams Experience managing client relationships More ❯
Expertise in one or more modern cloud data platforms: Snowflake, Databricks, AWS Redshift, Microsoft Fabric, or similar Understanding of data modelling principles, dimensional modelling, and database design Proficiency in SQL and query optimization Comprehensive knowledge of ETL/ELT processes and data pipeline architecture Excellent communication skills with the ability to collaborate across cross-functional teams Experience managing client relationships More ❯
for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including Delta Lake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and cost. Need to program using the languages such as SQL, Python, R … Scala. Deep understanding of data warehousing principles, data modelling techniques, and data integration patterns. Extensive experience with Azure data services, including Azure Data Factory, Azure Blob Storage, and Azure SQL Database. Experience working with large datasets and complex data pipelines. Experience with data architecture design and data pipeline optimization. Proven expertise with Databricks, including hands-on implementation experience and certifications. … Experience with SQL and NoSQL databases. Experience with data quality and data governance processes. Experience with version control systems (e.g., Git). Experience with Agile development methodologies. Excellent communication, interpersonal, and problem-solving skills. Experience with streaming data technologies (e.g., Kafka, Azure Event Hubs). Experience with data visualisation tools (e.g., Tableau, Power BI). Experience with DevOps tools and More ❯
for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including Delta Lake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and cost. Need to program using the languages such as SQL, Python, R … Scala. Deep understanding of data warehousing principles, data modelling techniques, and data integration patterns. Extensive experience with Azure data services, including Azure Data Factory, Azure Blob Storage, and Azure SQL Database. Experience working with large datasets and complex data pipelines. Experience with data architecture design and data pipeline optimization. Proven expertise with Databricks, including hands-on implementation experience and certifications. … Experience with SQL and NoSQL databases. Experience with data quality and data governance processes. Experience with version control systems (e.g., Git). Experience with Agile development methodologies. Excellent communication, interpersonal, and problem-solving skills. Experience with streaming data technologies (e.g., Kafka, Azure Event Hubs). Experience with data visualisation tools (e.g., Tableau, Power BI). Experience with DevOps tools and More ❯
and operate ELT/ETL pipelines using Microsoft based tools (Data Factory, Fabric). Maintain a medallion architecture (Bronze–Gold) for trusted, refined datasets. Develop, optimize, and maintain complex SQL queries to support analytics and reporting requirements. Implement data quality, testing and observability; ensure lineage, accuracy and compliance. Enable self-serve analytics through well-documented models and transformation logic. Integrate … purpose data solutions. Proactively identify opportunities for continuous improvement. What you can already do Minimum 3 years’ experience in data engineering, data analytics, and BI. Proficiency in Python and SQL languages. Experience in delivering technology projects within a fast-paced business, medium sized organisations. Deliver solutions within the appropriate framework and methodology whilst ensuring the supportability of services delivered. Experience … control systems (Git, GitHub) and CI/CD best practices Excellent understanding of Power BI Service and Fabric Strong grasp of data modelling and warehousing concepts such as MS SQL Server, Oracle and Snowflake. Knowledge of Infrastructure-as-Code (e.g., Terraform), identity and secrets management (IAM), and cloud cost optimization at scale Knowledge of information principles, processes, and Master Data More ❯
and operate ELT/ETL pipelines using Microsoft based tools (Data Factory, Fabric). Maintain a medallion architecture (Bronze–Gold) for trusted, refined datasets. Develop, optimize, and maintain complex SQL queries to support analytics and reporting requirements. Implement data quality, testing and observability; ensure lineage, accuracy and compliance. Enable self-serve analytics through well-documented models and transformation logic. Integrate … purpose data solutions. Proactively identify opportunities for continuous improvement. What you can already do Minimum 3 years’ experience in data engineering, data analytics, and BI. Proficiency in Python and SQL languages. Experience in delivering technology projects within a fast-paced business, medium sized organisations. Deliver solutions within the appropriate framework and methodology whilst ensuring the supportability of services delivered. Experience … control systems (Git, GitHub) and CI/CD best practices Excellent understanding of Power BI Service and Fabric Strong grasp of data modelling and warehousing concepts such as MS SQL Server, Oracle and Snowflake. Knowledge of Infrastructure-as-Code (e.g., Terraform), identity and secrets management (IAM), and cloud cost optimization at scale Knowledge of information principles, processes, and Master Data More ❯
team Identify opportunities to improve data quality, reliability, automation, and reuse throughout the data lifecycle Experience and Skills Required: Strong hands on experience with Azure Data Factory and Azure SQL Proven expertise in Power BI, including creating intuitive and user centred visualisations Awareness or experience of Microsoft Fabric technologies Solid understanding of data warehouse design principles, including Kimball or Inmon More ❯
multiple data analytics tools (e.g. Power BI) Deep understanding of data wareho using concepts, ETL/ELT pipelines and dimensional modelling Proficiency in advanced programming languages (Python/PySpark, SQL) Experience in data pipeline orchestration (e.g. Airflow, Data Factory) Familiarity with DevOps and CI/CD practices (Git, Azure DevOps etc) Ability to communicate technical concepts to both technical and More ❯
Design, build, and maintain scalable, cloud-based data pipelines and architectures to support advanced analytics and machine learning initiatives. Develop robust ELT workflows using tools like dbt, Airflow, and SQL (PostgreSQL, MySQL) to transform raw data into high-quality, analytics-ready datasets. Collaborate with data scientists, analysts, and software engineers to ensure seamless data integration and availability for predictive modeling … Workflow Orchestration: Production experience with Apache Airflow (Prefect, Dagster or similar), including authoring DAGs, scheduling workloads and monitoring pipeline execution. Data Modeling: Strong skills in dbt, including writing modular SQL transformations, building data models, and maintaining dbt projects. SQL Databases: Extensive experience with PostgreSQL, MySQL (or similar), including schema design, optimization, and complex query development. Infrastructure as Code: Production experience More ❯
Design, build, and maintain scalable, cloud-based data pipelines and architectures to support advanced analytics and machine learning initiatives. Develop robust ELT workflows using tools like dbt, Airflow, and SQL (PostgreSQL, MySQL) to transform raw data into high-quality, analytics-ready datasets. Collaborate with data scientists, analysts, and software engineers to ensure seamless data integration and availability for predictive modeling … Workflow Orchestration: Production experience with Apache Airflow (Prefect, Dagster or similar), including authoring DAGs, scheduling workloads and monitoring pipeline execution. Data Modeling: Strong skills in dbt, including writing modular SQL transformations, building data models, and maintaining dbt projects. SQL Databases: Extensive experience with PostgreSQL, MySQL (or similar), including schema design, optimization, and complex query development. Infrastructure as Code: Production experience More ❯
Nottingham, Nottinghamshire, England, United Kingdom Hybrid/Remote Options
BUZZ Bingo
data warehousing and modelling. Proficiency in C#, Python, Java, or Scala . Hands-on experience with ETL tools (e.g., SSIS) and orchestration tools (e.g., Azure Data Factory). Strong SQL skills and experience with relational databases (MSSQL, PostgreSQL, MySQL). Familiarity with Azure services (Fabric, Azure SQL, Synapse Analytics, Blob Storage) and hybrid cloud/on-prem solutions. Understanding of More ❯
University College London Hospitals NHS Foundation Trust
day-to-day business of a teaching hospital and will work effectively with clinical, operational, finance, transformation, ICT, and other teams. In addition, you will possess: Expert technical skills (SQL, EPIC Reporting Stack, R) innate problem-solving skills detailed understanding of NHS Data Dictionary, Payment by Results and other commissioning options a significant experience with a broad set of NHS … a relevant Masters' degree level qualification or equivalent experience. Experience Essential Significant experience in an information analysis/processing role Proven experience of working with relational databases, including complex SQL Proven experience of using SQL Server to a high standard Proven experience of using MS Excel to high standard Desirable Experience in an Information Analysis role in an NHS or More ❯
Southampton, Hampshire, South East, United Kingdom
Tetra Tech
and engineers, sharing ideas and driving innovation across the globe. Cutting-Edge Technology: Get hands-on with the full Esri product suite (ArcGIS Pro, Enterprise, AGOL), Microsoft Azure, MS SQL Server, PostgreSQL and the latest open-source platforms - plus AI/ML frameworks that power truly intelligent mapping and analytics. Diverse, High-Impact Projects: From single-discipline proofs-of-concept … additional experience in .NET, Java or modern JavaScript libraries. Experience with Web GIS development, including React and integration with Enterprise/PostgreSQL backends. Experience with Enterprise Geodatabases, databases and SQL for handling large datasets. Experience with deployment, CI/CD, and version control (e.g., Azure DevOps, Git). AI literacy, with the ability to communicate concepts clearly and support skill More ❯
big data architecture and performance optimisation. CI/CD & Automation: Skilled in Jenkins, GitHub Actions, and Python scripting for automated ETL and Power BI deployments. Data Engineering: Proficient in SQL, Spark, EMR, and data warehousing; experienced in optimizing ETL pipelines and Power BI models (DAX, refresh scheduling). Containerization & DevOps: Hands-on with Docker and Kubernetes; experienced in scalable, portable More ❯
Required Qualifications: ● 4+ years relevant industry experience in a data engineering role and graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field ● Proficiency in writing SQL queries and knowledge of cloud-based databases like Snowflake, Redshift, BigQuery or other big data solutions ● Experience in data modelling and tools such as dbt, ETL processes, and data warehousing More ❯
Required Qualifications: ● 4+ years relevant industry experience in a data engineering role and graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field ● Proficiency in writing SQL queries and knowledge of cloud-based databases like Snowflake, Redshift, BigQuery or other big data solutions ● Experience in data modelling and tools such as dbt, ETL processes, and data warehousing More ❯
data and search capability to thousands of users across hundreds of clients across the globe. As the RDM Developer you will be hands-on, continually contributing to the predominantly SQL codebase and investigating/solving data issues. Our platform is SQL Server in Azure with a web-services layer. We are looking for individuals who will challenge ideas in a … ability to use initiative to make sure tasks are progressed and workload is prioritised accordingly Responsibilities Design and build scalable, efficient, and fault-tolerant data products using predominantly T-SQL on SQL Server in Azure and Azure DevOps. Work closely with the rest of the members of the Data team to identify, refine and build solutions to enhance the Reference … tech-debt backlog Mandatory Skills, Knowledge or Experience At least 3 years of experience in data engineering in a fast-paced, large-scale production environment Demonstrable expert skills in SQL, including a knowledge of efficient and performant query design Experience of programming/scripting languages ( e.g.T-SQL/C#/PowerShell/Python) Experience building production systems utilising cloud systems More ❯
for backlog management, sprint planning, and CI/CD. Technical Skills Azure Data Factory: Expert in building, automating, and optimising ETL pipelines. Azure Synapse Analytics: Strong experience with dedicated SQL pools, data warehousing concepts, and performance tuning. Power BI: Advanced experience managing enterprise models, datasets, and governance processes. SQL: Expert-level proficiency in query design, optimisation, and data transformation. Azure More ❯
for backlog management, sprint planning, and CI/CD. Technical Skills Azure Data Factory: Expert in building, automating, and optimising ETL pipelines. Azure Synapse Analytics: Strong experience with dedicated SQL pools, data warehousing concepts, and performance tuning. Power BI: Advanced experience managing enterprise models, datasets, and governance processes. SQL: Expert-level proficiency in query design, optimisation, and data transformation. Azure More ❯
working closely with the Data analytics team focusing on ETL processes, creating roadmaps for their central data warehouse and working with stakeholders to gather requirements. Technical Skills Languages: Python, SQL Databases: Microsoft SQL Server (Azure SQL Database, SQLite) ETL Tools: SSIS, Azure Data Factory Cloud Platforms: SaaS and PaaS environments Reporting: Power BI Productivity Tools: Advanced Excel skills Comprehensive knowledge More ❯