is preserved, and you have opportunities to develop your career in this established SaaS organisation. We seek candidates with 1 to 3 years of experience who are: Proficient in SQL, including writing and refining queries and stored procedures Familiar with AWS Redshift, PostgreSQL, or similar OLAP/OLTP databases Skilled in Python, especially for AWS Lambda functions Knowledgeable about event … Infrastructure as Code Our client’s tech stack leverages AWS services such as Lambda, Step Functions, and Redshift, providing a robust platform for scalable solutions. You’ll dive into SQL, Python, and Amazon States Language within Step Functions, all while maintaining a DevOps approach that encourages automation, CI/CD, and continuous improvement. Seize this exciting chance to advance your More ❯
EC2M, Coleman Street, Greater London, United Kingdom
Devonshire Hayes Recruitment Specialists Ltd
process from a lead position Strong test automation experience (10 years minimum) Experience of using test management tools (e.g. Azure DevOps, HP ALM, Jira) Visual Studio 2019 or higher SQL Experience of MS Office, MS Project and Visio. SDLC methodology. Testing in an Agile (SCRUM) environment. Integrating automated test cases with CI/CD frameworks and tools (Azure DevOps). More ❯
Wrexham, Abergele or Bangor, United Kingdom Hybrid / WFH Options
Betsi Cadwaladr University Health Board
insight Department has a fantastic opportunity for an enthusiastic professional to join our Development Team as a Data Engineer. We are looking for a motivated person who possesses excellent SQL skills. The ideal candidate will work flexibly as part of a small team responsible for maintaining and developing the department's Data Warehouse. We offer a full induction, flexible working … The data warehouse supports the organisation in strategically and operationally managing health services across North Wales. The following are therefore essential: Knowledge of data warehousing concepts and techniques Advanced SQL skills Experience of ETL processes In-depth knowledge of the Microsoft SQL Server platform and client tools (SSMS) Understanding of the importance of thorough testing The applicant must possess excellent … qualifications, knowledge or experience. Desirable Microsoft Certified Qualifications in Data Engineering or similar subject Experience Essential Experience of working in a data engineering or similar role Working knowledge of SQLquerylanguage Evidence of experience with Windows PC and server operating systems Desirable Knowledge of working with difference systems Experience of working with databases Aptitude and Abilities Essential Ability to More ❯
Work with engineers and analysts to design, implement, and maintain reliable, observable ETL/ELT workflows using Airflow and managed cloud services. Focus on Python-first implementations, high-quality SQL, Airflow orchestration, and query engines such as Athena, Trino, or ClickHouse. Required Skills: Hands-on software development and data engineering experience. Must have strong financial or trading background. Strong Python … SQL skills: writing clean, testable code following SOLID principles. Hands-on experience using Athena, Trino, ClickHouse, or other distributed SQL engines and knowledge of cost/scan optimization. Experience with cloud platforms (AWS/Azure/GCP): working with object storage, managed query services, and data catalogs. More ❯
architecture framework. Collaborate with internal teams leading governance and process standardisation. Define best practices for data lineage, metadata management, and documentation. Advise on integration and harmonisation of reporting platforms (SQL, PowerBI, Tableau, Python). Support the creation of robust controls and standardised processes for regulatory reporting. Ensure architectural recommendations align with business goals and regulatory requirements. Essential Experience: Previous experience … strategy, governance, and data management frameworks. Strong understanding of data lineage, metadata, and documentation standards. Experience working with insurance or financial services data. Familiarity with reporting tools and platforms: SQL, PowerBI, Tableau, Python. Ability to work with cross-functional teams and influence senior stakeholders. Experience designing scalable data architectures for global organisations. Knowledge of regulatory reporting requirements in insurance. Experience … strategy, governance, and data management frameworks. Strong understanding of data lineage, metadata, and documentation standards. Experience working with insurance or financial services data. Familiarity with reporting tools and platforms: SQL, PowerBI, Tableau, Python. Ability to work with cross-functional teams and influence senior stakeholders. Experience designing scalable data architectures for global organisations. Knowledge of regulatory reporting requirements in insurance. Experience More ❯
architecture framework. Collaborate with internal teams leading governance and process standardisation. Define best practices for data lineage, metadata management, and documentation. Advise on integration and harmonisation of reporting platforms (SQL, PowerBI, Tableau, Python). Support the creation of robust controls and standardised processes for regulatory reporting. Ensure architectural recommendations align with business goals and regulatory requirements. Essential Experience: Previous experience … strategy, governance, and data management frameworks. Strong understanding of data lineage, metadata, and documentation standards. Experience working with insurance or financial services data. Familiarity with reporting tools and platforms: SQL, PowerBI, Tableau, Python. Ability to work with cross-functional teams and influence senior stakeholders. Experience designing scalable data architectures for global organisations. Knowledge of regulatory reporting requirements in insurance. Experience … strategy, governance, and data management frameworks. Strong understanding of data lineage, metadata, and documentation standards. Experience working with insurance or financial services data. Familiarity with reporting tools and platforms: SQL, PowerBI, Tableau, Python. Ability to work with cross-functional teams and influence senior stakeholders. Experience designing scalable data architectures for global organisations. Knowledge of regulatory reporting requirements in insurance. Experience More ❯
build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and data science initiatives. Key Responsibilities … Design and implement scalable data models and transformation pipelines using DBT on Snowflake. Write efficient and maintainable PL/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity … and enhance CI/CD pipelines for DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (data warehousing, performance tuning, security). Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages. Solid understanding of More ❯
build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and data science initiatives. Key Responsibilities … Design and implement scalable data models and transformation pipelines using DBT on Snowflake. Write efficient and maintainable PL/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity … and enhance CI/CD pipelines for DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (data warehousing, performance tuning, security). Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages. Solid understanding of More ❯
london (city of london), south east england, united kingdom
Capgemini
build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and data science initiatives. Key Responsibilities … Design and implement scalable data models and transformation pipelines using DBT on Snowflake. Write efficient and maintainable PL/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity … and enhance CI/CD pipelines for DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (data warehousing, performance tuning, security). Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages. Solid understanding of More ❯
Maximo implementations (ideally managing teams if at the senior level) Proven experience working within asset intensive industries (Utilities, Transportation, Oil & Gas, Manufacturing etc.) Hands-on experience with databases (Oracle, SQL Server, DB2) and SQL scripting. Proficiency in at least one programming language (e.g., Java, Python, or Linux/Solaris). Excellent communication and problem-solving skills. More ❯
Maximo implementations (ideally managing teams if at the senior level) Proven experience working within asset intensive industries (Utilities, Transportation, Oil & Gas, Manufacturing etc.) Hands-on experience with databases (Oracle, SQL Server, DB2) and SQL scripting. Proficiency in at least one programming language (e.g., Java, Python, or Linux/Solaris). Excellent communication and problem-solving skills. More ❯
london (city of london), south east england, united kingdom
Codex
Maximo implementations (ideally managing teams if at the senior level) Proven experience working within asset intensive industries (Utilities, Transportation, Oil & Gas, Manufacturing etc.) Hands-on experience with databases (Oracle, SQL Server, DB2) and SQL scripting. Proficiency in at least one programming language (e.g., Java, Python, or Linux/Solaris). Excellent communication and problem-solving skills. More ❯
Maximo implementations (ideally managing teams if at the senior level) Proven experience working within asset intensive industries (Utilities, Transportation, Oil & Gas, Manufacturing etc.) Hands-on experience with databases (Oracle, SQL Server, DB2) and SQL scripting. Proficiency in at least one programming language (e.g., Java, Python, or Linux/Solaris). Excellent communication and problem-solving skills. More ❯
experience in quantitative analytics or data modelling Deep understanding of predictive modelling, machine-learning, clustering and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau More ❯
experience in quantitative analytics or data modelling Deep understanding of predictive modelling, machine-learning, clustering and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau More ❯
able to work with a loose brief and deliver tangible results. Strong communicator, comfortable explaining what the data means to colleagues at all levels. Experience with Power BI or SQL would be beneficial but not essential. Why Apply? Join a stable, well-respected organisation where your analysis directly supports operational and performance improvement. Opportunity to modernise and automate legacy reporting More ❯
modelling tools (Visio, Lucidchart, Miro, Jira, Confluence). Skilled in documenting use cases, user stories, acceptance criteria, and process maps. Familiarity with data governance, security, integration principles, and basic SQL/reporting for validation. Core Strengths Analytical Thinking: Clarifies complex requirements using structured approaches. Stakeholder Management: Builds consensus across multi-level teams. Business Acumen: Understands finance, HR, and operational workflows. More ❯
modelling tools (Visio, Lucidchart, Miro, Jira, Confluence). Skilled in documenting use cases, user stories, acceptance criteria, and process maps. Familiarity with data governance, security, integration principles, and basic SQL/reporting for validation. Core Strengths Analytical Thinking: Clarifies complex requirements using structured approaches. Stakeholder Management: Builds consensus across multi-level teams. Business Acumen: Understands finance, HR, and operational workflows. More ❯
modelling tools (Visio, Lucidchart, Miro, Jira, Confluence). Skilled in documenting use cases, user stories, acceptance criteria, and process maps. Familiarity with data governance, security, integration principles, and basic SQL/reporting for validation. Core Strengths Analytical Thinking: Clarifies complex requirements using structured approaches. Stakeholder Management: Builds consensus across multi-level teams. Business Acumen: Understands finance, HR, and operational workflows. More ❯
modelling tools (Visio, Lucidchart, Miro, Jira, Confluence). Skilled in documenting use cases, user stories, acceptance criteria, and process maps. Familiarity with data governance, security, integration principles, and basic SQL/reporting for validation. Core Strengths Analytical Thinking: Clarifies complex requirements using structured approaches. Stakeholder Management: Builds consensus across multi-level teams. Business Acumen: Understands finance, HR, and operational workflows. More ❯
modelling tools (Visio, Lucidchart, Miro, Jira, Confluence). Skilled in documenting use cases, user stories, acceptance criteria, and process maps. Familiarity with data governance, security, integration principles, and basic SQL/reporting for validation. Core Strengths Analytical Thinking: Clarifies complex requirements using structured approaches. Stakeholder Management: Builds consensus across multi-level teams. Business Acumen: Understands finance, HR, and operational workflows. More ❯
modelling tools (Visio, Lucidchart, Miro, Jira, Confluence). Skilled in documenting use cases, user stories, acceptance criteria, and process maps. Familiarity with data governance, security, integration principles, and basic SQL/reporting for validation. Core Strengths Analytical Thinking: Clarifies complex requirements using structured approaches. Stakeholder Management: Builds consensus across multi-level teams. Business Acumen: Understands finance, HR, and operational workflows. More ❯
Shops & Onsite Shop, Sports & Social Club and More Skills Required: Ideally degree qualified in a STEM or Data Engineering subject With experience in the following subject, technologies and tool: SQL technologies skills (e.g. MS SQL, Oracle) NoSQL technologies skills (e.g. MongoDB, InfluxDB, Neo4J) Data exchange and processing skills (e.g. ETL, ESB, API) Development (e.g. Python) skills Big data technologies knowledge More ❯
modelling tools (Visio, Lucidchart, Miro, Jira, Confluence). Skilled in documenting use cases, user stories, acceptance criteria, and process maps. Familiarity with data governance, security, integration principles, and basic SQL/reporting for validation. Core Strengths Analytical Thinking: Clarifies complex requirements using structured approaches. Stakeholder Management: Builds consensus across multi-level teams. Business Acumen: Understands finance, HR, and operational workflows. More ❯
with overall business objectives. Evaluate emerging technologies and recommend areas for continuous improvement. Design, build, and maintain API integrations to enable seamless data flow between key business systems. Use SQL, JSON, OData, and REST APIs to manage and develop data warehousing solutions in line with best practices. Leverage Microsoft Power Apps and related tools to enhance system functionality and user … and Azure cloud services. Familiarity with legal sector applications (e.g. iManage, 3E, Intapp, Mimecast, Tessian, Litera) is advantageous. Experience with API development, data integration, and automation tools. Proficiency in SQL, JSON, OData, and REST APIs; experience with data warehousing best practices. Working knowledge of Microsoft Power Apps and low-code/no-code integration tools. Strong grasp of IT security More ❯