Cardiff, South Glamorgan, Wales, United Kingdom Hybrid/Remote Options
Artis Recruitment
closely with cross-functional teams to deliver high-quality technical solutions Participate in secure-by-design development practices and code reviews Support CI/CD workflows and assist with performanceoptimisation Provide guidance to junior team members where required Ideal Background: 5+ years’ commercial full stack development experience Degree in Computer Science or similar discipline Strong capability with More ❯
time dashboards. You'll have the chance to: Build responsive, scalable frontend applications in Angular & TypeScript Collaborate with .NET/Cloud Engineer, & contribute to CI/CD pipelines & frontend performanceoptimisation Join at a carbon-negative scale-up pivotal point of growth, with stacks of progression opportunity Salary to £55k + bonus + shares + 25 days' leave More ❯
time dashboards. You’ll have the chance to: Build responsive, scalable frontend applications in Angular & TypeScript Collaborate with .NET/Cloud Engineer, & contribute to CI/CD pipelines & frontend performanceoptimisation Join at a carbon-negative scale-up pivotal point of growth, with stacks of progression opportunity Salary to £55k + bonus + shares + 25 days' leave More ❯
Cardiff, Wales, United Kingdom Hybrid/Remote Options
Undisclosed
implementing the same on the system with business teams Implementation of Standard Operating Procedure(SOP)s Communication between Client and Offshore team Providing Solutions for Business Requirements and Applying Performance Optimization Techniques. Perform Code Review for Customs developed Programs and Completing end-end to testing. Leading the ABAP team and involved in project Management Design and develop Incoming and More ❯
solving complex challenges, optimising large-scale systems, and influencing strategy within a collaborative, forward-thinking team. What you’ll be doing: Lead the design and implementation of scalable, high-performance data architectures and pipelines. Define and enforce best practices for data engineering, including coding standards, testing, and documentation. Mentor and guide engineers, fostering collaboration and technical excellence. Translate complex … business requirements into reliable, well-structured data solutions. Optimise data workflows for performance, reliability, and cost efficiency. Drive adoption of modern data tools and technologies across the organisation. Ensure robust data governance, security, and compliance. Troubleshoot and resolve complex data issues, delivering long-term solutions. Work with analytics, product, and engineering teams to support advanced analytics and machine learning … skills you’ll need: Extensive experience designing and building large-scale data pipelines and ETL processes. Strong proficiency in SQL and Python. Deep understanding of data modelling, warehousing, and performance optimisation. Proven experience with cloud platforms (AWS, Azure, or GCP) and their data services. Hands-on experience with big data frameworks (e.g. Apache Spark, Hadoop). Strong knowledge of More ❯
solving complex challenges, optimising large-scale systems, and influencing strategy within a collaborative, forward-thinking team. What you'll be doing: Lead the design and implementation of scalable, high-performance data architectures and pipelines. Define and enforce best practices for data engineering, including coding standards, testing, and documentation. Mentor and guide engineers, fostering collaboration and technical excellence. Translate complex … business requirements into reliable, well-structured data solutions. Optimise data workflows for performance, reliability, and cost efficiency. Drive adoption of modern data tools and technologies across the organisation. Ensure robust data governance, security, and compliance. Troubleshoot and resolve complex data issues, delivering long-term solutions. Work with analytics, product, and engineering teams to support advanced analytics and machine learning … skills you'll need: Extensive experience designing and building large-scale data pipelines and ETL processes. Strong proficiency in SQL and Python. Deep understanding of data modelling, warehousing, and performance optimisation. Proven experience with cloud platforms (AWS, Azure, or GCP) and their data services. Hands-on experience with big data frameworks (e.g. Apache Spark, Hadoop). Strong knowledge of More ❯