Senior Data Engineer
Join Kainos and Shape the Future At Kainos, we’re problem solvers, innovators, and collaborators - driven by a shared mission to create real impact. Whether we’re transforming digital services for millions, delivering cutting-edge Workday solutions, or pushing the boundaries of technology, we do it together. We believe in a people-first culture , where your ideas are valued, your growth is supported, and your contributions truly make a difference. Here, you’ll be part of a diverse, ambitious team that celebrates creativity and collaboration. Ready to make your mark? Join us and be part of something bigger. MAIN PURPOSE OF THE ROLE & RESPONSIBILITIES IN THE BUSINESS: As a Senior Data Engineer (Senior Associate) at Kainos, you will be responsible or designing and developing data processing and data persistence software components for solutions which handle data at scale. Working in agile teams, Senior Data Engineers provide strong development leadership and take responsibility for significant technical components of data systems . You will work within a multi-skilled agile team to design and develop large-scale data processing software to meet user needs in demanding production environments. Your Responsibilities Will Include
- Working to develop data processing software primarily for deployment in Big Data technologies. The role encompasses the full software lifecycle including design, code, test and defect resolution.
- Working with Architects and Lead Engineers to ensure the software supports non-functional needs.
- Collaborating with colleagues to resolve implementation challenges and ensure code quality and maintainability remains high. Leads by example in code quality.
- Working with operations teams to ensure operational readiness
- Advising customers and managers on the estimated effort and technical implications of user stories and user journeys.
- Coaching and mentoring team members.
- Strong software development experience in one of Java, Scala, or Python
- Software development experience with data-processing platforms from vendors such as AWS, Azure, GCP, Databricks.
- Experience of developing substantial components for large-scale data processing solutions and deploying into a production environment
- Proficient in SQL and SQL extensions for analytical queries
- Solid understanding of ETL/ELT data processing pipelines and design patterns
- Aware of key features and pitfalls of distributed data processing frameworks, data stores and data serialisation formats
- Able to write quality, testable code and has experience of automated testing
- Experience with Continuous Integration and Continuous Deployment techniques
- A Keen interest in AI Technologies
- Experience of performance tuning
- Experience of data visualisation and complex data transformations
- Experience with steaming and event-processing architectures including technologies such as Kafka and change-data-capture (CDC) products
- Expertise in continuous improvement and sharing input on data best practice
- Practical experience with AI technologies, tools, processes and delivery