Data Engineer

Role: Data Engineer

Location: Glasgow

Hybrid Work:3 days per week and 2x WFH

Salary: Competitive but around £50,000 - £60,000

Tech stack: Python, AWS, CI,CD, ETL, Data Warehouse

We CANNOT sponsor or accept anyone on a PSW or Graduate Visa.

**This Role is Exclusive to Cortech so you MUST apply via this advert **

We are looking for a AI engineer someone that is capable of deploying and maintaining infrastructure:

  • Python (must)
  • API design and frameworks FastAPI ideally (but Flask and Django, in said order, would indicate this)
  • Experience with AWS ideally (Azure or GCP would be a good indication)
  • Experience with Infrastructure as Code, CDK ideally (Terraform or Serverless potentially)

Great to have but can be taught:

  • Data and stream processing - AWS Firehose and ETL platforms
  • Experience with authentication frameworks
  • CI/CD GitHub actions (GitLab, TeamCity, CircleCI)

This is data related software development, responsible for whole lifecycle and developed end point for API data pipeline.

Must be able to write infrastructure code in terraform or AWS but can be taught.

We design and develop across a full stack of disciplines – Mechanical, Electronic, Electrical and Software Engineering – within the Digital team we develop software for IoT edge devices, cloud services, frontend UI, AI/ML models in computer vision, and Data Analysis.

We are seeking a talented and enthusiastic Data Engineer to join our talented AI/ML team. We are a medium-sized enterprise so you will be working closely with everyone in the business. If this kind of direct visibility and opportunity to shine through your collaboration and merit appeals, this is the place for you.

As a Data Engineer, you will have the opportunity to work closely with experienced professionals and gain valuable hands-on experience across the entire product development lifecycle.

Responsibilities of the role

  • Collaborate with cross-functional teams (e.g., data scientists, software engineers, ML/AI Engineers and product managers) to translate business requirements into technical specifications and deliver impactful solutions.
  • Develop and maintain robust and scalable data pipelines using AWS services (e.g., SageMaker, EC2, S3, Lambda) and other relevant technologies.
  • Stay abreast of the latest advancements in data pipeline research and explore new opportunities to apply these innovations to our business.
  • Contribute to the development and improvement of our infrastructure and best practices.

Experience & Skills:

  • Master's or Ph.D. in Computer Science, Computer Engineering, or a related field.
  • Proficiency in Python
  • Strong experience with AWS services, including SageMaker, EC2, S3, Lambda, etc.
  • Experience with cloud-native development and deployment methodologies.
  • Ability to work independently and as part of a collaborative team.

General Skills

  • Excellent problem-solving skills and the ability to think creatively to overcome technical challenges.
  • A passion for learning and staying updated with the latest industry trends and best practices.
  • Strong communication and teamwork skills, with the ability to effectively collaborate with cross-functional teams, your default should be Openness and transparency.
  • Desire to take the initiative and self-start when necessary.
  • Flexibility, we pride ourselves on doing what is necessary to make the whole organisation successful.

Bonus Points:

• Experience with distributed computing and large-scale data processing.

How to apply?

Please send a CV to danni@cortechtalentsolutions.co.uk

Job Details

Company
Cortech Talent Solutions Ltd
Location
Glasgow, Scotland, United Kingdom
Hybrid / Remote Options
Posted