3 of 3 Remote/Hybrid Apache Jobs in the North East

Lead Test Engineer (SC Cleared)

Hiring Organisation
scrumconnect ltd
Location
City, Newcastle Upon Tyne, United Kingdom
Employment Type
Permanent
Salary
GBP 55,000 Annual
looking for a Test Engineer to join a large-scale cloud data engineering programme, operating across a modern AWS-native technology stack including Apache Airflow, Amazon Athena, AWS Glue, S3, EMR, and DynamoDB. You will own testing across automated pipelines, data workflows, and cloud infrastructure - identifying risks, championing test … Development Identify, evaluate, and implement new test frameworks Enhance existing frameworks to improve testing confidence and coverage Pipeline & Data Testing Validate data pipelines using Apache Airflow, AWS Glue, Athena, and EMR Ensure data integrity, transformation accuracy, and performance under load Analyse data in multiple formats to validate new functionality ...

SC Cleared Python Developer - AWS - £500 - IR35 - Occasional office - Newcastle

Hiring Organisation
RecOps
Location
City, Newcastle Upon Tyne, United Kingdom
Employment Type
Contract
Contract Rate
GBP 500 Daily
inside IR35 & is fully remote with occasional office meetings in Leeds. Must have SC clearance. Key Skills required: Python AWS Terraform Apache Spark Airflow Docker Gitlab Security Scanning Agents (eg Trivy, Trend Micro, Wiz etc) Jupyter Notebooks If the above sounds like you, please apply now for immediate consideration. ...

Data Engineer - Snowflake

Hiring Organisation
DXC Technology
Location
Newcastle upon Tyne, Tyne and Wear, United Kingdom
Employment Type
Full Time
Salary
50000-80000
Build and maintain robust data pipelines using Snowpipe and COPY INTO. Develop modular, testable data transformations with dbt. Orchestrate workflows and manage dependencies using Apache Airflow. Leverage Snowpark for advanced data applications and processing. Use Terraform to manage infrastructure as code. Implement CI/CD pipelines to streamline deployments. … Build and maintain robust data pipelines using Snowpipe and COPY INTO. Develop modular, testable data transformations with dbt. Orchestrate workflows and manage dependencies using Apache Airflow. Leverage Snowpark for advanced data applications and processing. Use Terraform to manage infrastructure as code. Implement CI/CD pipelines to streamline deployments. ...