11 of 11 Apache Jobs in the North West

Senior Data Engineer

Hiring Organisation
SRG
Location
Manchester, Lancashire, England, United Kingdom
Employment Type
Full-Time
Salary
£70,000 - £80,000 per annum
experience working with data warehouses. Solid understanding of ETL/ELT design and data pipeline optimisation. Experience with big data frameworks such as Apache Spark, and exposure to streaming technologies like Kafka. Strong problem-solving skills and the ability to collaborate effectively across teams. What's on offer: Opportunity ...

Senior Data Engineer Python AWS

Hiring Organisation
Client Server
Location
Cheshire East, Cheshire, UK
have strong hands-on experience of building scalable data pipelines in cloud based environments using tools such as DBT, AWS Glue, AWS Lake Formation, Apache Spark and Amazon Redshift You have a good knowledge of data modelling, ELT design patterns, data governance and security best practices You're collaborative ...

Data Engineer

Hiring Organisation
Peaple Talent
Location
Manchester Area, United Kingdom
Azure or AWS Strong experience designing and delivering data solutions in Databricks Proficient with SQL and Python Experience using Big Data technologies such as Apache Spark or PySpark Great communication skills, effectively participating with Senior Stakeholders Nice to haves: Azure/AWS Data Engineering certifications Databricks certifications What ...

Systems Administrator

Hiring Organisation
Applause IT Recruitment Ltd
Location
Crewe, Cheshire, United Kingdom
Employment Type
Permanent
Linux estate while supporting and improving cloud infrastructure in AWS. Day-to-day, you'll be: Supporting and maintaining a Linux server environment (Ubuntu, Apache/Nginx, MySQL/Postgres) Supporting and developing AWS infrastructure (EC2, ECS, Lambda, VPC, Route53, S3, RDS, CloudWatch) Automating builds, scaling, patching and monitoring ...

Systems Operations Engineer, Linux, AWS

Hiring Organisation
OCC Group
Location
Crewe, Cheshire, United Kingdom
Employment Type
Permanent
patching -Collaborating with Dev & DevOps on infrastructure needs -Identifying improvements to efficiency and service quality -Documenting systems & upholding security standards Your toolkit Linux Ubuntu Apache, PHP, MySQL, PostgreSQL, Nginx, PostfixGit AWS EC2, ECS, Lambda, VPC, Route53, S3RDS, CloudWatch CloudFormation Automation & config Puppet, Ansible, Terraform, scripting langs Python/Bash ...

Automation Engineer

Hiring Organisation
RealityMine
Location
Trafford Park, Greater Manchester, UK
language used for automation (e.g. Python, JavaScript, TypeScript). · Experience integrating automated tests into CI/CD pipelines, or using orchestration platforms such as Apache Airflow · Practical experience working with real and/or virtual devices (Android, iOS, emulators/simulators) and dealing with the challenges of scale, stability ...

Data Platform Engineer

Hiring Organisation
RealityMine
Location
Trafford Park, Greater Manchester, UK
looking for: · 3+ years of professional Python development experience as a Data Platform Engineer, using AWS. · Experience developing and deploying Serverless technologies, provisioning Apache Airflow, and using infrastructure as code such as AWS CloudFormation or Terraform. · Takes a collaborative approach to problem solving and breaking down complex problems, whilst … with software engineering requirements and specifications. · Good interpersonal skills, positive can-do attitude and willing to help other members of the team. · Understanding of Apache Spark, Apache Trino or other big data and machine learning systems. · Exposure to using AI tools to enhance productivity and quality of coding ...

Implementation Consultant

Hiring Organisation
Heywood
Location
Altrincham, England, United Kingdom
coding skills (any version, but we use Oracle) Experience coding in at least one dynamic scripting language (e.g., Python, Node JS, PowerShell or Apache Groovy) Business reporting intelligence experience (using tools like BIRT, Tableau or Power BI) Knowledge of REST APIs and standard HTTP methods and data migration; either ...

Data Engineer

Hiring Organisation
Searchability NS&D
Location
Manchester Area, United Kingdom
hold active Enhanced DV Clearance (West) Competitive Salary DOE - 6% bonus, 25 days holiday, clearance bonus Experience in Data Pipelines, ETL processing, Data Integration, Apache, SQL/NoSQL Who Are We? Our client is a trusted and growing supplier to the National Security sector, delivering mission-critical solutions that … Engineer Should Have: Active eDV clearance (West) Willingness to work full-time on-site in Manchester when required. Required technical experience in the following: Apache Kafka Apache NiFI SQL and noSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java To be Considered: Please either ...

Data Engineer

Hiring Organisation
Searchability®
Location
Greater Manchester, England, United Kingdom
optimise scalable data pipelines within a Databricks Lakehouse environment. You’ll work with both streaming and batch data pipelines using technologies such as Apache Spark Structured Streaming, Delta Live Tables, and Kafka, ensuring data flows efficiently and reliably across the organisation. Working closely with BI, product, and engineering teams …/7 Collaborative office environment with regular social and charity events And Much More!!! DATA ENGINEER – ESSTENTIAL SKILLS Strong experience with Databricks and Apache Spark (PySpark or Scala) Experience building streaming and batch pipelines using Spark Structured Streaming Hands-on experience with Kafka (MSK) and real-time data ingestion ...

Python Developer

Hiring Organisation
Maxwell Bond
Location
Manchester, United Kingdom
Employment Type
Contract
Contract Rate
£400/day Outside IR35
requirements 3+ years of Python in production environments Strong AWS experience across core services Strong SQL skills for querying, transformation, and optimisation Experience with Apache Spark or serverless architectures Core responsibilities and skills Build and maintain real-time and batch ETL pipelines using Python on large, complex datasets Write … across large data volumes Model and transform data into well-structured, reusable datasets for downstream use Work with distributed data processing tools such as Apache Spark to handle high-volume workloads Improve existing pipelines by identifying bottlenecks and refactoring code and queries Develop and maintain data workflows, scheduling ...