4 of 4 PySpark Jobs in the North West

Lead Data Engineer

Hiring Organisation
TPXImpact Holdings Plc
Location
Manchester, North West, United Kingdom
Employment Type
Permanent
Salary
£65,000
e.g. Power BI) Deep understanding of data wareho using concepts, ETL/ELT pipelines and dimensional modelling Proficiency in advanced programming languages (Python/PySpark, SQL) Experience in data pipeline orchestration (e.g. Airflow, Data Factory) Familiarity with DevOps and CI/CD practices (Git, Azure DevOps etc) Ability ...

Data Engineer

Hiring Organisation
Hyperloop Recruitment
Location
Liverpool, Merseyside, United Kingdom
Employment Type
Permanent
Salary
GBP 55,000 Annual
development and optimisation of scalable data and pipelines in a cloud-based environment, building and managing ETL processes using tools such as Azure, Databricks, PySpark and SQL click apply for full job details ...

Senior Database Engineer

Hiring Organisation
Conferma Ltd
Location
Manchester, North West, United Kingdom
Employment Type
Permanent
Salary
£70,000
. Maintain/implement data warehousing solutions and manage large-scale data storage systems (e.g. Microsoft Fabric) Build and optimise SQL queries, stored procedures, PySpark notebooks and database objects to ensure data performance and reliability. Migrate and modernise legacy databases to cloud-based architectures. Database Administration Administer, monitor … normalisation, indexing, query optimisation). Strong experience with ETL/ELT tools. E.g. Azure Data Factory, Databricks, Synapse Pipelines, SSIS, etc. Experience with Python, PySpark, or Scala for data processing. Familiarity with CI/CD practices. Experience with Data lake, Data warehouse and Medallion architectures. Understanding of API integrations ...

Data Engineer

Hiring Organisation
Hyperloop Recruitment
Location
Liverpool, Merseyside, North West, United Kingdom
Employment Type
Permanent, Work From Home
Salary
£55,000
development and optimisation of scalable data and pipelines in a cloud-based environment, building and managing ETL processes using tools such as Azure, Databricks, PySpark and SQL. The role would suit a strong Data Engineer with 2+ years' commercial experience working with cloud-based data tech (Azure & AWS), Python … relevant languages for ETL), relational and non-relational databases. Key skills required: Proficiency in implementing ETL pipelines Azure/AWS cloud proficiency Python/PySpark Strong SQL skills for database management SSRS/SSIS Ability to develop, optimise & maintain complex SQL queries & stored procedures (T-SQL) The role ...