|
1 to 8 of 8 Permanent PySpark Jobs in London with Remote Work Options
London Area, United Kingdom Hybrid / WFH Options Tata Consultancy Services
GitHub for version control, you will champion DevOps practices to ensure seamless collaboration and automation across the data engineering lifecycle. Your proficiency in SQL, PySpark, and Python will be helpful in transforming raw data into valuable insights, while your familiarity with Kafka will enable real-time data processing capabilities. … responsibilities: Lead the design, development, and maintenance of Azure-based data pipelines and analytical solutions using Databricks, Synapse, and other relevant services. Leverage SQL, PySpark, and Python to perform data transformations, aggregations, and analysis on large datasets. Architect data storage solutions using Azure SQL Database, Azure Data Lake Storage more »
Greater London, England, United Kingdom Hybrid / WFH Options Agora Talent
come with scaling a company • The ability to translate complex and sometimes ambiguous business requirements into clean and maintainable data pipelines • Excellent knowledge of PySpark, Python and SQL fundamentals • Experience in contributing to complex shared repositories. What’s nice to have: • Prior early-stage B2B SaaS experience involving client more »
London Area, United Kingdom Hybrid / WFH Options Tata Consultancy Services
vision to convert data requirements into logical models and physical solutions. with data warehousing solutions (e.g. Snowflake) and data lake architecture, Azure Databricks/ Pyspark & ADF. retail data model standards - ADRM. communication skills, organisational and time management skills. to innovate, support change and problem solve. attitude, with the ability more »
London Area, United Kingdom Hybrid / WFH Options HENI
scientific Python toolset. Our tech stack includes Airbyte for data ingestion, Prefect for pipeline orchestration, AWS Glue for managed ETL, along with Pandas and PySpark for pipeline logic implementation. We utilize Delta Lake and PostgreSQL for data storage, emphasizing the importance of data integrity and version control in our … testing frameworks. Proficiency in Python and familiarity with modern software engineering practices, including 12factor, CI/CD, and Agile methodologies. Deep understanding of Spark ( PySpark), Python (Pandas), orchestration software (e.g. Airflow, Prefect) and databases, data lakes and data warehouses. Experience with cloud technologies, particularly AWS Cloud services, with a more »
London Area, United Kingdom Hybrid / WFH Options MBN Solutions
Quality and Information Security principles Experience with Azure, ETL Tools such as ADF and Databricks Advanced Database and SQL skills, alng with SQL, Python, Pyspark, Spark SQL Strong understanding of data model design and implementation principles Data warehousing design patterns and implementation Benefits : £50-£60k DOE Mainly home based more »
london, south east england, United Kingdom Hybrid / WFH Options MBN Solutions
Quality and Information Security principles Experience with Azure, ETL Tools such as ADF and Databricks Advanced Database and SQL skills, alng with SQL, Python, Pyspark, Spark SQL Strong understanding of data model design and implementation principles Data warehousing design patterns and implementation Benefits : £50-£60k DOE Mainly home based more »
london, south east england, United Kingdom Hybrid / WFH Options Tata Consultancy Services
GitHub for version control, you will champion DevOps practices to ensure seamless collaboration and automation across the data engineering lifecycle. Your proficiency in SQL, PySpark, and Python will be helpful in transforming raw data into valuable insights, while your familiarity with Kafka will enable real-time data processing capabilities. … responsibilities: Lead the design, development, and maintenance of Azure-based data pipelines and analytical solutions using Databricks, Synapse, and other relevant services. Leverage SQL, PySpark, and Python to perform data transformations, aggregations, and analysis on large datasets. Architect data storage solutions using Azure SQL Database, Azure Data Lake Storage more »
london, south east england, United Kingdom Hybrid / WFH Options Agora Talent
come with scaling a company • The ability to translate complex and sometimes ambiguous business requirements into clean and maintainable data pipelines • Excellent knowledge of PySpark, Python and SQL fundamentals • Experience in contributing to complex shared repositories. What’s nice to have: • Prior early-stage B2B SaaS experience involving client more »
|
Salary Guide PySpark London - 10th Percentile
- £52,500
- 25th Percentile
- £57,500
- Median
- £70,000
- 75th Percentile
- £93,438
- 90th Percentile
- £107,500
|