|
1 to 25 of 29 Permanent PySpark Jobs in London
London Area, United Kingdom Hybrid / WFH Options Durlston Partners
building a massively distributed cloud-hosted data platform that would be used by the entire firm. Tech Stack: Python/Java, Dremio, Snowflake, Iceberg, PySpark, Glue, EMR, dbt and Airflow/Dagster. Cloud: AWS, Lambdas, ECS services This role would focus on various areas of Data Engineering including: End more »
London Area, United Kingdom Hexegic
advice to analytical users on how they can access and utilise the new datasets. Qualities Comfortable with Python - ideally experience with Apache Spark and Pyspark Previous data analytics software experience Able to scope new integrations and translate analytical user needs into technical requirement. UK based – data analytics system can more »
London Area, United Kingdom Prism Digital
Lead Azure Data Engineer | PySpark ( Python) & Synapse | Tech for Good/Charity Rate: £500-650 per day Duration: 6-12 months IR35: Outside Location: Remote with occasional travel to London (Once every two months max) Essential skills required: Azure – solid experience required of the Azure Data ecosystem Python - ESSENTIAL … as PySpark is used heavily. You will be tested on PySpark Azure Synapse - ESSENTIAL as is used heavily Spark Azure Data Lake/Data Bricks/Data Factory Be happy to act as a lead and mentor to the other permanent Azure Data Engineers This is the chance … tool experience. Familiar with building Catalogs and lineage This is an urgent contract, so if you are interested apply ASAP. Lead Azure Data Engineer | PySpark ( Python), Synapse, Data Lake | Tech for Good/Charity more »
London Area, United Kingdom Hexegic
incorrect or not received on time. outages with the end users of a data pipeline What We Value reading and writing code in Python, Pyspark and Java. understanding of Spark and interested in learning the basics of tuning Spark jobs. pipeline monitoring team members should be able to use more »
Chiswick, England, United Kingdom Square One Resources
SQL Server and relational databases. Solid understanding of the Azure data engineering stack, including Azure Synapse and Azure Data Lake. Programming skills in Python, PySpark, and T-SQL. Nice to haves: Familiarity with broader Azure Data Solutions, such as Azure ML Studio. Previous experience with Azure DevOps and knowledge more »
London Area, United Kingdom Hybrid / WFH Options MBN Solutions
Quality and Information Security principles Experience with Azure, ETL Tools such as ADF and Databricks Advanced Database and SQL skills, alng with SQL, Python, Pyspark, Spark SQL Strong understanding of data model design and implementation principles Data warehousing design patterns and implementation Benefits : £50-£60k DOE Mainly home based more »
London Area, United Kingdom Hybrid / WFH Options Janus Henderson Investors
and ensuring best practices and understood and followed. Technical Skills and Qualifications Expert knowledge in python including libraries/frameworks such as pandas, numpy, pyspark Good understanding of OOP, software design patterns, and SOLID principles Good experience in Docker Good experience in Linux Good experience in Airflow Good knowledge more »
London Area, United Kingdom Stanbrook Consulting
or similar frameworks. Problem-Solving : Strong problem-solving skills and the ability to think quickly in a dynamic environment. Technical Skills: Proficiency in Python, PySpark, Synapse, DataBricks Strong SQL skills. CI/CD practices PyTest or similar Desirable Skills and Experience: Azure/Cloud services Jenkins, Octopus, and Git. more »
London Area, United Kingdom Harrington Starr
of Python Experience developing in the cloud (AWS preferred) Solid understanding of libraries like Pandas and NumPy Experience in data warehousing tools like Snowflake, Pyspark, Databricks Commercial experience with performant database programming in SQL Capability to solve complex technical issues, comprehending risks prior to the circumstance. Please apply today more »
Greater London, England, United Kingdom Mars
understand consumers. Hands-on data engineering/development experience, preferably in a cloud/big data environment Skilled in at least one of Python, PySpark, SQL or similar Experience in guiding or managing roles in insight or data functions, delivering data projects and insight to inspire action and drive more »
London Area, United Kingdom Tredence Inc
Azure Cloud platform Knowledge on orchestrating workloads on cloud Ability to set and lead the technical vision while balancing business drivers Strong experience with PySpark, Python programming Proficiency with APIs, containerization and orchestration is a plus Qualifications: Bachelor's and/or master’s degree About you: You are more »
London Area, United Kingdom Mars
understand consumers. Hands-on data engineering/development experience, preferably in a cloud/big data environment Skilled in at least one of Python, PySpark, SQL or similar Experience in guiding or managing roles in insight or data functions, delivering data projects and insight to inspire action and drive more »
London Area, United Kingdom Hybrid / WFH Options Michael James Associates
can offer you exposure to the latest technologies. We are looking for a senior Data Engineer who has solid Python skills as well as Pyspark, DataBricks and SQL, as well as Data Modeling, and Azure Data Factors . Azure Devops would be a distinct advantage. Strong communication and business more »
London Area, United Kingdom Hybrid / WFH Options Tata Consultancy Services
GitHub for version control, you will champion DevOps practices to ensure seamless collaboration and automation across the data engineering lifecycle. Your proficiency in SQL, PySpark, and Python will be helpful in transforming raw data into valuable insights, while your familiarity with Kafka will enable real-time data processing capabilities. … responsibilities: Lead the design, development, and maintenance of Azure-based data pipelines and analytical solutions using Databricks, Synapse, and other relevant services. Leverage SQL, PySpark, and Python to perform data transformations, aggregations, and analysis on large datasets. Architect data storage solutions using Azure SQL Database, Azure Data Lake Storage more »
London Area, United Kingdom Axtria - Ingenious Insights
Mart. Utilize Vector Databases, Cosmos DB, Redis, and Elasticsearch for efficient data storage and retrieval. Demonstrate proficiency in programming languages including Python, Spark, Databricks, Pyspark, SQL, and ML Algorithms. Implement Machine Learning models and algorithms using Pyspark, Scikit Learn, and other relevant tools. Manage Azure DevOps, CI/… environments, Azure Data Lake, Azure Data Factory, Microservices architecture. Experience with Vector Databases, Cosmos DB, Redis, Elasticsearch. Strong programming skills in Python, Spark, Databricks, Pyspark, SQL, ML Algorithms, Gen AI. Knowledge of Azure DevOps, CI/CD pipelines, GitHub, Kubernetes (AKS). Experience with ML/OPS tools such more »
London Area, United Kingdom Simon James IT Ltd
systems and offer improvements that will help reduce technical/code/engineering debt. Key Skills: Extensive experience with Machine Learning and Spark/ PySpark Recommendation systems, pattern recognition, data mining, artificial intelligence Modern Parallel Computing; distributed clusters, multicore servers, GPU’s Experience with developing machine learning models at more »
London Area, United Kingdom InfoCepts
and industry standards for the organization. Strong experience on Azure cloud services like Azure, ADF, ADLS, Synapse Proficiency in querying languages such as SQL, Pyspark, Python and familiarity with data visualization tools (e.g. Power BI). Strong communication skills to gather the business requirements from stakeholder and propose best more »
London Area, United Kingdom Higher - AI recruitment
experience-related problems such as workforce management, demand forecasting, or root cause analysis Strong visualisation skills including experience with Tableau Familiarity with Databricks and PySpark for data manipulation and analysis Familiarity with Git-based source control methodologies, including branching and pull requests A self-starter, passionate about converting data more »
Greater London, England, United Kingdom Hybrid / WFH Options Agora Talent
come with scaling a company • The ability to translate complex and sometimes ambiguous business requirements into clean and maintainable data pipelines • Excellent knowledge of PySpark, Python and SQL fundamentals • Experience in contributing to complex shared repositories. What’s nice to have: • Prior early-stage B2B SaaS experience involving client more »
London Area, United Kingdom Hybrid / WFH Options Tata Consultancy Services
vision to convert data requirements into logical models and physical solutions. with data warehousing solutions (e.g. Snowflake) and data lake architecture, Azure Databricks/ Pyspark & ADF. retail data model standards - ADRM. communication skills, organisational and time management skills. to innovate, support change and problem solve. attitude, with the ability more »
London Area, United Kingdom Vallum Associates
directly with clients. Supporting clients in platform discovery, integration, training, and collaboration on data science projects. Proficiency in technical skills, particularly Python, R, SQL, Pyspark, and JavaScript. Assisting users in mastering the platform. Analysing diverse data and ML applications. Providing strategic insights to ensure customer success. Collaborating with customers more »
London Area, United Kingdom Xcede
of 4 years commercial experience Strong proficiency in AWS and relevant technologies Good communication skills Preferably experience with some or all of the following; Pyspark, Kubernetes, Terraform Expeirence in a start-up/scale up or fast-paced environment is desirable. HOW TO APPLY Please register your interest by more »
London Area, United Kingdom Investigo
London. Responsibilities: Collaborate with cross-functional teams to gather requirements and implement solutions. Develop and maintain data processing applications using Python. Optimise and tune PySpark jobs for performance and scalability. Ensure data quality, reliability, and integrity throughout the data processing pipelines. Technical Requirements: Python: Proficiency in Python programming. Object … Oriented Design: Solid understanding of object-oriented principles and design patterns. PySpark: Experience with PySpark for data processing and analytics. Azure: Familiarity with Azure services and cloud platforms. Financial Services Background: Knowledge of financial markets, instruments, and related data. more »
London Area, United Kingdom Hybrid / WFH Options HENI
scientific Python toolset. Our tech stack includes Airbyte for data ingestion, Prefect for pipeline orchestration, AWS Glue for managed ETL, along with Pandas and PySpark for pipeline logic implementation. We utilize Delta Lake and PostgreSQL for data storage, emphasizing the importance of data integrity and version control in our … testing frameworks. Proficiency in Python and familiarity with modern software engineering practices, including 12factor, CI/CD, and Agile methodologies. Deep understanding of Spark ( PySpark), Python (Pandas), orchestration software (e.g. Airflow, Prefect) and databases, data lakes and data warehouses. Experience with cloud technologies, particularly AWS Cloud services, with a more »
London Area, United Kingdom Harrington Starr
start interviewing ASAP. Responsibilities: Azure Cloud Data Engineering using Azure Databricks Data Warehousing Data Engineering Very strong with the Microsoft Stack ESSENTIAL knowledge of PySpark clusters Python & C# Scripting experience Experience of message queues (Kafka) Experience of containerization (Docker) FINANCIAL SERVICES EXPERIENCE (Energy/commodities trading) If you have more »
|
Salary Guide PySpark London - 10th Percentile
- £52,500
- 25th Percentile
- £57,500
- Median
- £70,000
- 75th Percentile
- £92,500
- 90th Percentile
- £102,000
|