Permanent PySpark Jobs in Slough

23 of 23 Permanent PySpark Jobs in Slough

Senior Data Engineer

slough, south east england, United Kingdom
Quantum Technology Solutions Inc
working with unstructured data and NLP-related datasets. Proficiency in one programming language, preferably Python with experience in data processing libraries such as Pandas, PySpark, or Dask. Familiarity with MLOps and deploying AI/ML models into production environments. Knowledge of Retrieval-Augmented Generation (RAG) frameworks or interest in More ❯
Posted:

Senior AWS Data Engineer

slough, south east england, United Kingdom
Hybrid / WFH Options
Radley James
working in cloud-native environments (AWS preferred) Strong proficiency with Python and SQL Extensive hands-on experience in AWS data engineering technologies, including Glue, PySpark, Athena, Iceberg, Databricks, Lake Formation, and other standard data engineering tools. Familiarity with DevOps practices and infrastructure-as-code (e.g., Terraform, CloudFormation) Solid understanding More ❯
Posted:

Senior Big Data Engineer (Databricks) - RELOCATION TO ABU DHABI

slough, south east england, United Kingdom
SoftServe
cloud providers and collaborate with many technology partners like AWS, GCP, Microsoft, Databricks, Snowflake, Confluent, and others. IF YOU ARE Experienced with Python/PySpark Proficient working with Databricks Lakehouse architecture and principles Having 2+ years of designing data models, building ETL pipelines, and wrangling data to solve business More ❯
Posted:

Machine Learning Engineer with Data Engineering expertise

slough, south east england, United Kingdom
Tadaweb
machine learning, with a strong portfolio of relevant projects. Proficiency in Python with libraries like TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like Apache Airflow, Spark, or Kafka. Strong understanding of SQL, NoSQL More ❯
Posted:

Data Engineer - Microsoft Fabric

slough, south east england, United Kingdom
Agile
and the broader Azure ecosystem. Requirements Proven experience as a Data Engineer working with Microsoft Fabric or related Azure data services. Knowledge of using PySpark in notebooks for data analysis and manipulation. Strong proficiency with SQL and data modelling. Experience with modern ELT/ETL tools within the Microsoft More ❯
Posted:

Lead Engineer

slough, south east england, United Kingdom
Fractal
System Integration, Application Development or Data-Warehouse projects, across technologies used in the enterprise space. Software development experience using: Object-oriented languages (e.g., Python, PySpark,) and frameworks Stakeholder Management Expertise in relational and dimensional modelling, including big data technologies. Exposure across all the SDLC process, including testing and deployment. More ❯
Posted:

Data Architect (GCP)

slough, south east england, United Kingdom
Hybrid / WFH Options
Anson McCade
data warehouses and data lakes. Expertise in GCP data services including BigQuery, Composer, Dataform, DataProc, and Pub/Sub. Strong programming experience with Python, PySpark, and SQL. Hands-on experience with data modelling, ETL processes, and data quality frameworks. Proficiency with BI/reporting tools such as Looker or More ❯
Posted:

Data Engineer

slough, south east england, United Kingdom
Hybrid / WFH Options
83zero
similar tools is valuable (Collibra, Informatica Data Quality/MDM/Axon etc.) Data Architecture experience is a bonus Python, Scala, Databricks Spark and Pyspark with Data Engineering skills Ownership and ability to drive implementation/solution design More ❯
Posted:

Data Engineer

slough, south east england, United Kingdom
Hybrid / WFH Options
La Fosse
Collaborate across technical and non-technical teams Troubleshoot issues and support wider team adoption of the platform What You’ll Bring: Proficiency in Python, PySpark, Spark SQL or Java Experience with cloud tools (Lambda, S3, EKS, IAM) Knowledge of Docker, Terraform, GitHub Actions Understanding of data quality frameworks Strong More ❯
Posted:

Data & AI Science Consultant

slough, south east england, United Kingdom
Accenture
coding languages eg. Python, R, Scala, etc.; (Python preferred) Proficiency in database technologies eg. SQL, ETL, No-SQL, DW, and Big Data technologies e.g. pySpark, Hive, etc. Experienced working with structured and also unstructured data eg. Text, PDFs, jpgs, call recordings, video, etc. Knowledge of machine learning modelling techniques More ❯
Posted:

Data Engineer

slough, south east england, United Kingdom
Realm
ready for deployment. Clear communication and the capacity to articulate technical choices effectively are crucial. Must-Have Skills: 3+ years Databricks (Lakehouse, Delta Lake, PySpark, Spark SQL) 5+ years SQL Python Azure Excellent client-facing communication skills Experience deploying Databricks pipelines Experience provisioning Databricks as code Nice to Have More ❯
Posted:

Data Scientist

slough, south east england, United Kingdom
Arrows
queries for huge datasets. Has a solid understanding of blockchain ecosystem elements like DeFi, Exchanges, Wallets, Smart Contracts, mixers and privacy services. Databricks and PySpark Analysing blockchain data Building and maintaining data pipelines Deploying machine learning models Use of graph analytics and graph neural networks If this sounds like More ❯
Posted:

Senior Manager Wholesale IRB Credit Risk Consultant

slough, south east england, United Kingdom
Hybrid / WFH Options
Carnegie Consulting Limited
in programming languages and data structures such as SAS, Python, R, SQL is key. With Python background, particularly familiarity with pandas/polars/pyspark, pytest; understanding of OOP principles; git version control; knowledge of the following frameworks a plus: pydantic, pandera, sphinx Additionally, experience in any or all More ❯
Posted:

Data Engineer

slough, south east england, United Kingdom
Scrumconnect Consulting
Engineering Experience required ACTIVE SC is mandatory Essential requirement: Azure Data Factory and Synapse data solution provision Azure DevOps Microsoft Azure PowerBi Python misson Pyspark Dimension Data Model Semantic Data Models, including integration to Power BI Data Engineering Capabilities Business analysis to understand service needs and and documents accurately More ❯
Posted:

GCP Data Engineer (Java, Spark, ETL)

slough, south east england, United Kingdom
Staffworx
Future Talent Pool - GCP Data Engineer, London, hybrid role - digital Google Cloud transformation programme Proficiency in programming languages such as Python, PySpark and Java develop ETL processes for Data ingestion & preparation SparkSQL CloudRun, DataFlow, CloudStorage GCP BigQuery Google Cloud Platform Data Studio Unix/Linux Platform Version control tools More ❯
Posted:

Data Lead

slough, south east england, United Kingdom
Hybrid / WFH Options
Trust In SODA
following areas: Data Warehousing (Databricks) Data Modelling (Medallion Arch, Facts/Dimensions) Azure Data Stack (DataFactory/Synapse) Visualisation (PowerBI) Coding Best Practice (Python, PySpark) Real-Time Processing (SQL) Insurance/Finance Experience Startup experience Leadership/Line Management experience. What’s in it for you? Remote First Working More ❯
Posted:

Test Lead

slough, south east england, United Kingdom
Mastek
sprint planning sessions. Monitor data pipeline executions and investigate test failures or anomalies. Document test results, defects, and quality metrics. Preferred qualifications: Experience with PySpark or notebooks in Databricks. Exposure to Azure DevOps, Unit Testing frameworks, or Great Expectations for data testing. Knowledge of data warehousing or medallion architecture More ❯
Posted:

Senior Data Engineer

slough, south east england, United Kingdom
Mastek
performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout … years of experience in data engineering, with at least 3+ years of hands-on experience with Azure Databricks. Strong proficiency in Python and Spark (PySpark) or Scala. Deep understanding of data warehousing principles, data modelling techniques, and data integration patterns. Extensive experience with Azure data services, including Azure Data More ❯
Posted:

Head of Data & AI

slough, south east england, United Kingdom
Careerwise
ensuring data assets are leveraged for maximum operational and commercial impact. Technology Ownership: Stay current with the latest developments in AI/ML, Databricks, PySpark, and Power BI; evaluate and integrate advancements to improve data pipelines, data science workflows, and reporting capabilities. Data Infrastructure: Oversee the design, implementation, and … cybersecurity, or software distribution. Should have experience in GenAI, Graph, Neo4j, Azure Databricks. Expertise in cloud-native data platforms, with strong proficiency in Databricks , PySpark , and Power BI . Solid understanding of AI/ML applications in real-world business use cases. Strong knowledge of data governance, data warehousing More ❯
Posted:

Senior Software Engineer

slough, south east england, United Kingdom
Kantar Media
Azure Storage, for scalable cloud solutions. Ensure seamless integration with AWS S3 and implement secure data encryption/decryption practices. Python Implementation: Utilize Python, Pyspark for processing large datasets and integrating with cloud-based data solutions. Team Leadership: Code review and mentor a team of 3 engineers, fostering best … and reliable operations. Required 5-7 years of experience in software development with a focus on production-grade code. Proficiency in Java, Python, and PySpark; experience with C++ is a plus. Deep expertise in Azure services, including Azure Storage, and familiarity with AWS S3. Strong understanding of data security More ❯
Posted:

Senior Data Engineer

slough, south east england, United Kingdom
Hybrid / WFH Options
Burns Sheehan
Senior Data Engineer 💰 £90,000-£100,000 + 10% bonus 🖥️ Databricks, Snowflake, Terraform, Pyspark, Azure 🌍 London, hybrid working (2 days in office) 🏠 Leading property data & risk software company We are partnered with an industry-leading property data and risk software company, helping businesses in the UK make better decisions. … are experienced with Databricks and Lakehouse architecture for efficient data management. You are experienced building and optimising data pipelines and have strong SQL and PySpark skills. You have a strong understanding of the Azure stack. You have experience with devops practices and infrastructure-as-code, preferably Terraform. You have … membership Gym on-site Cycle to work and electric car schemes Senior Data Engineer 💰 £90,000-£100,000 + 10% bonus 🖥️ Databricks, Snowflake, Terraform, Pyspark, Azure 🌍 London, hybrid working (2 days in office) 🏠 Leading property data & risk software company More ❯
Posted:

Data Engineer

slough, south east england, United Kingdom
Hybrid / WFH Options
Empiric
Hiring: SC Cleared Data Engineer | Public Sector | Hybrid London Are you a Data Engineer with active SC Clearance and strong experience in Python and PySpark ? Join a dynamic and growing consultancy that partners with government and public sector organisations to deliver cutting- edge architecture and programme solutions. Role Overview … travel as needed) Sector: Public Sector/Consultancy Salary: Competitive (based on experience) Key Skills: SC Clearance (active) Strong Python programming Proven experience with PySpark and data pipelines Ready for a new challenge? If you're looking to take your data engineering career to the next level and work More ❯
Posted:

Head of Data and AI

slough, south east england, United Kingdom
Hybrid / WFH Options
Careerwise
with data governance policies, industry regulations, and best practices to protect data privacy and security. Innovation: Stay up-to date with the latest Databricks, PySpark, and PowerBi features and industry trends to continuously improve data capabilities ensuring relevance to a Cyber Security Value Added Distributor Qualification:- Education: Bachelor's … of experience in data management, analytics, data architecture, and AI, with at least 5 years in a leadership role. Technical Skills: Proficiency in Databricks, PySpark, Knowledge Graphs, Neo4J (Graph database and analytics), Power BI, SSRS, Azure Data Factory, and AI technologies. Strong understanding of data architecture, ETL processes, and More ❯
Posted: