Permanent PySpark Jobs in Wolverhampton

9 of 9 Permanent PySpark Jobs in Wolverhampton

Data Engineer

wolverhampton, midlands, United Kingdom
Hybrid / WFH Options
Aspia Space
and integrity. Essential: • 3+ years of experience in data engineering, data architecture, or similar roles. • Expert proficiency in Python, including popular data libraries (Pandas, PySpark, NumPy, etc.). • Strong experience with AWS services—specifically S3, Redshift, Glue (Athena a plus). • Solid understanding of applied statistics. • Hands-on experience More ❯
Posted:

Senior Data Engineer

wolverhampton, midlands, United Kingdom
JSS Transform
the project is an integration between two platforms. Skills required: Azure DataBricks Snowflake Azure Data Lake Python At least one of: NumPy, Pandas, or PySpark SQL Financial Services experience At least one Data related Azure certification Nice to have: PowerBI Azure Synapse C# or Java Data Factory Insurance experience More ❯
Posted:

Data Engineer

wolverhampton, midlands, United Kingdom
CACTUS
extraction, transformation, and loading (ETL) processes. o Ensure data pipelines are efficient, reliable, and scalable. Data Transformation & Processing: o Implement data transformations using Spark (PySpark or Scala) and other relevant tools. o Perform data cleaning, validation, and enrichment. o Ensure data quality and consistency. Azure Databricks Implementation: o Work More ❯
Posted:

Data Engineer

wolverhampton, midlands, United Kingdom
Hybrid / WFH Options
Mars
on a multi-year digital transformation journey where your work will unlock real impact. 🌟 What you'll do Build robust data pipelines using Python, PySpark, and cloud-native tools Engineer scalable data models with Databricks, Delta Lake, and Azure tech Collaborate with analysts, scientists, and fellow engineers to deliver More ❯
Posted:

Senior Data Architect

wolverhampton, midlands, United Kingdom
Anson McCade
at scale. Hands-on expertise in core GCP data services such as BigQuery, Composer, Dataform, Dataproc, and Pub/Sub. Strong programming skills in PySpark, Python, and SQL. Proficiency in ETL processes, data mining, and data storage principles. Experience with BI and data visualisation tools, such as Looker or More ❯
Posted:

Data Engineer

wolverhampton, midlands, United Kingdom
Realm
ready for deployment. Clear communication and the capacity to articulate technical choices effectively are crucial. Must-Have Skills: 3+ years Databricks (Lakehouse, Delta Lake, PySpark, Spark SQL) 5+ years SQL Python Azure Excellent client-facing communication skills Experience deploying Databricks pipelines Experience provisioning Databricks as code Nice to Have More ❯
Posted:

Data Engineer

wolverhampton, midlands, United Kingdom
The Green Recruitment Company
Utilities sector Experience leading technical projects Skills & Technologies required: Proficiency in cloud-based data engineering tools (ADF, Synapse Analytics, S3, Lamda) Proficiency in using PySpark notebooks for ELT. Fostering and cultivating a culture of best practices Strong analytical and problem-solving skills. Ability to work independently and as part More ❯
Posted:

Founding Machine Learning Engineer

wolverhampton, midlands, United Kingdom
Hybrid / WFH Options
Opus Recruitment Solutions
Strong hands-on experience with ML frameworks (PyTorch, TensorFlow, Keras). Proficiency in Python and C/C++. Experience with scalable data tools (e.g., PySpark, Kubernetes, Databricks, Apache Arrow). Proven ability to manage GPU-intensive data processing jobs. 4+ years of applied research or industry experience. Creative problem More ❯
Posted:

Data Modeler

wolverhampton, midlands, United Kingdom
Hybrid / WFH Options
N Consulting Global
Contract duration: 6 months-INSIDe IR 35 Location: London JOB DETAILS Role Title: Data Modeler/Technical Architect Required Core Skills: Databricks, AWS, Python, Pyspark, data modelling Minimum years of experience: 6 years Job Description: Must have hands-on experience in designing, developing, and maintaining data pipelines and data … strong working knowledge of moving/transforming data across layers (Bronze, Silver, Gold) using ADF, Python, and PySpark. Must have hands on experience to PySpark, Python,AWS,data modelling Must have experience in ETL processes Must have hands on experience in Databricks development Good to have experience in developing More ❯
Posted: