Permanent PySpark Jobs in Nottingham

9 of 9 Permanent PySpark Jobs in Nottingham

Data Engineer - Microsoft Fabric

Nottingham, England, United Kingdom
JR United Kingdom
and the broader Azure ecosystem. Requirements Proven experience as a Data Engineer working with Microsoft Fabric or related Azure data services. Knowledge of using PySpark in notebooks for data analysis and manipulation. Strong proficiency with SQL and data modelling. Experience with modern ELT/ETL tools within the Microsoft More ❯
Posted:

Senior Data Architect

Nottingham, England, United Kingdom
JR United Kingdom
at scale. Hands-on expertise in core GCP data services such as BigQuery, Composer, Dataform, Dataproc, and Pub/Sub. Strong programming skills in PySpark, Python, and SQL. Proficiency in ETL processes, data mining, and data storage principles. Experience with BI and data visualisation tools, such as Looker or More ❯
Posted:

Founding Machine Learning Engineer

Nottingham, England, United Kingdom
Hybrid / WFH Options
JR United Kingdom
Strong hands-on experience with ML frameworks (PyTorch, TensorFlow, Keras). Proficiency in Python and C/C++. Experience with scalable data tools (e.g., PySpark, Kubernetes, Databricks, Apache Arrow). Proven ability to manage GPU-intensive data processing jobs. 4+ years of applied research or industry experience. Creative problem More ❯
Posted:

Data Engineer , Python, PySpark, and SQL, AWS

Nottingham, England, United Kingdom
JR United Kingdom
Social network you want to login/join with: Data Engineer , Python, PySpark, and SQL, AWS, nottingham col-narrow-left Client: Athsai Location: nottingham, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 3 Posted: 06.06.2025 Expiry Date: 21.07.2025 col-wide Job Description … a crucial role in designing and implementing robust data pipelines, ensuring the integrity and accessibility of data across various platforms. Required Skills Proficient in PySpark and AWS Strong experience in designing, implementing, and debugging ETL pipelines Expertise in Python, PySpark, and SQL In-depth knowledge of Spark and More ❯
Posted:

Data Engineer

Nottingham, England, United Kingdom
Hybrid / WFH Options
Accelero
our Nottingham office. As a key member of the engineering team, you’ll help design, build, and maintain robust data pipelines using Python and PySpark , enabling powerful analytics and smarter business decisions across the organisation. What You'll Be Doing Design and build scalable ETL/ELT data pipelines … using Python and PySpark Lead and support data migration initiatives across legacy and cloud-based platforms Collaborate with analysts, data scientists, and stakeholders to deliver high-quality, reliable data solutions Ensure best practices in data engineering, including quality, testing, and performance tuning Contribute to the evolution of our data … What We're Looking For 3+ years’ experience as a Data Engineer or in a similar role Strong hands-on experience with Python and PySpark Proven experience in data migration and transformation projects Solid understanding of cloud-based data platforms (e.g., AWS, Azure, GCP – nice to have) Strong problem More ❯
Posted:

Senior Data Engineer London Hybrid(6+ Years)

Nottingham, England, United Kingdom
Hybrid / WFH Options
JR United Kingdom
will play a crucial role in designing, developing, and maintaining data architecture and infrastructure. The successful candidate should possess a strong foundation in Python, Pyspark, SQL, and ETL processes, with a demonstrated ability to implement solutions in a cloud environment. Experience - 6-9 Years Location - London Job Type - Hybrid … Permanent Mandatory Skills: Design, build, maintain data pipelines using Python, Pyspark and SQL Develop and maintain ETL processes to move data from various data sources to our data warehouse on AWS/AZURE/GCP . Collaborate with data scientists, business analysts to understand their data needs & develop solutions … of our data solutions. Qualifications: Minimum 6+ years of Total experience. At least 4+ years of Hands on Experience using The Mandatory skills - Python, Pyspark, SQL. #J-18808-Ljbffr More ❯
Posted:

Data Engineer Azure Python PySpark

Nottingham, East Midlands
Hybrid / WFH Options
Client Server
Data Engineer (Azure Python PySpark) Nottingham/WFH to £65k Are you a tech savvy Data Engineer seeking an opportunity to join a growing tech company where you can make a real difference and progress your career? You could be joining a rapidly growing Microsoft Solution Partner, that provide … to-target mappings and re-engineer manual data flows to enable scaling and repeatable use. You will use a range of technology including Python, PySpark, Azure Synapse Analytics, Azure Data Factory, Microsoft Fabric and Data Flows. Location/WFH: You can work from home most of the time, meeting … be within an hour commute to Nottingham). About you: You have experience as a Data Engineer You have strong Python coding skills and PySpark experience You have experience with Azure Data Factory, Azure Synapse Analytics, Data Flows, Data migrations You are collaborative with excellent communication skills, have a More ❯
Employment Type: Permanent
Salary: £50,000 - £65,000
Posted:

Senior Data Engineer Azure Cloudera ERP

Nottingham, England, United Kingdom
Hybrid / WFH Options
ZipRecruiter
Documenting source-to-target mappings. Re-engineering manual data flows to enable scaling and repeatability. You will use a range of technologies including Python, PySpark, Cloudera, Azure Synapse Analytics, Azure Data Factory, Microsoft Fabric, and Data Flows. Location/WFH: You can work from home most of the time … Nottingham twice a month (must be within an hour commute). About you: Experience as a Senior Data Engineer. Strong Python coding skills and PySpark experience. Experience with Azure Data Factory, Azure Synapse Analytics, Data Flows, Data migrations, Microsoft Fabric, and Cloudera. ERP experience, particularly with SAP or Microsoft More ❯
Posted:

Senior Data Engineer Azure Cloudera ERP

Nottingham, England, United Kingdom
Hybrid / WFH Options
JR United Kingdom
to-target mappings and re-engineer manual data flows to enable scaling and repeatable use. You will use a range of technology including Python, PySpark, Cloudera, Azure Synapse Analytics, Azure Data Factory, Microsoft Fabric and Data Flows. Location/WFH: You can work from home most of the time … within an hour commute to Nottingham). About you: You have experience as a Senior Data Engineer You have strong Python coding skills and PySpark experience You have experience with Azure Data Factory, Azure Synapse Analytics, Data Flows, Data migrations, Microsoft Fabric and Cloudera You have ERP experience - particularly More ❯
Posted: