4 of 4 PySpark Jobs in Cambridge

Senior Data Engineer

Hiring Organisation
Alexander Associates Technical Recruitment
Location
Cambridge, Cambridgeshire, UK
Employment Type
Full-time
Engineering, with a minimum of 3 years of hands-on Azure Databricks experience delivering production-grade solutions. Strong programming proficiency in Python and Spark (PySpark) or Scala, with the ability to build scalable and efficient data processing applications. Advanced understanding of data warehousing concepts, including dimensional modelling, ETL/ ...

Product Manager III - Growth Marketing

Hiring Organisation
Expedia Group
Location
Cambridge, Cambridgeshire, UK
Employment Type
Full-time
machine learning operations. Excellent communication skills with the ability to influence cross-functional teams and stakeholders. Experience with Python, SQL, Java, HQL, Hive, Pyspark, or similar programming languages preferred Well-versed in design thinking principles Great problem solving and analytical skills Accommodation requests If you need assistance with ...

Manager Strategic Analytics

Hiring Organisation
American Express
Location
Cambridge, Cambridgeshire, UK
Employment Type
Full-time
skills with the ability to communicate effectively, build relationships, and influence at all levels. Technical Skills: Familiarity with Google Bigquery, SQL, R, Python, Lumi, PySpark and Tableau. Experience in the merchant business preferred. Experience in Machine Learning/Cloud/Gen AI is a plus. Preferred Qualifications: Bachelors ...

Data Engineer

Hiring Organisation
Quantum World Technologies Inc
Location
Cambridge, Cambridgeshire, UK
Employment Type
Full-time
Lake for optimized storage and analytics. Data Governance & Cataloging: Establish data cataloging, lineage, and metadata management for improved discoverability. Performance Optimization: Tune Spark/PySpark jobs for efficiency in large-scale data processing. Data Modelling & Quality: Develop dimensional/data vault models and enforce data quality checks. Collaboration: Work … integrate with Azure/AWS/GCP data ecosystems. Primary Skills (Must-Have): Databricks – Architecture, Delta Lake, Lakehouse, Unity Catalog/Data Catalog PySpark (optimization, UDFs, Delta operations) SQL (advanced querying, performance tuning) Data Lake/Warehouse best practices Secondary Skills (Nice-to-Have): Python (for scripting & automation) Data ...