3 of 3 Remote SAS Macro Jobs

Lead PySpark Engineer - Data, SAS, AWS

Hiring Organisation
Randstad Digital
Location
London, United Kingdom
Employment Type
Contract
Contract Rate
£350 - £380 per day
Lead Data Engineer - Pyspark/AWS/Python/SAS - Financial Sector As a Lead PySpark Engineer, you will design, develop, and fix complex data processing solutions using PySpark on AWS. You will work hands-on with code, modernising legacy data workflows and supporting large-scale SAS … deliver production-ready data pipelines in a financial services environment. Essential Skills PySpark & Data Engineering Minimum 5+ years of hands-on PySpark experience. SAS to Pyspark migration experience Proven ability to write production-ready PySpark code. Strong understanding of data and data warehousing concepts, including: ETL/ELT, Data ...

Senior Data Analyst

Hiring Organisation
ARM
Location
Sweden
Employment Type
Contract
will also be expected to contribute subject-matter expertise during study planning and designing phases. The main programming language for this role is SAS (e.g., SAS Base, SAS Stat, SAS Macro, Proc SQL), though other languages such as R/R Studio or Python … used. Responsibilities: Implements cohort selection, variable derivation, data management, analyses, and production of results for the type of studies described above using SAS or another of the programming languages with little or no supervision. Conducts double programming and develops programming specifications. Reviews programs developed by other programmers ...

PySpark Developer

Hiring Organisation
Randstad Digital
Location
London, United Kingdom
Employment Type
Contract, Work From Home
Contract Rate
£300 - £350 per day
modernisation project, transitioning legacy data workflows into a high-performance AWS cloud environment. This is a hands-on technical role focused on converting legacy SAS code into production-ready PySpark pipelines within a complex financial services landscape. Key Responsibilities Code Conversion: Lead the end-to-end migration of SAS … code (Base SAS, Macros, DI Studio) to PySpark using automated tools (SAS2PY) and manual refactoring. Pipeline Engineering: Design, build, and troubleshoot complex ETL/ELT workflows and data marts on AWS. Performance Tuning: Optimise Spark workloads for execution efficiency, partitioning, and cost-effectiveness. Quality Assurance: Implement clean coding principles ...