london, south east england, United Kingdom Hybrid / WFH Options
trg.recruitment
Rate: Up to £600 per day 📆 Contract: 6 months (Outside IR35, potential to go perm) 🛠 Tech Stack: Azure Data Factory, Synapse, Databricks, Delta Lake, PySpark, Python, SQL, Event Hub, Azure ML, MLflow We’ve partnered with a new AI-first professional services consultancy that’s taking on the Big … and supporting team capability development What You Need: ✔ 5+ years in data engineering or backend cloud development ✔ Strong Python, SQL, and Databricks skills (especially PySpark & Delta Lake) ✔ Deep experience with Azure: Data Factory, Synapse, Event Hub, Azure Functions ✔ Understanding of MLOps tooling like MLflow and integration with AI pipelines More ❯
of data for critical client projects. How to develop and enhance your knowledge of agile ways of working and working in open source stack (PySpark/PySql). Quality engineering professionals utilize Accenture delivery assets to plan and implement quality initiatives to ensure solution quality throughout delivery. As a … practices and contribute to data analytics insights and visualization concepts, methods, and techniques. We are looking for experience in the following skills: Palantir PythonPySpark/PySQL AWS or GCP Set yourself apart: Palantir Certified Data Engineer Certified cloud data engineering (preferably AWS) What's in it for you More ❯
data feeds Subject matter expertise experience with relational databases to noSQL Building and operating high-performance data processing pipelines using Lambda, Step Functions and PySpark Building high-quality User Interface/User experiences with the React framework and webGL Designing and operating large scale graph databases using Apache Cassandra … Sponsor's data graphing tool traversal capabilities built upon Apache Gremlin Building and operating high-performance data processing pipelines using Lambda, Step Functions and PySpark on the Sponsor's infrastructure with EMR Working with the Sponsor's enterprise services used for Data Management, including the enterprise catalog service (and More ❯
Senior Data Engineer Wiltshire - 3 days in office £65,000 About The Company The company operates in both B2B and D2C markets, providing food solutions to institutions and individuals. With over 30 years of experience and a presence in 400 More ❯
/Chantilly, VA. You'll be part of a collaborative Agile team working on modern, cloud-native software solutions using open-source technologies like PySpark, FastAPI, Docker, and AWS. Our developers take ownership at every level - from translating user stories and writing clean, testable code, to deploying containerized microservices … FastAPI and similar frameworks. Package and deploy code in containerized (Docker-based) environments. Required Skills: Proficiency with Python and data libraries (e.g., Pandas, NumPy, PySpark). Familiarity with FastAPI or similar frameworks for web API development. Experience writing automated tests using PyTest and Mocking tools. Understanding of software development … through testing and CI/CD integration. Package, test, and deploy solutions using Docker in cloud environments. Required Skills: Strong knowledge of Python libraries (PySpark, Pandas, NumPy). Hands-on experience with FastAPI and test automation frameworks. Working knowledge of Agile delivery models and version control systems. Preferred: Experience More ❯
Sr. Hadoop with SQL, Hive Work Location Tampa FL Duration: Full time Job Description: Mandatory Certificate Databricks Certified Developer Apache Spark 3.0 Skills PythonPySparkSpark SQL Hadoop Hive Responsibilities Ensure effective Design Development Validation and Support activities in line with client needs and architectural requirements Ensure continual knowledge … our clients navigate their next in their digital transformation journey Requirements A good professional with at least 6-10 yrs of experience in Bigdata PySpark HIVE Hadoop PLSQL Good knowledge of AWS and Snowflake Good understanding of CICD and system design Candidate with prior experience working on technologies on More ❯
and reports using BI tools like Oracle Business Intelligence Enterprise Edition (OBIEE) and Oracle Analytics Cloud (OAC). Proficiency in programming languages such as Pyspark and PL/SQL for data processing and automation. Have a strong understanding of Azure cloud platform and its services for deploying and managing … cloud-based solutions. Experience with Cloud services such as Azure Data Services, ADLS and AKS. Experience with Python and PySpark for distributed data processing, along with proficiency in Numpy, Pandas and other data manipulation libraries. Experience in optimizing big data architectures for high availability and performance. Strong problem-solving More ❯
Senior Software Engineer Location: Springfield, VA (100% onsite) Salary: Up to $210k (DOE) Clearance: Active Top Secret with SCI eligibility and CI Polygraph Benefits: 100% employer-paid healthcare Role Overview: Join to work on challenging projects in software engineering. Collaborate More ❯
of data for critical client projects. How to develop and enhance your knowledge of agile ways of working and working in open source stack (PySpark/PySQL). Quality engineering professionals utilize Accenture delivery assets to plan and implement quality initiatives to ensure solution quality throughout delivery. As a … team members to provide regular progress updates and raise any risk/concerns/issues. Core skills we're working with include: Palantir PythonPySpark/PySQL AWS or GCP What's in it for you: At Accenture, in addition to a competitive basic salary, you will also have More ❯
of data for critical client projects. How to develop and enhance your knowledge of agile ways of working and working in open source stack (PySpark/PySQL). Quality engineering professionals utilise Accenture delivery assets to plan and implement quality initiatives to ensure solution quality throughout delivery. As a … team members to provide regular progress updates and raise any risk/concerns/issues. Core skills we're working with include: Palantir PythonPySpark/PySQL AWS or GCP What's in it for you At Accenture in addition to a competitive basic salary, you will also have More ❯
of data for critical client projects. How to develop and enhance your knowledge of agile ways of working and working in open source stack (PySpark/PySQL). Quality engineering professionals utilise Accenture delivery assets to plan and implement quality initiatives to ensure solution quality throughout delivery. As a … team members to provide regular progress updates and raise any risk/concerns/issues. Core skills we're working with include: Palantir PythonPySpark/PySQL AWS or GCP What's in it for you At Accenture, in addition to a competitive basic salary, you will also have More ❯
contract data engineers to supplement existing team during implementation phase of new data platform. Main Duties and Responsibilities: Write clean and testable code using PySpark and SparkSQL scripting languages, to enable our customer data products and business applications. Build and manage data pipelines and notebooks, deploying code in a … Experience: Excellent understanding of Data Lakehouse architecture built on ADLS. Excellent understanding of data pipeline architectures using ADF and Databricks. Excellent coding skills in PySpark and SQL. Excellent technical governance experience such as version control and CI/CD. Strong understanding of designing, constructing, administering, and maintaining data warehouses More ❯
Knowledge of the cloud software deployment process is also key, including familiarity with parallel processing in Python, large data processing using packages such as PySpark, and optimizing Python code for deployment as part of a containerized architecture. Role Highlights: Work with the Chief Engineer and Technical Leads to translate … Experience leading discussions with customer stakeholders to align on requirements and technical implementations 7+ years' demonstrated experience with: Data Processing Python Libraries such as PySpark, Pandas and Numpy Experience with API development in Python using Python libraries such as FastAPI Experience with Unit Testing Frameworks in PyTest and Mocking More ❯
Knowledge of the cloud software deployment process is also key, including familiarity with parallel processing in Python, large data processing using packages such as PySpark, and optimizing Python code for deployment as part of a containerized architecture. Role Highlights: Work with the Chief Engineer and Technical Leads to translate … with customer stakeholders to align on requirements and technical implementations Required Skills: Minimum 3-5 years' experience with: Data Processing Python Libraries such as PySpark, Pandas and Numpy Experience with API development in Python using Python libraries such as FastAPI Experience with Unit Testing Frameworks in PyTest and Mocking More ❯
Knowledge of the cloud software deployment process is also key, including familiarity with parallel processing in Python, large data processing using packages such as PySpark, and optimizing Python code for deployment as part of a containerized architecture. Responsibilities: Work with the Chief Engineer and Technical Leads to translate requirements … integrating Python code into Docker containers as part of a distributed architecture. Requirements: 1-3 years' experience with: Data Processing Python Libraries such as PySpark, Pandas and Numpy Experience with API development in Python using libraries such as FastAPI Experience with Unit Testing Frameworks in PyTest and Mocking Preferred More ❯
Knowledge of the cloud software deployment process is also key, including familiarity with parallel processing in Python, large data processing using packages such as PySpark, and optimizing Python code for deployment as part of a containerized architecture. Responsibilities: Work with the Chief Engineer and Technical Leads to translate requirements … Experience leading discussions with customer stakeholders to align on requirements and technical implementations Requirements: 7+ years' experience with: Data Processing Python Libraries such as PySpark, Pandas and Numpy Experience with API development in Python using libraries such as FastAPI Experience with Unit Testing Frameworks in PyTest and Mocking Preferred More ❯
requirements into user stories and develop new functionality within existing software systems. - Implement Python-based data processing functionality using open-source libraries such as PySpark, Pandas, and NumPy. - Build automated test frameworks using PyTest and mocking tools to ensure code quality within CI/CD deployment pipelines. - Develop and … scalability in large data environments. Requirements Minimum Qualifications: - 1-3 years of professional experience in Python development. - Experience with data processing libraries such as PySpark, Pandas, and NumPy. - Proficiency in developing APIs using FastAPI or similar frameworks. - Hands-on experience with unit testing frameworks, including PyTest and mocking practices. More ❯
Are you a data-driven problem solver with a passion for machine learning and predictive analytics? Do you thrive on building models that drive real business impact, optimizing processes, and uncovering insights from complex datasets? Are you both detail-oriented More ❯
Mid-Level Cloud Software Engineer. Responsibilities: The Cloud Software Engineer will create, test, and maintain the various analytics running in Pig and/or PySpark to create various reports for the users. The Engineer will also provide report updates and create new reports based on requirements. Ensure the data … analytics; event driven analytics; set of analytics orchestrated through rules engine. Experience documenting data models, schemas, data element dictionaries, and other technical specifications. Pig PySpark Piranhas Requirements TS/SCI Clearance with FS Polygraph Bachelors degree in Computer Science or related discipline from an accredited college or university. Four … analytics; event driven analytics; set of analytics orchestrated through rules engine. Experience documenting data models, schemas, data element dictionaries, and other technical specifications. Pig, PySpark, Piranhas Desired: Python, Java, Scala, Apache NiFi, Ansible, Agile Experience, Experience deploying applications in a cloud environment, MongoDB Benefits for this position include: 401K More ❯
a sophisticated, proprietary methodology. To do this, we use cutting-edge technology: AWS services, EMR Cluster for Big Data, exploitation of AI models with Pyspark processes, web dev environments with Spring Boot servers & Java 11, React & Redux fronts We're an ambitious, talented & international team of 20 eager to …/PL-SQL Relevant experience with Python Relevant experience with ETLs & E/R models Relevant experience working with Data & Development teams Experience with Pyspark Basic experience with AWS services (EMR, Lambdas, ) Fluency in Spanish and English Relevant experience working in with financial data This gives extra points Basic More ❯
Senior Data Analyst - Pricing Data Engineering & Automation, CUO Global Pricing Let's care for tomorrow. Whether it's aircraft, international business, offshore wind parks or Hollywood film productions, Allianz Commercial has an extensive range of risks covered when it comes More ❯
Newcastle upon Tyne, Tyne & Wear Hybrid / WFH Options
Client Server
e.g. Data Science, Mathematics, Statistics, Physics, Computer Science, Informatics or Engineering You have strong experience with analytics and data manipulation software e.g. R, Python, PySpark, SAS, SQL, SPSS You take a consultative approach and have polished communication and stakeholder management skills You're able to work independently and take … and inclusive environment Health and wellbeing support Volunteering opportunities Pension Apply now to find out more about this Data Scientist/Consultant (R PythonPySpark SAS SQL SPSS) opportunity. At Client Server we believe in a diverse workplace that allows people to play to their strengths and continually learn. More ❯
Newcastle Upon Tyne, England, United Kingdom Hybrid / WFH Options
Client Server
e.g. Data Science, Mathematics, Statistics, Physics, Computer Science, Informatics or Engineering You have strong experience with analytics and data manipulation software e.g. R, Python, PySpark, SAS, SQL, SPSS You take a consultative approach and have polished communication and stakeholder management skills You're able to work independently and take … and inclusive environment Health and wellbeing support Volunteering opportunities Pension Apply now to find out more about this Data Scientist/Consultant (R PythonPySpark SAS SQL SPSS) opportunity. At Client Server we believe in a diverse workplace that allows people to play to their strengths and continually learn. More ❯
Job Description: Engineering Lead - GCP Data Key Responsibilities: 12+ years of overall IT experience with 10+ years in building data warehouse/datamart solutions. Experience in implementing an end-to-end data platform for analytics on cloud from ingestion to More ❯
City, Edinburgh, United Kingdom Hybrid / WFH Options
Axiom Software Solutions Limited
Job Title: MS Fabric Architect Location: Edinburgh, UK (Hybrid) FTC Contract of 6 months Job Description: The MS Fabric Architect role requires expertise in designing and implementing scalable data solutions with a focus on cloud architectures and Microsoft Fabric technologies. More ❯