with machine learning frameworks, such as TensorFlow, PyTorch, or scikit-learn. Intermediate project management experience, with the ability to prioritize and manage competing demands. Advanced proficiency in Python, SQL, PySpark, or similar analytic tools. Strong problem-solving and analytical skills, with the ability to think strategically and balance technical depth with business acumen. Strong communication skills, both written and More ❯
You should be experienced in a data engineering role demonstrating a strong track record of designing, building, and maintaining data pipelines and data architectures. Required Skills - Proficiency in Python, Pyspark, SQL for data manipulation and querying. Experience with containerisation technologies, specifically Kubernetes and Docker. Proven experience in designing and implementing data pipelines, working with big data technologies and architectures. More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Oliver James
Azure. Especially Synapse, ADF and Power BI (Datasets and Reports). Ideally SSIS, SSRS, SSAS with some understanding of Power App design and delivery Proficient in SQL and Python (PySpark) languages Understanding of data modelling concepts Experience of working with code management & deployment tools Proficient in debugging, monitoring, tuning and troubleshooting BI solutions. Knowledge and a proven track record More ❯
on experience or strong interest in working with Foundry as a core platform Forward-Deployed Engineering – delivering real time solutions alongside users and stakeholders Broader Skillsets of Interest: Python & PySpark – for data engineering and workflow automation Platform Engineering – building and maintaining scalable, resilient infrastructure Cloud (AWS preferred) – deploying and managing services in secure environments Security Engineering & Access Control – designing More ❯
on experience or strong interest in working with Foundry as a core platform Forward-Deployed Engineering – delivering real time solutions alongside users and stakeholders Broader Skillsets of Interest: Python & PySpark – for data engineering and workflow automation Platform Engineering – building and maintaining scalable, resilient infrastructure Cloud (AWS preferred) – deploying and managing services in secure environments Security Engineering & Access Control – designing More ❯
on experience or strong interest in working with Foundry as a core platform Forward-Deployed Engineering – delivering real time solutions alongside users and stakeholders Broader Skillsets of Interest: Python & PySpark – for data engineering and workflow automation Platform Engineering – building and maintaining scalable, resilient infrastructure Cloud (AWS preferred) – deploying and managing services in secure environments Security Engineering & Access Control – designing More ❯
on experience or strong interest in working with Foundry as a core platform Forward-Deployed Engineering – delivering real time solutions alongside users and stakeholders Broader Skillsets of Interest: Python & PySpark – for data engineering and workflow automation Platform Engineering – building and maintaining scalable, resilient infrastructure Cloud (AWS preferred) – deploying and managing services in secure environments Security Engineering & Access Control – designing More ❯
london (city of london), south east england, united kingdom
Morela Solutions
on experience or strong interest in working with Foundry as a core platform Forward-Deployed Engineering – delivering real time solutions alongside users and stakeholders Broader Skillsets of Interest: Python & PySpark – for data engineering and workflow automation Platform Engineering – building and maintaining scalable, resilient infrastructure Cloud (AWS preferred) – deploying and managing services in secure environments Security Engineering & Access Control – designing More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Vermelo RPO
with deploying services in Docker and Kubernetes Experience in creating production grade coding and SOLID programming principles, including test-driven development (TDD) approaches Experience in programming languages (e.g. Python, PySpark, R, SAS, SQL) Experience in source-control software, e.g., GitHub Proficient at communicating results in a concise manner both verbally and written Experience in data and model monitoring is More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Gerrard White
with deploying services in Docker and Kubernetes Experience in creating production grade coding and SOLID programming principles, including test-driven development (TDD) approaches Experience in programming languages (e.g. Python, PySpark, R, SAS, SQL) Experience in source-control software, e.g., GitHub Proficient at communicating results in a concise manner both verbally and written Experience in data and model monitoring is More ❯
environment. Experience in line management or team leadership, with a track record of developing and supporting engineers. Strong proficiency in: AWS (Lambda, Glue, ECS, S3, etc.). Python and PySpark (data pipelines, APIs, automation). TypeScript and React (frontend development). Excellent communication and stakeholder management skills. Demonstrated expertise in technical design and architecture of distributed systems. Familiarity with More ❯
Nottingham, Nottinghamshire, East Midlands, United Kingdom Hybrid / WFH Options
Littlefish
roles.? Ability to manage multiple projects and priorities in a fast-paced environment.? Experience with SSRS/SSAS (Tabular with DAX & OLAP with MDX)/SSIS? Experience with Databricks, Pyspark, and other data sciences tools Experience of using Azure DevOps/Git? Microsoft Certified on Data (e.g. Fabric) This is a client-facing role, so the following are essential More ❯
guildford, south east england, united kingdom Hybrid / WFH Options
BP Energy
systems, and wants to have a direct impact on data-driven decision-making. Key Responsibilities: Design, build, and maintain scalable and reliable ETL/ELT data pipelines using Python, PySpark, and SQL. Develop and manage data workflows and orchestration using tools such as Airflow or similar. Optimize data processes for performance, scalability, and cost-efficiency, particularly in cloud environments. … security and compliance best practices are followed across data systems and processes. Required Qualifications: 5+ years of experience in data engineering or a related field. Proficiency in Python and PySpark for data processing and automation. Strong command of SQL for data querying, transformation, and performance tuning. Deep experience with cloud platforms , preferably AWS (e.g., S3, Glue, Redshift, Athena, EMR More ❯
sunbury, south east england, united kingdom Hybrid / WFH Options
BP Energy
systems, and wants to have a direct impact on data-driven decision-making. Key Responsibilities: Design, build, and maintain scalable and reliable ETL/ELT data pipelines using Python, PySpark, and SQL. Develop and manage data workflows and orchestration using tools such as Airflow or similar. Optimize data processes for performance, scalability, and cost-efficiency, particularly in cloud environments. … security and compliance best practices are followed across data systems and processes. Required Qualifications: 5+ years of experience in data engineering or a related field. Proficiency in Python and PySpark for data processing and automation. Strong command of SQL for data querying, transformation, and performance tuning. Deep experience with cloud platforms , preferably AWS (e.g., S3, Glue, Redshift, Athena, EMR More ❯
fast-paced environments. Experience working with Software Engineers to provide feedback and validate the technical implementation for custom applications. Experience working with a scripting language like Python, SQL, Spark, PySpark or similar. Must be able to work on-site in Hendon, VA, with the ability to work in Springfield, VA as needed. Preferred Qualifications : Comfortable with briefing government leadership. More ❯
understanding of AI/ML/DL and Statistics, as well as coding proficiency using related open source libraries and frameworks. Significant proficiency in SQL and languages like Python, PySpark and/or Scala. Can lead, work independently as well as play a key role in a team. Good communication and interpersonal skills for working in a multicultural work More ❯
and complexity projects• Bachelor's degree in Computer Science or related field (or 4 additional years of relevant SWE experience)• Strong background in analytics development• Proficiency in Pig and PySpark Desired:• Experience with patch management and IAVA tracking• Programming skills in Python, Java, or Scala• Familiarity with NiFi and Ansible• Experience working in Agile environments Security Clearance Required: TS More ❯
a minimum of 6-8 years of core python development experience. Strong competency with Python libraries such as Pandas and NumPy for data wrangling, analysis, and manipulation. Expertise in PySpark for large-scale data processing and loading into databases. Proficiency in data querying and manipulation with Oracle and PostgreSQL. Skilled in API programming, handling JSON, CSV, and other unstructured More ❯
maximize data processing efficiency and real-time insights. Collaborate closely with cross-functional teams to ensure seamless integration and delivery of analytics solutions. Required Skills and Experience: Experience in PySpark, Pig or Piranhas. Understanding of Map/Reduce and/or streaming analytics methodologies. Desired Experience: Patch management Python Java Scala NiFi Ansible Lambda Functions AWS Need 6 years More ❯
Data Management Required:• Strong background in analytics development• Pig• PySpark • Piranhas Desired:• Patch management and IAVA tracking• Python• Java• Scala• NiFi• Ansible• Experience working in Agile environment Security Clearance Required: TS/SCI with Poly About Avid Technology Professionals Avid Technology Professionals, LLC (ATP) is a premiere provider of software and systems engineering, and acquisition program management services for More ❯
Title: GCP -Data Engineer. Location: Philadelphia PA (Can submit who are willing to relocate) GCP Data Engineer - GCP Dataflow and Apache Beam (Key skills) Primary Skills- PySpark, Spark, Python, Big Data, GCP, Apache Beam, Dataflow, Airflow, Kafka and BigQuery GFO, Google Analytics Javascript is Must Strong Experience with Dataflow and BigQuery A person should have leading the team or … an enterprise-scale Customer Data Platform (CDP) Experience in at least one programming language (Python strongly preferred), cloud computing platforms (e.g., GCP), big data tools such as Spark/PySpark, columnar datastores (BigQuery preferred), DevOps processes/tooling (CI/CD, GitHub Actions), infrastructure as code frameworks (Terraform), BI Tools (e.g. DOMO, Tableau, Looker,), pipeline orchestration (eg. Airflow) Fluency More ❯
related field. • Demonstrated experience with intermediate to advanced full stack development. • Demonstrated experience with the ability to consume complex REST APIs. • Demonstrated experience with tools, languages, and frameworks: Python, PySpark, JavaScript, Vue, Nuxt.js, and Viz.js • Demonstrated experience with AWS services: S3 and EC2. • Demonstrated experience with databases: relational, graph (Neo4J/Graph-Tool), and NoSQL/document (MongoDB). More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Gerrard White
Knowledge of the technical differences between different packages for some of these model types would be an advantage. Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Experience of WTW's Radar software is preferred Proficient at communicating results in a concise More ❯