on experience or strong interest in working with Foundry as a core platform Forward-Deployed Engineering – delivering real time solutions alongside users and stakeholders Broader Skillsets of Interest: Python & PySpark – for data engineering and workflow automation Platform Engineering – building and maintaining scalable, resilient infrastructure Cloud (AWS preferred) – deploying and managing services in secure environments Security Engineering & Access Control – designing More ❯
on experience or strong interest in working with Foundry as a core platform Forward-Deployed Engineering – delivering real time solutions alongside users and stakeholders Broader Skillsets of Interest: Python & PySpark – for data engineering and workflow automation Platform Engineering – building and maintaining scalable, resilient infrastructure Cloud (AWS preferred) – deploying and managing services in secure environments Security Engineering & Access Control – designing More ❯
on experience or strong interest in working with Foundry as a core platform Forward-Deployed Engineering – delivering real time solutions alongside users and stakeholders Broader Skillsets of Interest: Python & PySpark – for data engineering and workflow automation Platform Engineering – building and maintaining scalable, resilient infrastructure Cloud (AWS preferred) – deploying and managing services in secure environments Security Engineering & Access Control – designing More ❯
london (city of london), south east england, united kingdom
Morela Solutions
on experience or strong interest in working with Foundry as a core platform Forward-Deployed Engineering – delivering real time solutions alongside users and stakeholders Broader Skillsets of Interest: Python & PySpark – for data engineering and workflow automation Platform Engineering – building and maintaining scalable, resilient infrastructure Cloud (AWS preferred) – deploying and managing services in secure environments Security Engineering & Access Control – designing More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Vermelo RPO
with deploying services in Docker and Kubernetes Experience in creating production grade coding and SOLID programming principles, including test-driven development (TDD) approaches Experience in programming languages (e.g. Python, PySpark, R, SAS, SQL) Experience in source-control software, e.g., GitHub Proficient at communicating results in a concise manner both verbally and written Experience in data and model monitoring is More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Gerrard White
with deploying services in Docker and Kubernetes Experience in creating production grade coding and SOLID programming principles, including test-driven development (TDD) approaches Experience in programming languages (e.g. Python, PySpark, R, SAS, SQL) Experience in source-control software, e.g., GitHub Proficient at communicating results in a concise manner both verbally and written Experience in data and model monitoring is More ❯
environment. Experience in line management or team leadership, with a track record of developing and supporting engineers. Strong proficiency in: AWS (Lambda, Glue, ECS, S3, etc.). Python and PySpark (data pipelines, APIs, automation). TypeScript and React (frontend development). Excellent communication and stakeholder management skills. Demonstrated expertise in technical design and architecture of distributed systems. Familiarity with More ❯
Nottingham, Nottinghamshire, East Midlands, United Kingdom Hybrid / WFH Options
Littlefish
roles.? Ability to manage multiple projects and priorities in a fast-paced environment.? Experience with SSRS/SSAS (Tabular with DAX & OLAP with MDX)/SSIS? Experience with Databricks, Pyspark, and other data sciences tools Experience of using Azure DevOps/Git? Microsoft Certified on Data (e.g. Fabric) This is a client-facing role, so the following are essential More ❯
fast-paced environments. Experience working with Software Engineers to provide feedback and validate the technical implementation for custom applications. Experience working with a scripting language like Python, SQL, Spark, PySpark or similar. Must be able to work on-site in Hendon, VA, with the ability to work in Springfield, VA as needed. Preferred Qualifications : Comfortable with briefing government leadership. More ❯
understanding of AI/ML/DL and Statistics, as well as coding proficiency using related open source libraries and frameworks. Significant proficiency in SQL and languages like Python, PySpark and/or Scala. Can lead, work independently as well as play a key role in a team. Good communication and interpersonal skills for working in a multicultural work More ❯
and complexity projects• Bachelor's degree in Computer Science or related field (or 4 additional years of relevant SWE experience)• Strong background in analytics development• Proficiency in Pig and PySpark Desired:• Experience with patch management and IAVA tracking• Programming skills in Python, Java, or Scala• Familiarity with NiFi and Ansible• Experience working in Agile environments Security Clearance Required: TS More ❯
a minimum of 6-8 years of core python development experience. Strong competency with Python libraries such as Pandas and NumPy for data wrangling, analysis, and manipulation. Expertise in PySpark for large-scale data processing and loading into databases. Proficiency in data querying and manipulation with Oracle and PostgreSQL. Skilled in API programming, handling JSON, CSV, and other unstructured More ❯
maximize data processing efficiency and real-time insights. Collaborate closely with cross-functional teams to ensure seamless integration and delivery of analytics solutions. Required Skills and Experience: Experience in PySpark, Pig or Piranhas. Understanding of Map/Reduce and/or streaming analytics methodologies. Desired Experience: Patch management Python Java Scala NiFi Ansible Lambda Functions AWS Need 6 years More ❯
Data Management Required:• Strong background in analytics development• Pig• PySpark • Piranhas Desired:• Patch management and IAVA tracking• Python• Java• Scala• NiFi• Ansible• Experience working in Agile environment Security Clearance Required: TS/SCI with Poly About Avid Technology Professionals Avid Technology Professionals, LLC (ATP) is a premiere provider of software and systems engineering, and acquisition program management services for More ❯
Title: GCP -Data Engineer Location: Philadelphia PA (Can submit who are willing to relocate) GCP Data Engineer - GCP Dataflow and Apache Beam (Key skills) Primary Skills- PySpark, Spark, Python, Big Data, GCP, Apache Beam, Dataflow, Airflow, Kafka and BigQuery GFO, Google Analytics Javascript is Must Strong Experience with Dataflow and BigQuery A person should have leading the team or … an enterprise-scale Customer Data Platform (CDP) Experience in at least one programming language (Python strongly preferred), cloud computing platforms (e.g., GCP), big data tools such as Spark/PySpark, columnar datastores (BigQuery preferred), DevOps processes/tooling (CI/CD, GitHub Actions), infrastructure as code frameworks (Terraform), BI Tools (e.g. DOMO, Tableau, Looker,), pipeline orchestration (eg. Airflow) Fluency More ❯
related field. • Demonstrated experience with intermediate to advanced full stack development. • Demonstrated experience with the ability to consume complex REST APIs. • Demonstrated experience with tools, languages, and frameworks: Python, PySpark, JavaScript, Vue, Nuxt.js, and Viz.js • Demonstrated experience with AWS services: S3 and EC2. • Demonstrated experience with databases: relational, graph (Neo4J/Graph-Tool), and NoSQL/document (MongoDB). More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Gerrard White
Knowledge of the technical differences between different packages for some of these model types would be an advantage. Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Experience of WTW's Radar software is preferred Proficient at communicating results in a concise More ❯
team members on a regular basis • Adhere to all established policies regarding security, completion of mandatory training courses and timesheet submission Required Experience: • Experience developing in Python • Experience with PySpark application programming interface (API) to process large datasets in a distributed cluster • Ability to efficiently balance simultaneous, high-profile projects and deliver those projects in an effective manner is More ❯
etc. Knows when and why to use specific algorithms depending on the business use case. Can fine-tune/train models and evaluate performance. Technical Skills: Proficient in Python, PySpark, and working with MongoDB JSON data. Must be comfortable working directly with raw/complex datasets and drawing insights. Experience in model selection based on data exploration. Bonus (Not More ❯
Reading, Berkshire, South East, United Kingdom Hybrid / WFH Options
Bowerford Associates
technical concepts to a range of audiences. Able to provide coaching and training to less experienced members of the team. Essential skills: Programming Languages such as Spark, Java, Python, PySpark, Scala (minimum of 2). Extensive Data Engineering hands-on experience (coding, configuration, automation, delivery, monitoring, security). ETL Tools such as Azure Data Fabric (ADF) and Databricks or … UK, and you MUST have the Right to Work in the UK long-term without the need for Company Sponsorship. KEYWORDS Senior Data Engineer, Coding Skills, Spark, Java, Python, PySpark, Scala, ETL Tools, Azure Data Fabric (ADF), Databricks, HDFS, Hadoop, Big Data, Cloudera, Data Lakes, Azure Data, Delta Lake, Data Lake, Databricks Lakehouse, Data Analytics, SQL, Geospatial Data, FME More ❯
mathematic models, methods, and/or techniques (e.g. algorithm or development) to study issues and solve problems, engineering (electrical or computer), and/or high performance computing Preferred Python & pySpark experience More ❯
role for you. Key Responsibilities: Adapt and deploy a cutting-edge platform to meet customer needs Design scalable generative AI workflows (e.g., using Palantir) Execute complex data integrations using PySpark and similar tools Collaborate directly with clients to understand their priorities and deliver impact Why Join? Be part of a mission-driven startup redefining how industrial companies operate Work More ❯
role for you. Key Responsibilities: Adapt and deploy a cutting-edge platform to meet customer needs Design scalable generative AI workflows (e.g., using Palantir) Execute complex data integrations using PySpark and similar tools Collaborate directly with clients to understand their priorities and deliver impact Why Join? Be part of a mission-driven startup redefining how industrial companies operate Work More ❯