work across the full technology stack, with a primary focus on React.js for front-end development and Node.js for back-end services. You'll manage complex data flows in PostgreSQL , utilize Redis for caching and performance optimization, and implement robust messaging systems such as Kafka . The role also includes responsibility for CI/CD pipelines, ensuring seamless deployment across … culture of innovation, collaboration, and continuous technical improvement. Key Responsibilities Lead the design and development of dynamic, responsive web applications using React.js , Node.js , and related technologies. Build and optimize PostgreSQL databases, including schema design, complex queries, and performance tuning. Implement and manage Redis for caching, session management, and real-time data access. Develop and maintain CI/CD pipelines to … frameworks, and industry best practices, applying them where relevant. Skills & Experience Proven experience as a Full Stack Developer with strong proficiency in React.js and Node.js . Solid understanding of PostgreSQL and Redis for data management and caching. Experience with Kafka or similar message streaming platforms. Strong knowledge of CI/CD tools (e.g., Jenkins, GitLab CI, GitHub Actions). Familiarity More ❯
experience - ideally hands-on with setup, configuration, and performance tuning. Understanding of Shopware-specific configuration and data structures. Technical background in server management , networking , and database administration (MySQL/PostgreSQL). Demonstrated ability to collaborate with non-technical colleagues - must be colleague-facing , with clear communication skills. Background in E-commerce platform management (Shopware, Magento, Shopify, or similar). Experience More ❯
East London, London, United Kingdom Hybrid / WFH Options
InfinityQuest Ltd,
Engineer (Python focused with trading background) Location: Canary Wharf, UK (Hybrid) (3 days onsite & 2 days remote) Role Type: 6 Months Contract with possibility of extensions Mandatory Skills : Python, Postgres SQL, Azure Databricks, AWS(S3), Git, Azure DevOps CICD, Apache Airflow, Energy Trading experience Job Description:- Data Engineer (Python enterprise developer): 6+ years of experience in python scripting. Proficient in … developing applications in Python language. Exposed to python-oriented Algorithms libraries such as NumPy, pandas, beautiful soup, Selenium, pdfplumber, Requests etc. Proficient in SQL programming, Postgres SQL. Knowledge on DevOps like CI/CD, Jenkins, Git. Experience working with AWS(S3) and Azure Databricks. Have experience in delivering project with Agile and Scrum methodology. Able to co-ordinate with Teams More ❯
collaborate with cross-functional teams to build scalable data pipelines and contribute to digital transformation initiatives across government departments. Key Responsibilities Design, develop and maintain robust data pipelines using PostgreSQL and Airflow or Apache Spark Collaborate with frontend/backend developers using Node.js or React Implement best practices in data modelling, ETL processes and performance optimisation Contribute to containerised deployments … is desirable, not essential Operate within Agile teams and support DevOps practices What We're Looking For Proven experience as a Data Engineer in complex environments Strong proficiency in PostgreSQL and either Airflow or Spark Solid understanding of Node.js or React for integration and tooling Familiarity with containerisation technologies (Docker/Kubernetes) is a plus Excellent communication and stakeholder engagement More ❯
collaborate with cross-functional teams to build scalable data pipelines and contribute to digital transformation initiatives across government departments. Key Responsibilities Design, develop and maintain robust data pipelines using PostgreSQL and Airflow or Apache Spark Collaborate with frontend/backend developers using Node.js or React Implement best practices in data modelling, ETL processes and performance optimisation Contribute to containerised deployments … is desirable, not essential Operate within Agile teams and support DevOps practices What We're Looking For Proven experience as a Data Engineer in complex environments Strong proficiency in PostgreSQL and either Airflow or Spark Solid understanding of Node.js or React for integration and tooling Familiarity with containerisation technologies (Docker/Kubernetes) is a plus Excellent communication and stakeholder engagement More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Noir
Java Developer - Financial Technology - London/Hybrid (INSIDE IR35) (Key skills: Java, Spring Boot, Kubernetes, AWS EKS, Amazon Redshift, PostgreSQL, RESTful APIs, CI/CD, Microservices, iOS/Android Native, Agile, Financial Services) Are you a highly skilled Java Developer with a passion for building scalable, high-performance financial systems? Do you enjoy working with cutting-edge technologies across backend … Kubernetes , and integrated with modern mobile and analytics platforms. As part of a collaborative, Agile development team, you will be responsible for building robust microservices, managing data pipelines across PostgreSQL and Amazon Redshift , and delivering performant, secure RESTful APIs used across web and mobile platforms. You'll work closely with iOS and Android engineers to ensure seamless end-to-end More ❯
include: Designing, implementing and maintaining backend services and REST/GraphQL APIs using Python and Django (or Django REST Framework). Building performant, secure data models and database schemas (Postgres). Writing automated tests (unit/integration) and participating in code review processes. Collaborating with frontend engineers to define interfaces and deliver product features. Working with DevOps/Platform teams … to reliability and performance. Tech Stack & Skills Core skills: Strong Python development experience (5+ years preferred) with production Django/Django REST Framework work. Solid relational database experience, ideally Postgres (schema design, query optimisation). Test-driven development practices and experience with pytest or equivalent. Experience working with RESTful APIs and/or GraphQL. Familiarity with containerisation and cloud deployment More ❯
data flows with Azure Data Factory Manage Real Time data streams with Kafka/NATS Ensure security and compliance using Keycloak and best practice access controls Design and optimise PostgreSQL and Elasticsearch databases What We're Looking For Strong Python skills for automation and data engineering Hands-on ETL/ELT experience in hybrid environments Proven expertise in hybrid cloud …/orchestration experience (Docker, GitHub, Kubernetes) Practical knowledge of Azure Data Factory Event streaming experience (Kafka/NATS) Understanding of security practices across multi-platform environments Strong database development (PostgreSQL, Elasticsearch) Nice to Have 5+ years' data engineering experience across cloud and on-prem Azure or related certifications Some DevOps engineering exposure Clearance This role requires Security Clearance (ACTIVE SC More ❯
Design and implement RESTful APIs for biogas trading, inventory management, and regulatory reporting • Build complex business logic for RINS credit calculations, LCFS compliance, and carbon credit trading • Work with PostgreSQL databases and Entity Framework Core for data persistence • Implement authentication and authorization using Azure Active Directory and Microsoft Identity Platform • Develop email workflow systems and automated reporting for regulatory compliance … FluentValidation • Work with Docker and OpenShift for containerized deployments Required Technical Skills • 5+ years of experience with .NET development (C#, ASP.NET Core) • Strong experience with Entity Framework Core and PostgreSQL • Proficiency in RESTful API design and OpenAPI/Swagger documentation • Experience with authentication systems (OAuth2, JWT, Azure AD) • Knowledge of containerization (Docker) and Kubernetes/OpenShift • Experience with background job More ❯
Senior Data Engineer | 3-Month Contract (Possible Extension) | £550-£650p/day | Fully Remote | GCP, MongoDB, PostgreSQL | Start: Immediately About the Role On behalf of a leading healthcare technology company , we're looking for a Senior Data Engineer to join on an initial 3-month contract. This is an exciting opportunity to help shape scalable, data-driven systems that support … Responsibilities Design, build, and maintain data pipelines and architectures in Google Cloud Platform (GCP) . Develop MongoDB solutions from scratch , managing live/streaming data. Write, optimize, and tune PostgreSQL queries for large-scale workloads. Work with microservices, event-driven systems, and distributed architectures to support complex data flows. Collaborate with product, data science, and engineering teams to ensure data … Proven expertise in GCP data engineering tools (BigQuery, Dataflow, Pub/Sub, Composer, Cloud Storage). Strong hands-on MongoDB experience (schema design, aggregation pipelines, performance tuning). Advanced PostgreSQL query optimization skills. Solid coding ability in Python, Java, or Scala . Experience with microservices, event-driven architectures, and distributed systems . Familiarity with DevOps/SRE practices is a More ❯
/day Start: ASAP Key Skills: Experience with Python and modern frameworks such as FastAPI or Flask . Understanding of SQLModel/SQLAlchemy and relational database design. Knowledge of PostgreSQL and query optimisation. Extensive exposure to Celery Familiarity with AWS and CI/CD pipelines. (Bonus) Interest in data platforms, analytics, or cloud-native architecture (Bonus) Experience in data-heavy … Responsibilities: Design, develop, and maintain backend services using Python and FastAPI to power a data and intelligence platform Build and optimise database models with SQLModel and SQLAlchemy. Integrate with PostgreSQL databases and external APIs. Contribute to feature development, testing, and deployment. Collaborate with the frontend and data teams to deliver scalable solutions Own features end-to-end - from understanding the More ❯
environments. The role involves designing and deploying new instances, implementing robust backup/restore processes , and managing end-to-end data migration . Alongside MongoDB, youll also work with Postgres and other database technologies, ensuring performance, security, and reliability in a cloud-native setting. What Were Looking For Were keen to speak with engineers who have hands-on experience with … NoSQL databases (MongoDB essential) Postgres PaaS database services across GCP/AWS/Azure GitOps practices (GitLab/ArgoCD) Ansible for automation RabbitMQ (or similar integration services) Running databases in public cloud , ideally with PaaS offerings Why This Role? This is a high-impact contract where youll play a key role in ensuring the reliability and scalability of data systems More ❯