Deep understanding in software architecture, object-oriented design principles, and data structures Extensive experience in developing microservices using Java, Python Experience in distributed computing frameworks like - Hive/Hadoop, Apache Spark. Good experience in Test driven development and automating test cases using Java/Python Experience in SQL/NoSQL (Oracle, Cassandra) database design Demonstrated ability to be proactive … HR related applications Experience with following cloud services: AWS Elastic Beanstalk, EC2, S3, CloudFront, RDS, DynamoDB, VPC, Elastic Cache, Lambda Working experience with Terraform Experience in creating workflows for Apache Airflow About Roku Roku pioneered streaming to the TV. We connect users to the streaming content they love, enable content publishers to build and monetize large audiences, and provide More ❯
of robust, future-proof data solutions. Key Skills Experience developing modern data stacks and cloud data platforms. Capable of engineering scalable data pipelines using ETL/ELT tools e.g. Apache Spark, Airflow, dbt. Expertise with cloud data platforms e.g. AWS (Redshift, Glue), Azure (Data Factory, Synapse), Google Cloud (BigQuery, Dataflow). Proficiency in data processing languages e.g. Python, Java More ❯
or MS degree in Computer Science or equivalent Experience in developing Finance or HR related applications Working experience with Tableau Working experience with Terraform Experience in creating workflows for Apache Airflow and Jenkins Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive More ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
Azure Storage, Medallion Architecture, and working with data formats such as JSON, CSV, and Parquet. • Strong understanding of IT concepts, including security, IAM, Key Vault, and networking. • Exposure to Apache Airflow and DBT is a bonus. • Familiarity with agile principles and practices. • Experience with Azure DevOps pipelines. The "Nice to Haves": • Certification in Azure or related technologies. • Experience with More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Fruition Group
best practices for data security and compliance. Collaborate with stakeholders and external partners. Skills & Experience: Strong experience with AWS data technologies (e.g., S3, Redshift, Lambda). Proficient in Python, Apache Spark, and SQL. Experience in data warehouse design and data migration projects. Cloud data platform development and deployment. Expertise across data warehouse and ETL/ELT development in AWS More ❯
Newcastle upon Tyne, Tyne and Wear, Tyne & Wear, United Kingdom
Randstad Technologies Recruitment
institutions, alongside a proven record of relevant professional experience." Proven experience in a data specialist role with a passion for solving data-related problems. Expertise in SQL, Python , and Apache Spark , with experience working in a production environment. Familiarity with Databricks and Microsoft Azure is a plus. Financial Services experience is a bonus, but not required. Strong verbal and More ❯
of distributed systems, databases,(PostgreSQL, Databricks, Clickhouse, ElasticSearch), and performance tuning. Familiarity with modern web frameworks and front-end technologies (React, Vue, Angular, etc.) Experience with data processing frameworks (Apache Spark, Kafka, Airflow, Dagster or similar) Experience with cloud platforms (AWS, GCP, Azure) and containerization technologies (Docker/Kubernetes) Experience with testing frameworks Strong analytical skills and a data More ❯
Azure) Experience managing PKI/X.509 certificate infrastructure. Extensive experience supporting and implementing TLS/SSL certificate management systems Proficient with Token-based authentication services, Perfect Forward Security (PFS), Apache, Nginx, HAProxy Solid knowledge of Linux security and system operations. Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support More ❯
Newcastle Upon Tyne, England, United Kingdom Hybrid / WFH Options
In Technology Group
warehousing. Proficiency in Python or another programming language used for data engineering. Experience with cloud platforms (e.g., Azure, AWS, or GCP) is highly desirable. Familiarity with tools such as Apache Airflow, Spark, or similar is a plus. What’s On Offer: Competitive salary between £45,000 – £55,000 , depending on experience. Flexible hybrid working – 3 days on-site in More ❯
Chester, Cheshire, United Kingdom Hybrid / WFH Options
Leidos LLC
built features Integrate and extend Codice Alliance tools with the existing catalogue Build secure and modular services for ingesting, indexing, and querying geospatial and imagery data Work with OSGi, Apache Karaf, and other modular Java platforms Ensure compliance with data security, access control, and audit requirements Create Design and Build documentation derived from customer requirements. Own the creation and … of software design documentation Lead on Software Change Control Packaged Releases Generate release notes Required Experience: Strong Java development experience, especially in modular or distributed systems Familiarity with OSGi, Apache Karaf, and the DDF architecture Experience with REST APIs, secure data handling, and geospatial data formats Experience with build tools (Maven), version control (Git), and CI/CD pipelines More ❯
of a forward-thinking company where data is central to strategic decision-making. We’re looking for someone who brings hands-on experience in streaming data architectures, particularly with Apache Kafka and Confluent Cloud, and is eager to shape the future of scalable, real-time data pipelines. You’ll work closely with both the core Data Engineering team and … the Data Science function, bridging the gap between model development and production-grade data infrastructure. What You’ll Do: Design, build, and maintain real-time data streaming pipelines using Apache Kafka and Confluent Cloud. Architect and implement robust, scalable data ingestion frameworks for batch and streaming use cases. Collaborate with stakeholders to deliver high-quality, reliable datasets to live … experience in a Data Engineering or related role. Strong experience with streaming technologies such as Kafka, Kafka Streams, and/or Confluent Cloud (must-have). Solid knowledge of Apache Spark and Databricks. Proficiency in Python for data processing and automation. Familiarity with NoSQL technologies (e.g., MongoDB, Cassandra, or DynamoDB). Exposure to machine learning pipelines or close collaboration More ❯
evolution of our technical stack through the implementation and adoption of new technologies. You will report to the leadership within the Persistence Infrastructure Team. Your Impact Provisioning and maintaining Apache Pulsar Infrastructure on Kubernetes for Event Driven Architecture Developing and deploying software and tools for managing the lifecycle of persistence services such as Kubernetes operators, configuration management tools, shell … hardening activities Developing automation to remove manual tasks Developing and maintaining observability dashboards and alerting Collaborating with Software Engineers and Users across the business Your Qualifications Operational experience with Apache Pulsar or Kafka Experience working with Kubernetes Experience in Linux system administration Familiarity with CI/CD pipeline tooling Comfortable with scripting for automation Preferred Skills Software development skills More ❯
evolution of our technical stack through the implementation and adoption of new technologies. You will report to the leadership within the Persistence Infrastructure Team. Your Impact Provisioning and maintaining Apache Pulsar Infrastructure on Kubernetes for Event Driven Architecture Developing and deploying software and tools for managing the lifecycle of persistence services such as Kubernetes operators, configuration management tools, shell … hardening activities Developing automation to remove manual tasks Developing and maintaining observability dashboards and alerting Collaborating with Software Engineers and Users across the business Your Qualifications Operational experience with Apache Pulsar or Kafka Experience working with Kubernetes Experience in Linux system administration Familiarity with CI/CD pipeline tooling Comfortable with scripting for automation Preferred Skills Software development skills More ❯
Sheffield, Yorkshire, United Kingdom Hybrid / WFH Options
Reach Studios Limited
Azure etc.) What You'll Need Must-haves: Comprehensive experience in a DevOps or SRE role, ideally in a multi-project environment Deep experience with web stacks: Nginx/Apache, PHP-FPM, MySQL, Redis, Varnish, Elasticsearch Proven expertise in managing and optimising Cloudflare across DNS, security, performance, and access Experience with Magento 2 infrastructure and deployment CI/CD More ❯
like GPT, DALL-E, and Stable Diffusion for real-world applications. • Build advanced ML solutions for classification, recommendations, and more. • Create scalable AI pipelines using tools like MLflow and Apache Spark. • Collaborate with teams to design AI systems for diverse industries. What You’ll Bring: • 4+ years of AI/ML experience, including Databricks expertise. • Proficiency in Python and More ❯
Newcastle Upon Tyne, Tyne And Wear, United Kingdom
Endeavour Recruitment Solutions
experience. Experience will include a solid mix of; At least 3 years' experience of using, building and maintaining Solaris, AIX, HP-UX, SUSE or RedHat RDBMS (Oracle) Application Server (Apache, JBoss, TomCat, WebLogic) Networking fundamentals (Firewalls, Routers, TCP/IP, Load Balancers) High Availability and Clustering UNIX/LINIX Scripting Veritas Volume Manager and File System Management Virtualisation technologies More ❯
Sheffield, Yorkshire, United Kingdom Hybrid / WFH Options
Reach Studios Limited
on PHP platforms and frameworks with web applications utilising the power of Laravel for back-end processes. Front-ends are built in Alpine.js/React/Next.js, rendered by Apache or NGINX servers. Both REST and GraphQL APIs are utilised to integrate and extend the services we provide to our array of clients. We also partner with multiple serverMore ❯