Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
Hamilton Barnes
Pub/Sub, Dataflow, and BigQuery. Key Responsibilities: Develop Scalable Solutions: Lead the creation of scalable and dependable data streaming solutions on GCP using Apache Kafka and associated technologies. Optimize Kafka Setup: Customize Kafka brokers, topics, partitions, and replication to guarantee the highest performance and reliability of data streams. … Connectors: Apply your expertise to set up Kafka connectors for batch processing, managing both source and sink connectors to seamlessly integrate data. Python and Apache Beam Proficiency: Utilize Python and Apache Beam to craft tailored data processing logic and transformations within pipelines, enabling swift and effective data analysis. … Bring: Hands-On Kafka Configuration: Proven expertise in configuring Kafka connectors for batch processing, optimizing their number for improved performance. Python and DataFlow/Apache Beam Proficiency: Skilled in Python and DataFlow/Apache Beam, adept at developing custom data processing logic within pipelines Streaming Data Management: Demonstrated more »
Cambridge, Impington, Cambridgeshire, United Kingdom Hybrid / WFH Options
Pure Resourcing Solutions Limited
principles, standards, and best practices. Experience with ISO 27001 is highly advantageous. Knowledge of web hosting technologies highly advantageous, including any of the following- Apache, Nginx, MySQL, MongoDB, Django and PWAs, their rolling updates, red/black deployments and roll-backs. Experience with virtualisation technologies such as Docker and more »
Employment Type: Permanent
Salary: £45000 - £60000/annum Hybrid working - scale-up business
Newport, Gwent, Wales, United Kingdom Hybrid / WFH Options
Maclean Moore Ltd
Developer. ROLE: GCP DATA ENGINEER LOCATION: NEWPORT OR CARDIFF (HYBRID) IR35 STATUS: INSIDE LENGTH: 6 MONTHS Required experience: Expertise in python and DataFlow/Apache beam Experience in handling streaming data Strong experience in database replication using Message based CDC Experience in using Kafka implementations in a secured cloud more »
development, deployment of large scale data streaming Pipelines in GCP . Work on Data Streaming POC Experience required: Expertise in python and DataFlow/Apache beam Experience in handling streaming data. Strong experience in database replication using Message based CDC Experience in using Kafka implementations in a secured cloud more »
technical direction Capacity to work independently or as part of a team and operate to tight deadlines Advantageous But Not Essential: Great knowledge of Apache, specifically Mod Rewrite Good working knowledge of Linux, including command line Server administration experience Ecommerce experience Smarty Salary will be £60K - £65K. Send your more »
technical direction Capacity to work independently or as part of a team and operate to tight deadlines Advantageous But Not Essential: Great knowledge of Apache, specifically Mod Rewrite Good working knowledge of Linux, including command line Server administration experience Ecommerce experience Smarty Salary will be £60K - £65K. Send your more »
opportunity for an experienced DevOPS to join our expanding team, managing our servers. You must have performance tuning experience with Linux (preferably Centos 6.X), Apache 2.X, MySQL 5.X and PHP 5.X. Overview: We currently develop and host a bespoke high performance e-commerce platform on behalf of our established … running of Aurora Commerce production and development servers What You Need for this Position: Required Technologies with performance tuning experience : Linux - preferably Centos 6.X Apache 2.X MySQL 5.X PHP5.X Required Skills : *Solid Linux systems engineering/administration experience *At least 5 years experience working in a high traffic, production more »
of the day to day role): Virtualisation with KVM/QEMU Scripting skills (Powershell, Python, Bash, Ruby, Perl, etc.) LAMP (or components of - Linux, Apache, MySQL, PHP) OpenStack (or similar cloud software platform) VMware ESXi Hypervisor Data Modelling tools (XML, NETCONF, YANG, JSON) Must have the ability to break more »
independently or as part of a team and operate to tight deadlines Sense of humour please Advantageous But Not a Must: Great knowledge of Apache, specifically Mod Rewrite Good working knowledge of Linux, including command line Server administration experience Ecommerce experience Smarty Competitive Salary Career Growth and Financial Stability more »
Job Description Must have excellent Liferay and Java skills, Spring MVC, Apache CKF, Dozer (a fast and flexible framework for mapping back and forth between Java Beans) & XML: A minimum of 5 years' work experience in Software Development Experience with implementing service-oriented architecture (SOA) Designs and develops Enterprise more »
independently or as part of a team and operate to tight deadlines Sense of humour please Advantageous But Not a Must: Great knowledge of Apache, specifically Mod Rewrite Good working knowledge of Linux, including command line Server administration experience Ecommerce experience Smarty Competitive Salary Career Growth and Financial Stability more »
Durham, County Durham, North East, United Kingdom Hybrid / WFH Options
Reed Technology
a data science team and managing complex projects. Expertise in machine learning, statistics, data management, and relevant technologies (e.g., Python, R, SQL, AWS SageMaker, Apache Airflow, Dbt, AWS Kinesis). Strong communication skills with the ability to explain complex data concepts to a non-technical audience. Knowledge of industry more »
LIN Buses Serial Buses (RS485/RS232 etc..) SPI/I2C Python Go XML JSON HTML CSS Web backend servers (Angular, Django, NodeJS, React, Apache or similar) Web Sockets IP video and video routing Familiarity with Systems serving Real Time Information via Web Sockets Use of DDS and interfacing more »
up and learn new technologies and frameworks Nice to have: Knowledge of databases, SQL Familiarity with Boost ASIO Familiarity with data serialization formats such Apache Arrow/Parquet, Google Protocol Buffers, Flatbuffers Experience with gRPC, http/REST and Websocket protocols Experience with Google Cloud/AWS and/ more »
Other skills we are looking for you to demonstrate include: Experience of data storage technologies: Delta Lake, Iceberg, Hudi Sound knowledge and understanding of Apache Spark, Databricks or Hadoop Ability to take business requirements and translate these into tech specifications Knowledge of Architecture best practices and patterns Competence in more »
offs explicit and understandable to others REQUIREMENTS 7+ years' coding experience, including 3 years in a dedicated ML Engineering role 2+ years’ experience with Apache Spark Experience working with GB+ scale data Experience with deployed ML services Experience deploying multiple ML projects across different environments Productionisation experience in at more »
Azure Data Lake , Azure Databricks or GCP Cloud Dataproc . Familiarity with big data technologies and distributed computing frameworks, such as Hadoop, Spark, or Apache Flink. Experience scaling an “API-Ecosystem ”, designing, and implementing “API-First” integration patterns. Experience working with authentication and authorisation protocols/patterns. Other Information more »
structures. Experience of API (REST) development, Docker, and Kubernetes. Familiarity with IntelliJ, Subversion and Maven. Exposure to one or more of the following technologies: Apache Storm, OpenSearch, Cassandra and Kafka. Ability to work within a hybrid Agile methodology. Understand the design and development approaches required to build a scalable more »
Bristol, England, United Kingdom Hybrid / WFH Options
Made Tech
and able to guide how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes) Ability to create more »
Manchester, England, United Kingdom Hybrid / WFH Options
Made Tech
and able to guide how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes) Ability to create more »
Python client for Google BigQuery Advanced SQL (GoogleSQL, MySQL) Google Cloud Services Advanced BigQuery Advanced Google Cloud Storage Google Dataform Google Cloud Function Advanced Apache Airflow Basic Tableau: ability to create basic visualisations Ability to integrate multiple data sources and databases into one system Able to create database schemas more »
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Wyoming Interactive
and experienced in Front-end tools/compilers Experience with database management (MySQL, Aurora, AWS RDS configuration), git source control, and web server configuration (Apache, Nginx). Familiarity with Docker for development environments and CI/CD pipelines. Infrastructure and Cloud Services: Experience designing and managing infrastructure in AWS more »
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Wyoming Interactive
and experienced in Front-end tools/compilers Experience with database management (MySQL, Aurora, AWS RDS configuration), git source control, and web server configuration (Apache, Nginx). Familiarity with Docker for development environments and CI/CD pipelines. Infrastructure and Cloud Services: Experience designing and managing robust, high availability more »
Drupal Magento BigCommerce Laravel Proficient in setting up development and staging environments. Proficient in using and altering MySQL databases. Familiarity with server structures, specifically Apache and Nginx. Familiarity with domain management and DNS records. Familiarity with Agile and Waterfall work environments. Nice to haves: Familiarity with digital marketing/ more »