Verve Graphic Design & Marketing Ltd, 1 Darwin Court, Clayton Way, Oxon Business Park, Bicton Heath, Shrewsbury, England
VERVE GRAPHIC DESIGN & MARKETING LT
experience and knowledge of frontend technologies: CSS, HTML, Javascript, jQuery, Bootstrap · Good experience and knowledge of backend technologies: PHP, MySQL · Basic understanding of SSH, Apache, Nginx, Git, Magento module development, WordPress plugin development · The ability to build the code into CMS systems such as WordPress & Magento. · Strong developers mindset more »
designing and building robust, scalable, distributed data systems and pipelines, using open source and public cloud technologies. Strong experience with data orchestration tools: e.g. Apache Airflow, Dagster. Experience with big data storage and processing technologies: e.g. DBT, Spark, SQL, Athena/Trino, Redshift, Snowflake, RDBMSs (PostgreSQL/MySQL). … Knowledge of event-driven architectures and streaming technologies: e.g. Apache Kafka, Kafka Streams, Apache Flink. Experience with public cloud environments: e.g. AWS, GCP, Azure, Terraform. Strong knowledge of software engineering practices: e.g. testing, CI/CD (Jenkins, Github Actions), agile development, git/version control, containers etc. Strong more »
Potters Bar, Hertfordshire, South East, United Kingdom Hybrid / WFH Options
Senitor Associates Limited
level of technical support and customer service to clients, and potentially to our in-house team. Responsibilities and technologies: Server Administration - Linux Servers running Apache PHP Understanding of WHM and Cpanel for Management of the server 301 Redirects Domain Management - Transferring in and out domains DNS Management - Making sites more »
for improvements to the system and processes. What you'll need to succeed Windows Server Microsoft System Centre (SCOM, SCCM, SCSM) SQL Server RabbitMQ Apache Zookeeper ElasticSearch Cassanra Desirable skillsWindows/Linux knowledge, interest in automation (tools around it PowerShell/Ansible Tower/Gitlab) What you'll get more »
specialized functionalities within WordPress sites. • Familiarity with caching mechanisms like Varnish and transitioning processes in-house • Competence in configuring and maintaining server environments on Apache/nginx with PHP 8.2 • Ability to manage multiple environments (production, staging, development) using Git branches (main, staging, dev) • Knowledge of Node.js for tasks more »
a good understanding of CVEs and the remediation with a good knowledge of security. You must also have good experience of configuration parameters in Apache & Tomcat, administrative security settings in AIX, Solaris, RHEL and common infrastructure, including Active Directory, GPO and Kerberos. Please apply ASAP to discuss further. more »
experience in web hosting, cloud services and load balancing. The Requirements: Practical experience with load balancing; F5/Cloudflare Practical web hosting experience with Apache/tomcat Strong Linux system admin experience proven understanding of technologies like BIND DNS, SMTP Container management experience; Docker/EC2 Good experience with more »
data engineering or a similar role. > Proficiency in programming languages such as Python, Java, or Scala. > Strong experience with data processing frameworks such as Apache Spark, Apache Flink, or Hadoop. > Hands-on experience with cloud platforms such as AWS, Google Cloud, or Azure. > Experience with data warehousing technologies more »
learning management systems or content management systems) Strong knowledge of customer centric service management processes Experience with web hosting platforms and security standards (e.g. Apache) Demonstrated ability to adapt to an ever-changing technical landscape. Extensive experience of working with a diverse range of stakeholders and external partners to more »
Certified Solutions Architect, AWS Certified Data Analytics Specialty, or AWS Certified Big Data Specialty. Experience with other big data and streaming technologies such as Apache Spark, Apache Flink, or Apache Beam. Knowledge of containerization and orchestration technologies such as Docker and Kubernetes. Experience with data lakes, NoSQL more »
comfortable designing and constructing bespoke solutions and components from scratch to solve the hardest problems. Adept in Java, Scala, and big data technologies like Apache Kafka and Apache Spark, they bring a deep understanding of engineering best practices. This role involves scoping and sizing, and indeed estimating and … be considered. Key responsibilities of the role are summarised below Design and implement large-scale data processing systems using distributed computing frameworks such as Apache Kafka and Apache Spark. Architect cloud-based solutions capable of handling petabytes of data. Lead the automation of CI/CD pipelines for more »
of the company's data infrastructure. You will work with some of the most innovative tools in the market including Snowflake, AWS (Glue, S3), Apache Spark, Apache Airflow and DBT!! The role is hybrid, with 2 days in the office in central London and the company is offering more »
workplace where each employee's privacy and personal dignity is respected and protected from offensive or threatening behaviour including violence and sexual harassment Role: Apache Spark Application Developer Skills Required: Hands on Experience as a software engineer in a globally distributed team working with Scala, Java programming language (preferably more »
cloud-based data storage technologies such as Google BigQuery, Amazon S3, and Redshift. Hands-on experience with data processing frameworks and tools such as Apache Spark, Apache Beam, and TensorFlow. Proficiency in programming languages such as Python, Java, or Scala. Solid understanding of data modeling concepts and database more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Xpertise Recruitment
CD, and model monitoring. Proficiency in Python and relevant data manipulation and analysis libraries (e.g., pandas, NumPy). Experience with distributed computing frameworks like Apache Spark is a plus. Apache Spark and Airflow would be a bonus. Role overview: If you're looking to work with a team more »
Newcastle Upon Tyne, England, United Kingdom Hybrid / WFH Options
Xpertise Recruitment
CD, and model monitoring. Proficiency in Python and relevant data manipulation and analysis libraries (e.g., pandas, NumPy). Experience with distributed computing frameworks like Apache Spark is a plus. Apache Spark and Airflow would be a bonus. Role overview: If you're looking to work with a team more »
Ability to design and implement data warehousing solutions using Azure Synapse Analytics. Azure Databricks: Proficiency in using Azure Databricks for data processing and analytics. Apache Spark: Deep understanding of Apache Spark for large-scale data processing. Azure Blob Storage and Azure Data Lake Storage: Expertise in setting up more »
development (ideally AWS) and container technologies Strong communication and interpersonal skills Experience managing projects and working with external third party teams Ideally experience with Apache Spark or Apache Flink (but not essential) Please note, this role is unable to provide sponsorship. If this role sounds of interest and more »
system. · Significant experience with Python, and experience with modern software development and release engineering practices (e.g. TDD, CI/CD). · Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Arrow, MapR). · Significant experience with SQL – comfortable writing efficient SQL. · Experience using enterprise … scheduling tools (e.g. Apache Airflow, Spring DataFlow, Control-M) · Experience with Linux and containerisation What you’ll get in return ·Competitive base salary ·Up to 20% bonus ·25 days holiday ·BAYE, SAYE & Performance share schemes ·7% pension ·Life Insurance ·Work Away Scheme ·Flexible benefits package ·Excellent staff travel benefits more »
Scala Kotlin Spark Google PubSub Elasticsearch Bigquery, PostgresQL Kubernetes, Docker, Airflow Ke y Responsibilities Designing and implementing scalable data pipelines using tools such as Apache Spark, Google PubSub etc. Optimising data storage and retrieval systems for maximum performance using both relational an d NoSQL databases. Continuously monitoring and improving … Data Infrastructure projects, as well as designing and building data intensive applications and services. Experience with data processing and distributed computing frameworks such as Apache Spark Expert knowledge in one or more of the following languages - Python, Scala, Java, Kotlin Deep knowledge of data modelling, data access, and data more »
so any knowledge of cross domain solutions or air gapped is a plus AWS as initial hosting provider Containerised apps using Docker and Kubernetes Apache Jena Elastic PostGIS Kafka Apache NiFi AWS Cognito HTTP REST, GraphQL, SPARQL interfaces Web apps based on HTML/CSS/Javascript frameworks more »
Experienced creating data pipelines on a cloud (preferably AWS) environment CI/CD experience Containerization experience (Docker, Kubernetes, etc.) Experience with SQS/SNS, Apache Kafka, RabbitMQ Other interesting/bonus skills – Airflow, Trino, Apache Iceburg, Postgres, MongoDB You *must* be eligible to work in your chosen country more »