Birmingham, West Midlands (County), United Kingdom
Workday
structures. Experience of API (REST) development, Docker, and Kubernetes. Familiarity with IntelliJ, Subversion and Maven. Exposure to one or more of the following technologies: Apache Storm, OpenSearch, Cassandra and Kafka. Ability to work within a hybrid Agile methodology. Understand the design and development approaches required to build a scalable more »
proficiency in SQL for data querying and transformation. ● Programming skills in Python, including experience with basic libraries like os, csv, and pandas. ● Experience with Apache Airflow for workflow management. ● Experience with enterprise DBMS (e.g., DB2, MS SQL Server) and cloud data warehouses, particularly Google BigQuery. ● Proficiency in Google Cloud more »
workplace where each employee's privacy and personal dignity is respected and protected from offensive or threatening behaviour including violence and sexual harassment Role: Apache Spark Application Developer Skills Required: Hands on Experience as a software engineer in a globally distributed team working with Scala, Java programming language (preferably more »
system. · Significant experience with Python, and experience with modern software development and release engineering practices (e.g. TDD, CI/CD). · Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Arrow, MapR). · Significant experience with SQL – comfortable writing efficient SQL. · Experience using enterprise … scheduling tools (e.g. Apache Airflow, Spring DataFlow, Control-M) · Experience with Linux and containerisation What you’ll get in return ·Competitive base salary ·Up to 20% bonus ·25 days holiday ·BAYE, SAYE & Performance share schemes ·7% pension ·Life Insurance ·Work Away Scheme ·Flexible benefits package ·Excellent staff travel benefits more »
Tech: - AWS (S3, Glue, EMR, Athena, Lambda) - Snowflake, Redshift - DBT (Data Build Tool) - Programming: Python, Scala, Spark, PySpark or Ab Initio - Data pipeline orchestration (Apache Airflow) - Knowledge of SQL This is a 6 month initial contract with a trusted client of ours. CVs are being presented on Friday and more »
of the curve with emerging tech trends. As a Linux Engineer - Is this you? 🧐 Proficient with Linux OS, kernel tweaks, and web server management - Apache - NGINX Versed in server hardware from giants like HP and Dell. Scripting experience in Python or Ansible. Network savvy with the veins of tech more »
management and data governance open source platform that we will teach you. Other technologies in use in our space: RESTful services, Maven/Gradle, Apache Spark, BigData, HTML 5, AngularJs/ReactJs, IntelliJ, Gitlab, Jira. Cloud Technologies: You’ll be involved in building the next generation of finance systems more »
explain and present the findings of technical work to non-expert audiences Fluency with Python machine learning and data science packages (pandas, scikit-learn, Apache, Spark, DASK, Tensorflow, etc.) or experience with programming languages and willingness to learn Python For engineering, experience in a DevOps role, ideally in a more »
experience in web hosting, cloud services and load balancing. The Requirements: Practical experience with load balancing; F5/Cloudflare Practical web hosting experience with Apache/tomcat Strong Linux system admin experience proven understanding of technologies like BIND DNS, SMTP Container management experience; Docker/EC2 Good experience with more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Xpertise Recruitment
CD, and model monitoring. Proficiency in Python and relevant data manipulation and analysis libraries (e.g., pandas, NumPy). Experience with distributed computing frameworks like Apache Spark is a plus. Apache Spark and Airflow would be a bonus. Role overview: If you're looking to work with a team more »
Newcastle Upon Tyne, England, United Kingdom Hybrid / WFH Options
Xpertise Recruitment
CD, and model monitoring. Proficiency in Python and relevant data manipulation and analysis libraries (e.g., pandas, NumPy). Experience with distributed computing frameworks like Apache Spark is a plus. Apache Spark and Airflow would be a bonus. Role overview: If you're looking to work with a team more »
developing and optimising ETL pipelines. Version Control: Experience with Git for code collaboration and change tracking. Data Pipeline Tools: Proficiency with tools such as Apache Airflow. Cloud Platforms: Familiarity with AWS, Azure, Snowflake, and GCP. Visualisation: Tableau or PowerBI Delivery Tools: Familiarity with agile backlogs, code repositories, automated builds more »
Scala Kotlin Spark Google PubSub Elasticsearch Bigquery, PostgresQL Kubernetes, Docker, Airflow Ke y Responsibilities Designing and implementing scalable data pipelines using tools such as Apache Spark, Google PubSub etc. Optimising data storage and retrieval systems for maximum performance using both relational an d NoSQL databases. Continuously monitoring and improving … Data Infrastructure projects, as well as designing and building data intensive applications and services. Experience with data processing and distributed computing frameworks such as Apache Spark Expert knowledge in one or more of the following languages - Python, Scala, Java, Kotlin Deep knowledge of data modelling, data access, and data more »
Basingstoke, England, United Kingdom Hybrid / WFH Options
Intec Select
cross-functionally across the business to understand the requirements of the products Designing and implementing performance related data ingestion pipelines from multiple sources using Apache Spark Integrating end-to-end data pipelines ensuring a high level of quality is maintained Working with an Agile delivery/DevOps methodology to more »
Ability to design and implement data warehousing solutions using Azure Synapse Analytics. Azure Databricks: Proficiency in using Azure Databricks for data processing and analytics. Apache Spark: Deep understanding of Apache Spark for large-scale data processing. Azure Blob Storage and Azure Data Lake Storage: Expertise in setting up more »
Experience Required Excellent knowledge of HTML, CSS, PHP, and MySQL. Ability to optimise code and database queries with speed. jQuery, JavaScript, Java, Python. Linux, Apache, SVN. Kotlin, Swift, SVG Animation. Video editing and web design. Interested? For more information on this PHP Web Developer opportunity, apply directly to this more »
Rickmansworth, Hertfordshire, South East, United Kingdom
Mobilize Financial Services
build, operate and manage a complex production environment. Familiarity with RedHat based Linux versions Experience of Web Application servers architectures, security, protocols and technologies (Apache Web Server, HAProxy, Tomcat) configuration and optimization Understanding of DR/BCP business processes Comfortable liaising with business users as well as technical teams more »
learning management systems or content management systems) Strong knowledge of customer centric service management processes Experience with web hosting platforms and security standards (e.g. Apache) Demonstrated ability to adapt to an ever-changing technical landscape. Extensive experience of working with a diverse range of stakeholders and external partners to more »
Terraform/Docker/Kubernetes. Write software using either Java/Scala/Python . The following are nice to have, but not required - Apache Spark jobs and pipelines. Experience with any functional programming language. Database design concepts. Writing and analysing SQL queries. Application overVIOOH Our recruitment team will more »
Greater London, England, United Kingdom Hybrid / WFH Options
Understanding Recruitment
use Java (for a very small amount of scripting work) Have public cloud experience with AWS or other cloud providers Have an understanding of Apache products such as Kafka and Flake Good knowledge of development using CI/CD Bonus points if you knowledge of: Web products Financial markets more »
Kubernetes. Other tech components used: Google Cloud App Engine, Cloud Functions Cloud Endpoints Kubernetes Engine, Compute Engine Cloud SQL, Cloud Spanner and BigQuery Dataflow, Apache Beam DialogFlow Architecture Microservices patterns Event-driven architectures and message queues Relational and non-relational databases If you are interested in a confidential conversation more »
. • Troubleshooting networks issues (tcpdump/Wireshark). • Scripting capabilities (SH/Bash/Python/Perl). • Configuration of common services (DNS/Apache/NGINX/Postfix/Squid/SSH/iptables). • Understanding of clustering services, enabling High Availability failover. • Experience with enterprise hardware and more »
Linux System administration ( Preferred Red hat Certification) Kubernetes Ansible Puppet Network analysis, tcpdump wireshark Shell Scripting Python Secondary Skills: SaltStack Ansible Puppet Kubernetes Keycloak Apache python bash Prometheus Grafana Splunk Responsibility: System Administration: Install, configure, and maintain Linux operating systems on both physical and virtual machines. Shell Scripting: Develop more »