business context. Commercially minded, thinking about ways to increase revenue & profitability. Proficiency in data manipulation tools (Python, Pandas, Spark, SQL) and data visualization tools (Apache Superset, Tableau, Power BI, ggplot2) and MS Excel. Grasp of pricing strategies, market dynamics, and consumer behaviour in the online space is a plus. more »
for seamless data integration. * Understanding of DevOps best practices for SQL and Power BI projects, including DACPAC, CI/CD, and versioning. * Familiarity with Apache Spark for big data processing. * Additional development experience in Python or related technologies. * Experience gained within a Media, Travel or Broadcast Media sectors would more »
Employment Type: Permanent
Salary: £65000 - £70000/annum Hybrid, Health, Dental, Extra Hols
EC2N, Broad Street, Greater London, United Kingdom
James Joseph Associates
and team effectiveness Participating in Client Service and Technology meetings Handle production incidents and problem management activities related to the incidents TECH STACK: Linux, Apache, MySQL, OO Perl AWS CloudWatch Monitoring and Alerting Systems Ticketing/workflow Systems (e.g. Jira) KEY SKILLS/EXPERIENCE REQUIRED: Good degree in Computer more »
Python client for Google BigQuery Advanced SQL (GoogleSQL, MySQL) Google Cloud Services Advanced BigQuery Advanced Google Cloud Storage Google Dataform Google Cloud Function Advanced Apache Airflow Basic Tableau: ability to create basic visualisations Ability to integrate multiple data sources and databases into one system Able to create database schemas more »
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Wyoming Interactive
and experienced in Front-end tools/compilers Experience with database management (MySQL, Aurora, AWS RDS configuration), git source control, and web server configuration (Apache, Nginx). Familiarity with Docker for development environments and CI/CD pipelines. Infrastructure and Cloud Services: Experience designing and managing infrastructure in AWS more »
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Wyoming Interactive
and experienced in Front-end tools/compilers Experience with database management (MySQL, Aurora, AWS RDS configuration), git source control, and web server configuration (Apache, Nginx). Familiarity with Docker for development environments and CI/CD pipelines. Infrastructure and Cloud Services: Experience designing and managing robust, high availability more »
development of technical solutions, including websites, REST APIs, SDKs, using .NET (C#/Javascript) and/or Java; experience with hosting platforms, e.g. IIS, Apache, Nginx. Experience with database development and/or administration using SQL Server (preferred), MySQL or PostgreSQL. About working for us: We're on an more »
development of technical solutions, including websites, REST APIs, SDKs, using .NET (C#/Javascript) and/or Java; experience with hosting platforms, e.g. IIS, Apache, Nginx. Experience with database development and/or administration using SQL Server (preferred), MySQL or PostgreSQL. About working for us: We're on an more »
are in compliance with specifications Should have understand Banking domain Should have Core Banking knowledge Familiarity with databases (e.g. MySQL, MongoDB), web servers (e.g. Apache) and UI/UX design Excellent communication and teamwork skills more »
Proven experience in MLOps and deploying machine learning models on Kubernetes. Proficiency in cloud technologies, AWS, GCP, Azure Experience with data orchestration tools (e.g., Apache Airflow). Familiarity with Terraform for CI/CD and infrastructure as code. Strong programming skills in software development. A cloud-agnostic mindset, with more »
experience in data engineering. Experienced in building ETL data pipelines. Relational database experience w/PostgreSQL. Understanding of tech within our stack: AWS/Apache beam/Kafka. Experience with Object Orientated Programming A desire to work in the commodities/trading sector. Permanent/Full-Time Employment. Hybrid more »
testing, and maintenance of data pipelines and data storage systems on Google Cloud Platform (GCP). You will be working with technologies such as Apache Airflow, BigQuery, Python, and SQL to transform and load large data sets, ensuring high data quality and accessibility for business intelligence and analytics purposes. more »
Durham, County Durham, North East, United Kingdom Hybrid / WFH Options
Reed Technology
a data science team and managing complex projects. Expertise in machine learning, statistics, data management, and relevant technologies (e.g., Python, R, SQL, AWS SageMaker, Apache Airflow, Dbt, AWS Kinesis). Strong communication skills with the ability to explain complex data concepts to a non-technical audience. Knowledge of industry more »
as Hadoop and Spark. Experience with data warehousing technologies such as Redshift, Snowflake, or BigQuery. Experience with data pipeline and ETL tools such as Apache NiFi, Airflow, or Glue. Knowledge of data governance and security best practices. Strong problem-solving and analytical skills. Ability to work well in a more »
. • Troubleshooting networks issues (tcpdump/Wireshark). • Scripting capabilities (SH/Bash/Python/Perl). • Configuration of common services (DNS/Apache/NGINX/Postfix/Squid/SSH/iptables). • Understanding of clustering services, enabling High Availability failover. • Experience with enterprise hardware and more »
engineers of varying levels of experience. Flexibility and willingness to adapt to new software and techniques. Nice to Have Experience working with projects in Apache Spark, Databricks of similar. Expert cloud platform knowledge, e.g. Azure What will be your key responsibilities? A technical expert and leader on the Petcare more »
engineers of varying levels of experience. Flexibility and willingness to adapt to new software and techniques. Nice to Have Experience working with projects in Apache Spark, Databricks of similar. Expert cloud platform knowledge, e.g. Azure What will be your key responsibilities? A technical expert and leader on the Petcare more »
Nottingham, Nottinghamshire, East Midlands, United Kingdom
Microlise
computing concepts and experience working with hybrid or private cloud platforms is a plus Demonstrable technical experience working with a Microsoft, Red Hat, and Apache data and software engineering environment A team-oriented individual with a passion for engineered excellence and the ability to lead and motivate a team more »
computing concepts and experience working with hybrid or private cloud platforms is a plus Demonstrable technical experience working with a Microsoft, Red Hat, and Apache data and software engineering environment A team-oriented individual with a passion for engineered excellence and the ability to lead and motivate a team more »
London, England, United Kingdom Hybrid / WFH Options
Element Materials Technology
in: Building a modular Kubernetes-centric platform, with Pulumi, Terraform, and Argo. Implementing service mesh and configuration management for microservices. Operating critical infrastructure like Apache Pulsar or Kafka and Keycloak. Developing a multi-Cloud approach supporting Azure, Alibaba, and GCP. Implementing collection, dashboards, and alerts for logs and metrics. more »
Drupal Magento BigCommerce Laravel Proficient in setting up development and staging environments. Proficient in using and altering MySQL databases. Familiarity with server structures, specifically Apache and Nginx. Familiarity with domain management and DNS records. Familiarity with Agile and Waterfall work environments. Nice to haves: Familiarity with digital marketing/ more »
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Maclean Moore
basis. ROLE: GCP DATA ENGINEER LOCATION: NEWPORT OR CARDIFF (HYBRID) IR35 STATUS: INSIDE LENGTH: 6 MONTHS Required experience: Expertise in python and DataFlow/Apache beam Experience in handling streaming data Strong experience in database replication using Message based CDC Experience in using Kafka implementations in a secured cloud more »
Especially MS Azure is recommended as Microsoft Fabric is integrated within Azure services. Experience of designing robust , secure and compliant capabilities. Strong understanding of Apache Spark, Including its Architecture , Components, and how to create, Monitor, Optimize, and Scale Spark Jobs. Experienced working in a DevOps/Agile Team Experience more »
development, deployment of large scale data streaming Pipelines in GCP . Work on Data Streaming POC Experience required: Expertise in python and DataFlow/Apache beam Experience in handling streaming data. Strong experience in database replication using Message based CDC Experience in using Kafka implementations in a secured cloud more »
South East London, England, United Kingdom Hybrid / WFH Options
Hunter Bond
s Degree in Computer Science, Engineering (or other related STEM subject)5+ years experience in data engineering2+ years in a leadership role.Experience working with Apache Spark, Azure Data Factory and other data pipelines tools.Strong programming skills.Impeccable communication skills.Precise attention to detail.Pioneering attitude.If you are a Lead Data Engineer and more »