City of London, London, United Kingdom Hybrid / WFH Options
GCS Ltd
harnessing diverse AWS services. Key Requirements: High level of experience in both SQL and Python programming (10+ years) Experience managing data engineering pipelines using Apache Airflow Proficiency in CI/CD pipelines and automation Git proficiency for version control (branching strategies and repo management) Competent in monitoring tools such more »
Manchester, North West, United Kingdom Hybrid / WFH Options
N Brown Group
and practices and tools like Jira and Confluence. What technical skills you will have Experience with general Cloud products (Cloyd SQL, BigQuery, RedShift, Snowflake, Apache Beam, Spark) or similar products. Experience with open-source data-stack tools such as Airflow, Airbyte, DBT, Kafka etc. Awareness of data visualisation tools more »
to production, providing subject matter expertise on the .Net stack and contributing to technical design discussions. You'll use a range of technology including Apache Flink with Java for large scale data processing and will be able to assess and recommend new and emerging technologies, using the best tool more »
to production, providing subject matter expertise on the .Net stack and contributing to technical design discussions. You'll use a range of technology including Apache Flink with Java for large scale data processing and will be able to assess and recommend new and emerging technologies, using the best tool more »
Luton, England, United Kingdom Hybrid / WFH Options
Ventula Consulting
models and following best practices. The Ability to develop pipelines using SageMaker, MLFlow or similar frameworks. Strong experience with data programming frameworks such as Apache Spark. Understanding of common Data Science and Machine Learning models, libraries and frameworks. This role provides a competitive salary plus excellent benefits package. In more »
Greater London, England, United Kingdom Hybrid / WFH Options
Understanding Recruitment
use Java (for a very small amount of scripting work) Have public cloud experience with AWS or other cloud providers Have an understanding of Apache products such as Kafka and Flake Good knowledge of development using CI/CD Bonus points if you knowledge of: Web products Financial markets more »
for seamless data integration. * Understanding of DevOps best practices for SQL and Power BI projects, including DACPAC, CI/CD, and versioning. * Familiarity with Apache Spark for big data processing. * Additional development experience in Python or related technologies. * Experience gained within a Media, Travel or Broadcast Media sectors would more »
Employment Type: Permanent
Salary: £65000 - £70000/annum Hybrid, Health, Dental, Extra Hols
Skills & Experience At least 10 years experience working with JavaScript or Python/Java Previous experience deploying Software into the Cloud EKS, Docker, Kubernetes Apache Spark or NiFi Microservice architecture experience Experience with AI/ML systems more »
Leeds, England, United Kingdom Hybrid / WFH Options
Harvey Nash
websites and web apps using HTML, PHP, Javascript Full stack development, Bootstrap, SQL Best practice PHP with an emphasis on secure development practices Linux, Apache/Nginx, PostgreSQL/MySQL, Bootstrap stack Creating scalable, clean and resilient solutions through code Version control through Git to manage the codebase efficiently more »
Salisbury, Wiltshire, South West, United Kingdom Hybrid / WFH Options
Sopra Steria Limited
bring: Good knowledge of Application installation and fixing on Linux base system CentOS/Redhat Linux/Rocky 8 Web Layer experience such as Apache, Tomcat Shell Scripting/BASH, Python - Knowledge of scripts for automation Git, Bitbucket, Jira, Azure DevOps/TFS Appreciation for security (standard methodology and more »
highest standards in database design and operations. The Person We're looking for someone who is: Experienced as a Data Engineer, demonstrating proficiency in Apache Spark and cloud-based technologies, especially Microsoft Azure and Databricks. Skilled in programming, particularly Python, and familiar with data integration tools and ETL frameworks. more »
You will require a blend of the following: - Strong SQL Query experience Knowledge of Azure Data Factory is desirable Python Development knowledge Knowledge of Apache Airflow. Experience with enterprise DBMS and, ideally, Google BigQuery. Experience in Google Cloud Platform services. Any knowledge of Power BI and Tableau is useful. more »
testing, and maintenance of data pipelines and data storage systems on Google Cloud Platform (GCP). You will be working with technologies such as Apache Airflow, BigQuery, Python, and SQL to transform and load large data sets, ensuring high data quality and accessibility for business intelligence and analytics purposes. more »
Crewe, Cheshire, United Kingdom Hybrid / WFH Options
OCC Computer Personnel
Kubernetes/Docker or other container technologies. Scripting skills including Python and Bash. Strong Linux systems admin, Linux and database technologies such as Ubuntu, Apache, PHP, MySQL, PostgreSQL, Nginx, Mercurial and Git This is an exciting opportunity where you will be involved in planning and implementing system migrations, modernisations more »
Durham, County Durham, North East, United Kingdom Hybrid / WFH Options
Reed Technology
a data science team and managing complex projects. Expertise in machine learning, statistics, data management, and relevant technologies (e.g., Python, R, SQL, AWS SageMaker, Apache Airflow, Dbt, AWS Kinesis). Strong communication skills with the ability to explain complex data concepts to a non-technical audience. Knowledge of industry more »
London, England, United Kingdom Hybrid / WFH Options
Austin Fraser
a plus: Cutting-Edge Tech: Experience with containerisation, Kubernetes, and observability platforms. Workflow Wizardry: Familiarity with data orchestration tools like Airflow and ETL with Apache Beam. Data Visionary: Knowledge of DataVault (DV2) and data management concepts. Location: Our opportunities are available in London Victoria and Bracknell. Choose the work more »
in: Building a modular Kubernetes-centric platform, with Pulumi, Terraform, and Argo. Implementing service mesh and configuration management for microservices. Operating critical infrastructure like Apache Pulsar or Kafka and Keycloak. Developing a multi-Cloud approach supporting Azure, Alibaba, and GCP. Implementing collection, dashboards, and alerts for logs and metrics. more »
London, England, United Kingdom Hybrid / WFH Options
Parkopedia
of web development principles (HTTP, RESTful APIs) and data structures Experience with data retrieval, transformation, and manipulation techniques using Python and tools such as Apache Airflow Commercial experience with AWS and IaC (Terraform/CDK/CloudFormation) Applicable understanding of API security, common exploits and secure development practices, including more »