Leeds, England, United Kingdom Hybrid / WFH Options
Harvey Nash
websites and web apps using HTML, PHP, Javascript Full stack development, Bootstrap, SQL Best practice PHP with an emphasis on secure development practices Linux, Apache/Nginx, PostgreSQL/MySQL, Bootstrap stack Creating scalable, clean and resilient solutions through code Version control through Git to manage the codebase efficiently more »
and AI models. Data Engineer Required Experience Data engineering experience (2+ years) Cloud platform proficiency (e.g., AWS, Azure, GCP) Data pipeline development (e.g., Airflow, Apache Spark) SQL proficiency, database design Visualization tools knowledge (e.g., Tableau, PowerBI, Looker) Data Engineer Application Process This is a 1 year contract requirement with more »
to use Java (for a very small amount of scripting work)Have public cloud experience with AWS or other cloud providersHave an understanding of Apache products such as Kafka and FlakeGood knowledge of development using CI/CDBonus points if you knowledge of:Web productsFinancial marketsThey are a very more »
and team working skills Nice to haves Degree in Computer Science or similar Experience with No-SQL databases; Mongo, Cassandra, Redis Real time streaming; Apache Storm/Kafka Streams Infrastructure knowledge; ansible, puppet, AWS, Kubernetes more »
problem-solving and communication skills Proficiency in scripting for automating deployment and maintenance tasks. Understanding of DAG (Directed Acyclic Graph) models and experience with Apache Airflow for managing complex data processing work-flows. Solid understanding of software development best practices, including version control (Git), testing, and code review processes. more »
data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook.Our Commitment to Diversity and InclusionAt Databricks, we are committed more »
engineers of varying levels of experience Flexibility and willingness to adapt to new software and techniques Nice to Have Experience working with projects in Apache Spark, Databricks of similar Expert cloud platform knowledge, e.g. Azure What will be your key responsibilities?A technical expert and leader on the Petcare more »
engineers of varying levels of experience. Flexibility and willingness to adapt to new software and techniques. Nice to Have Experience working with projects in Apache Spark, Databricks of similar. Expert cloud platform knowledge, e.g. Azure What will be your key responsibilities? A technical expert and leader on the Petcare more »
LIN Buses Serial Buses (RS485/RS232 etc..) SPI/I2C Python Go XML JSON HTML CSS Web backend servers (Angular, Django, NodeJS, React, Apache or similar) Web Sockets IP video and video routing Familiarity with Systems serving Real Time Information via Web Sockets Use of DDS and interfacing more »
highest standards in database design and operations. The Person We're looking for someone who is: Experienced as a Data Engineer, demonstrating proficiency in Apache Spark and cloud-based technologies, especially Microsoft Azure and Databricks. Skilled in programming, particularly Python, and familiar with data integration tools and ETL frameworks. more »
management, HTML, Git, SASS, JavaScript (including JQuery Foundation.js and Vue.js), SLIM (3) and TWIG, JSON, XML, and REST APIs. * Working knowledge of web servers (Apache), DevOps practices, package managers (Composer and NPN), and unit testing (PHPUnit). * Strong analytical problem-solving skills. * Familiarity with Ubuntu Linux. Desirable: * Experience with more »
needed: Proficiency in AWS services relevant to data engineering such as S3, Glue, EMR, Athena, and Lambda. Experience with data pipeline orchestration tools like Apache Airflow . Understanding of Datamodelling Principles and best practices. Hands-on experience with Snowflake, Redshift cloud data warehousing solutions. Familiarity with DBT (Data Build more »
Proven experience in MLOps and deploying machine learning models on Kubernetes. Proficiency in cloud technologies, AWS, GCP, Azure Experience with data orchestration tools (e.g., Apache Airflow). Familiarity with Terraform for CI/CD and infrastructure as code. Strong programming skills in software development. A cloud-agnostic mindset, with more »
You will require a blend of the following: - Strong SQL Query experience Knowledge of Azure Data Factory is desirable Python Development knowledge Knowledge of Apache Airflow. Experience with enterprise DBMS and, ideally, Google BigQuery. Experience in Google Cloud Platform services. Any knowledge of Power BI and Tableau is useful. more »
testing, and maintenance of data pipelines and data storage systems on Google Cloud Platform (GCP). You will be working with technologies such as Apache Airflow, BigQuery, Python, and SQL to transform and load large data sets, ensuring high data quality and accessibility for business intelligence and analytics purposes. more »
Crewe, Cheshire, United Kingdom Hybrid / WFH Options
OCC Computer Personnel
Kubernetes/Docker or other container technologies. Scripting skills including Python and Bash. Strong Linux systems admin, Linux and database technologies such as Ubuntu, Apache, PHP, MySQL, PostgreSQL, Nginx, Mercurial and Git This is an exciting opportunity where you will be involved in planning and implementing system migrations, modernisations more »
Python or Bash for automation and infrastructure management tasks. Basic understanding of CI/CD pipelines and version control systems like Git. Exposure to Apache Airflow, Informatica, or similar data integration tools. more »
Durham, County Durham, North East, United Kingdom Hybrid / WFH Options
Reed Technology
a data science team and managing complex projects. Expertise in machine learning, statistics, data management, and relevant technologies (e.g., Python, R, SQL, AWS SageMaker, Apache Airflow, Dbt, AWS Kinesis). Strong communication skills with the ability to explain complex data concepts to a non-technical audience. Knowledge of industry more »
Crewe, Cheshire, United Kingdom Hybrid / WFH Options
OCC Computer Personnel
Kubernetes/Docker or other container technologies. Scripting skills including Python and Bash. Strong Linux systems admin, Linux and database technologies such as Ubuntu, Apache, PHP, MySQL, PostgreSQL, Nginx, Mercurial and Git This is an exciting opportunity where you will be involved in planning and implementing system migrations, modernisations more »
in Computer Science, Software Engineering, or a related field.Proven experience as a Senior Software Developer, with a strong background in LAMP stack applications (Linux, Apache, MySQL, PHP).Proficiency in front-end technologies such as HTML, CSS, JavaScript, and modern frameworks like React or Angular.Strong experience with database design and more »
working in a Product organisation and ideally Fintech • Practical hands on knowledge of Java Technology Stack. J2EE/Spring etc. • Experience of working with Apache Nifi • Experienced in working with AWS, Docker, Jenkins etc. rolling out AWS environments and environment strategy. • Have sound Database experience, preferably Oracle. • Experience in more »
various open-source software and containerization Experience with cloud systems (e.g., AWS, Azure, and GCP) and APIs Understanding of databases and web servers (Nginx, Apache) Scripting skills (e.g., shell scripts, Bash, Python) Second language (French, German, and/or Spanish If you are an experienced Technical Service Engineer based more »
London, England, United Kingdom Hybrid / WFH Options
Austin Fraser
a plus: Cutting-Edge Tech: Experience with containerisation, Kubernetes, and observability platforms. Workflow Wizardry: Familiarity with data orchestration tools like Airflow and ETL with Apache Beam. Data Visionary: Knowledge of DataVault (DV2) and data management concepts. Location: Our opportunities are available in London Victoria and Bracknell. Choose the work more »
Jenkins, Bamboo, Concourse etc. Monitoring utilising products such as: Prometheus, Grafana, ELK, filebeat etc. Observability - SRE Big Data solutions (ecosystems) and technologies such as: Apache Spark and the Hadoop Ecosystem Edge technologies eg NGINX, HAProxy etc. Excellent knowledge of YAML or similar languages Desirable Requirements: Jupyter Hub Awareness Minio more »
data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook.Our Commitment to Diversity and InclusionAt Databricks, we are committed more »