backend & frontend Familiarity with computer vision libraries and frameworks such as OpenCV, TensorFlow, and PyTorch. Knowledge of databases (e.g., MySQL, MongoDB), web servers (e.g., Apache), and UI/UX design principles. Relevant BSc/MSc degree eg Computer Science, ML, Computer Vision or a relevant tech subject. This is More ❯
Languages : Python, Bash, Go Network Modelling : YANG API Protocols : RESTCONF, NETCONF Platforms : ServiceNow, GitHub, Azure, AWS Data : XML, JSON Other : Azure DevOps, GIT, Linux, Apache, MySQL Ideal Candidate Strong experience in automation and systems integration. Proficient in Python and automation using Ansible. Familiarity with ServiceNow, GitHub workflows, and network More ❯
or similar role. Proficiency with Databricks and its ecosystem. Strong programming skills in Python, R, or Scala. Experience with big data technologies such as Apache Spark, Databricks. Knowledge of SQL and experience with relational databases. Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud). Strong analytical and problem More ❯
applications Experience with NoSQL and in-memory databases (MongoDB, CouchDB, Redis, or others) Experience with analytical databases and processing large scales of data (ClickHouse, Apache Druid, or others) Experience with analyzing and tuning database queries Experience with Event-Driven Architecture Can't find the position you're looking for More ❯
and Responsibilities: Design and implement highly scalable and reactive backend systems in Java. Utilize reactive programming frameworks like Kafka Streams , Akka , Eclipse Vert.x , or Apache Flink . Leverage Java concurrency features including multithreading and Executor Services to ensure optimal performance. Apply functional programming paradigms in Java to write clean More ❯
experience in data engineering, with a strong understanding of modern data technologies (e.g., cloud platforms like AWS, Azure, GCP, and data tools such as Apache Spark, Kafka, dbt, etc.). Proven track record of leading and managing data engineering teams in a consultancy or similar environment. Strong expertise in More ❯
london, south east england, united kingdom Hybrid / WFH Options
Solytics Partners
and Responsibilities: Design and implement highly scalable and reactive backend systems in Java. Utilize reactive programming frameworks like Kafka Streams , Akka , Eclipse Vert.x , or Apache Flink . Leverage Java concurrency features including multithreading and Executor Services to ensure optimal performance. Apply functional programming paradigms in Java to write clean More ❯
experience in data engineering, with a strong understanding of modern data technologies (e.g., cloud platforms like AWS, Azure, GCP, and data tools such as Apache Spark, Kafka, dbt, etc.). Proven track record of leading and managing data engineering teams in a consultancy or similar environment. Strong expertise in More ❯
technical specifications; assured code/systems against standards; championed data standards Metadata Management: Designed/Developed data catalogues or metadata repositories; used tools like Apache Atlas/Hive Metastore/AWS Glue/AWS Datazone Data Design: Designed data lakes/data warehouses/data lakehouses/data pipelines More ❯
to Have: Experience in retail and/or e-commerce. Knowledge of Big Data, Distributed Computing, and streaming technologies like Spark Structured Streaming or Apache Flink. Additional programming skills in PowerShell or Bash. Understanding of Databricks Ecosystem components. Familiarity with Data Observability or Data Quality Frameworks. More ❯
Tools: GitLab CI, Terraform, Ansible, Helm Charts, Python, PowerShell, REST APIs. Kubernetes: Experience building and managing Kubernetes clusters and application delivery. Applications: Familiarity with Apache NiFi, Elastic ECK, Artifactory. Secret Management: Expertise in using HashiCorp Vault. Operating Systems: Solid experience with Red Hat and Windows environments. Apply today via More ❯
knowledge of geoscience and well data (Geology, Geophysics, Petrophysics, Wells, etc.). Familiarity with data visualization tools (Power BI, Tableau) or workflow automation tools(Apache Airflow). Exposure to Azure SQL, cloud storage solutions, or geoscience data platforms. Previous internship or coursework involving data reconciliation, validation, or geoscience records More ❯
knowledge of geoscience and well data (Geology, Geophysics, Petrophysics, Wells, etc.). Familiarity with data visualization tools (Power BI, Tableau) or workflow automation tools(Apache Airflow). Exposure to Azure SQL, cloud storage solutions, or geoscience data platforms. Previous internship or coursework involving data reconciliation, validation, or geoscience records More ❯
Excel, Quicksight, Power BI) Data analysis and statistics KPI design PREFERRED QUALIFICATIONS Power BI and Power Pivot in Excel AWS fundamentals (IAM, S3, ) Python Apache Spark/Scala Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a More ❯
data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook. Benefits At Databricks, we strive to provide comprehensive benefits More ❯
data, analytics, and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake, and MLflow. Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our More ❯
data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook. Benefits At Databricks, we strive to provide comprehensive benefits More ❯
Coalville, Leicestershire, East Midlands, United Kingdom Hybrid / WFH Options
Ibstock PLC
and BI solutions. Ensure data accuracy, integrity, and consistency across the data platform. Knowledge, Skills and Experience: Essentia l Strong expertise in Databricks and Apache Spark for data engineering and analytics. Proficient in SQL and Python/PySpark for data transformation and analysis. Experience in data lakehouse development and More ❯
Critical Skills (must have). PHP8+ and the Laravel framework. MySQL. Proficiency in Laravel Solid understanding of server management (e.g., Linux, Docker, Nginx/Apache). Knowledge of Agile/Scrum methodologies. Outcome of the Role A smooth and secure platform experience for buyers and sellers. Scalability to support More ❯
BE NICE FOR THE DATA ENGINEER TO HAVE…. Cloud based architectures Microservice architecture or server-less architecture Messaging/routing technologies such as Apache Nifi/RabbitMQ TO BE CONSIDERED…. Please either apply by clicking online or emailing me directly to dominic.barbet@searchability.com For further information please More ❯
BE NICE FOR THE DATA ENGINEER TO HAVE…. Cloud based architectures Microservice architecture or server-less architecture Messaging/routing technologies such as Apache Nifi/RabbitMQ TO BE CONSIDERED…. Please either apply by clicking online or emailing me directly to dominic.barbet@searchability.com For further information please More ❯
Contract iO Associates are currently partnered with a customer in the defence space who need multiple DV cleared DevOps Engineers. Requirements: - AWS - Ansible - Kubernetes - Apache Nifi - Gitlab - Terraform If you're a DevOps engineer looking for acontract offering up to £ 5 00 A DAY OUTSIDE IR35 , then send an More ❯
Contract iO Associates are currently partnered with a customer in the defence space who need multiple DV cleared DevOps Engineers. Requirements: - AWS - Ansible - Kubernetes - Apache Nifi - Gitlab - Terraform If you're a DevOps engineer looking for acontract offering up to £ 5 00 A DAY OUTSIDE IR35 , then send an More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
months. Key Skills: CI/CD (Gitlab CI, Terraform, Ansible, Helm Charts, Python, PowerShell, REST APIs) Kubernetes cluster build and application delivery Application Experience: Apache NiFi, Elastic ECK, Artifactory Secret Management: Hashicorp Vault Operating Systems: Red Hat & Windows This is an Outside IR35 contract paying between £400-£425 per More ❯