up and learn new technologies and frameworks Nice to have: Knowledge of databases, SQL Familiarity with Boost ASIO Familiarity with data serialization formats such Apache Arrow/Parquet, Google Protocol Buffers, Flatbuffers Experience with gRPC, http/REST and Websocket protocols Experience with Google Cloud/AWS and/ more »
Greenford, London, United Kingdom Hybrid / WFH Options
Indotronix Avani UK Ltd
Java, Python, and Ruby. Experience in database design under MS SQL, MySQL, Firebird, or similar servers. Experience with web servers such as IIS and Apache, or similar servers. Experience in Web design using HTML, JSON, JavaScript, etc. Experience in API design. Degree in electronics engineering/IT - Programming or more »
developing and optimising ETL pipelines. Version Control: Experience with Git for code collaboration and change tracking. Data Pipeline Tools: Proficiency with tools such as Apache Airflow. Cloud Platforms: Familiarity with AWS, Azure, Snowflake, and GCP. Visualisation: Tableau or PowerBI Delivery Tools: Familiarity with agile backlogs, code repositories, automated builds more »
Microservice frameworks • working knowledge of client-side web technologies (React, JavaScript) • experience with Messaging frameworks (like Tibco, Kafka) • experience with web servers running Tomcat, Apache • exposure to Azure Cloud services (like Azure AKS, CI/CD) • knowledge of open-source market edge technologies like Cache frameworks, Monitoring tools etc. more »
a production setting. Knowledge of developing real-time data stream systems (ideally Kafka). Proven track record in developing data systems using PySpark and Apache Spark for batch processing. Capable of managing data intake from various sources, including data streams, unstructured data, relational databases, and NoSQL databases. Extensive knowledge more »
pipelines solutions for the ingestion, transformation, and serving of data, as well as solutions for the orchestration of pipeline components (e.g. AWS Step Functions, Apache Airflow). Good understanding of data modelling, algorithm, and data transformation techniques to work with data platforms. Working knowledge of cloud development practices (AWS more »
have a valid visa as we Are not able to sponsor Technical Stack:- Python, Postgres SQL, Azure Databricks, AWS(S3), Git, Azure DevOps CICD, Apache Airflow Skills years of experience in python scripting. in developing applications in Python language. to python-oriented Algorithm’s libraries such as NumPy, pandas more »
and harnessing diverse AWS services.Key Requirements:High level of experience in both SQL and Python programming (10+ years)Experience managing data engineering pipelines using Apache AirflowProficiency in CI/CD pipelines and automation Git proficiency for version control (branching strategies and repo management)Competent in monitoring tools such as more »
Flask, Tornado or Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as Apache Spark, Hadoop, Kafka, etc. Data acquisition and development of data sets and improving data quality Preparing data for predictive and prescriptive modelling Hands on more »
to production, providing subject matter expertise on the .Net stack and contributing to technical design discussions. You'll use a range of technology including Apache Flink with Java for large scale data processing and will be able to assess and recommend new and emerging technologies, using the best tool more »
London, England, United Kingdom Hybrid / WFH Options
client server
to production, providing subject matter expertise on the .Net stack and contributing to technical design discussions. You'll use a range of technology including Apache Flink with Java for large scale data processing and will be able to assess and recommend new and emerging technologies, using the best tool more »
London, England, United Kingdom Hybrid / WFH Options
Client Server
to production, providing subject matter expertise on the .Net stack and contributing to technical design discussions. You'll use a range of technology including Apache Flink with Java for large scale data processing and will be able to assess and recommend new and emerging technologies, using the best tool more »
London, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
Work with Hadoop, Spark, and other platforms for large-scale data processing. Real-Time Data Streaming: Develop and manage pipelines using CDC, Kafka, and Apache Spark. Database Management: Handle SQL databases like Oracle, MySQL, or PostgreSQL. Data Governance: Ensure data quality, security, and compliance with best practices. Ideal Candidate more »
Greater London, England, United Kingdom Hybrid / WFH Options
Understanding Recruitment
use Java (for a very small amount of scripting work) Have public cloud experience with AWS or other cloud providers Have an understanding of Apache products such as Kafka and Flake Good knowledge of development using CI/CD Bonus points if you knowledge of: Web products Financial markets more »
the following: - Experience of working in an Agile product delivery framework - Experience with PHP 8+ and the Laravel framework - Experience with Linux, NGINX (or Apache), MySQL server. LEMP/LAMP stack. - Experience of writing unit tests with test frameworks (PHPUnit, Codeception, etc.) Although we have a dedicated QA Test more »