London, England, United Kingdom Hybrid / WFH Options
Aventum Group
Profisee), Snowflake Data Integration, Azure Service Bus, Deltalake, BigQuery, Azure DevOps, Azure Monitor, Azure Data Factory, SQL Server, Azure DataLake Storage, Azure App Service, Apache Airflow, Apache Iceberg, Apache Spark, Apache Hudi, Apache Kafka, Power BI, BigQuery, Azure ML is a plus Experience with Azure more »
work is largely down to you. It can be entirely Back End. Otherwise, the stack includes Redux Saga, Ag-Grid, Node, TypeScript, gRPC, protobuf, Apache Ignite, Apache Airflow and AWS. As the application suite grows and advances in complexity, there is a decent amount of interaction with the more »
up and learn new technologies and frameworks Nice to have: Knowledge of databases, SQL Familiarity with Boost ASIO Familiarity with data serialization formats such Apache Arrow/Parquet, Google Protocol Buffers, Flatbuffers Experience with gRPC, http/REST and Websocket protocols Experience with Google Cloud/AWS and/ more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
in SQL, NoSQL, Blob,Delta Lake, and other enterprise scale data stores. Data Orchestration - Enterprise scale usage of technology such as Azure Data Factory, Apache Airflow, Logic Apps, DBT, SnapLogic, Spark or similar tools. Software Tooling - GIT/GitHub, CI/CD, deployment tools like Octopus, Terraform infrastructure as more »
Java, Python, and Ruby. Experience in database design under MS SQL, MySQL, Firebird, or similar servers. Experience with web servers such as IIS and Apache, or similar servers. Experience in Web design using HTML, JSON, JavaScript, etc. Experience in API design. Degree in electronics engineering/IT - Programming or more »
Greenford, London, United Kingdom Hybrid / WFH Options
Indotronix Avani UK Ltd
Java, Python, and Ruby. Experience in database design under MS SQL, MySQL, Firebird, or similar servers. Experience with web servers such as IIS and Apache, or similar servers. Experience in Web design using HTML, JSON, JavaScript, etc. Experience in API design. Degree in electronics engineering/IT - Programming or more »
developing and optimising ETL pipelines. Version Control: Experience with Git for code collaboration and change tracking. Data Pipeline Tools: Proficiency with tools such as Apache Airflow. Cloud Platforms: Familiarity with AWS, Azure, Snowflake, and GCP. Visualisation: Tableau or PowerBI Delivery Tools: Familiarity with agile backlogs, code repositories, automated builds more »
pipelines solutions for the ingestion, transformation, and serving of data, as well as solutions for the orchestration of pipeline components (e.g. AWS Step Functions, Apache Airflow). Good understanding of data modelling, algorithm, and data transformation techniques to work with data platforms. Working knowledge of cloud development practices (AWS more »
or Operations Center. Knowledge of and a proficiency in using Computer and Microsoft Office applications (Word, Excel, Access, Outlook) Familiarity & knowledge with the following, Apache Spark, Kubernetes, Kafka, SIMP Project, Ansible, Docker, GIT, Red Hat Enterprise Linux (RHEL), Suricata, Zeek, Kibana, Logstash, Elastic Search, Neo4J, PostgreSQL, AWS Cloud & Nifi more »
Other skills we are looking for you to demonstrate include: Experience of data storage technologies: Delta Lake, Iceberg, Hudi Sound knowledge and understanding of Apache Spark, Databricks or Hadoop Ability to take business requirements and translate these into tech specifications Knowledge of Architecture best practices and patterns Competence in more »
able to deploy code via CI/CD platforms (e.g. Github Actions, Jenkins) Experience in Proficiency in working with distributed computing frameworks, such as Apache Spark and data modelling, database systems, and SQL optimisation techniques Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and associated services (e.g., S3 more »
to production, providing subject matter expertise on the .Net stack and contributing to technical design discussions. You'll use a range of technology including Apache Flink with Java for large scale data processing and will be able to assess and recommend new and emerging technologies, using the best tool more »
integration architecture and design skills Good communication skills Desirable Skills: JavaScript – with React/Vue being even better. Docker/Kubernetes Linux – Basic sysadmin (Apache, Nginx) SQL/Oracle/PostgreSQL/MongoDB/DynamoDB Message Queues – RabbitMQ or similar AWS or GCP This an office based position more »
integration architecture and design skills Good communication skills Desirable Skills: JavaScript - with React/Vue being even better. Docker/Kubernetes Linux - Basic sysadmin (Apache, Nginx) SQL/Oracle/PostgreSQL/MongoDB/DynamoDB Message Queues - RabbitMQ or similar AWS or GCP This an office based position DCS more »
Terraform/Docker/Kubernetes. Write software using either Java/Scala/Python . The following are nice to have, but not required - Apache Spark jobs and pipelines. Experience with any functional programming language. Database design concepts. Writing and analysing SQL queries. Application overVIOOH Our recruitment team will more »
Modelling. Experience with at least one or more of these programming languages: Python, Scala/Java Experience with distributed data and computing tools, mainly Apache Spark & Kafka Understanding of critical path approaches, how to iterate to build value, engaging with stakeholders actively at all stages. Able to deal with more »
technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud more »