Greater London, England, United Kingdom Hybrid / WFH Options
CommuniTech Recruitment Group
AWS ecosystems like Lambdas, step functions and ECS services. Experience of Dremio is a nice to have Experience with data stack technologies, such as Apache Iceberg & Spark. Exposure to Apache Airflow, Prefect, Dagster, DBT. Expertise in data analysis with exposure to data services (such as Glue, Lake Formation more »
AWS ecosystems like Lambdas, step functions and ECS services. Experience of Dremio is a nice to have Experience with data stack technologies, such as Apache Iceberg & Spark. Exposure to Apache Airflow, Prefect, Dagster, DBT. Expertise in data analysis with exposure to data services (such as Glue, Lake Formation more »
step functions and ECS services. Strong understanding of AWS ecosystems like Lambdas, step functions and ECS services. Experience with data stack technologies, such as Apache Iceberg & Spark. Exposure to Apache Airflow, Prefect, Dagster, DBT. Expertise in data analysis with exposure to data services (such as Glue, Lake Formation more »
learning management systems or content management systems) Strong knowledge of customer centric service management processes Experience with web hosting platforms and security standards (e.g. Apache) Demonstrated ability to adapt to an ever-changing technical landscape. Extensive experience of working with a diverse range of stakeholders and external partners to more »
Our client is a leading supporter of major UK programmes including Apache, Chinook, C-17, P-8A, and Wedgetail, employing over 1,400 workers. They offer engineering sustainment and support for UK armed forces aircraft, along with training. By combining their UK defence business with global strategies, they're more »
to develop unit test cases. Help in backlog grooming. Key skills: Extensive experience in developing Bigdata pipelines in cloud using Bigdata technologies such as Apache Spark Expertise in performing complex data transformation using Spark SQL queries Experience in orchestrating data pipelines using Apache Airflow Proficiency in Git based more »
comfortable designing and constructing bespoke solutions and components from scratch to solve the hardest problems. Adept in Java, Scala, and big data technologies like Apache Kafka and Apache Spark, they bring a deep understanding of engineering best practices. This role involves scoping and sizing, and indeed estimating and … be considered. Key responsibilities of the role are summarised below Design and implement large-scale data processing systems using distributed computing frameworks such as Apache Kafka and Apache Spark. Architect cloud-based solutions capable of handling petabytes of data. Lead the automation of CI/CD pipelines for more »
Westminster, Colorado, United States Hybrid / WFH Options
Maxar Technologies
Prior experience with CI/CD technologies such as Jenkins Prior experience with any of the following: Trino/Starburst, dbt (core or cloud), Apache Superset, OpenMetadata, Apache Airflow, Tableau. Prior experience with RDS databases, or Postgres. Agile software development lifecycle experience These skills would be amazing: Holds more »
Extremely talented in applying SCD, CDC and DQ/DV framework. Familiar with JIRA & Confluence . Must have exposure to technologies such as dbt, Apache airflow, and Snowflake . Desire to continually keep up with advancements in data engineering practices. Knowledge of AWS cloud, and Python is a plus. … Requirements 5+ years of IT experience with major focus on data warehouse/database related projects Must have exposure to technologies such as dbt, Apache Airflow, Snowflake. Experience in data platforms: Snowflake, Oracle, SQL Server, MDM etc Expertise in writing SQL and database objects - Stored procedures, functions, and views. … Hands-on experience in ETL/ELT and data security, SQL performance optimization and job orchestration tools and technologies e.g., dbt, APIs, Apache Airflow, etc. Experience in data modeling and relational database design Well-versed in applying SCD, CDC, and DQ/DV framework. Demonstrate ability to write new more »
South East London, England, United Kingdom Hybrid / WFH Options
Maclean Moore
given requirement.Ensure to develop unit test cases.Help in backlog grooming.Key skills: Extensive experience in developing Bigdata pipelines in cloud using Bigdata technologies such as Apache SparkExpertise in performing complex data transformation using Spark SQL queriesExperience in orchestrating data pipelines using Apache AirflowProficiency in Git based version control toolsProficiency more »
to production, providing subject matter expertise on the .Net stack and contributing to technical design discussions. You'll use a range of technology including Apache Flink with Java for large scale data processing and will be able to assess and recommend new and emerging technologies, using the best tool …/CD and Infrastructure as Code (Terraform) You're familiar with other languages such as Java and are open to learning new things e.g. Apache Flink You've worked on systems that require high throughput and low latency You enjoy problem solving and have great communication and collaboration skills more »
City of London, London, United Kingdom Hybrid / WFH Options
Client Server
to production, providing subject matter expertise on the .Net stack and contributing to technical design discussions. You'll use a range of technology including Apache Flink with Java for large scale data processing and will be able to assess and recommend new and emerging technologies, using the best tool …/CD and Infrastructure as Code (Terraform) You're familiar with other languages such as Java and are open to learning new things e.g. Apache Flink You've worked on systems that require high throughput and low latency You enjoy problem solving and have great communication and collaboration skills more »
features from idea to production unattended. Also, actively manages and escalates risk and customer-impacting issues. Responsibilities Install and maintain JBOSS application server and Apache platforms End-to-end setup of Virtual Machines/servers with pre-requisites like file systems, backups, logging, monitoring, etc. required for the application … if you have: Experience using containerized platforms including Kubernetes, Docker and OpenShift Experience in JBOSS 7.x/8.x, Redhat Linux, Redhat OpenJDK, Oracle Java, Apache 2.x Experience in Java-based applications Experience in Recovery Collection Applications, including Debt Management and Recovery Possess technical knowledge on AWS and GCP cloud more »
GoLang. - Significant experience with Hadoop, Spark and other distributed processing platforms and frameworks. - Experience working with Open table/storage formats like delta lake, apache iceberg or apache hudi. - Experience of developing and managing real time data streaming pipelines using Change data capture (CDC), Kafka and Apachemore »
Engineering, providing DevOps support, and/or RHEL administration for mission-critical platforms, ideally Kafka. 4+ years of combined experience with Kafka (Confluent Kafka, Apache Kafka, Amazon MSK) 4+ years of experience with Ansible automation Must be able to obtain and maintain a Public Trust. Contract requirement. Selected candidate … Solid experience using version control software such as Git/Bitbucket including peer reviewing Ansible playbooks Hands-on experience administrating Kafka platform (Confluent Kafka, Apache Kafka, Amazon MSK) via Ansible playbooks or other automation. Understanding of Kafka architecture, including partition strategy, replication, transactions, tiered storage, and disaster recovery strategies. … STAND OUT FROM THE CROWD (Desired Skills) Showcase your knowledge of modern development through the following experience or skills: Preferred Confluent Certified Administrator for Apache Kafka (CCAAK) or Confluent Certified Developer for Apache Kafka (CCDAK) Practical experience with event-driven applications and at least one event processing framework more »
pipelines Know your way around Unix based operating system Experience working with any major cloud provider (AWS, GCP, Azure) Fluency in English Experience using Apache Airflow Experience using Docker Experience using Apache Spark Benefits: Salary £40-50K per annum dependant on skills and experience 25 Days annual more »
service incidents. Responsible for the technical design, development, installation, monitoring and ongoing support and maintenance of a diverse set of middleware technologies including WebSphere, Apache, Tomcat, and Jboss. The role is a technical, hands-on opportunity with a heavy focus on automation, resilient design and deployment of middleware ready … impacting issues within the day-to-day role to management. Responsibilities Strong engineering experience in installation and maintenance of WebSphere application server, Tomcat, and Apache platforms Implement DevOps practices through GitOps framework Implement Configuration Management and Infrastructure as Code (e.g. Terraform, Python, Chef, Ansible, and Bash) Achieves product commitments more »
Scala Kotlin Spark Google PubSub Elasticsearch Bigquery, PostgresQL Kubernetes, Docker, Airflow Ke y Responsibilities Designing and implementing scalable data pipelines using tools such as Apache Spark, Google PubSub etc. Optimising data storage and retrieval systems for maximum performance using both relational an d NoSQL databases. Continuously monitoring and improving … Data Infrastructure projects, as well as designing and building data intensive applications and services. Experience with data processing and distributed computing frameworks such as Apache Spark Expert knowledge in one or more of the following languages - Python, Scala, Java, Kotlin Deep knowledge of data modelling, data access, and data more »
development (ideally AWS) Knowledge and ideally hands-on experience with data streaming, event-based architectures and Kafka Strong communication and interpersonal skills Experience with Apache Spark or Apache Flink would be ideal, but not essential Please note, this role is unable to provide sponsorship. If this role sounds more »
in cloud development (ideally AWS)Knowledge and ideally hands-on experience with data streaming, event-based architectures and KafkaStrong communication and interpersonal skillsExperience with Apache Spark or Apache Flink would be ideal, but not essentialPlease note, this role is unable to provide sponsorship.If this role sounds of interest more »
as well as distributed computing frameworks. Familiarity with data warehousing concepts and dimensional modelling. Strong knowledge of ETL principles and experience with tools like Apache Spark or AWS Glue. Experience with real-time data processing frameworks such as Apache Kafka or AWS Kinesis. Experience with version control systems more »
Greater Bristol Area, United Kingdom Hybrid / WFH Options
J&C Associates Ltd
so any knowledge of cross domain solutions or air gapped is a plus AWS as initial hosting provider * Containerised apps using Docker and Kubernetes * Apache Jena * Elastic * PostGIS * Kafka * Apache NiFi * AWS Cognito * HTTP REST, GraphQL, SPARQL interfaces * Web apps based on HTML/CSS/Javascript frameworks more »
Experienced creating data pipelines on a cloud (preferably AWS) environment CI/CD experience Containerization experience (Docker, Kubernetes, etc.) Experience with SQS/SNS, Apache Kafka, RabbitMQ Other interesting/bonus skills – Airflow, Trino, Apache Iceburg, Postgres, MongoDB You *must* be eligible to work in your chosen country more »