West Bromwich, England, United Kingdom Hybrid / WFH Options
Leonardo UK Ltd
exciting and critical challenges to the UK’s digital landscape. This role requires strong expertise in building and managing data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. The successful candidate will design, implement, and maintain scalable, secure data solutions, ensuring compliance with strict security standards and regulations. This is a UK based onsite role with … the option of compressed hours. The role will include: Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality and security. Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards. Manage and … experience working as a Data Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with Apache NiFi for building and managing complex data flows and integration processes. Knowledge of security practices for handling sensitive data, including encryption, anonymization, and access control. Familiarity with data governance More ❯
tuning skills. Preferred Qualifications Strong communication skills and demonstrated ability to engage with business stakeholders and product teams. Experience in data modeling , data warehousing (e.g., Snowflake , AWS Glue , EMR , ApacheSpark ), and working with data pipelines . Leadership experience—whether technical mentorship, team leadership, or managing critical projects. Familiarity with Infrastructure as Code (IaC) tools like Terraform , CloudFormation More ❯
similar). Experience with ETL/ELT tools, APIs, and integration platforms. Deep knowledge of data modelling, warehousing, and real time analytics. Familiarity with big data technologies principals (e.g., Spark, Hadoop) and BI tools (e.g., Power BI, Tableau). Strong programming skills (e.g. SQL, Python, Java, or similar languages). Ability to exercise a substantial degree of independent professional More ❯
We are seeking 3 Data Engineers to join our defence & security client on a contract basis. Key skills required for this role DV cleared, Data Engineer, ETL, Elastic Stack, Apache NiFi Important DV Cleared - Data Engineer - ELK & NiFi - Outside IR35 Location: Worcester Duration: 6 months initial contract Security: Active DV clearance required In this role, you will help design … develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. These positions are onsite in Worcester and require active UK DV clearance. Key Responsibilities: Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality … the ability to obtain it. Experience as a Data Engineer in secure or regulated environments. Expertise in the Elastic Stack for data ingestion, transformation, and visualization. Strong experience with Apache NiFi for managing complex data flows. Knowledge of security practices for handling sensitive data. Understanding of data governance, quality, and compliance standards in secure settings. Experience managing large-scale More ❯
with SQL, NoSQL, and data visualization tools. Strong analytical and problem-solving skills. Experience with social media analytics and user behavior analysis. Knowledge of big data technologies like Hadoop, Spark, Kafka. Familiarity with AWS machine learning services such as SageMaker and Comprehend. Understanding of data governance and security in AWS. Excellent communication and teamwork skills. Attention to detail and More ❯
West Bromwich, England, United Kingdom Hybrid / WFH Options
ADLIB Recruitment | B Corp™
across both on-premise and cloud-based data systems Clear communicator, able to translate complex data concepts to cross-functional teams Bonus points for experience with: Big data tools (Spark, Hadoop), ETL workflows, or high-throughput data streams Genomic data formats and tools Cold and hot storage management, ZFS/RAID systems, or tape storage AI/LLM tools More ❯
of data points per day and create a highly available data processing and REST services to distribute data to different consumers across PWM. Technologies used include: Data Technologies: Kafka, Spark, Hadoop, Presto, Alloy - a data management and data governance platform Programming Languages: Java, Scala, Scripting Database Technologies: MongoDB, ElasticSearch, Cassandra, MemSQL, Sybase IQ/ASE Micro Service Technologies: REST … new tech stacks SKILLS AND EXPERIENCE WE ARE LOOKING FOR Computer Science, Mathematics, Engineering or other related degree at bachelors level Java, Scala, Scripting, REST, Spring Boot, Jersey Kafka, Spark, Hadoop, MongoDB, ElasticSearch, MemSQL, Sybase IQ/ASE 3+ years of hands-on experience on relevant technologies ABOUT GOLDMAN SACHS At Goldman Sachs, we commit our people, capital and More ❯
software version control with Git or SVN. Capable of presenting technical issues and successes to team members and Product Owners. Nice to Have Experience with any of these- Python, Spark, Kafka, Kinesis, Kinesis Analytics, BigQuery, Dataflow, BigTable, and SQL. Enthusiastic about learning and applying new technologies (growth mindset). Ability to build new solutions and support our data solutions More ❯
HBase, Elasticsearch). Ability to build, operate, maintain, and support cloud infrastructure and data services. Skills to automate and optimize data engineering pipelines. Experience with big data technologies (Databricks, Spark). Development of custom security applications, APIs, AI/ML models, and advanced analytics technologies. Experience with threat detection in Azure Sentinel, Databricks, MPP Databases (Snowflake), or Splunk. Expertise More ❯
Stafford, England, United Kingdom Hybrid / WFH Options
Energy Job Search
R, MATLAB, or C++. Familiarity with cloud platforms (AWS, Azure, Google Cloud) and microservices architecture. Nice-to-Have Requirements Experience with data modeling, containerization (Docker, Kubernetes), and distributed computing (Spark, Scala). Familiarity with GraphDB, MongoDB, SQL/NoSQL, and other DBMS technologies. Understanding of system automation, protection, and diagnostics in relevant sectors. Experience with deep learning algorithms, reinforcement More ❯
successes to team members and Product Owners. Has experience of people management or desire to manage individuals on the team Nice to Have Experience with some of these- Python, Spark, Kafka, Kinesis, Kinesis Analytics, BigQuery, Dataflow, BigTable, and SQL. Enthusiastic about learning and applying new technologies (growth mindset). Ability to build new solutions and support our data solutions More ❯
and reporting needs, translating them into technical solutions and scalable BI products. Design and implement robust data pipelines and workflows using modern data processing frameworks such as DBT and Apache Spark. Evaluate and select BI tools and technologies to enhance reporting capabilities and user experience. Optimize the performance and scalability of data systems, continuously improving data ingestion, transformation, and … modern cloud data warehouse platforms, preferably Snowflake. Advanced proficiency in data modeling, data warehousing, and dimensional modeling concepts. Strong command of data transformation and pipeline tools, such as DBT, ApacheSpark, or equivalent. Expertise in implementing data governance frameworks, including data quality management, metadata management, and data security practices. Experience leading cross-functional data governance initiatives and councils. More ❯
Lambda) . Strong background in data architecture , including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing More ❯
Lambda) . Strong background in data architecture , including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing More ❯
Lambda) . Strong background in data architecture , including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing More ❯
Lambda) . Strong background in data architecture , including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing More ❯
Lambda) . Strong background in data architecture , including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing More ❯
Lambda) . Strong background in data architecture , including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing More ❯