Data Engineer (Informatica/Teradata/Datawarehouse) page is loaded Data Engineer (Informatica/Teradata/Datawarehouse) Apply locations Two PNC Plaza (PA374) Birmingham - Brock (AL112) Dallas Innovation Center - Luna Rd (TX270) Strongsville Technology Center (OH537) time type Full time More ❯
West Bromwich, England, United Kingdom Hybrid / WFH Options
Leonardo UK Ltd
exciting and critical challenges to the UK’s digital landscape. This role requires strong expertise in building and managing data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. The successful candidate will design, implement, and maintain scalable, secure data solutions, ensuring compliance with strict security standards and regulations. This is a UK based onsite role with … the option of compressed hours. The role will include: Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality and security. Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards. Manage and … experience working as a Data Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with Apache NiFi for building and managing complex data flows and integration processes. Knowledge of security practices for handling sensitive data, including encryption, anonymization, and access control. Familiarity with data governance More ❯
We are seeking 3 Data Engineers to join our defence & security client on a contract basis. Key skills required for this role DV cleared, Data Engineer, ETL, Elastic Stack, Apache NiFi Important DV Cleared - Data Engineer - ELK & NiFi - Outside IR35 Location: Worcester Duration: 6 months initial contract Security: Active DV clearance required In this role, you will help design … develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. These positions are onsite in Worcester and require active UK DV clearance. Key Responsibilities: Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality … the ability to obtain it. Experience as a Data Engineer in secure or regulated environments. Expertise in the Elastic Stack for data ingestion, transformation, and visualization. Strong experience with Apache NiFi for managing complex data flows. Knowledge of security practices for handling sensitive data. Understanding of data governance, quality, and compliance standards in secure settings. Experience managing large-scale More ❯
all employees and ensuring that our workplaces are free from discrimination. We aim to treat all employees and potential future employees fairly, with dignity and respect. Does this opportunity spark your interest? We eagerly look forward to receiving your application. More ❯
West Bromwich, England, United Kingdom Hybrid / WFH Options
Leonardo UK Ltd
researching new technologies and software versions Working with cloud technologies and different operating systems Working closely alongside Data Engineers and DevOps engineers Working with big data technologies such as spark Demonstrating stakeholder engagement by communicating with the wider team to understand the functional and non-functional requirements of the data and the product in development and its relationship to … networks into production Experience with Docker Experience with NLP and/or computer vision Exposure to cloud technologies (eg. AWS and Azure) Exposure to Big data technologies Exposure to Apache products eg. Hive, Spark, Hadoop, NiFi Programming experience in other languages This is not an exhaustive list, and we are keen to hear from you even if you More ❯
proficiency in Python and experience with container systems (Docker, Kubernetes) Proven experience with AWS services (Lambda, S3, EMR, DynamoDB) Hands-on experience with big data technologies such as Hadoop, Spark, Snowflake, and Vertica Familiarity with CI/CD, DevOps practices, and automated testing frameworks Ability to work in agile, fast-paced environments and lead technical projects Excellent communication and More ❯
similar). Experience with ETL/ELT tools, APIs, and integration platforms. Deep knowledge of data modelling, warehousing, and real time analytics. Familiarity with big data technologies principals (e.g., Spark, Hadoop) and BI tools (e.g., Power BI, Tableau). Strong programming skills (e.g. SQL, Python, Java, or similar languages). Ability to exercise a substantial degree of independent professional More ❯
of data points per day and create a highly available data processing and REST services to distribute data to different consumers across PWM. Technologies used include: Data Technologies: Kafka, Spark, Hadoop, Presto, Alloy - a data management and data governance platform Programming Languages: Java, Scala, Scripting Database Technologies: MongoDB, ElasticSearch, Cassandra, MemSQL, Sybase IQ/ASE Micro Service Technologies: REST … new tech stacks SKILLS AND EXPERIENCE WE ARE LOOKING FOR Computer Science, Mathematics, Engineering or other related degree at bachelors level Java, Scala, Scripting, REST, Spring Boot, Jersey Kafka, Spark, Hadoop, MongoDB, ElasticSearch, MemSQL, Sybase IQ/ASE 3+ years of hands-on experience on relevant technologies ABOUT GOLDMAN SACHS At Goldman Sachs, we commit our people, capital and More ❯
/ec2), Infrastructure automation (Terraform), and CI/CD platform (Github Actions & Admin), Password/Secret management (hashicorp vault). Strong Data related programming skills SQL/Python/Spark/Scala. Database technologies in relation to Data Warehousing/Data Lake/Lake housing patterns and relevant experience when handling structured and non-structured data Machine Learning - Experience More ❯
West Bromwich, England, United Kingdom Hybrid / WFH Options
ADLIB Recruitment | B Corp™
across both on-premise and cloud-based data systems Clear communicator, able to translate complex data concepts to cross-functional teams Bonus points for experience with: Big data tools (Spark, Hadoop), ETL workflows, or high-throughput data streams Genomic data formats and tools Cold and hot storage management, ZFS/RAID systems, or tape storage AI/LLM tools More ❯
software version control with Git or SVN. Capable of presenting technical issues and successes to team members and Product Owners. Nice to Have Experience with any of these- Python, Spark, Kafka, Kinesis, Kinesis Analytics, BigQuery, Dataflow, BigTable, and SQL. Enthusiastic about learning and applying new technologies (growth mindset). Ability to build new solutions and support our data solutions More ❯
Birmingham, England, United Kingdom Hybrid / WFH Options
Autodesk
architecture, and processing skills with varied unstructured data representations · Processing unstructured data, such as 3D geometric data · Large scale, data-intensive systems in production · Distributed computing frameworks, such as Spark, Dask, Ray Data etc. · Cloud platforms such as AWS, Azure, or GCP · Docker · Documenting code, architectures, and experiments · Linux systems and bash terminals Preferred Qualifications o Databases and/… or data warehousing technologies, such as Apache Hive, Iceberg etc. o Data transformation via SQL and DBT. o Orchestration platforms such as Apache Airflow, Argo Workflows, etc. o Data catalogs and metadata management tools o Vector databases o Relational and object databases o Kubernetes o computational geometry such as mesh or boundary representation data processing o analyzing data More ❯
Worcester, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Engineering (open to professionals from various data eng. backgrounds — data pipelines, ML Eng, data warehousing, analytics engineering, big data, cloud etc.) Technical Exposure: Experience with tools like SQL, Python, ApacheSpark, Kafka, Cloud platforms (AWS/GCP/Azure), and modern data stack technologies Formal or Informal Coaching Experience: Any previous coaching, mentoring, or training experience — formal or More ❯
Wolverhampton, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Engineering (open to professionals from various data eng. backgrounds — data pipelines, ML Eng, data warehousing, analytics engineering, big data, cloud etc.) Technical Exposure: Experience with tools like SQL, Python, ApacheSpark, Kafka, Cloud platforms (AWS/GCP/Azure), and modern data stack technologies Formal or Informal Coaching Experience: Any previous coaching, mentoring, or training experience — formal or More ❯
Coventry, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Engineering (open to professionals from various data eng. backgrounds — data pipelines, ML Eng, data warehousing, analytics engineering, big data, cloud etc.) Technical Exposure: Experience with tools like SQL, Python, ApacheSpark, Kafka, Cloud platforms (AWS/GCP/Azure), and modern data stack technologies Formal or Informal Coaching Experience: Any previous coaching, mentoring, or training experience — formal or More ❯
Shrewsbury, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Engineering (open to professionals from various data eng. backgrounds — data pipelines, ML Eng, data warehousing, analytics engineering, big data, cloud etc.) Technical Exposure: Experience with tools like SQL, Python, ApacheSpark, Kafka, Cloud platforms (AWS/GCP/Azure), and modern data stack technologies Formal or Informal Coaching Experience: Any previous coaching, mentoring, or training experience — formal or More ❯
Birmingham, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Engineering (open to professionals from various data eng. backgrounds — data pipelines, ML Eng, data warehousing, analytics engineering, big data, cloud etc.) Technical Exposure: Experience with tools like SQL, Python, ApacheSpark, Kafka, Cloud platforms (AWS/GCP/Azure), and modern data stack technologies Formal or Informal Coaching Experience: Any previous coaching, mentoring, or training experience — formal or More ❯
Stoke-on-Trent, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Engineering (open to professionals from various data eng. backgrounds — data pipelines, ML Eng, data warehousing, analytics engineering, big data, cloud etc.) Technical Exposure: Experience with tools like SQL, Python, ApacheSpark, Kafka, Cloud platforms (AWS/GCP/Azure), and modern data stack technologies Formal or Informal Coaching Experience: Any previous coaching, mentoring, or training experience — formal or More ❯
Birmingham, England, United Kingdom Hybrid / WFH Options
FIND | Creating Futures
Engineering (open to professionals from various data eng. backgrounds - data pipelines, ML Eng, data warehousing, analytics engineering, big data, cloud etc.) Technical Exposure: Experience with tools like SQL, Python, ApacheSpark, Kafka, Cloud platforms (AWS/GCP/Azure), and modern data stack technologies Formal or Informal Coaching Experience: Any previous coaching, mentoring, or training experience - formal or More ❯
Job Description Summary GE Vernova is accelerating the path to more reliable, affordable, and sustainable energy, while helping our customers power economies and deliver GE Vernova is accelerating the path to more reliable, affordable, and sustainable energy, while helping our More ❯
and reporting needs, translating them into technical solutions and scalable BI products. Design and implement robust data pipelines and workflows using modern data processing frameworks such as DBT and Apache Spark. Evaluate and select BI tools and technologies to enhance reporting capabilities and user experience. Optimize the performance and scalability of data systems, continuously improving data ingestion, transformation, and … modern cloud data warehouse platforms, preferably Snowflake. Advanced proficiency in data modeling, data warehousing, and dimensional modeling concepts. Strong command of data transformation and pipeline tools, such as DBT, ApacheSpark, or equivalent. Expertise in implementing data governance frameworks, including data quality management, metadata management, and data security practices. Experience leading cross-functional data governance initiatives and councils. More ❯
cloud and AI deployment experience. Stakeholder management skills. Skills: Expertise in designing and optimizing generative AI models, prompt engineering, AI algorithms, scalable cloud solutions, version control, Linux, Docker, Hadoop, Spark, Elasticsearch, and strong problem-solving and communication skills. Business development experience is a plus. Why KPMG? Work with diverse, exciting clients across industries and globally. Engage in impactful projects More ❯
Job Description Summary GE Vernova is accelerating the path to more reliable, affordable, and sustainable energy, helping our customers power economies and deliver vital electricity for health, safety, and security. Are you excited about the opportunity to electrify and decarbonize More ❯
looking for a Data Scientist to join its innovative team. This role requires hands-on experience with machine learning techniques and proficiency in data manipulation libraries such as Pandas, Spark, and SQL. As a Data Scientist at PwC, you will work on cutting-edge projects, using data to drive strategic insights and business decisions. If you have strong analytical … Machine learning frameworks and tooling e.g. Sklearn) and (Deep learning frameworks such as Pytorch and Tensorflow). Understanding of machine learning techniques. Experience with data manipulation libraries (e.g. Pandas, Spark, SQL). Git for version control. Cloud experience (we use Azure/GCP/AWS). Skills we'd also like to hear about: Evidence of modelling experience applied More ❯
Bromsgrove, Worcestershire, United Kingdom Hybrid / WFH Options
Talk Recruitment
Stress-testing, performance-tuning, and optimization skills. Debugging in multi-threaded environments. Eligible to work in the UK. Desirable Skills: Technologies such as Zookeeper, Terraform, Ansible, Cassandra, RabbitMQ, Kafka, Spark, Redis, MongoDB, CosmoDB, Xsolla Backend(AcceleratXR), Pragma, Playfab, Epic Online Services, Unity Game Services, Firebase, Edgegap, Photon Game engine experience with Unreal or Unity Web application development experience (NodeJS More ❯