data layer on Azure Synapse Analytics, SQL DW, and Cosmos DB. The data engineer is proficient in Azure Data Platform components, including ADLS2, Blob Storage, SQLDW, Synapse Analytics with Spark and SQL, Azure functions with Python, Azure Purview, and Cosmos DB. They are also proficient in Azure Event Hub and Streaming Analytics, Managed Streaming for Apache Kafka, Azure … DataBricks with Spark, and other open source technologies like Apache Airflow and dbt, Spark/Python, or Spark/Scala. Preferred Education Bachelor's Degree Required Technical And Professional Expertise Commercial experience as a Data Engineer or similar role, with a strong emphasis on Azure technologies. Proficiency in Azure data services (Azure SQL Database, Azure Synapse … Analytics, Azure Data Factory, Azure Databricks). Experience with data modeling, data warehousing, and big data processing (Hadoop, Spark, Kafka). Strong understanding of SQL and NoSQL databases, data modeling, and ETL/ELT processes. Proficiency in at least one programming language (Python, C#, Java). Experience with CI/CD pipelines and tools (Azure DevOps, Jenkins). Knowledge More ❯
Data Engineer (Informatica/Teradata/Datawarehouse) page is loaded Data Engineer (Informatica/Teradata/Datawarehouse) Apply locations Two PNC Plaza (PA374) Birmingham - Brock (AL112) Dallas Innovation Center - Luna Rd (TX270) Strongsville Technology Center (OH537) time type Full time More ❯
West Bromwich, England, United Kingdom Hybrid / WFH Options
Leonardo UK Ltd
exciting and critical challenges to the UK’s digital landscape. This role requires strong expertise in building and managing data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. The successful candidate will design, implement, and maintain scalable, secure data solutions, ensuring compliance with strict security standards and regulations. This is a UK based onsite role with … the option of compressed hours. The role will include: Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality and security. Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards. Manage and … experience working as a Data Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with Apache NiFi for building and managing complex data flows and integration processes. Knowledge of security practices for handling sensitive data, including encryption, anonymization, and access control. Familiarity with data governance More ❯
technical and professional experience Preferred Skills: Experience working within the public sector. Knowledge of cloud platforms (e.g., IBM Cloud, AWS, Azure). Familiarity with big data processing frameworks (e.g., ApacheSpark, Hadoop). Understanding of data warehousing concepts and experience with tools like IBM Cognos or Tableau. Certifications:While not required, the following certifications would be highly beneficial … Experience working within the public sector. Knowledge of cloud platforms (e.g., IBM Cloud, AWS, Azure). Familiarity with big data processing frameworks (e.g., ApacheSpark, Hadoop). Understanding of data warehousing concepts and experience with tools like IBM Cognos or Tableau. ABOUT BUSINESS UNIT IBM Consulting is IBM's consulting and global professional services business, with market leading More ❯
We are seeking 3 Data Engineers to join our defence & security client on a contract basis. Key skills required for this role DV cleared, Data Engineer, ETL, Elastic Stack, Apache NiFi Important DV Cleared - Data Engineer - ELK & NiFi - Outside IR35 Location: Worcester Duration: 6 months initial contract Security: Active DV clearance required In this role, you will help design … develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. These positions are onsite in Worcester and require active UK DV clearance. Key Responsibilities: Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality … the ability to obtain it. Experience as a Data Engineer in secure or regulated environments. Expertise in the Elastic Stack for data ingestion, transformation, and visualization. Strong experience with Apache NiFi for managing complex data flows. Knowledge of security practices for handling sensitive data. Understanding of data governance, quality, and compliance standards in secure settings. Experience managing large-scale More ❯
Coalville, Leicestershire, East Midlands, United Kingdom Hybrid / WFH Options
Ibstock PLC
for data models, ETL processes, and BI solutions. Ensure data accuracy, integrity, and consistency across the data platform. Knowledge, Skills and Experience: Essentia l Strong expertise in Databricks and ApacheSpark for data engineering and analytics. Proficient in SQL and Python/PySpark for data transformation and analysis. Experience in data lakehouse development and Delta Lake optimisation. Experience More ❯
Ibstock, England, United Kingdom Hybrid / WFH Options
Ibstock Plc
comprehensive documentation for data models, ETL processes, and BI solutions. Ensure data accuracy, integrity, and consistency across the data platform. Knowledge, Skills and Experience: Strong expertise in Databricks and ApacheSpark for data engineering and analytics. Proficient in SQL and Python/PySpark for data transformation and analysis. Experience in data lakehouse development and Delta Lake optimisation. Experience More ❯
all employees and ensuring that our workplaces are free from discrimination. We aim to treat all employees and potential future employees fairly, with dignity and respect. Does this opportunity spark your interest? We eagerly look forward to receiving your application. More ❯
relevant technology: Azure Platform, Azure Data Services, Databricks, Power BI, SQL DW, Snowflake, Big Query, and Advanced Analytics. Proven ability to understand low-level data engineering solutions and languages (Spark, MPP, Python, Delta, Parquet). Experience with Azure DevOps & CICD processes, software development lifecycle including infrastructure as code (Terraform). Understand data warehousing concepts, including dimensional modelling, star schema More ❯
West Bromwich, England, United Kingdom Hybrid / WFH Options
Leonardo UK Ltd
researching new technologies and software versions Working with cloud technologies and different operating systems Working closely alongside Data Engineers and DevOps engineers Working with big data technologies such as spark Demonstrating stakeholder engagement by communicating with the wider team to understand the functional and non-functional requirements of the data and the product in development and its relationship to … networks into production Experience with Docker Experience with NLP and/or computer vision Exposure to cloud technologies (eg. AWS and Azure) Exposure to Big data technologies Exposure to Apache products eg. Hive, Spark, Hadoop, NiFi Programming experience in other languages This is not an exhaustive list, and we are keen to hear from you even if you More ❯
obtain UK security clearance. We do not sponsor visas. Preferred Skills and Experience Public sector experience Knowledge of cloud platforms (IBM Cloud, AWS, Azure) Experience with big data frameworks (ApacheSpark, Hadoop) Data warehousing and BI tools (IBM Cognos, Tableau) Additional Details Seniority level: Mid-Senior level Employment type: Full-time Job function: Information Technology Industries: IT Services More ❯
proficiency in Python and experience with container systems (Docker, Kubernetes) Proven experience with AWS services (Lambda, S3, EMR, DynamoDB) Hands-on experience with big data technologies such as Hadoop, Spark, Snowflake, and Vertica Familiarity with CI/CD, DevOps practices, and automated testing frameworks Ability to work in agile, fast-paced environments and lead technical projects Excellent communication and More ❯
similar). Experience with ETL/ELT tools, APIs, and integration platforms. Deep knowledge of data modelling, warehousing, and real time analytics. Familiarity with big data technologies principals (e.g., Spark, Hadoop) and BI tools (e.g., Power BI, Tableau). Strong programming skills (e.g. SQL, Python, Java, or similar languages). Ability to exercise a substantial degree of independent professional More ❯
/ec2), Infrastructure automation (Terraform), and CI/CD platform (Github Actions & Admin), Password/Secret management (hashicorp vault). Strong Data related programming skills SQL/Python/Spark/Scala. Database technologies in relation to Data Warehousing/Data Lake/Lake housing patterns and relevant experience when handling structured and non-structured data Machine Learning - Experience More ❯
PhD degree in Computer Science, Engineering, Mathematics, Physics or a related field. Hands-on experience with LLMs, RAG, LangChain, or LlamaIndex. Experience with big data technologies such as Hadoop, Spark, or Kafka. The estimated total compensation range for this position is $75,000 - $90,000 ( USD base plus bonus). Actual compensation for the position is based on a More ❯
of data points per day and create a highly available data processing and REST services to distribute data to different consumers across PWM. Technologies used include: Data Technologies: Kafka, Spark, Hadoop, Presto, Alloy - a data management and data governance platform Programming Languages: Java, Scala, Scripting Database Technologies: MongoDB, ElasticSearch, Cassandra, MemSQL, Sybase IQ/ASE Micro Service Technologies: REST … new tech stacks SKILLS AND EXPERIENCE WE ARE LOOKING FOR Computer Science, Mathematics, Engineering or other related degree at bachelors level Java, Scala, Scripting, REST, Spring Boot, Jersey Kafka, Spark, Hadoop, MongoDB, ElasticSearch, MemSQL, Sybase IQ/ASE 3+ years of hands-on experience on relevant technologies ABOUT GOLDMAN SACHS At Goldman Sachs, we commit our people, capital and More ❯
West Bromwich, England, United Kingdom Hybrid / WFH Options
ADLIB Recruitment | B Corp™
across both on-premise and cloud-based data systems Clear communicator, able to translate complex data concepts to cross-functional teams Bonus points for experience with: Big data tools (Spark, Hadoop), ETL workflows, or high-throughput data streams Genomic data formats and tools Cold and hot storage management, ZFS/RAID systems, or tape storage AI/LLM tools More ❯
software version control with Git or SVN. Capable of presenting technical issues and successes to team members and Product Owners. Nice to Have Experience with any of these- Python, Spark, Kafka, Kinesis, Kinesis Analytics, BigQuery, Dataflow, BigTable, and SQL. Enthusiastic about learning and applying new technologies (growth mindset). Ability to build new solutions and support our data solutions More ❯
Birmingham, England, United Kingdom Hybrid / WFH Options
Autodesk
architecture, and processing skills with varied unstructured data representations · Processing unstructured data, such as 3D geometric data · Large scale, data-intensive systems in production · Distributed computing frameworks, such as Spark, Dask, Ray Data etc. · Cloud platforms such as AWS, Azure, or GCP · Docker · Documenting code, architectures, and experiments · Linux systems and bash terminals Preferred Qualifications o Databases and/… or data warehousing technologies, such as Apache Hive, Iceberg etc. o Data transformation via SQL and DBT. o Orchestration platforms such as Apache Airflow, Argo Workflows, etc. o Data catalogs and metadata management tools o Vector databases o Relational and object databases o Kubernetes o computational geometry such as mesh or boundary representation data processing o analyzing data More ❯
Lincoln, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Engineering (open to professionals from various data eng. backgrounds — data pipelines, ML Eng, data warehousing, analytics engineering, big data, cloud etc.) Technical Exposure: Experience with tools like SQL, Python, ApacheSpark, Kafka, Cloud platforms (AWS/GCP/Azure), and modern data stack technologies Formal or Informal Coaching Experience: Any previous coaching, mentoring, or training experience — formal or More ❯
Northampton, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Engineering (open to professionals from various data eng. backgrounds — data pipelines, ML Eng, data warehousing, analytics engineering, big data, cloud etc.) Technical Exposure: Experience with tools like SQL, Python, ApacheSpark, Kafka, Cloud platforms (AWS/GCP/Azure), and modern data stack technologies Formal or Informal Coaching Experience: Any previous coaching, mentoring, or training experience — formal or More ❯
Chesterfield, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Engineering (open to professionals from various data eng. backgrounds — data pipelines, ML Eng, data warehousing, analytics engineering, big data, cloud etc.) Technical Exposure: Experience with tools like SQL, Python, ApacheSpark, Kafka, Cloud platforms (AWS/GCP/Azure), and modern data stack technologies Formal or Informal Coaching Experience: Any previous coaching, mentoring, or training experience — formal or More ❯
Coventry, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Engineering (open to professionals from various data eng. backgrounds — data pipelines, ML Eng, data warehousing, analytics engineering, big data, cloud etc.) Technical Exposure: Experience with tools like SQL, Python, ApacheSpark, Kafka, Cloud platforms (AWS/GCP/Azure), and modern data stack technologies Formal or Informal Coaching Experience: Any previous coaching, mentoring, or training experience — formal or More ❯
Derby, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Engineering (open to professionals from various data eng. backgrounds — data pipelines, ML Eng, data warehousing, analytics engineering, big data, cloud etc.) Technical Exposure: Experience with tools like SQL, Python, ApacheSpark, Kafka, Cloud platforms (AWS/GCP/Azure), and modern data stack technologies Formal or Informal Coaching Experience: Any previous coaching, mentoring, or training experience — formal or More ❯
Wolverhampton, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Engineering (open to professionals from various data eng. backgrounds — data pipelines, ML Eng, data warehousing, analytics engineering, big data, cloud etc.) Technical Exposure: Experience with tools like SQL, Python, ApacheSpark, Kafka, Cloud platforms (AWS/GCP/Azure), and modern data stack technologies Formal or Informal Coaching Experience: Any previous coaching, mentoring, or training experience — formal or More ❯