Site Reliability Engineer or a similar role, with a focus on data infrastructure management Proficiency in data technologies, such as relational databases, data warehousing, big data platforms (e.g., Hadoop, Spark), data streaming (e.g., Kafka), and cloud services (e.g., AWS, GCP, Azure). Ideally some programming skills in languages like Python, Java, or Scala, with experience in automation and scripting. More ❯
quality of our tools and applications through bug fixes and code refactoring. Leverage the latest data technologies and programming languages, including Python, Scala, and Java, along with systems like Spark, Kafka, and Airflow, within cloud services such as AWS. Ensure the ongoing maintenance, troubleshooting, optimization, and reliability of data systems, including timely resolution of unexpected issues. Stay abreast of … Strong knowledge of relational and NoSQL databases (e.g., PostgreSQL, MongoDB) and data modeling principles. Proven ability to design, build, and maintain scalable data pipelines and workflows using tools like Apache Airflow or similar. Strong problem-solving and analytical skills. Excellent communication and collaboration skills. Nice to have: Hands-on experience with data warehouse and lakehouse architectures (e.g., Databricks, Snowflake … or similar). Experience with big data frameworks (e.g., ApacheSpark, Hadoop) and cloud platforms (e.g., AWS, Azure, or GCP). More ❯
quality of our tools and applications through bug fixes and code refactoring. Leverage the latest data technologies and programming languages, including Python, Scala, and Java, along with systems like Spark, Kafka, and Airflow, within cloud services such as AWS. Ensure the ongoing maintenance, troubleshooting, optimization, and reliability of data systems, including timely resolution of unexpected issues. Stay abreast of … Strong knowledge of relational and NoSQL databases (e.g., PostgreSQL, MongoDB) and data modeling principles. Proven ability to design, build, and maintain scalable data pipelines and workflows using tools like Apache Airflow or similar. Strong problem-solving and analytical skills. Excellent communication and collaboration skills. Nice to have: Hands-on experience with data warehouse and lakehouse architectures (e.g., Databricks, Snowflake … or similar). Experience with big data frameworks (e.g., ApacheSpark, Hadoop) and cloud platforms (e.g., AWS, Azure, or GCP). More ❯
with SQL, NoSQL, data visualization, and statistical tools. Strong analytical and problem-solving skills. Experience with social media analytics and user behavior. Familiarity with big data tools like Hadoop, Spark, Kafka. Knowledge of AWS ML services such as SageMaker and Comprehend. Understanding of data governance and security in AWS. Excellent communication and teamwork skills. Attention to detail and ability More ❯
Proven experience as a Data Scientist, with a focus on AI and machine learning, including hands-on experience with Gen AI technologies. Experience with big data technologies (e.g., Hadoop, Spark). Knowledge of cloud platforms (e.g., AWS, Azure, GCP). Knowledge of financial instruments, markets, and risk management. Excellent problem-solving skills and attention to detail. Strong communication and More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
PA Consulting
committed digital practitioner. You'll have: Experience in the design and deployment of production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala, Spark, SQL. Perform tasks such as writing scripts, extracting data using APIs, writing SQL queries etc. Work closely with other engineering teams to integrate data engineering components into production systems. More ❯
deploying large-scale data pipelines, ETL processes, and data lakes using Databricks. Demonstrable experience with cloud services, all three public clouds (AWS, GCP, Azure) is beneficial. Solid understanding of Spark architecture, distributed computing, and cloud-based data engineering principles. Proficiency in programming languages such as Python or SQL. Strong leadership skills with experience managing and growing high-performing technical More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Citi
science solutions that are Accurate, Reliable, Relevant, Consistent, Complete, Scalable, Timely, Secure, Nimble. Olympus is built on Big data platform and technologies under Cloudera distribution like HDFS, Hive, Impala, Spark, YARN, Sentry, Oozie, Kafka. Our team interfaces with a vast client base and works in close partnership with Operations, Development and other technology counterparts running the application production platform … conduct and business practices, and escalating, managing and reporting control issues with transparency. Skills & Qualifications: Working knowledge of various components and technologies under Cloudera distribution like HDFS, Hive, Impala, Spark, YARN, Sentry, Oozie, Kafka. Very good knowledge on analyzing the bottlenecks on the cluster - performance tuning, effective resource usage, capacity planning, investigating. Perform daily performance monitoring of the cluster More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Citigroup Inc
/CD pipeline. Qualifications: Relevant experience in an Application Development role. Demonstrated execution capabilities. Strong analytical and quantitative skills; Data driven and results-oriented Experience with Core Java required (Spark a plus) Experience with SQL Experience working with Hadoop, Hive, Sqoop and other technologies in Cloudera's CDP distribution. Understanding of version control (git) Experience working as part of … an agile team. Excellent written and oral communication skills Technical Skills: Strong knowledge in Java Some knowledge inHadoop, hive, SQL, Spark Understanding of Unix Shell Scripting CI/CD Pipeline Maven or Gradle experience Predictive analytics (desirable) PySpark (desirable) Trade Surveillance domain knowledge (desirable) Education: Bachelor’s/University degree or equivalent experience What we’ll provide you: By More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
JR United Kingdom
As a Data Engineer , you will: Design and deploy production data pipelines from ingestion to consumption within a big data architecture. Work with technologies such as Python, Java, Scala, Spark, and SQL to extract, clean, transform, and integrate data. Build scalable solutions using AWS services like EMR, Glue, Redshift, Kinesis, Lambda, and DynamoDB. Process large volumes of structured and More ❯
continuous delivery Excellent problem-solving skills and a collaborative mindset Agile development experience in a team setting Bonus Skills (nice to have) Experience with big data tools like Hadoop, Spark, or Scala Exposure to fraud, payments , or financial services platforms Understanding of cloud-native development and container orchestration Knowledge of test-driven development and modern code quality practices What More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Data Intellect Limited
Python, SQL, and/or Scala. Knowledge of two or more common Cloud ecosystems (Azure, AWS, GCP) with expertise in at least one. Deep experience with distributed computing with Apache Spark. Working knowledge of CI/CD for production deployments. Working knowledge of MLOps. Familiarity with designing and deploying performant end-to-end data architectures. Experience with technical project More ❯
Belfast, County Antrim, Sydenham, City of Belfast, United Kingdom
Experis
Role: Data Engineer Location: Belfast Duration: Long Term Contract Opportunity - Rolling Contract Rate: Market Rates - Inside IR35 Job Description : Fujitsu's Decision Intelligence practice in the UK helps organisations bridge the gap between data and insights, empowering businesses to make More ❯
with MLOps practices and model deployment pipelines Proficient in cloud AI services (AWS SageMaker/Bedrock) Deep understanding of distributed systems and microservices architecture Expert in data pipeline platforms (Apache Kafka, Airflow, Spark) Proficient in both SQL (PostgreSQL, MySQL) and NoSQL (Elasticsearch, MongoDB) databases Strong containerization and orchestration skills (Docker, Kubernetes) Experience with infrastructure as code (Terraform, CloudFormation More ❯
with MLOps practices and model deployment pipelines Proficient in cloud AI services (AWS SageMaker/Bedrock) Deep understanding of distributed systems and microservices architecture Expert in data pipeline platforms (Apache Kafka, Airflow, Spark) Proficient in both SQL (PostgreSQL, MySQL) and NoSQL (Elasticsearch, MongoDB) databases Strong containerization and orchestration skills (Docker, Kubernetes) Experience with infrastructure as code (Terraform, CloudFormation More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Citi
science solutions that are Accurate, Reliable, Relevant, Consistent, Complete, Scalable, Timely, Secure, Nimble. Olympus is built on Big data platform and technologies under Cloudera distribution like HDFS, Hive, Impala, Spark, YARN, Sentry, Oozie, Kafka. Our team interfaces with a vast client base and works in close partnership with Operations, Development and other technology counterparts running the application production platform … Experience in an Application Support role. Hands-on experience in supporting applications built in Hadoop. Working knowledge of various components and technologies under Cloudera distribution like HDFS, Hive, Impala, Spark, YARN, Sentry, Oozie, Kafka. Experienced in Linux Very good knowledge on analyzing the bottlenecks on the cluster - performance tuning, effective resource usage, capacity planning, investigating. Perform daily performance monitoring More ❯
Lambda) . Strong background in data architecture , including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost of ownership estimation and managing More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Danske Bank
Job Description Your Career, Your Danske Bank........You're Danske Bank Your role: Senior Analytics Engineer Data Analytics, Technology & Digital Development Location Donegall Square West, Belfast, Northern Ireland/Hybrid Working Why you'll want to work with us: At Danske More ❯
Systems Architecture - AWS data engineer Job Description Location: UK - Will require travel to customer site ](Belfast) Job Summary: We are seeking a skilled and experienced AWS Data Engineer to join our team. The successful candidate will be responsible for implementing More ❯
associated Data Science tooling. Experience in technical communication with both business stakeholders and technical peers. Experience working with big data concepts, strategies, methodologies, and tools such as MongoDB, Snowflake, Spark, or Hadoop. Knowledge and experience of deploying enterprise scale data science products. Experience in coaching and mentoring team members. Experience and skills we'd love: Computer Vision Expertise: Practical More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Acord (association For Cooperative Operations Research And Development)
experience with Monitoring tools such as ITRS Geneos, AppDynamics. Good experience with Log Aggregation tools such as ELK, Splunk, Grafana (GEM) is preferred. Experience working with Oracle Database, Hadoop, ApacheSpark, Hive, Starburst. Experience with Middleware solutions such as Tibco EMS, Kafka. Good written and verbal communication skills. What we can offer you The SMBF Production Management organization More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
eFinancialCareers
Experience with Monitoring tools such as ITRS Geneos, AppDynamics Good Experience with Log Aggregation tools such as ELK, Splunk, Grafana(GEM) is preferred Experience working with Oracle Database, Hadoop, ApacheSpark, Hive, Starburst Experience with Middleware solutions such as Tibco EMS, Kafka Good written and verbal communication skills What we can offer you The SMBF Production Management organization More ❯
building solutions for Compliance Data Archival & Reporting application. Core Java, Spring and Hibernate will be the primary technologies used supplemented by strong technical knowledge of Oracle databases Knowledge of Spark, Big Data is desirable but not mandatory. The candidate should have hands on experience in designing and developing solutions The candidate will apply internal standards for re-use, architecture … unit test framework like junit and mockito. Experience in any cloud technologies like Openshift/PCF/AWS/GCP. Experience with building distributed systems, using solutions such as Spark, Big Data Technologies would be preferred but not mandatory. Knowledge of Big Data querying tools (Cloudera stack or similar) e.g. Hive or Impala would be preferred but not mandatory. More ❯
conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: Experience in a product development/product management environmenta Relevant experience within core java and spark Experience in systems analysis and programming of java applications Experience using big data technologies (e.g. Java Spark, hive, Hadoop) Ability to manage multiple/competing priorities and manage More ❯