Manchester, England, United Kingdom Hybrid / WFH Options
Ripjar
in production and have a curiosity and interest in learning more. In this role, you will be using python (specifically pyspark) and Node.js for processing data, backed by various Hadoop stack technologies such as HDFS and HBase. MongoDB and Elasticsearch are used for indexing smaller datasets. Airflow & Nifi are used to co-ordinate the processing of data, while Jenkins … software systems in production and have a curiosity and interest in learning more. You will be using Python (specifically pyspark) and Node.js for processing data You will be using Hadoop stack technologies such as HDFS and HBase Experience using MongoDB and Elasticsearch for indexing smaller datasets would be beneficial Experience using Airflow & Nifi to co-ordinate the processing of More ❯
for continuous improvement. Work alongside other engineers on the team to elevate technology and consistently apply best practices. Qualifications for Software Engineer Hands-on experience working with technologies like Hadoop, Hive, Pig, Oozie, Map Reduce, Spark, Sqoop, Kafka, Flume, etc. Strong DevOps focus and experience building and deploying infrastructure with cloud deployment technologies like Ansible, Chef, Puppet, etc. Experience More ❯
s degree in computer science, Engineering, Mathematics, or related field. - Proven experience (5+ years) in developing and deploying data engineering pipelines and products - Strong proficiency in Python - Experienced in Hadoop, Kafka or Spark - Experience leading/mentoring junior team members - Strong communication and interpersonal skills, with the ability to effectively communicate complex technical concepts to both technical and non More ❯
Hertfordshire, England, United Kingdom Hybrid / WFH Options
Queen Square Recruitment
Skilled in statistical modelling and data visualization tools (e.g., Tableau, Seaborn, Matplotlib). Able to independently lead complex projects from design to deployment. Desirable: Experience with big data tools (Hadoop, Spark) and cloud platforms (AWS, GCP, Azure). MSc or PhD in Computer Science, Data Science, AI or related disciplines. Ready to shape the future of AI? Apply now More ❯
Hertfordshire, South East, United Kingdom Hybrid / WFH Options
Queen Square Recruitment Limited
Skilled in statistical modelling and data visualization tools (e.g., Tableau, Seaborn, Matplotlib). Able to independently lead complex projects from design to deployment. Desirable: Experience with big data tools (Hadoop, Spark) and cloud platforms (AWS, GCP, Azure). MSc or PhD in Computer Science, Data Science, AI or related disciplines. Ready to shape the future of AI? Apply now More ❯
Employment Type: Contract
Rate: Unspecified Competitive Day Rate Inside IR35
Bishop's Stortford, England, United Kingdom Hybrid / WFH Options
Queen Square Recruitment
Skilled in statistical modelling and data visualization tools (e.g., Tableau, Seaborn, Matplotlib). Able to independently lead complex projects from design to deployment. Desirable: Experience with big data tools (Hadoop, Spark) and cloud platforms (AWS, GCP, Azure). MSc or PhD in Computer Science, Data Science, AI or related disciplines. Ready to shape the future of AI? Apply now More ❯
building finance systems on cloud platforms like AWS S3, Snowflake, etc. Preferred Qualifications Interest or knowledge in investment banking or financial instruments. Experience with big data concepts, such as Hadoop for Data Lake. Experience with near real-time transactional systems like Kafka. Experience in Business Process Management (BPM). ABOUT GOLDMAN SACHS At Goldman Sachs, we dedicate our people More ❯
building finance systems on cloud platforms like AWS S3, Snowflake, etc. Preferred Qualifications Interest or knowledge in investment banking or financial instruments. Experience with big data concepts, such as Hadoop for Data Lake. Experience with near real-time transactional systems like Kafka. Experience in Business Process Management (BPM). ABOUT GOLDMAN SACHS At Goldman Sachs, we dedicate our people More ❯
Experience with cloud technologies such as AWS S3, Snowflake, etc. Preferred Qualifications Interest or knowledge in investment banking or financial instruments. Experience with big data concepts and tools like Hadoop for Data Lake. Experience with near real-time transactional systems like Kafka. Experience in Business Process Management (BPM). About Goldman Sachs At Goldman Sachs, we dedicate our people More ❯
Maidenhead, England, United Kingdom Hybrid / WFH Options
BookFlowGo
and deploying real-time pricing or recommendation systems Deep technical knowledge of machine learning frameworks (e.g., TensorFlow, PyTorch), cloud platforms (AWS, GCP, Azure), and big data tools (e.g., Spark, Hadoop) Experience designing and/or implementing business-critical operational algorithms Clear communication skills; experienced in communicating with senior stakeholders and translating complex technical solutions into business impact Excellent problem More ❯
Experience with data analysis and visualization tools (e.g., Matplotlib, Seaborn, Tableau). Ability to work independently and lead projects from inception to deployment. Experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, GCP, Azure) is desirable. TCS is consistently voted a Top Employer in the UK and globally. Our competitive salary packages feature pension, health More ❯
your cloud/platform engineering capabilities Experience working with Big Data Experience of data storage technologies: Delta Lake, Iceberg, Hudi Proven knowledge and understanding of Apache Spark, Databricks or Hadoop Ability to take business requirements and translate these into tech specifications Competence in evaluating and selecting development tools and technologies Sound like the role you have been looking for More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Ripjar
in production and have a curiosity and interest in learning more. In this role, you will be using python (specifically pyspark) and Node.js for processing data, backed by various Hadoop stack technologies such as HDFS and HBase. MongoDB and Elasticsearch are used for indexing smaller datasets. Airflow & Nifi are used to co-ordinate the processing of data, while Jenkins … software systems in production and have a curiosity and interest in learning more. You will be using Python (specifically pyspark) and Node.js for processing data You will be using Hadoop stack technologies such as HDFS and HBase Experience using MongoDB and Elasticsearch for indexing smaller datasets would be beneficial Experience using Airflow & Nifi to co-ordinate the processing of More ❯
visualization tools such as QuickSight, Tableau, Looker, or QlikSense . Ability to create well-documented, scalable, and reusable data solutions. Desirable Skills Experience with big data technologies such as Hadoop, MapReduce, or Spark. Exposure to microservice-based data APIs . Familiarity with data solutions in other public cloud platforms . AWS certifications (e.g., Solutions Architect Associate , Big Data Specialty More ❯
visualization tools such as QuickSight, Tableau, Looker, or QlikSense . Ability to create well-documented, scalable, and reusable data solutions. Desirable Skills Experience with big data technologies such as Hadoop, MapReduce, or Spark. Exposure to microservice-based data APIs . Familiarity with data solutions in other public cloud platforms . AWS certifications (e.g., Solutions Architect Associate , Big Data Specialty More ❯
with cloud platforms (preferably Azure). Experience with source control tools (e.g., Git) and CI/CD pipelines. Familiarity with big data or NoSQL technologies (e.g., MongoDB, Cosmos DB, Hadoop). Exposure to data analytics tools (Power BI, Tableau) or machine learning workflows. Knowledge of data governance, GDPR, and data compliance practices. Why Join Be part of a growing More ❯
with data visualization tools such as QuickSight, Tableau, Looker, or QlikSense . Ability to create well-documented, scalable, and reusable data solutions. Experience with big data technologies such as Hadoop, MapReduce, or Spark. Exposure to microservice-based data APIs . Familiarity with data solutions in other public cloud platforms . AWS certifications (e.g., Solutions Architect Associate , Big Data Specialty More ❯
and continuous delivery Excellent problem-solving skills and a collaborative mindset Agile development experience in a team setting Bonus Skills (nice to have) Experience with big data tools like Hadoop, Spark, or Scala Exposure to fraud, payments , or financial services platforms Understanding of cloud-native development and container orchestration Knowledge of test-driven development and modern code quality practices More ❯
Telford, England, United Kingdom Hybrid / WFH Options
Supermercados Guanabara
/machine learning libraries Background in agile software development environments Ability to estimate effort, manage dependencies, and communicate effectively with technical and non-technical stakeholders Desirable skills: Experience with Hadoop and Jenkins AWS or Azure certifications Familiarity with Java If you would like to learn more about the role, please apply through the advert and we will be in More ❯
skills Formal training or certification on software engineering concepts and applied experience Experience in dealing with large amount of data, Data Engineering skills are desired Proven experience in Spark, Hadoop, Databricks and Snowflake Hands-on practical experience delivering system design, application development, testing, and operational stability Advanced in one or more programming language(s) i.e. Java or Python Advanced More ❯
BASIC QUALIFICATIONS - 3+ years of experience in cloud architecture and implementation - Bachelor's degree in Computer Science, Engineering, related field, or equivalent experience - Experience in database (eg. SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) - Experience in consulting, design and implementation of serverless distributed solutions - Experience in software development with object oriented language PREFERRED QUALIFICATIONS - AWS experience preferred, with proficiency in More ❯
Docker and orchestration tools like Kubernetes. Familiarity with Infrastructure as Code (IaC) tools such as Terraform or CloudFormation. Knowledge of data engineering and experience with big data technologies like Hadoop, Spark, or Kafka. Experience with CI/CD pipelines and automation, such as using Jenkins, GitLab, or CircleCI. ABOUT BUSINESS UNIT IBM Consulting is IBM's consulting and global More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Danske Bank
adapt message for different stakeholder groups, including technical and non-technical audiences. Preferred 3rd Level degree in IT or related STEM discipline. Good working knowledge of Python. Experience with Hadoop and Spark. 1+ years' experience with IBM DataStage. Knowledge of cloud technologies such as AWS, data bricks, snowflake. HOW WE WORK Our belief is that we are " Better when More ❯
External Description Reach beyond with Liberty IT; for this is where you'll find the super challenges, where you'll be given the scope and the support to go further, dig deeper and fly higher. We won't stand over More ❯