Cardiff, Wales, United Kingdom Hybrid / WFH Options
Creditsafe
our customers to match their customer and client lists against our extensive database of companies across many different countries. Currently built on a python, hadoop/hive and AWS tech stack, the product is constantly evolving and is currently going through a very exciting phase where we seek to more »
on experience with data visualization tools (Tableau, Power BI, matplotlib, seaborn). Knowledge of cloud platforms (AWS, GCP, Azure) and distributed computing tools (Spark, Hadoop) is a plus. Solid understanding of statistical methods, hypothesis testing, and experimental design. Soft Skills : Strong analytical and problem-solving skills. Excellent communication skills … and understanding of how data impacts business strategies. Preferred Qualifications : 3+ years of experience in a data science role. Experience with big data technologies (Hadoop, Spark). Familiarity with deep learning techniques and frameworks. Experience in A/B testing, causal inference, and other experimental design methodologies. For further more »
including stakeholder management Have experience in one of the following technologies: • ETL toolset (Talend, Pentaho, SAS DI, Informatica etc) • Database (Client, RDS, Redshift, MySQL, Hadoop, Postgres, etc) • Job Scheduling toolset (Job Scheduler, TWS, etc) • Programming and scripting languages (PL/SQL, SQL, Unix, Java, Python, Hive, HiveQL, HDFS, Impala more »
month contract based in Glasgow 2 days per week. This role falls Out of Scope of IR35 and is paying £400 per day. - Pyspark, Hadoop Big Data knowledge - Coding Skills (Java, Shell, Python) - DevOps best practices - SDLC processes - Can perform installation and configuration of these tools on cloud or more »
Experience of Big Data technologies/Big Data Analytics. C++, Java, Python, Shell Script R, Matlab, SAS Enterprise Miner Elastic search and understanding of Hadoop ecosystem Experience working with large data sets, experience working with distributed computing tools like Map/Reduce, Hadoop, Hive, Pig etc. Advanced use more »
CI/CD pipelines and containerization technologies such as Docker and Kubernetes is a plus. Preferred Qualifications: Experience with big data technologies like Spark , Hadoop , or Kafka . Knowledge of data governance , security , and compliance frameworks in the cloud. Azure certifications such as Azure Data Engineer Associate or Azure more »
Caret, Scikit-learn, Keras, TensorFlow, Pytorch is beneficial. * Experience working with large data sets, simulation/optimisation and distributed computing tools (Map/Reduce, Hadoop, Hive, Spark, Gurobi, Arena, etc. more »
a Data Engineer or in a similar role. Strong proficiency in SQL and experience with ETL frameworks. Experience with big data tools such as Hadoop, Spark, Kafka, etc. Proficiency in programming languages such as Python or Java. Experience with cloud platforms (AWS, Azure, or Google Cloud). Strong understanding more »
Newcastle Upon Tyne, England, United Kingdom Hybrid / WFH Options
G.Digital
practice What you need 😎 Commercial experience applying ML models Python, SQL or R Cloud data platform exposure - AWS ideally Any big data technologies like Hadoop would be great Strong communication and stakeholder engagement What’s on offer to you? 💰 £55k Bonus of up to 10% paid quarterly Flexible working more »
Enso Recruitment is working on behalf of our client to find a talented Data Scientist. If you are passionate about machine learning and predictive modelling, and thrive in a dynamic, data-driven environment, this role offers an exciting chance to more »
Manchester, North West, United Kingdom Hybrid / WFH Options
Corecruitment International
qualification preferred). 10+ years of experience in FP&A, ideally within the leisure, hospitality, or related sectors. Expertise in Big Data technologies (e.g., Hadoop, SQL, Python, BigQuery). Proven experience in managing financial planning and budgeting processes in large organizations. Strong background in real estate investment and financial more »
5+ years of experience in software engineering with strong expertise in Java and related frameworks. Proven experience with big data technologies such as ApacheHadoop, Apache Spark, Apache Kafka, or other distributed systems. Experience building data pipelines for processing large datasets in batch and real-time. Familiarity with cloud more »
Telford, Shropshire, United Kingdom Hybrid / WFH Options
Experis
including stakeholder management Have experience in one of the following technologies: ETL toolset (Talend, Pentaho, SAS DI, Informatica etc) Database (Oracle, RDS, Redshift, MySQL, Hadoop, Postgres, etc) Job Scheduling toolset (Job Scheduler, TWS, etc) Programming and scripting languages (PL/SQL, SQL, Unix, Java, Python, Hive, HiveQL, HDFS, Impala more »
Experience of Big Data technologies/Big Data Analytics. C++, Java, Python, Shell Script R, Matlab, SAS Enterprise Miner Elastic search and understanding of Hadoop ecosystem Experience working with large data sets, experience working with distributed computing tools like Map/Reduce, Hadoop, Hive, Pig etc. Advanced use more »
role, preferably within an IT consultancy. Proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL). Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka). Experience with cloud platforms (e.g., AWS, Google Cloud, Azure) and data warehousing solutions (e.g., Redshift, BigQuery). Proficient in programming languages more »
processing and analytical pipelines for big data, relational databases, NoSQL and data warehouse solutions. * Extensive experience providing practical direction within the AWS Native and Hadoop * Experience with private and public cloud architectures, pros/cons, and migration considerations. * Minimum of 5 years of hands-on experience in AWS and more »
DURATION: 12 MONTH INITIAL CONTRACT IR35 STATUS: INSIDE We are looking for a highly skilled Data Engineer with expertise in Python, Java, Pyspark, and Hadoop Big Data to join our team. The successful candidate will be responsible for designing, building, and maintaining systems to collect, store, and analyse large more »
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Bowerford Associates
optimal data extraction, transformation and loading using leading cloud technologies including Azure and AWS. Leverage Big Data Technologies - you will utilise tools such as Hadoop, Spark and Kafka to design and manage large-scale data processing systems. Extend and maintain the data Warehouse - you will be supporting and enhancing more »