South Ruislip, England, United Kingdom Hybrid / WFH Options
ANSON MCCADE
Azure, or GCP. Hands-on skills in data modeling, ETL processes, and programming with SQL, Python, or Java. Familiarity with big data tools like Hadoop, Spark, or Kafka. Clear communication and problem-solving abilities, making complex ideas easy to understand. Bonus Points For: Certifications like AWS Data Analytics or more »
South Ruislip, England, United Kingdom Hybrid / WFH Options
ANSON MCCADE
solutions (AWS, Azure, GCP). Hands-on experience with database management systems like Oracle, SQL Server, or PostgreSQL. Familiarity with big data technologies, including Hadoop, Spark, and Kafka. Proficiency in data modeling, ETL frameworks (e.g., Talend, Apache NiFi), and programming languages like SQL, Python, or Java. Knowledge of data more »
South Ruislip, England, United Kingdom Hybrid / WFH Options
ANSON MCCADE
Tools: Proficiency in SQL, Python, or Java, and familiarity with ETL tools like Talend and Apache NiFi. Big Data Acumen: Experience with technologies like Hadoop, Spark, and Kafka. Exceptional communication skills to convey technical concepts to varied audiences and strong problem-solving abilities. Preferred Qualifications Certifications in data management more »
City of London, London, Vintry, United Kingdom Hybrid / WFH Options
Deerfoot Recruitment Solutions Limited
JUnit). Experience testing backend systems or APIs, with knowledge of REST, JSON, or Thrift. Bonus: Familiarity with Selenium WebDriver, Jenkins, big data technologies (Hadoop, Kafka), or performance testing tools like JMeter. Why Apply? Competitive bonus structure (from 4%, up to 8% after 3 years). Comprehensive benefits including more »
Employment Type: Permanent
Salary: £90000 - £120000/annum Great Benefits Package
frameworks (e.g., TensorFlow, PyTorch, Scikit-learn). Experience with tools like MLflow, Kubeflow , or similar platforms for managing ML pipelines. Hands-on experience with Hadoop, Spark , or distributed computing frameworks . Proficiency in SQL and NoSQL databases for accessing and preprocessing large datasets. Familiarity with cloud ML services (e.g. more »
building. Experience with machine learning frameworks (e.g., TensorFlow, PyTorch, Scikit-learn). Proficiency in SQL for querying large datasets. Familiarity with tools like Spark, Hadoop , or cloud data warehouses (e.g., BigQuery, Redshift). Strong understanding of statistical modelling, hypothesis testing , and A/B testing methodologies . Experience with more »
team Experience in A/B hypothesis testing (confidence intervals and p-values, linear regression to generalised additive models, time series) Knowledge of Spark, Hadoop or similar platform Experience with CI/CD including Git, Docker Experience with data visualisation platforms like Tableau, Power BI, Looker Studio, etc. Experience more »
large-scale data. Experience with ETL processes for data ingestion and processing. Proficiency in Python and SQL. Experience with big data technologies like ApacheHadoop and Apache Spark. Familiarity with real-time data processing frameworks such as Apache Kafka or Flink. MLOps & Deployment: Experience deploying and maintaining large-scale more »
Oracle, SQL Server, PostgreSQL) and data warehousing technologies. Experience with cloud-based data solutions (AWS, Azure, GCP). Familiarity with big data technologies like Hadoop, Spark, and Kafka. Technical Skills: Proficiency in data modelling (ERD, normalization) and data warehousing concepts. Strong understanding of ETL frameworks and tools (e.g., Talend more »
Oracle, SQL Server, PostgreSQL) and data warehousing technologies. Experience with cloud-based data solutions (AWS, Azure, GCP). Familiarity with big data technologies like Hadoop, Spark, and Kafka. Technical Skills: Proficiency in data modelling (ERD, normalization) and data warehousing concepts. Strong understanding of ETL frameworks and tools (e.g., Talend more »
cluster analysis, dimensionality reduction, neural networks) Hands-on development experience working with version control systems (we use Git) and Big Data and Cloud platforms (Hadoop, Spark, Amazon Web Services (AWS), Google Cloud (GCP)) a strong asset regression and machine learning principles Practical experience designing and applying data science processes more »
Stevenage, England, United Kingdom Hybrid / WFH Options
Capgemini Engineering
visualization tools (e.g., Matplotlib, Seaborn, Tableau).• Ability to work independently and lead projects from inception to deployment.• Experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, GCP, Azure) is desirable.• MSc or PhD in Computer Science, Data Science, Artificial Intelligence, or related field is more »
with SQL databases Nice to have Experience in Financial Services/Fintech or Payments Familiar with distributed general-purpose cluster-computing (e.g. Spark, Dask, Hadoop) Experience with Docker. Experience with AWS or at least another common cloud platform (GCP/Azure). Familiar with the unix shell and shell more »
A very exciting opportunity! The following skills/experience is required: Strong Data Architect background Experience in Data Technologies to include: Finbourne LUSID, Snowflake, Hadoop, Spark. Experience in Cloud Platforms: AWS, Azure or GCP. Previously worked in Financial Services: Understanding of data requirements for equities, fixed income, private assets more »
s degree, or experience in a professional field or military - Experience as technical specialist in design and architecture - Experience in database (e.g., SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) - Experience in cloud based solution (AWS or equivalent), system, network and operating system - A UK national and able to obtain UK more »
Darlington, County Durham, United Kingdom Hybrid / WFH Options
Additional Resources
Excellent problem-solving abilities with the capacity to convert business requirements into analytical solutions. Experience with big data technologies and distributed computing frameworks (e.g., Hadoop, Spark)would be beneficial.. Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and knowledge of database systems (SQL, NoSQL) would be preferred. What's more »
and data warehousing technologies (e.g., SQL Server, Snowflake, Databricks). - Experience with cloud platforms (AWS, Azure, or Google Cloud) and big data technologies (Spark, Hadoop, Kafka). Technical Skills - Expertise in SQL, Python (Pandas, NumPy), and data modelling. - Experience with data pipeline orchestration tools (e.g., Airflow, DBT). - Proficiency more »
building and optimising data pipelines and distributed data systems. - Strong expertise in cloud platforms (AWS, GCP, or Azure) and modern data technologies (Spark, Kafka, Hadoop, or similar). - Proficiency in programming languages such as Python, Scala, or Java. - Experience working on AI/ML-driven platforms, with knowledge of more »
City Of Bristol, England, United Kingdom Hybrid / WFH Options
Made Tech
Salary: £85,000 to £100,000 depending on experience Location: Hybrid working in either Bristol, Manchester, London. Are you a skilled Data Engineer who can deliver Data Platforms? Are you familiar of working on agile delivery-led projects? Data Engineers more »
AWS Data Engineer Salary: £50,000 - £95,000 – Bonus + Pension + Private Healthcare Location: London/UK Wide Location – Hybrid working * To be successfully appointed to this role, you must be eligible for Security Check (SC) and/or more »
or Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as Apache Spark, Hadoop, Kafka, etc. Data acquisition and development of data sets and improving data quality Preparing data for predictive and prescriptive modelling Hands on coding experience more »
Greater London, England, United Kingdom Hybrid / WFH Options
Phaidon International
PyTorch, Scikit-learn). Knowledge of data visualization tools (e.g., Power BI, Tableau). Proficiency in Python, R, SQL, and big data frameworks (e.g., Hadoop, Spark). Hands-on experience with machine learning libraries such as TensorFlow, Scikit-learn, or PyTorch. Experience working with large, complex datasets and designing more »
with SQL databases. Nice to have: Experience in Financial Services/Fintech or Payments. Familiar with distributed general-purpose cluster-computing (e.g. Spark, Dask, Hadoop). Experience with Docker. Experience with AWS or at least another common cloud platform (GCP/Azure). Familiar with the unix shell and more »