of the following: Python, SQL, Java Commercial experience in client-facing projects is a plus, especially within multi-disciplinary teams Deep knowledge of database technologies: Distributed systems (e.g., Spark, Hadoop, EMR) RDBMS (e.g., SQL Server, Oracle, PostgreSQL, MySQL) NoSQL (e.g., MongoDB, Cassandra, DynamoDB, Neo4j) Solid understanding of software engineering best practices - code reviews, testing frameworks, CI/CD, and More ❯
ETL processes. Proficiency in Python. Experience with cloud platforms (AWS, Azure, or GCP). Knowledge of data modelling, warehousing, and optimisation. Familiarity with big data frameworks (e.g. Apache Spark, Hadoop). Understanding of data governance, security, and compliance best practices. Strong problem-solving skills and experience working in agile environments. Desirable: Experience with Docker/Kubernetes, streaming data (Kafka More ❯
ETL processes. Proficiency in Python. Experience with cloud platforms (AWS, Azure, or GCP). Knowledge of data modelling, warehousing, and optimisation. Familiarity with big data frameworks (e.g. Apache Spark, Hadoop). Understanding of data governance, security, and compliance best practices. Strong problem-solving skills and experience working in agile environments. Desirable: Experience with Docker/Kubernetes, streaming data (Kafka More ❯
data modelling, warehousing, and performance optimisation. Proven experience with cloud platforms (AWS, Azure, or GCP) and their data services. Hands-on experience with big data frameworks (e.g. Apache Spark, Hadoop). Strong knowledge of data governance, security, and compliance. Ability to lead technical projects and mentor junior engineers. Excellent problem-solving skills and experience in agile environments. Desirable: Experience More ❯
data modelling, warehousing, and performance optimisation. Proven experience with cloud platforms (AWS, Azure, or GCP) and their data services. Hands-on experience with big data frameworks (e.g. Apache Spark, Hadoop). Strong knowledge of data governance, security, and compliance. Ability to lead technical projects and mentor junior engineers. Excellent problem-solving skills and experience in agile environments. Desirable: Experience More ❯
communication skills We're excited if you have 7+ years of experience delivering multi tier, highly scalable, distributed web applications Experience working with Distributed computing frameworks knowledge: Hive/Hadoop, Apache Spark, Kafka, Airflow Working with programming languages Python , Java, SQL. Working on building ETL (Extraction Transformation and Loading) solution using PySpark Experience in SQL/NoSQL database design More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
problem-solving skills, and the ability to think critically and analytically High experience in documentation and data dictionaries Knowledge of big data technologies and distributed computing frameworks such as Hadoop and Spark Excellent communication skills to effectively collaborate with cross-functional teams and present insights to business stakeholders Please can you send me a copy of your CV if More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
problem-solving skills, and the ability to think critically and analytically High experience in documentation and data dictionaries Knowledge of big data technologies and distributed computing frameworks such as Hadoop and Spark Excellent communication skills to effectively collaborate with cross-functional teams and present insights to business stakeholders Please can you send me a copy of your CV if More ❯
Crewe, Cheshire, United Kingdom Hybrid/Remote Options
Manchester Digital
AI model development. Expertise in Python, R, or Julia, with proficiency in pandas, NumPy, SciPy, scikit-learn, TensorFlow, or PyTorch. Experience with SQL, NoSQL, and big data technologies (Spark, Hadoop, Snowflake, Databricks, etc.). Strong background in statistical modelling, probability theory, and mathematical optimization. Experience deploying machine learning models to production (MLOps, Docker, Kubernetes, etc.). Familiarity with AWS More ❯
Manchester, Lancashire, United Kingdom Hybrid/Remote Options
CHEP UK Ltd
such as Python, R, and SQL for data analysis and model development. Experience working with cloud computing platforms including AWS and Azure, and familiarity with distributed computing frameworks like Hadoop and Spark. Deep understanding of supply chain operations and the ability to apply data science methods to solve real-world business problems effectively. Strong foundational knowledge in mathematics and More ❯
Flows, Conduct>It, Express>It, Metadata Hub, and PDL. Hands-on experience with SQL , Unix/Linux shell scripting , and data warehouse concepts . Familiarity with big data ecosystems (Hadoop, Hive, Spark) and cloud platforms (AWS, Azure, GCP) is a plus. Proven ability to troubleshoot complex ETL jobs and resolve performance issues. Experience working with large-scale datasets and More ❯
in data modelling, data warehousing, and ETL development. Hands-on experience with Azure Data Factory, Azure Data Lake, and Azure SQL Database. Exposure to big data technologies such as Hadoop, Spark, and Databricks. Experience with Azure Synapse Analytics or Cosmos DB. Familiarity with data governance frameworks (e.g., GDPR, HIPAA). Experience implementing CI/CD pipelines using Azure DevOps More ❯
in data modelling, data warehousing, and ETL development. Hands-on experience with Azure Data Factory, Azure Data Lake, and Azure SQL Database. Exposure to big data technologies such as Hadoop, Spark, and Databricks. Experience with Azure Synapse Analytics or Cosmos DB. Familiarity with data governance frameworks (e.g., GDPR, HIPAA). Experience implementing CI/CD pipelines using Azure DevOps More ❯
Python) and other database applications; · Understanding of PC environment and related software, including Microsoft Office applications; · Knowledge of data engineering using data stores including MS SQL Server, Oracle, NoSQL, Hadoop or other distributed data technologies. Experience using data visualization tools is a plus; · Experienced with Excel to aggregate, model, and manage large data sets; · Familiar with Microsoft Power BI More ❯
e.g. MS SQL, Oracle) NoSQL technologies skills (e.g. MongoDB, InfluxDB, Neo4J) Data exchange and processing skills (e.g. ETL, ESB, API) Development (e.g. Python) skills Big data technologies knowledge (e.g. Hadoop stack) Knowledge in NLP (Natural Language Processing) Knowledge in OCR (Object Character Recognition) Knowledge in Generative AI (Artificial Intelligence) would be advantageous Experience in containerisation technologies (e.g. Docker) would More ❯
utilising strong communication and stakeholder management skills when engaging with customers Significant experience of coding in Python and Scala or Java Experience with big data processing tools such as Hadoop or Spark Cloud experience; GCP specifically in this case, including services such as Cloud Run, Cloud Functions, BigQuery, GCS, Secret Manager, Vertex AI etc. Experience with Terraform Prior experience More ❯
commercial impact. Understanding of ML Ops vs DevOps and broader software engineering standards. Cloud experience (any platform). Previous mentoring experience. Nice to have: Snowflake or Databricks Spark, PySpark, Hadoop or similar big data tooling BI exposure (PowerBI, Tableau, etc.) Interview Process Video call - high-level overview and initial discussion In-person technical presentation - based on a provided example More ❯
Apache Commons Suite & Maven, SQL Database such as Oracle MySQL, PostgreSQL etc. Hands-on experience in utilizing Spring Framework (Core, MVC, Integration and Data) Experience with Big Data/Hadoop and NoSQL Database is a big plus Experience with Play framework, Angular is a big plus Business Acumen: Strong problem solving abilities and capable of articulating specific technical topics More ❯
Stevenage, Hertfordshire, South East, United Kingdom Hybrid/Remote Options
MBDA
e.g. MS SQL, Oracle...) noSQL technologies skills (e.g. MongoDB, InfluxDB, Neo4J...) Data exchange and processing skills (e.g. ETL, ESB, API...) Development (e.g. Python) skills Big data technologies knowledge (e.g. Hadoop stack) Knowledge in NLP (Natural Language Processing) Knowledge in OCR (Object Character Recognition) Knowledge in Generative AI (Artificial Intelligence) would be advantageous Experience in containerisation technologies (e.g. Docker) would More ❯
Docker and orchestration tools like Kubernetes. Familiarity with Infrastructure as Code (IaC) tools such as Terraform or CloudFormation. Knowledge of data engineering and experience with big data technologies like Hadoop, Spark, or Kafka. Experience with CI/CD pipelines and automation, such as using Jenkins, GitLab, or CircleCI. As an equal opportunities employer, we welcome applications from individuals of More ❯
and social benefits (e.g. UK pension schema) What do you offer? Strong hands-on experience working with modern Big Data technologies such as Apache Spark, Trino, Apache Kafka, ApacheHadoop, Apache HBase, Apache Nifi, Apache Airflow, Opensearch Proficiency in cloud-native technologies such as containerization and Kubernetes Strong knowledge of DevOps tools (Terraform, Ansible, ArgoCD, GitOps, etc.) Proficiency in More ❯
Preferred Qualifications Experience converting research studies into tangible real-world changes Knowledge of AWS platforms such as S3, Glue, Athena, Sagemaker Experience with big data technologies such as AWS, Hadoop, Spark, Pig, Hive etc. PhD in Industrial/Organizational Psychology or related field Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central More ❯
across a global estate Strong domain knowledge of Data & AI, including familiarity with cloud data and AI platforms (e.g. MS Azure, AWS Google Cloud) and big data technologies (e.g. Hadoop, Spark) Excellent leadership, communication, and interpersonal skills Ability to think strategically and solve complex business scenarios Working knowledge of TOGAF, DAMA-DMBOK, or similar frameworks is a plusWe are More ❯
Scripting * Reviewing of other people's and AI-generated code * data engineering project Some other highly valued skills may include: * AI Prompt Engineering for code generation * Docker/Kubernetes * Hadoop (HDFS, HBase, Kafka, Spark etc) LA International is a HMG approved ICT Recruitment and Project Solutions Consultancy, operating globally from the largest single site in the UK as an More ❯
overview. Should have hands-on experience in creating reports in Microsoft excel. Capable of understanding the ITIL terminology and various Service Management and Software Development Lifecycle terminologies. Knowledge of Hadoop and ITIL – Alerting and Monitoring, Change management, Problem management and Incident management. Knowledge of service protection and change exception handling. Knowledge and understanding of Banking domain and IT Infrastructure More ❯