AWS data platforms and their respective data services. Solid understanding of data governance principles, including data quality, metadata management, and access control. Familiarity with big data technologies (e.g., Spark, Hadoop) and distributed computing. Proficiency in SQL and at least one programming language (e.g., Python, Java) 6 Month Contract Inside IR35 Immediately available London up to 2 times a month More ❯
e.g. MS SQL, Oracle) NoSQL technologies skills (e.g. MongoDB, InfluxDB, Neo4J) Data exchange and processing skills (e.g. ETL, ESB, API) Development (e.g. Python) skills Big data technologies knowledge (e.g. Hadoop stack) Knowledge in NLP (Natural Language Processing) Knowledge in OCR (Object Character Recognition) Knowledge in Generative AI (Artificial Intelligence) would be advantageous Experience in containerisation technologies (e.g. Docker) would More ❯
understanding of classical and modern ML techniques, A/B testing methodologies, and experiment design. Solid background in ranking, recommendation, and retrieval systems. Familiarity with large-scale data tools (Hadoop, BigQuery, Amazon EMR, etc.). Experience with BI tools and visualization platforms such as Tableau, Qlik, or MicroStrategy. Bonus: Experience with geospatial data and advanced analytics platforms. More ❯
understanding of classical and modern ML techniques, A/B testing methodologies, and experiment design. Solid background in ranking, recommendation, and retrieval systems. Familiarity with large-scale data tools (Hadoop, BigQuery, Amazon EMR, etc.). Experience with BI tools and visualization platforms such as Tableau, Qlik, or MicroStrategy. Bonus: Experience with geospatial data and advanced analytics platforms. More ❯
a data science team, mentoring junior colleagues and driving technical direction. Experience working with Agile methodologies in a collaborative team setting. Extensive experience with big data tools, such as Hadoop and Spark, for managing and processing large-scale datasets. Extensive experience with cloud platforms, particularly Microsoft Azure, for building and deploying data science solutions. Why Join? You'll be More ❯
as well as programming languages such as Python, R, or similar. Strong experience with machine learning frameworks (e.g., TensorFlow, Scikit-learn) as well as familiarity with data technologies (e.g., Hadoop, Spark). About Vixio: Our mission is to empower businesses to efficiently manage and meet their regulatory obligations with our unique combination of human expertise and Regulatory Technology (RegTech More ❯
ETL/ELT processes. Proficiency in AWS data platforms and services. Solid understanding of data governance principles (data quality, metadata, access control). Familiarity with big data technologies (Spark, Hadoop) and distributed computing. Advanced SQL skills and proficiency in at least one programming language (Python, Java). Additional Requirements Immediate availability for an October start. Must be UK-based More ❯
bring: Significant experience in data engineering, including leading or mentoring technical teams. Deep understanding of cloud environments such as Azure, AWS, or Google Cloud Platform, and tools like Synapse, Hadoop, or Snowflake. Hands-on experience with programming languages such as Python, Java, or Scala. Strong knowledge of data architecture, modelling, and governance. A track record of delivering complex data More ❯
Services, Telecom and Media, Retail and CPG, and Public Services. Consolidated revenues as of 12 months ending December 2024 totaled $13.8 billion. Experience : Minimum 10+ Years Strong Knowledge in Hadoop, Kafka, SQL/NoSQL Specialization in designing and implementing large-scale data pipelines, ETL processes, and distributed systems Should be able to work independenty with minimal help/guidance More ❯
Services, Telecom and Media, Retail and CPG, and Public Services. Consolidated revenues as of 12 months ending December 2024 totaled $13.8 billion. Experience : Minimum 10+ Years Strong Knowledge in Hadoop, Kafka, SQL/NoSQL Specialization in designing and implementing large-scale data pipelines, ETL processes, and distributed systems Should be able to work independenty with minimal help/guidance More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Lorien
Provide Environment Management representation in daily scrums, working groups, and ad-hoc meetings. Required Skillsets: Strong skills and experience with data technologies such as IBM DB2, Oracle, MongoDB, Hive, Hadoop, SQL, Informatica, and similar tech stacks. Attention to detail and strong ability to work independently and navigate complex target end state architecture (Tessa). Strong knowledge and experience with More ❯
Stevenage, Hertfordshire, England, United Kingdom Hybrid / WFH Options
MBDA
e.g. MS SQL, Oracle...) noSQL technologies skills (e.g. MongoDB, InfluxDB, Neo4J...) Data exchange and processing skills (e.g. ETL, ESB, API...) Development (e.g. Python) skills Big data technologies knowledge (e.g. Hadoop stack) Knowledge in NLP (Natural Language Processing) Knowledge in OCR (Object Character Recognition) Knowledge in Generative AI (Artificial Intelligence) would be advantageous Experience in containerisation technologies (e.g. Docker) would More ❯
technologies, e.g. AWS, Azure. Good knowledge of Linux, it's development environments and tools Have experience in object-oriented methodologies, design patterns Understanding of Big Data technologies such as Hadoop, Spark Understanding of security implications and secure coding Proven grasp of software development lifecycle best-practices, agile methods, and conventions, including Source Code Management, Continuous Integration Practical experience with More ❯
Apache Commons Suite & Maven, SQL Database such as Oracle MySQL, PostgreSQL etc. Hands-on experience in utilizing Spring Framework (Core, MVC, Integration and Data) Experience with Big Data/Hadoop and NoSQL Database is a big plus Experience with Play framework, Angular is a big plus Business Acumen: Strong problem solving abilities and capable of articulating specific technical topics More ❯
MongoDB, InfluxDB, Neo4J). Familiarity with data exchange and processing methods (e.g. ETL, ESB, API). Proficiency in development languages such as Python. Knowledge of big data technologies (e.g. Hadoop stack). Understanding of NLP (Natural Language Processing) and OCR (Object Character Recognition). Knowledge of Generative AI would be advantageous. Experience in containerisation technologies (e.g. Docker) would be More ❯
Bolton, England, United Kingdom Hybrid / WFH Options
Anson McCade
with noSQL databases (e.g. MongoDB, InfluxDB, Neo4J). • Strong data exchange and processing expertise (ETL, ESB, API). • Programming experience, ideally in Python. • Understanding of big data technologies (e.g. Hadoop stack). • Knowledge of NLP and OCR methodologies. • Familiarity with Generative AI concepts and tools (advantageous). • Experience with containerisation (e.g. Docker) is desirable. • Background in industrial and/ More ❯
Stevenage, England, United Kingdom Hybrid / WFH Options
Anson McCade
with noSQL databases (e.g. MongoDB, InfluxDB, Neo4J). • Strong data exchange and processing expertise (ETL, ESB, API). • Programming experience, ideally in Python. • Understanding of big data technologies (e.g. Hadoop stack). • Knowledge of NLP and OCR methodologies. • Familiarity with Generative AI concepts and tools (advantageous). • Experience with containerisation (e.g. Docker) is desirable. • Background in industrial and/ More ❯
challenges. You are proficient in Python, with experience using PySpark and ML libraries such as scikit-learn, TensorFlow, or Keras . You are familiar with big data technologies (e.g., Hadoop, Spark), cloud platforms (AWS, GCP), and can effectively communicate technical concepts to non-technical stakeholders. Accommodation requests If you need assistance with any part of the application or recruiting More ❯
Docker and orchestration tools like Kubernetes. Familiarity with Infrastructure as Code (IaC) tools such as Terraform or CloudFormation. Knowledge of data engineering and experience with big data technologies like Hadoop, Spark, or Kafka. Experience with CI/CD pipelines and automation, such as using Jenkins, GitLab, or CircleCI. As an equal opportunities employer, we welcome applications from individuals of More ❯
Docker and orchestration tools like Kubernetes. Familiarity with Infrastructure as Code (IaC) tools such as Terraform or CloudFormation. Knowledge of data engineering and experience with big data technologies like Hadoop, Spark, or Kafka. Experience with CI/CD pipelines and automation, such as using Jenkins, GitLab, or CircleCI. ABOUT BUSINESS UNIT IBM Consulting is IBM's consulting and global More ❯
and social benefits (e.g. UK pension schema) What do you offer? Strong hands-on experience working with modern Big Data technologies such as Apache Spark, Trino, Apache Kafka, ApacheHadoop, Apache HBase, Apache Nifi, Apache Airflow, Opensearch Proficiency in cloud-native technologies such as containerization and Kubernetes Strong knowledge of DevOps tools (Terraform, Ansible, ArgoCD, GitOps, etc.) Proficiency in More ❯
at collaborating with cross functional teams Strong Background and experience in Data Ingestions, Transformation, Modeling and Performance tuning. Should have experience in designing and developing dashboards Strong Knowledge in Hadoop, Kafka, SQL/NoSQL Should have experience in creating roadmap to improve platform Observability Experience in leading mid-scale teams with strong communication skills Experience in Machine Learning and More ❯
at collaborating with cross functional teams Strong Background and experience in Data Ingestions, Transformation, Modeling and Performance tuning. Should have experience in designing and developing dashboards Strong Knowledge in Hadoop, Kafka, SQL/NoSQL Should have experience in creating roadmap to improve platform Observability Experience in leading mid-scale teams with strong communication skills Experience in Machine Learning and More ❯
Stevenage, Hertfordshire, South East, United Kingdom Hybrid / WFH Options
Henderson Scott
with SQL & NoSQL databases (e.g. MS SQL, MongoDB, Neo4J) Python skills for scripting and automation ETL and data exchange experience (e.g. APIs, ESB tools) Knowledge of Big Data (e.g. Hadoop) Curiosity about AI, particularly NLP, OCR or Generative AI ?? Bonus Points For Docker/containerisation experience Any previous work in industrial, aerospace or secure environments Exposure to tools like More ❯
technologies (e.g., MongoDB, InfluxDB, Neo4J). Experience with data exchange and processing (ETL, ESB, APIs). Proficiency in Python or similar programming languages. Familiarity with big data frameworks (e.g., Hadoop ecosystem). Desirable Skills: Understanding of NLP (Natural Language Processing) and OCR (Optical Character Recognition). Exposure to Generative AI concepts and tools. Experience with containerisation (e.g., Docker). More ❯