City of London, London, United Kingdom Hybrid / WFH Options
Robert Half
ETL/ELT processes . Experience with data integration tools (e.g., Apache Kafka, Talend, Informatica) and APIs . Familiarity with big data technologies (e.g., Hadoop, Spark) and real-time streaming Expertise in cloud security , data governance , and compliance (e.g., GDPR, HIPAA). Strong SQL skills and proficiency in at More ❯
infrastructure and its impact on data architecture. Data Technology Skills: A solid understanding of big data technologies such as Apache Spark, and knowledge of Hadoop ecosystems. Knowledge of programming languages such as Python, R, or Java is beneficial. Exposure to ETL/ELT processes, SQL, NoSQL databases is a More ❯
engineer Proficiency in programming languages such as Python, Java, or Scala. Strong experience with relational databases (e.g., PostgreSQL, MySQL) and big data technologies (e.g., Hadoop, Spark). Experienced with Elasticsearch and Cloud Search. Hands-on experience with cloud platforms such as AWS, Azure, or Google Cloud Platform. Experience with More ❯
in Python, SQL , and one or more: R, Java, Scala Experience with relational/NoSQL databases (e.g., PostgreSQL, MongoDB) Familiarity with big data tools (Hadoop, Spark, Kafka), cloud platforms (Azure, AWS, GCP), and workflow tools (Airflow, Luigi) Bonus: experience with BI tools , API integrations , and graph databases Apply now More ❯
Experience in commodities markets or broader financial markets. Knowledge of quantitative modeling, risk management, or algorithmic trading. Familiarity with big data technologies like Kafka, Hadoop, Spark, or similar. Why Work With Us? Impactful Work: Directly influence the profitability of the business by building technology that drives trading decisions. Innovative More ❯
Experience in commodities markets or broader financial markets. Knowledge of quantitative modeling, risk management, or algorithmic trading. Familiarity with big data technologies like Kafka, Hadoop, Spark, or similar. Why Work With Us? Impactful Work: Directly influence the profitability of the business by building technology that drives trading decisions. Innovative More ❯
Mars Wrigley Confectionery UK (SLO, WAL, ISB &PAD)
or platform management roles, with5+ years in leadership positions. Expertise inmodern data platforms (e.g., Azure, AWS, Google Cloud) and big datatechnologies (e.g., Spark, Kafka,Hadoop). Strong knowledge of data governanceframeworks, regulatory compliance (e.g., GDPR, CCPA), and datasecurity best practices. Proven experience inenterprise-level architecture design andimplementation. Hands-on More ❯
Mars Wrigley Confectionery UK (SLO, WAL, ISB &PAD)
or platform management roles, with5+ years in leadership positions. Expertise inmodern data platforms (e.g., Azure, AWS, Google Cloud) and big datatechnologies (e.g., Spark, Kafka,Hadoop). Strong knowledge of data governanceframeworks, regulatory compliance (e.g., GDPR, CCPA), and datasecurity best practices. Proven experience inenterprise-level architecture design andimplementation. Hands-on More ❯
Mars Wrigley Confectionery UK (SLO, WAL, ISB &PAD)
or platform management roles, with5+ years in leadership positions. Expertise inmodern data platforms (e.g., Azure, AWS, Google Cloud) and big datatechnologies (e.g., Spark, Kafka,Hadoop). Strong knowledge of data governanceframeworks, regulatory compliance (e.g., GDPR, CCPA), and datasecurity best practices. Proven experience inenterprise-level architecture design andimplementation. Hands-on More ❯
Mars Wrigley Confectionery UK (SLO, WAL, ISB &PAD)
or platform management roles, with5+ years in leadership positions. Expertise inmodern data platforms (e.g., Azure, AWS, Google Cloud) and big datatechnologies (e.g., Spark, Kafka,Hadoop). Strong knowledge of data governanceframeworks, regulatory compliance (e.g., GDPR, CCPA), and datasecurity best practices. Proven experience inenterprise-level architecture design andimplementation. Hands-on More ❯
Account team within Services group (TTS) and is responsible for building a scalable, high-performance data platform on Big Data technologies (Spark, Scala, Hive, Hadoop) along with Kafka/Java and AI technologies to support core account data needs across multiple lines of businesses. As a tenant on the More ❯
Python and SQL programming languages. Hands-on experience with cloud platforms like AWS, GCP, or Azure, and familiarity with big data technologies such as Hadoop or Spark. Experience working with relational databases and NoSQL databases. Strong knowledge of data structures, data modelling, and database schema design. Experience in supporting More ❯
collaborate with technical teams to build scalable solutions. Expertise in programming languages such as Python, R, SQL, and familiarity with big data technologies like Hadoop, Spark, and cloud platforms (e.g., AWS, GCP, Azure). Exceptional problem-solving and analytical thinking. Strong leadership, team-building, and mentoring skills. Excellent communication More ❯
with distributed systems as it pertains to data storage and computing Experience with Redshift, Oracle, NoSQL etc. Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Bachelor's degree PREFERRED QUALIFICATIONS Experience working on and delivering end to end projects independently Experience providing technical leadership and mentoring More ❯
Engineering: Proficiency in developing and maintaining real-time data pipelines. Experience with ETL processes, Python, and SQL. Familiarity with big data technologies like ApacheHadoop and Apache Spark. MLOps & Deployment: Experience deploying and maintaining ML inference pipelines. Proficiency with Docker and Kubernetes. Familiarity with AWS cloud platform. The Perfect More ❯
Engineering: Proficiency in developing and maintaining real-time data pipelines. Experience with ETL processes, Python, and SQL. Familiarity with big data technologies like ApacheHadoop and Apache Spark. MLOps & Deployment: Experience deploying and maintaining ML inference pipelines. Proficiency with Docker and Kubernetes. Familiarity with AWS cloud platform. The Perfect More ❯
infrastructure-as-code (e.g., Terraform, CloudFormation), CI/CD pipelines, and monitoring (e.g., CloudWatch, Datadog). Familiarity with big data technologies like Apache Spark, Hadoop, or similar. ETL/ELT tools and creating common data sets across on-prem (IBMDatastage ETL) and cloud data stores Leadership & Strategy: Lead Data More ❯
Keycloak. Experience with web frameworks such as Fast API, SpringBoot and Express. Demonstrated hands on experience working with Demonstrated hands-on experience working with Hadoop, Apache Spark and their related ecosystems. A candidate must be a US Citizen and requires an active/current TS/SCI with Polygraph More ❯
and virtual environments. Experience with network traffic inspection tools (e.g., Suricata, Arkime, Zeek, etc.). Knowledge of big data technologies, (e.g., Elastic Search, ApacheHadoop, Spark, Kafka, etc.). Relevant Certifications: Certifications in Cloud Engineering, (e.g., Amazon Web Services (AWS) Solutions Architect - Associate; Microsoft Certified: Azure Fundamentals; Google Associate More ❯
with unstructured datasets. Engineering best practices and standards. Experience with data warehouse software (e.g. Snowflake, Google BigQuery, Amazon Redshift). Experience with data tools: Hadoop, Spark, Kafka, etc. Code versioning (Github integration and automation). Experience with scripting languages such as Python or R. Working knowledge of message queuing More ❯
Service Catalogue, Cloud Formation, Lake Formation, SNS, SQS, Event Bridge Language & Scripting: Python and Spark ETL: DBT Good to Have: Airflow, Snowflake, Big Data (Hadoop), and Teradata Responsibilities: Serve as the primary point of contact for all AWS related data initiatives and projects. Responsible for leading a team of More ❯
required for this role. Knowledge of cloud-based database solutions (AWS, Azure, Google Cloud) is an advantage. Preferred Qualifications: Experience with big data technologies (Hadoop, Spark, Snowflake) is a plus. Certification in database technologies or testing methodologies. Knowledge of scripting languages like Python or Shell scripting for test automation. More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Mastek
required for this role. Knowledge of cloud-based database solutions (AWS, Azure, Google Cloud) is an advantage. Preferred Qualifications: Experience with big data technologies (Hadoop, Spark, Snowflake) is a plus. Certification in database technologies or testing methodologies. Knowledge of scripting languages like Python or Shell scripting for test automation. More ❯
AWS Databases: MSSQL, PostgreSQL, MySQL, NoSQL Cloud: AWS (preferred), with working knowledge of cloud-based data solutions Nice to Have: Experience with graph databases, Hadoop/Spark, or enterprise data lake environments What You’ll Bring Strong foundation in computer science principles (data structures, algorithms, etc.) Experience building enterprise More ❯