wrangling, modelling, feature engineering and deployment What’s equally relevant is programming knowledge and a gift for coding using: R, Python, SQL, SAS or Spark You’ll also need the capability to deliver projects, collaborate with operational teams and be able to explain technical concepts to non-specialists What more »
MLOps experience, Experience with cloud computing platforms such as AWS, Azure, or GCP (Google Cloud Platform). Familiarity with big data technologies such as Apache Hadoop, Spark, or Kafka. Experience deploying machine learning models in production environments. Contributions to open-source machine learning projects or research publications in more »
product experimentation, Causal AI, and advanced statistical techniques. Deep knowledge of data science tools (e.g., scikit-learn, TensorFlow, PyTorch) and big data technologies (e.g., Spark). Proficiency in Python for data manipulation, model building, and scripting. Strong communication skills to present findings to both technical and non-technical audiences. more »
ingestion pipelines. Requirements: Proven experience working with Python or Java or C# Experience working with ELT/ELT technologies such as Airflow, Argo, Dagster, Spark, Hive Strong technical expertise, especially in data processing and exploration, with a willingness to learn new technologies. A passion for automation and driving continual more »
in architecture of the data workloads in EMR\Clusters. Designs system architecture to integrate easily with other AWS services. Strong background in technologies like Spark, Hive and Pyspark. Key Experience: Experience of managing the full life cycle of a data platform solution in AWS. Experience of leading AWS cloud more »
Manchester, North West, United Kingdom Hybrid / WFH Options
N Brown Group
and practices and tools like Jira and Confluence. What technical skills you will have Experience with general Cloud products (Cloyd SQL, BigQuery, RedShift, Snowflake, Apache Beam, Spark) or similar products. Experience with open-source data-stack tools such as Airflow, Airbyte, DBT, Kafka etc. Awareness of data visualisation more »
analytics. The Client would also like to see experience of managing and leading a team of Data Scientists. Should have experience of SCALA/SPARK and Hadoop. Initially this is a 3 month contract assignment in Canary Wharf - with likelihood that it will go on beyond that point. Location more »
network programming. Experience in building and enhancing compute, storage, and data platforms with exposure to open source products like Kubernetes, Knative, Ceph, Rook, Cassandra, Spark, Nate and the like. Hands-on with infrastructure-as-code tools and automation, such as Terraform, Ansible, or Helm. The role Tech Lead responsible more »
network programming. Experience in building and enhancing compute, storage, and data platforms with exposure to open source products like Kubernetes, Knative, Ceph, Rook, Cassandra, Spark, Nate and the like. Hands-on with infrastructure-as-code tools and automation, such as Terraform, Ansible, or Helm. The role Tech Lead responsible more »
network programming. Experience in building and enhancing compute, storage, and data platforms with exposure to open source products like Kubernetes, Knative, Ceph, Rook, Cassandra, Spark, Nate and the like. Hands-on with infrastructure-as-code tools and automation, such as Terraform, Ansible, or Helm. The role Tech Lead responsible more »
or Rust. Experience in building and enhancing compute, storage, and data platforms with exposure to open source products like Kubernetes, Knative, Ceph, Rook, Cassandra, Spark, Nate etc. Hands-on exp. with IaC tools and automation, such as Terraform, Ansible, or Helm. Active engagement or contributions to the open-source more »
or Rust. Experience in building and enhancing compute, storage, and data platforms with exposure to open source products like Kubernetes, Knative, Ceph, Rook, Cassandra, Spark, Nate etc. Hands-on exp. with IaC tools and automation, such as Terraform, Ansible, or Helm. Active engagement or contributions to the open-source more »
or Rust. Experience in building and enhancing compute, storage, and data platforms with exposure to open source products like Kubernetes, Knative, Ceph, Rook, Cassandra, Spark, Nate etc. Hands-on exp. with IaC tools and automation, such as Terraform, Ansible, or Helm. Active engagement or contributions to the open-source more »
capital while also giving them the flexibility to decide when and how they use it. LIQUiDITY is backed by top global financial institutions, including Spark Capital, MUFG, Apollo, and Meitav Dash. We are looking for Credit Directors who are ready to build the next generation of machine-learning-powered more »
Responsibilities: As a pivotal member of our rapidly expanding business, your role involves: Designing and Maintaining Data Architectures and Pipelines: Create and optimize robust data architectures and pipelines. Assemble intricate data sets that align with both functional and non-functional more »
of scalable backend services using Go, Rust, and other relevant technologies. Spearhead the adoption of cutting-edge technologies like Kubernetes, Knative, Rook, Cassandra, and Spark to enhance our product offerings and infrastructure. Set the technical direction for the team, ensuring alignment with the company’s strategic goals and industry more »
Newcastle Upon Tyne, Tyne and Wear, North East, United Kingdom Hybrid / WFH Options
Senitor Associates Limited
within Software Engineering to explore new technologies. Contribute to a team culture that prioritizes diversity, equity, inclusion, and respect. Required Skills Expertise in Java , Spark, SQL, Relational DB, Spark, NoSQL, focusing on performance optimization. A thorough understanding of the Software Development Life Cycle and agile methodologies, including CI more »
. Preferably experience in setting up data platforms, setting standards - not just pipelines. Preferably experience in a distributed data processing environment/framework (e.g. Spark or Flink). Technologies: Java, Kotlin, Python (candidate is not expected to be proficient in one, and open to learn about the other) Kubernetes … Apache Pulsar GCP, BigQuery, BigTable, Spark (Note: we are at an early stage, these might change these if the team decides there is a better fit) #ICBengineering #ICBcareers ABOUT US J.P. Morgan is a global leader in financial services, providing strategic advice and products to the world's more »
on Kubernetes with Helm/Terraform Good to have prior experience dealing with streaming and batch compute frameworks like Spring Kafka, Kafka Streams, Flink, Spark Streaming, Spark Experience with large-scale computing platforms, such as Hadoop, Hive, Spark, NoSQL stores Experience with developing large-scale data pipelines more »
an easy and developer friendly platform to create, manage and monitor their own kafka infrastructure Help drive our stream processing platform, whether it be Apache Flink, Spark, or another relevant technology Collaborate with cross-functional teams to design, develop, and implement scalable and reliable solutions Troubleshoot and resolve more »
Proficiency in working with large datasets and databases (e.g., SQL, NoSQL). Hands-on experience with data processing frameworks/libraries (e.g., Pandas, NumPy, Spark). Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Preferred Qualifications: Experience with distributed computing platforms (e.g., Hadoop, Apachemore »
PACKAGED GOODS (RCG) Extensive experience working with large data sets with hands-on technology skills to design and build robust Big Data solutions using Spark PySpark framework and Industry standard frameworks like Databricks Azure DevOps and other tools technologies on Azure Cloud Platform Key Roles and Responsibilities: Understand the … business development and ensuring high levels of client satisfaction during deliver Skills Must have strong hands-on technical Skills in Python Pyspark Azure Databricks Spark ETL Cloud Azure preferred Good to have knowledge on ADF CI CD more »
you'll spearhead backend and data engineering and mentor team members. Tech stack: Athena; Python, Flask, Redis, Postgres, React, Plotly, Docker, SQL, Athena & EMR Spark, ECS and Temporal. This is a 50/50 split between tech and leadership. Your background: 8 years+ coding experience, 4+ years need to … cloud infrastructure (AWS, Azure, GCP) Start-up experience is preferred but not essential. The extras: Experience working on AI-based products. Distributed computing experience (Spark, MPI, etc) Experience orchestrating workflows, particularly within distributed system environments. Knowledge of MLOps principles and practices, especially in implementing them within production settings. What more »
Greater Bristol Area, United Kingdom Hybrid / WFH Options
Net2Source Inc
/Onsite : Hybrid Extensive experience working with large data sets with hands on technology skills to design and build robust Big Data solutions using Spark PySpark framework and Industry standard frameworks like Databricks Azure DevOps and other tools technologies on Azure Cloud Platform Key Roles and Responsibilities Understand the … business development and ensuring high levels of client satisfaction during deliver Skills Must have strong hands-on technical Skills in Python Pyspark Azure Databricks Spark ETL Cloud Azure preferred Good to have knowledge on ADF CI CD more »
EC3V 1LT Working Arrangements: Hybrid, 2-3 days p/w in office Salary: £75,000-£85,000 Industry: Insurance Tech Stack: Python, SQL, Spark, Azure 👩🏻💻 Great opportunity for a talented Engineer (Python, SQL, Spark, Azure) to join a market leading cyber insurance company. The Company 🚀 Tech driven … business around the world to provide unique, competitive and secure insurance packages. The Role ✨ They are seeking a highly pragmatic Engineer (Python, SQL, Spark, Azure) to help build out their new data platform. You (Python, SQL, Spark, Azure) will work closely with Architects, Data Scientist and Software Engineers … to build out a greenfield platform that can be used to provide business critical insights. The ideal candidate (Python, SQL, Spark, Azure) will be comfortable working with non technical colleagues to build a platform that benefits the entire business Desired Skills ⚙️ Python (SQL Server, Azure SQL) Databricks Ariflow, Kafka more »