ML/DL libraries like TensorFlow, PyTorch, or JAX. Knowledge of data analytics concepts, including data warehouse technical architectures, ETL and reporting/analytic tools and environments (such as Apache Beam, Hadoop, Spark, Pig, Hive, MapReduce, Flume). Customer facing experience of discovery, assessment, execution, and operations. Demonstrated excellent communication, presentation, and problem solving skills. Experience in project More ❯
ML/DL libraries like TensorFlow, PyTorch, or JAX. Knowledge of data analytics concepts, including data warehouse technical architectures, ETL and reporting/analytic tools and environments (such as Apache Beam, Hadoop, Spark, Pig, Hive, MapReduce, Flume). Customer facing experience of discovery, assessment, execution, and operations. Demonstrated excellent communication, presentation, and problem solving skills. Experience in project More ❯
London, England, United Kingdom Hybrid / WFH Options
Qh4 Consulting
Familiarity with legacy systems (e.g. C#) and willingness to interact with them where necessary. Exposure to Cloudera Data Platform or similar big data environments. Experience with tools such as ApacheHive, NiFi, Airflow, Azure Blob Storage, and RabbitMQ. Background in investment management or broader financial services, or a strong willingness to learn the domain. The Role Offers The More ❯
London, England, United Kingdom Hybrid / WFH Options
HR Ways - Hiring Tech Talent
ML/DL libraries like TensorFlow, PyTorch, or JAX. Knowledge of data analytics concepts, including data warehouse technical architectures, ETL and reporting/analytic tools and environments (such as Apache Beam, Hadoop, Spark, Pig, Hive, MapReduce, Flume). Customer facing experience of discovery, assessment, execution, and operations. Demonstrated excellent communication, presentation, and problem solving skills. Experience in project More ❯
security. Strong Cloudera experience with expertise in Cloudera Data Platform (CDP), Cloudera Manager, and Cloudera Navigator . Strong knowledge of Hadoop ecosystem and related technologies such as HDFS, YARN, Hive, Impala, Spark, and Kafka . Strong AWS services/Architecture experience with hands-on expertise in cloud-based deployments (AWS, Azure, or GCP) . Strong Big Data experience , including More ❯
tackle business problems. Comfort with rapid prototyping and disciplined software development processes. Experience with Python, ML libraries (e.g. spaCy, NumPy, SciPy, Transformers, etc.), data tools and technologies (Spark, Hadoop, Hive, Redshift, SQL), and toolkits for ML and deep learning (SparkML, Tensorflow, Keras). Demonstrated ability to work on multi-disciplinary teams with diverse skillsets. Deploying machine learning models and More ❯
technologies (PostgreSQL, MySQL, RDS) US citizenship and an active TS/SCI with Polygraph security clearance required Desired Experience: Experience with distributed databases and streaming tools (Hadoop, Spark, Yarn, Hive, Trino) Experience with Remote Desktop Protocol (RDP) technologies Experience with data access control, specifically Role-Based Access Control (RBAC) and Attribute-Based Access Control (ABAC) Familiarity with data science More ❯
We're excited if you have 4+ years of relevant work experience in Analytics, Business Intelligence, or Technical Operations Master in SQL, Python, and ETL using big data tools (HIVE/Presto, Redshift) Previous experience with web frameworks for Python such as Django/Flask is a plus Experience writing data pipelines using Airflow Fluency in Looker and/ More ❯
We're excited if you have 4+ years of relevant work experience in Analytics, Business Intelligence, or Technical Operations Master in SQL, Python, and ETL using big data tools (HIVE/Presto, Redshift) Previous experience with web frameworks for Python such as Django/Flask is a plus Experience writing data pipelines using Airflow Fluency in Looker and/ More ❯
We're excited if you have 4+ years of relevant work experience in Analytics, Business Intelligence, or Technical Operations Master in SQL, Python, and ETL using big data tools (HIVE/Presto, Redshift) Previous experience with web frameworks for Python such as Django/Flask is a plus Experience writing data pipelines using Airflow Fluency in Looker and/ More ❯
We're excited if you have 4+ years of relevant work experience in Analytics, Business Intelligence, or Technical Operations Master in SQL, Python, and ETL using big data tools (HIVE/Presto, Redshift) Previous experience with web frameworks for Python such as Django/Flask is a plus Experience writing data pipelines using Airflow Fluency in Looker and/ More ❯
Sales acumen, identifying and managing sales opportunities at client engagements An understanding of database technologies e.g. SQL, ETL, No-SQL, DW, and Big Data technologies e.g. Hadoop, Mahout, Pig, Hive, etc.; An understanding of statistical modelling techniques e.g. Classification and regression techniques, Neural Networks, Markov chains, etc.; An understanding of cloud technologies e.g. AWS, GCP or Azure A track More ❯
analytics Practical experience in coding languages eg. Python, R, Scala, etc.; (Python preferred) Proficiency in database technologies eg. SQL, ETL, No-SQL, DW, and Big Data technologies e.g. pySpark, Hive, etc. Experienced working with structured and also unstructured data eg. Text, PDFs, jpgs, call recordings, video, etc. Knowledge of machine learning modelling techniques and how to fine-tune those More ❯
Sales acumen, identifying and managing sales opportunities at client engagements An understanding of database technologies e.g. SQL, ETL, No-SQL, DW, and Big Data technologies e.g. Hadoop, Mahout, Pig, Hive, etc.; An understanding of statistical modelling techniques e.g. Classification and regression techniques, Neural Networks, Markov chains, etc.; An understanding of cloud technologies e.g. AWS, GCP or Azure A track More ❯
RELOCATION ASSISTANCE: No relocation assistance available CLEARANCE TYPE: Secret TRAVEL: Yes, 10% of the Time Description At Northrop Grumman, our employees have incredible opportunities to work on revolutionary systems that impact people's lives around the world today, and for More ❯
London, England, United Kingdom Hybrid / WFH Options
Axiom Software Solutions Limited
basic scripts; Pydantic experience would be a bonus) SQL PySpark Delta Lake Bash (both CLI usage and scripting) Git Markdown Scala (bonus, not compulsory) Azure SQL Server as a HIVE Metastore (bonus) Technologies: Azure Databricks Apache Spark Delta Tables Data processing with Python PowerBI (Integration/Data Ingestion) JIRA #J-18808-Ljbffr More ❯
basic scripts; Pydantic experience would be a bonus) SQL PySpark Delta Lake Bash (both CLI usage and scripting) Git Markdown Scala (bonus, not compulsory) Azure SQL Server as a HIVE Metastore (bonus) Technologies: Azure Databricks Apache Spark Delta Tables Data processing with Python PowerBI (Integration/Data Ingestion) JIRA Due to the nature and urgency of this post More ❯
relevant experience in several areas of Data Mining, Classical Machine Learning, Deep Learning, NLP and Computer Vision. Experience with Large Scale/Big Data technology, such as Hadoop, Spark, Hive, Impala, PrestoDb. Hands-on capability developing ML models using open-source frameworks in Python and R and applying them on real client use cases. Proficient in one of the … deep learning stacks such as PyTorch or Tensorflow. Working knowledge of parallelisation and async paradigms in Python, Spark, Dask, Apache Ray. An awareness and interest in economic, financial and general business concepts and terminology. Excellent written and verbal command of English. Strong problem-solving, analytical and quantitative skills. A professional attitude and service orientation with the ability to work More ❯
continuous improvement. Work alongside other engineers on the team to elevate technology and consistently apply best practices. Qualifications for Software Engineer Hands-on experience working with technologies like Hadoop, Hive, Pig, Oozie, Map Reduce, Spark, Sqoop, Kafka, Flume, etc. Strong DevOps focus and experience building and deploying infrastructure with cloud deployment technologies like Ansible, Chef, Puppet, etc. Experience with More ❯
. Strong background in data structures and algorithms Experience working with at least one machine learning framework (TensorFlow, PyTorch, XGBoost, etc) Experience working with big data technologies (Spark, Kafka, Hive, Databricks, feature stores, etc). Experience working with containerisation, deployment and orchestration technologies (Docker, Kubernetes, Airflow, CI/CD pipelines, etc) Experience with automated testing, including unit, functional, and More ❯
London, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
Strong knowledge of LLM algorithms and training techniques . Experience deploying models in production environments. Nice to Have: Experience in GenAI/LLMs Familiarity with distributed computing tools (Hadoop, Hive, Spark). Background in banking, risk management, or capital markets . Why Join? This is a unique opportunity to work at the forefront of AI innovation in financial services More ❯
SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) - Experience with one or more scripting language (e.g., Python, KornShell) PREFERRED QUALIFICATIONS - Experience with big data technologies such as: Hadoop, Hive, Spark, EMR - Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you More ❯
Demonstrated experience in micro service architecture using Spring Framework, Spring Boot, Tomcat, AWS, Docker Container or Kubernetes solutions. 5. Demonstrated experience in big data solutions (Hadoop Ecosystem, MapReduce, Pig, Hive, DataStax, etc.) in support of a screening and vetting mission. More ❯
SCI with Polygraph security clearance required Desired Qualifications: Familiarity with AWS CDK Terraform, Packer Design Concepts: REST APIs Programming Languages: JavaScript/NodeJS Processing Tools: Presto/Trino, MapReduce, Hive The Swift Group and Subsidiaries are an Equal Opportunity/Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual More ❯
problem-solving and communication skills. • Strong understanding of systems and infrastructure concepts. • Nice to have experience with: o MLOps collaboration and robust deployment pipelines. o Distributed systems like Cloudera, Hive, Spark, or Hadoop. o Document indexing/search platforms and GPU-based workloads. o Data tools such as R, SQL/NoSQL, EMR, and Gurobi. o Visualization libraries: Plotly More ❯