Azure SQL Database, HDInsight, and Azure Machine Learning Studio. Data Storage & Databases: SQL & NoSQL Databases: Experience with databases like PostgreSQL, MySQL, MongoDB, and Cassandra. Big Data Ecosystems: Hadoop, Spark, Hive, and HBase. Data Integration & ETL: Data Pipelining Tools: Apache NiFi, Apache Kafka, and Apache Flink. ETL Tools: AWS Glue, Azure Data Factory, Talend, and ApacheMore ❯
data architectures, Lambda type architectures - Proficiency in writing and optimizing SQL - Knowledge of AWS services including S3, Redshift, EMR, Kinesis and RDS. - Experience with Open Source Data Technologies (Hadoop, Hive, Hbase, Pig, Spark, etc.) - Ability to write code in Python, Ruby, Scala or other platform-related Big data technology - Knowledge of professional software engineering practices & best practices for the More ❯
and well-tested solutions to automate data ingestion, transformation, and orchestration across systems. Own data operations infrastructure: Manage and optimise key data infrastructure components within AWS, including Amazon Redshift, Apache Airflow for workflow orchestration and other analytical tools. You will be responsible for ensuring the performance, reliability, and scalability of these systems to meet the growing demands of data … pipelines , data warehouses , and leveraging AWS data services . Strong proficiency in DataOps methodologies and tools, including experience with CI/CD pipelines, containerized applications , and workflow orchestration using Apache Airflow . Familiar with ETL frameworks, and bonus experience with Big Data processing (Spark, Hive, Trino), and data streaming. Proven track record - You've made a demonstrable impact More ❯
years of experience working on mission critical data pipelines and ETL systems. 5+ years of hands-on experience with big data technology, systems and tools such as AWS, Hadoop, Hive, and Snowflake Expertise with common Software Engineering languages such as Python, Scala, Java, SQL and a proven ability to learn new programming languages Experience with workflow orchestration tools such … certification/s Strong data visualizations skills to convey information and results clearly Experience with DevOps tools such as Docker, Kubernetes, Jenkins, etc. Experience with event messaging frameworks like Apache Kafka The hiring range for this position in Santa Monica, California is $136,038 to $182,490 per year, in Glendale, California is $136,038 to $182,490 per More ❯
SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) - Experience with one or more scripting language (e.g., Python, KornShell) PREFERRED QUALIFICATIONS - Experience with big data technologies such as: Hadoop, Hive, Spark, EMR - Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you More ❯
S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions - Experience building large-scale, high-throughput, 24x7 data systems - Experience with big data technologies such as: Hadoop, Hive, Spark, EMR - Experience providing technical leadership and mentoring other engineers for best practices on data engineering Our inclusive culture empowers Amazonians to deliver the best results for our customers. More ❯
Sales acumen, identifying and managing sales opportunities at client engagements An understanding of database technologies e.g. SQL, ETL, No-SQL, DW, and Big Data technologies e.g. Hadoop, Mahout, Pig, Hive, etc.; An understanding of statistical modelling techniques e.g. Classification and regression techniques, Neural Networks, Markov chains, etc.; An understanding of cloud technologies e.g. AWS, GCP or Azure A track More ❯
Sales acumen, identifying and managing sales opportunities at client engagements An understanding of database technologies e.g. SQL, ETL, No-SQL, DW, and Big Data technologies e.g. Hadoop, Mahout, Pig, Hive, etc.; An understanding of statistical modelling techniques e.g. Classification and regression techniques, Neural Networks, Markov chains, etc.; An understanding of cloud technologies e.g. AWS, GCP or Azure A track More ❯
analytics Practical experience in coding languages eg. Python, R, Scala, etc.; (Python preferred) Proficiency in database technologies eg. SQL, ETL, No-SQL, DW, and Big Data technologies e.g. pySpark, Hive, etc. Experienced working with structured and also unstructured data eg. Text, PDFs, jpgs, call recordings, video, etc. Knowledge of machine learning modelling techniques and how to fine-tune those More ❯
analytics Practical experience in coding languages eg. Python, R, Scala, etc.; (Python preferred) Proficiency in database technologies eg. SQL, ETL, No-SQL, DW, and Big Data technologies e.g. pySpark, Hive, etc. Experienced working with structured and also unstructured data eg. Text, PDFs, jpgs, call recordings, video, etc. Knowledge of machine learning modelling techniques and how to fine-tune those More ❯
Sales acumen, identifying and managing sales opportunities at client engagements An understanding of database technologies e.g. SQL, ETL, No-SQL, DW, and Big Data technologies e.g. Hadoop, Mahout, Pig, Hive, etc.; An understanding of statistical modelling techniques e.g. Classification and regression techniques, Neural Networks, Markov chains, etc.; An understanding of cloud technologies e.g. AWS, GCP or Azure A track More ❯
East London, London, United Kingdom Hybrid / WFH Options
McGregor Boyall Associates Limited
Strong knowledge of LLM algorithms and training techniques . Experience deploying models in production environments. Nice to Have: Experience in GenAI/LLMs Familiarity with distributed computing tools (Hadoop, Hive, Spark). Background in banking, risk management, or capital markets . Why Join? This is a unique opportunity to work at the forefront of AI innovation in financial services More ❯
SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) - Experience with one or more scripting language (e.g., Python, KornShell) PREFERRED QUALIFICATIONS - Experience with big data technologies such as: Hadoop, Hive, Spark, EMR - Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. - Knowledge of cloud services such as AWS or equivalent Our inclusive culture empowers Amazonians to More ❯
your team or organization - Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS - Experience with big data technologies such as: Hadoop, Hive, Spark, EMR - Experience hiring, developing and promoting engineering talent - Experience communicating to senior management and customers verbally and in writing PREFERRED QUALIFICATIONS - Experience with AWS Tools and Technologies (Redshift More ❯
We're excited if you have 4+ years of relevant work experience in Analytics, Business Intelligence, or Technical Operations Master in SQL, Python, and ETL using big data tools (HIVE/Presto, Redshift) Previous experience with web frameworks for Python such as Django/Flask is a plus Experience writing data pipelines using Airflow Fluency in Looker and/ More ❯
Python (preferred) and C++ Experience working with structured and unstructured data (e.g., text, PDFs, images, call recordings, video) Proficiency in database and big data technologies including SQL, NoSQL, PySpark, Hive, etc. Cloud & AI Ecosystems Experience working with cloud platforms such as AWS, GCP, or Azure Understanding of API integration and deploying solutions in cloud environments Familiarity or hands-on More ❯
Experience on starting the front-end buildout from scratch by coordinating across multiple business and technology groups o Experience building complex single-page applications using Abinitio/Hadoop/Hive/Kafka/Oracle and modern MOM technologies o Experienced with Linux/Unix platform o Experience in SCMs like GIT; and tools like JIRA o Familiar with the More ❯
permissions - Experience with non-relational databases/data stores (object storage, document or key-value stores, graph databases, column-family databases) - Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the More ❯
e.g. R, SAS, or Matlab) - Experience with statistical models e.g. multinomial logistic regression - Experience in data applications using large scale distributed systems (e.g., EMR, Spark, Elasticsearch, Hadoop, Pig, and Hive) - Experience working with data engineers and business intelligence engineers collaboratively - Demonstrated expertise in a wide range of ML techniques PREFERRED QUALIFICATIONS - Experience as a leader and mentor on a More ❯
Sciences and Healthcare, Technology and Services, Telecom and Media, Retail and CPG, and Public Services. Consolidated revenues as of $13+ billion. Job Description- Spark - Must have Scala - Must Have Hive/SQL - Must Have Job Description : Scala/Spark • Good Big Data resource with the below Skillset: § Spark § Scala § Hive/HDFS/HQL • Linux Based Hadoop Ecosystem … HDFS, Impala, Hive, HBase, etc.) • Experience in Big data technologies , real time data processing platform(Spark Streaming) experience would be an advantage. • Consistently demonstrates clear and concise written and verbal communication • A history of delivering against agreed objectives • Ability to multi-task and work under pressure • Demonstrated problem solving and decision-making skills • Excellent analytical and process-based skills More ❯
Sciences and Healthcare, Technology and Services, Telecom and Media, Retail and CPG, and Public Services. Consolidated revenues as of $13+ billion. Job Description- Spark - Must have Scala - Must Have Hive/SQL - Must Have Job Description : Scala/Spark • Good Big Data resource with the below Skillset: § Spark § Scala § Hive/HDFS/HQL • Linux Based Hadoop Ecosystem … HDFS, Impala, Hive, HBase, etc.) • Experience in Big data technologies , real time data processing platform(Spark Streaming) experience would be an advantage. • Consistently demonstrates clear and concise written and verbal communication • A history of delivering against agreed objectives • Ability to multi-task and work under pressure • Demonstrated problem solving and decision-making skills • Excellent analytical and process-based skills More ❯
Business Research Analyst - II, RBS Tech As a Research Analyst, you'll collaborate with experts to develop cutting-edge ML and Gen AI/LLM solutions for business needs. You'll drive product pilots, demonstrating innovative thinking and customer focus. More ❯
As a Research Analyst, you'll collaborate with experts to develop cutting-edge ML solutions for business needs. You'll drive product pilots, demonstrating innovative thinking and customer focus. You'll build scalable solutions, write high-quality code, and develop More ❯
Business Research Analyst-II, RBS Size/Fit As a Research Analyst II (RA), you'll collaborate with experts to develop GenAI, LLM and ML solutions for business needs. You'll drive product pilots, demonstrating innovative thinking and customer focus. More ❯
Press Tab to Move to Skip to Content Link Location: London Other locations: Primary Location Only Date: Jul 3, 2025 Requisition ID: L4 Tech Lead - Innovation Hive - Open to Flexible Working About our team We are responsible for the innovation and development of human-centric products within EY. Our approach is to prototypes ideas, get feedback from our customers More ❯