relational databases/data stores (object storage, document or key-value stores, graph databases, column-family databases) • Experience with big data technologies such as: Hadoop, Hive, Spark, EMR, Snowflake, and Data Mesh principles • Team player • Proactive and resilient • A passion for social good Our Mission Statement: We are an more »
as Python, R, or Java. Solid understanding of data manipulation, modeling, and analysis techniques. Familiarity with big data technologies and distributed computing frameworks (e.g., Hadoop, Spark) is a plus. Experience with cloud platforms and services (e.g., AWS, Azure, GCP) for machine learning model deployment and management is advantageous. Excellent more »
in our relationships. Key Skills : Strong Snowflake Development experience Hands-on experience on Data Modelling/Data Migration and related tools Sound Knowledge on Hadoop Architecture Strong knowledge of SQL Experience in Jenkins, Astronomer Why Narwal? Opportunity to shape the future of a rapidly growing company. Competitive salary and more »
Principal Data Engineer (SVP)- AI We’re searching for a Principal Data Engineer/Scientist with experience in generative AI to join a business that provides crucial services, often focusing on saving programs and businesses from the brink of collapse. more »
and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau more »
within a typical retail trading environment is key. Experience required: A background in leveraging hands on skills using tools such as Python, R, Spark, Hadoop, SQL and cloud based platforms such as GCP, Azure and AWS to manipulate and analyse various data sets in large volumes Background in data more »
London, England, United Kingdom Hybrid / WFH Options
Anson McCade
and NoSQL databases • Programming languages such as Spark or Python • Amazon Web Services, Microsoft Azure or Google Cloud and distributed processing technologies such as Hadoop Benefits : • Base Salary: £45,000 - £75,000 (DoE) • Discretionary Bonus: Circa 10% per annum • DV Bonus: Circa £5,000 • Flex Fund: £5000 • Health: Private more »
proof of concepts. Develop monitoring strategies for infrastructure, platforms and applications aligning with enterprise strategy and overall industry trends. Big Data Technologies such as Hadoop, Spark, Kafka, etc. Hadoop: 5+ years Kafka: 3+ years Spark: 4+ years PySpark: 3+ years more »
data solutions (AWS, Azure or GCP), engineering languages including Python, SQL, Java, and pipeline management tools e.g., Apache Airflow. Familiarity with big data technologies, Hadoop, or Spark. If this opportunity is of interest, or you know anyone who would be interested in this role, please send your CV and more »
solving skills and creativity. Google Cloud Professional Cloud Architect or Professional Cloud Developer certification Very Disrable to have hands-on experience with ETL tools, Hadoop-based technologies (e.g., Spark), and batch/streaming data pipelines (e.g., Beam, Flink etc) Proven expertise in designing and constructing data lakes and data more »
Data and Artificial Intelligence, Senior Vice President We are searching for a Senior Vice President of Data and Artificial Intelligence- someone with hands on experience designing AI solutions to solve complex business problems. Your new role is a leadership position more »
rulemaking. What you'll need to succeed Extensive Business and Data Analysis experience Strong SQL and Excel skills Data Visualisation experience Experience with Python, Hadoop or Big Data What you'll get in return An exciting opportunity to join an international organisation working with a major financial services organisation. more »
in Apache Iceberg, Spark, Big Data 3+ years of Big Data project development experience Hands on experience in working areas like Apache Iceberg & Spark, Hadoop, Hive Must have knowledge in any Database Ex: Postgres, Oracle, MongoDB Excellent in SDLC Processes and DevOps knowledge (Jira, Jenkins pipeline) Working in Agile more »
would be an advantage Data visualization – Tools like Tableau Master data management (MDM) – Concepts and expertise in tools like Informatica & Talend MDM Big data – Hadoop eco-system, Distributions like Cloudera/Hortonworks, Pig and HIVE Data processing frameworks – Spark & Spark streaming Hands-on experience with multiple databases like PostgreSQL more »
experience, Experience with cloud computing platforms such as AWS, Azure, or GCP (Google Cloud Platform). Familiarity with big data technologies such as ApacheHadoop, Spark, or Kafka. Experience deploying machine learning models in production environments. Contributions to open-source machine learning projects or research publications in relevant conferences more »
learn). Understanding of database technologies (ETL) and SQL proficiency for data manipulation, data mining and querying. Knowledge of Big Data Tools (Spark or Hadoop a plus). Power BI, Dashboard design/development. Regulatory Awareness/Compliance Uphold Regulatory/Compliance requirements relevant to your role escalating areas more »
/Kotlin. Familiarity with Kotlin or willingness to learn. Industrial experience with AWS/GCP/Azure. Knowledge of common data products such as Hadoop, Spark, Airflow, PostgreSQL, S3, etc. Problem solving/troubleshooting skills and attention to detail. 👋 About Us High-quality data access and provisioning shouldn't more »
and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau more »
experience in ETL technical design, automated data quality testing, QA, documentation, data warehousing, data modelling, and data wrangling. Proficiency in RDMS, ETL pipelines, Python, Hadoop, SQL, and a solid grasp of modern code development practices. Ability to manage multiple data and analytic systems with an awareness of decentralised data more »
City of London, London, United Kingdom Hybrid / WFH Options
ECS Resource Group
continuously enhancing our technical capabilities. Must-Have Skills: Proficiency in Python and Java, with experience in any flavor of Spark. In-depth knowledge of Hadoop ecosystem components. Expertise in CI/CD pipelines and a solid understanding of DevOps practices. Hands-on experience with containerization technologies. String experience in more »
As an IT Specialist, you'll need broad expertise across various areas of the technology/software domain. Proficiency in AWS or Big Data, Hadoop or other SQL databases, Lucene, Spark, web app development (JavaScript, Node.js), Docker, Jenkins, Git, Python, or Ruby would be highly beneficial. Key Responsibilities: Meet more »
software engineer in a globally distributed team working with Scala, Java programming language (preferably both) Experience with big-data technologies Spark/Databricks and Hadoop/ADLS is a must Experience in any one of the cloud platform Azure (Preferred), AWS or Google Experience building data lakes and data more »
NumPy, Spark). Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Preferred Qualifications: Experience with distributed computing platforms (e.g., Hadoop, Apache Kafka). Familiarity with cloud computing services (e.g., AWS, GCP, Azure). Knowledge of financial markets and trading concepts. Previous exposure to DevOps more »
Greater London, England, United Kingdom Hybrid / WFH Options
Oliver Bernard
essential: -Proven experience as an Architect and excellent knowledge of Big Data -Great understanding of Cloud e.g. Azure and or AWS -Excellent knowledge of Hadoop and tools such as Hbase/Hive and Spark etc -Excellent experience of ETL, data warehousing and handling a variety of data types -Very more »