in the financial services or hedge fund industry.Technical Skills:Proficiency in Python and SQL.Experience with relational and NoSQL databases.Knowledge of big data frameworks (e.g., Hadoop, Spark, Kafka).Understanding of financial markets and trading systems.Strong analytical, problem-solving, and communication skills.Familiarity with DevOps tools and practices.This is an exciting opportunity more »
services or hedge fund industry. Technical Skills: Proficiency in Python and SQL. Experience with relational and NoSQL databases. Knowledge of big data frameworks (e.g., Hadoop, Spark, Kafka). Understanding of financial markets and trading systems. Strong analytical, problem-solving, and communication skills. Familiarity with DevOps tools and practices. This more »
services or hedge fund industry. Technical Skills: Proficiency in Python and SQL. Experience with relational and NoSQL databases. Knowledge of big data frameworks (e.g., Hadoop, Spark, Kafka). Understanding of financial markets and trading systems. Strong analytical, problem-solving, and communication skills. Familiarity with DevOps tools and practices. This more »
Must have 8 years Experience with Relational Databases like Oracle NoSQL Databases and or Big Data technologies e g Oracle SQL Server Postgres Spark Hadoop other Open Source Must have experience in Data Security Solutions Identity and Access Management and Data Security Access Management Must have 3 years experience more »
database management. Cloud Platform : AWS for cloud infrastructure. Programming Languages : JavaScript for front-end development and Java for back-end processes. Big Data Technologies : Hadoop, Spark, or Kafka for handling large-scale data processing. What We Need from You Essential Skills: Technical Proficiency : Expertise in React.js, front-end technologies more »
solving skills and creativity. Google Cloud Professional Cloud Architect or Professional Cloud Developer certification Very Disrable to have hands-on experience with ETL tools, Hadoop-based technologies (e.g., Spark), and batch/streaming data pipelines (e.g., Beam, Flink etc) Proven expertise in designing and constructing data lakes and data more »
solving skills and creativity. Google Cloud Professional Cloud Architect or Professional Cloud Developer certification Very Disrable to have hands-on experience with ETL tools, Hadoop-based technologies (e.g., Spark), and batch/streaming data pipelines (e.g., Beam, Flink etc) Proven expertise in designing and constructing data lakes and data more »
TensorFlow, PyTorch). Solid understanding of ML and data pipeline architectures and best practices. Experience with big data technologies and distributed computing (e.g., Spark, Hadoop) is a plus. Proficient in SQL and experience with relational databases. Strong analytical and problem-solving skills, with a keen attention to detail. Knowledge more »
/Kotlin. Familiarity with Kotlin or willingness to learn. Industrial experience with AWS/GCP/Azure. Knowledge of common data products such as Hadoop, Spark, Airflow, PostgreSQL, S3, etc. Problem solving/troubleshooting skills and attention to detail. 👋 About Us High-quality data access and provisioning shouldn't more »
in software engineering, computer science or a similar field. Comfortable programming in Python and Scala (or Java) Knowledgeable in Big Data technologies, in particular Hadoop, Hive, and Spark. Experience in building real-time applications, preferably in Spark Good understanding of machine learning pipelines and machine learning frameworks such as more »
Power BI and Sigma. • Experience with programming languages such as Python, R, and/or Julia. • Familiarity with data processing frameworks like Spark or Hadoop is a plus. • Solid understanding of statistical analysis techniques, data mining methods, and machine learning algorithms. • Strong analytical and problem-solving skills with the more »
consulting environment • Current or previous consulting experience highly desirable • Experience of working with companies in the finance sector highly desirable • Platform implementation experience (ApacheHadoop - Kafka - Storm and Spark, Elasticsearch and others) • Experience around data integration & migration, data governance, data mining, data visualisation, database modelling in an agile delivery more »
with containerization technologies (e.g., Docker, Kubernetes) and microservices architecture. Experience with ‘serverless’ technologies Familiarity with data management and processing tools (e.g., Apache Kafka, Spark, Hadoop) would be beneficial. Knowledge of the Strangler pattern and experience transitioning legacy services to modern infrastructure. Solid understanding of networking, security, and data privacy more »
Data Factory, Azure Databricks, Azure Data Lake, Kafka and Spark Streaming, Azure EventHub/IoT Hub, and Azure Stream Analytics • Experience with Bigdata Tools: Hadoop Spark, Kafka • Experience with Stream processing systems: Spark-streaming, Kafka • Experience with Object-oriented/object function scripting languages: Python preferred. more »
Hertfordshire, England, United Kingdom Hybrid / WFH Options
Hydrogen Group
in programming languages such as Python and strong SQL skills Expertise with cloud platforms (e.g., Azure, GCP) and big data technologies (e.g., Spark, Flink, Hadoop) Strong understanding of Kubernetes, CI/CD, and Terraform Knowledge of ETL and ELT frameworks and orchestration tools like Airflow Experience in hiring, developing more »
A genuine passion for renewable energy and sustainability. > Desirable Skills: Experience with time series analysis and forecasting. Familiarity with big data technologies such as Hadoop, Spark, or similar. Knowledge of energy systems, grid management, or related areas. Experience with cloud platforms like AWS, Google Cloud, or Azure. What's more »
analytical python solutions, installable with pip/conda · Experience working with relational databases, and SQL-like operations · Experience processing big data, ideally in a Hadoop/Spark environment, would be highly beneficial · Understanding of Continuous Integration/Continuous Delivery (CI/CD) & DevOps processes, and experience applying them within more »
best of breed Java toolsets - focused on MicroServices Architectures, powerful front- and backend frameworks, RESTful services, and everything from NoSQL databases like MongoDB and Hadoop, high-performance data grids like HazelCast to multi-node relational systems. You will be working in a Scrum Team of cross-functional skills in more »
JMeter or similar tools Web services technology such as REST, JSON or Thrift Testing web applications with Selenium WebDriver Big data technology such as Hadoop, MongoDB, Kafka or SQL Network principles and protocols such as HTTP, TLS and TCP Continuous integration systems such as Jenkins or Bamboo Continuous delivery more »
visualizations for diverse audiences. Experience with data manipulation, querying, and modeling using SQL databases (e.g., Redshift, PostgreSQL) and familiarity with big data technologies (e.g., Hadoop, Spark) is desirable. Proficiency in query performance tuning and analysis. Experience using Google Analytics creating custom Reports/extract information from GA4. Familiarity with more »
in log management tools to troubleshoot issues as well as identify useful analytics data. Preferred Experience in Microsoft Azure services and Databricks Spark, Redshift, Hadoop Map-Reduce or other Big Data frameworks Code management tools (Git, Sbt, Maven) Pyspark, Scala or other functional programming languages Analytics tools such as more »
to 10 years' IT Architecture experience working in a software development, technical project management, digital delivery, or technology consulting environment • Platform implementation experience (ApacheHadoop - Kafka - Storm and Spark, Elasticsearch and others) • Experience around data integration & migration, data governance, data mining, data visualisation, database modelling in an agile delivery more »
engineer Advanced SQL skills and relational database management Object-oriented programming languages, like Python, Java, and Scala. Experience with distributed computing frameworks, such as Hadoop or Spark Data pipelines and workflow management tools (e.g. Airflow) Cloud-based solutions AWS Strong project management and organizational skills. Excellent problem-solving, communication more »
About the roleA Payments FinTech are currently seeking a Data Engineering Lead (Python, Hadoop & SQL) to lead and mentor a talented team of data engineers and scientists as they look to simplify the bank through developing innovative data driven solutions, allowing them to be commercially successful through insight, and more »