or Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as Apache Spark, Hadoop, Kafka, etc. Data acquisition and development of data sets and improving data quality Preparing data for predictive and prescriptive modelling Hands on coding experience more »
with rapid prototyping and disciplined software development processes Experience with Python, ML libraries (e.g. spaCy, NumPy, SciPy, Transformers, etc.)data tools and technologies (Spark, Hadoop, Hive, Redshift, SQL), and toolkits for ML and deep learning (SparkML, Tensorflow, Keras) Demonstrated ability to work on multi-disciplinary teams with diverse skillsets more »
CD tooling Scripting experience (Python, Perl, Bash, etc.) ELK (Elastic stack) JavaScript Cypress Linux experience Search engine technology (e.g., Elasticsearch) Big Data Technology experience (Hadoop, Spark, Kafka, etc.) Microservice and cloud native architecture Desirable Skills Able to demonstrate experience of troubleshooting and diagnosis of technical issues. Able to demonstrate more »
to large datasets, emphasizing scalability and performance optimization, with proficiency in Gen AI methodologies. Exhibit strong proficiency in working with distributed computing frameworks like Hadoop and Spark, specializing in map-reduce programming and model training, incorporating Gen AI techniques. Possess experience deploying models on cloud platforms such as Google more »
technologies such as EC2, ECS, EMR, AWS Lambda, DynamoDB, S3, Kinesis, SQS, SES, Cloudwatch.... Knowledge or experience with big data technologies such as Spark, Hadoop, Redshift, Snowflake, Kafka, Flink, Druid, Clickhouse... is highly desirable. Benefits Flexible work arrangements to support a healthy work-life balance 25 vacation days (excluding more »
in the financial services or hedge fund industry.Technical Skills:Proficiency in Python and SQL.Experience with relational and NoSQL databases.Knowledge of big data frameworks (e.g., Hadoop, Spark, Kafka).Understanding of financial markets and trading systems.Strong analytical, problem-solving, and communication skills.Familiarity with DevOps tools and practices.This is an exciting opportunity more »
services or hedge fund industry. Technical Skills: Proficiency in Python and SQL. Experience with relational and NoSQL databases. Knowledge of big data frameworks (e.g., Hadoop, Spark, Kafka). Understanding of financial markets and trading systems. Strong analytical, problem-solving, and communication skills. Familiarity with DevOps tools and practices. This more »
Must have 8 years Experience with Relational Databases like Oracle NoSQL Databases and or Big Data technologies e g Oracle SQL Server Postgres Spark Hadoop other Open Source Must have experience in Data Security Solutions Identity and Access Management and Data Security Access Management Must have 3 years experience more »
in software engineering, computer science or a similar field. Comfortable programming in Python and Scala (or Java) Knowledgeable in Big Data technologies, in particular Hadoop, Hive, and Spark. Experience in building real-time applications, preferably in Spark Good understanding of machine learning pipelines and machine learning frameworks such as more »
consulting environment • Current or previous consulting experience highly desirable • Experience of working with companies in the finance sector highly desirable • Platform implementation experience (ApacheHadoop - Kafka - Storm and Spark, Elasticsearch and others) • Experience around data integration & migration, data governance, data mining, data visualisation, database modelling in an agile delivery more »
best of breed Java toolsets - focused on MicroServices Architectures, powerful front- and backend frameworks, RESTful services, and everything from NoSQL databases like MongoDB and Hadoop, high-performance data grids like HazelCast to multi-node relational systems. You will be working in a Scrum Team of cross-functional skills in more »
JMeter or similar tools Web services technology such as REST, JSON or Thrift Testing web applications with Selenium WebDriver Big data technology such as Hadoop, MongoDB, Kafka or SQL Network principles and protocols such as HTTP, TLS and TCP Continuous integration systems such as Jenkins or Bamboo Continuous delivery more »
in log management tools to troubleshoot issues as well as identify useful analytics data. Preferred Experience in Microsoft Azure services and Databricks Spark, Redshift, Hadoop Map-Reduce or other Big Data frameworks Code management tools (Git, Sbt, Maven) Pyspark, Scala or other functional programming languages Analytics tools such as more »
to 10 years' IT Architecture experience working in a software development, technical project management, digital delivery, or technology consulting environment • Platform implementation experience (ApacheHadoop - Kafka - Storm and Spark, Elasticsearch and others) • Experience around data integration & migration, data governance, data mining, data visualisation, database modelling in an agile delivery more »
About the roleA Payments FinTech are currently seeking a Data Engineering Lead (Python, Hadoop & SQL) to lead and mentor a talented team of data engineers and scientists as they look to simplify the bank through developing innovative data driven solutions, allowing them to be commercially successful through insight, and more »
at scale. What we expect from you Strong experience building python packages, installable with pip/conda Experience processing big data, ideally in a Hadoop/Spark environment Experience working with relational databases, and SQL-like operations Experience with Airflow/orchestration tooling is beneficial Understanding of Continuous Integration more »
Warehousing: Familiarity with data warehousing concepts and technologies, including star and snowflake schema designs. Big Data Technologies: Understanding of big data platforms such as Hadoop, Spark, and tools like Hive, Pig, and HBase. Data Integration: Ability to integrate data from disparate sources using middleware or integration tools as well more »
and NoSQL databases Programming languages such as Spark or Python Amazon Web Services, Microsoft Azure or Google Cloud and distributed processing technologies such as Hadoop Benefits : Base Salary: £45,000 - £75,000 (DoE) Discretionary Bonus: Circa 10% per annum DV Bonus: Circa £5,000 Flex Fund: £5000 Health: Private more »
London, England, United Kingdom Hybrid / WFH Options
Global Relay
JMeter or similar tools Web services technology such as REST, JSON or Thrift Testing web applications with Selenium WebDriver Big data technology such as Hadoop, MongoDB, Kafka or SQL Network principles and protocols such as HTTP, TLS and TCP Continuous integration systems such as Jenkins or Bamboo Continuous delivery more »
partners Preferred Requirements Experience or strong interest in blockchain and other Web 3.0 technologies Experience with OLAP technologies, such as, Presto/Trino, Spark, Hadoop, Athena, or BigQuery is a plus Experience in Golang or any other strongly-typed programming language Experience mentoring and supporting fellow engineers Our Selection more »
following: .NET (VB, C#, ASP.NET, .NET CORE) MVC Framework Python JavaScript (REACT, Bootstrap Frameworks) Database design SQL/SQL Server NoSQL technologies e.g., MongoDB, Hadoop, etc. If you’re the right person for the role, you’ll bring experience of working on a range of applications across the development more »
more) Experience in Data mining, Data warehousing, ETL Experience in handling large volumes of data on SQL, NoSQL and Big Data databases Experience in Hadoop ecosystem: Hadoop, Spark, Hive, and/or Scala Experience in programming languages: PHP, Python, C++/Java Experience in Web development in Laravel more »
EC1N, Farringdon Without, Greater London, United Kingdom
Damia Group Ltd
your key responsibilities will be to : Manage operational procedures. Transform and process data using Apache Spark. Administer AWS RDS with MySQL. Work with the Hadoop platform. Create reports using Tableau. Utilize Red Hat Decision Central Skills/Experience Required of the SC Cleared DevSecOps Engineer: Strong operational procedures knowledge. more »
Employment Type: Permanent
Salary: £50000 - £65000/annum 15% cash flex and 10% bonus
data engineering concepts and technologies Experience working in a cloud environment Experience with modern and traditional data warehousing and data processing technologies and concepts (Hadoop, PySpark, Streaming Data) #J-18808-Ljbffr more »