Flower Mound, Texas, United States Hybrid / WFH Options
Epsilon
Organizational, motivational, and interpersonal skills Additional, But Not Required Skills Spark libraries Scala programming Python programming SQL queries Experience with distributed computing Experience with Hadoop and cloud databases Additional Information About Epsilon Epsilon is a global advertising and marketing technology company positioned at the center of Publicis Groupe. Epsilon more »
Fort Worth, Texas, United States Hybrid / WFH Options
Epsilon
Organizational, motivational, and interpersonal skills Additional, But Not Required Skills Spark libraries Scala programming Python programming SQL queries Experience with distributed computing Experience with Hadoop and cloud databases Additional Information About Epsilon Epsilon is a global advertising and marketing technology company positioned at the center of Publicis Groupe. Epsilon more »
in Apache Iceberg, Spark, Big Data 3+ years of Big Data project development experience Hands on experience in working areas like Apache Iceberg & Spark, Hadoop, Hive Must have knowledge in any Database Ex: Postgres, Oracle, MongoDB Excellent in SDLC Processes and DevOps knowledge (Jira, Jenkins pipeline) Working in Agile more »
The following skills/experience is essential: -Proven experience as a Lead Big Data Engineer with excellent knowledge of Big Data -Excellent knowledge of Hadoop and tools such as Hbase/Hive and Spark etc -Excellent experience of ETL, data warehousing and handling a variety of data types -Very more »
following skills/experience is essential: -Proven experience as an Architect and excellent knowledge of Big Data -Excellence experience across Azure -Excellent knowledge of Hadoop and tools such as Hbase/Hive and Spark etc -Excellent experience of ETL, data warehousing and handling a variety of data types -Very more »
Greater London, England, United Kingdom Hybrid / WFH Options
Oliver Bernard
following skills/experience is essential: -Proven experience as an Architect and excellent knowledge of Big Data -Excellence experience across Azure -Excellent knowledge of Hadoop and tools such as Hbase/Hive and Spark etc -Excellent experience of ETL, data warehousing and handling a variety of data types -Very more »
as it pertains to data storage and computing • Experience with data modeling, warehousing and building ETL pipelines • Experience with big data technologies such as: Hadoop, Hive, Spark, EMR • Experience programming with at least one programming language such as C++, C#, Java, Python, Golang, PowerShell, Ruby • Experience with non-relational more »
decision making skills Exposure to working with REST API's Any of the following skills would be an added bonus: Has run code across Hadoop/MapReduce clusters Has code running in a production environment Used SAS before (or at least can decipher SAS code) Worked with very large more »
Newcastle Upon Tyne, England, United Kingdom Hybrid / WFH Options
G.Digital
experience applying ML models Python, SQL or R Any cloud data platforms would be great - AWS, Azure or GCP Any big data technologies like Hadoop would be great Strong communication and stakeholder engagement What’s on offer to you? 💰 £85-95k Bonus of up to 10% Flexible working more »
As an IT Specialist, you'll need broad expertise across various areas of the technology/software domain. Proficiency in AWS or Big Data, Hadoop or other SQL databases, Lucene, Spark, web app development (JavaScript, Node.js), Docker, Jenkins, Git, Python, or Ruby would be highly beneficial. Key Responsibilities: Meet more »
/Platform Lifecycle Managemen t in the area of Production Datawarehouse & Reporting Solutions Continuous enhancement of our Production Reporting Platforms using various technologies (Oracle, Hadoop, Denodo, ) Define, design, build and enhance business intelligence solutions Partner up with management , application owner, key business customer and team members for the delivery more »
Cambridge, Cambridgeshire, United Kingdom Hybrid / WFH Options
BeTechnology Group
in Python development Deep knowledge of distributed and scalable systems, including proficiency with PostgreSQL, RabbitMQ, and Cassandra. Familiarity with big data technologies such as Hadoop, Spark, or Kafka . Strong problem-solving skills and the ability to troubleshoot complex issues in distributed systems. Excellent communication and collaboration skills, with more »
Cambridge, Cambridgeshire, East Anglia, United Kingdom Hybrid / WFH Options
Be Technology
in Python development Deep knowledge of distributed and scalable systems, including proficiency with PostgreSQL, RabbitMQ, and Cassandra. Familiarity with big data technologies such as Hadoop, Spark, or Kafka . Strong problem-solving skills and the ability to troubleshoot complex issues in distributed systems. Excellent communication and collaboration skills, with more »
Data Analytics stack (IS, AS, RS) Power BI, DAX MDS Azure Data Lakes Supporting: Azure ML .Net/HTML5 Azure infrastructure R, Python Powershell Hadoop, Data Factory Principles: Data Modelling Data Warehouse Theory Data Architecture Master Data Management Data Science WHY ADATIS? There’s a long list of reasons more »
other quantitative discipline Database marketing experience/knowledge Ability to program in newer and emerging languages such as R and Python; working knowledge of Hadoop and other big data technologies Additional Information About Epsilon Epsilon is a global advertising and marketing technology company positioned at the center of Publicis more »
Romsey, Hampshire, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
Experience with cloud technologies i.e. AWS or Azure Programming language experience i.e. Java, Python, node.js or SQL Data technologies experience i.e. PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (url removed more »
Woking, Surrey, United Kingdom Hybrid / WFH Options
CBSbutler Holdings Limited trading as CBSbutler
Experience with cloud technologies i.e. AWS or Azure Programming language experience i.e. Java, Python, node.js or SQL Data technologies experience i.e. PostgreSQL, MongoDB, kafka, Hadoop If you are interested in discussing this DevOps Engineer role further, please apply or send a copy of your updated CV to (url removed more »
infrastructure and/or use case scaling , particularly in a production environment Familiarity with DevOps principles and practices Experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms are of benefit The ability to communicate in English with business fluency , German language skills are a plus more »
rulemaking. What you'll need to succeed Extensive Business and Data Analysis experience Strong SQL and Excel skills Data Visualisation experience Experience with Python, Hadoop or Big Data What you'll get in return An exciting opportunity to join an international organisation working with a major financial services organisation. more »
infrastructure and/or use case scaling , particularly in a production environment Familiarity with DevOps principles and practices Experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms are of benefit The ability to communicate in English with business fluency , German language skills are a plus more »
infrastructure and/or use case scaling , particularly in a production environment Familiarity with DevOps principles and practices Experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms are of benefit The ability to communicate in English with business fluency , German language skills are a plus more »
infrastructure and/or use case scaling , particularly in a production environment Familiarity with DevOps principles and practices Experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms are of benefit The ability to communicate in English with business fluency , German language skills are a plus more »
infrastructure and/or use case scaling , particularly in a production environment Familiarity with DevOps principles and practices Experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms are of benefit The ability to communicate in English with business fluency , German language skills are a plus more »
infrastructure and/or use case scaling , particularly in a production environment Familiarity with DevOps principles and practices Experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms are of benefit The ability to communicate in English with business fluency , German language skills are a plus more »
the initial point of escalation for software development issues within specific area of responsibility Requirements Have experience of working with Databases SQL Server, MYSQL, Hadoop, Oracle or any other RDBMS. Be able to partner and lead internal and external technology resources in solving complex business needs Be able to more »