in computer science, Mathematics, Statistics or similar engineering discipline. Working knowledge of Linux. Knowledge of network protocols and operation. Data analysis and visualization in Hadoop, Splunk. Data manipulation using tools like MapReduce or SQL. Experience with network security products and solutions. HTML, CSS and Javascript experience. Python. Ifyou are more »
industry experience Experience in distributed system design Experience with Pure/Alloy Working knowledge of open-source tools such as AWS lambda, Prometheus Spark, Hadoop or Snowflake knowledge would be a plus. Additional Information Location: This role can be delivered in a hybrid nature from one of these offices more »
Docker, Kubernetes). CI/CD pipelines and tools (e.g. DBT, Jenkins, GitLab CI) Desirable: Experience with analytics tools and frameworks (e.g., Apache Spark, Hadoop). SQL Sagemaker, DataRobot Google Cloud and Azure Data platform metadata driven frameworks to ingest, transform and manage data more »
programming language, ideally Python but can also be Java or C/c++ SQL expeirence Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau) Get in touch with Ella Alcott - Ella@engagewithus.com more »
industry experience Experience in distributed system design Experience with Pure/Alloy Working knowledge of open-source tools such as AWS lambda, Prometheus Spark, Hadoop orSnowflake knowledge would be a plus. Additional Information Location: This role can be delivered in a hybrid nature from one of these offices Dublin more »
looking for you to demonstrate include: Experience of data storage technologies: Delta Lake, Iceberg, Hudi Sound knowledge and understanding of Apache Spark, Databricks or Hadoop Ability to take business requirements and translate these into tech specifications Knowledge of Architecture best practices and patterns Competence in evaluating and selecting development more »
and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau more »
industry experience Experience in distributed system design Experience with Pure/Alloy Working knowledge of open-source tools such as AWS lambda, Prometheus Spark, Hadoop orSnowflake knowledge would be a plus. Additional Information Location: This role can be delivered in a hybrid nature from one of these offices Dublin more »
Experience with RDS like MySQL or PostgreSQL or others, and experienced with NoSQL like Redis or MongoDB or others; * Experience with BigData technologies like Hadoop eco-system is a plus. * Excellent writing, proof-reading and editing skills. Able to create documentation that can express cloud architectures using text and more »
Bristol, England, United Kingdom Hybrid / WFH Options
Made Tech
how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes) Ability to create data pipelines on a more »
solving skills and creativity. Google Cloud Professional Cloud Architect or Professional Cloud Developer certification Very Disrable to have hands-on experience with ETL tools, Hadoop-based technologies (e.g., Spark), and batch/streaming data pipelines (e.g., Beam, Flink etc) Proven expertise in designing and constructing data lakes and data more »
requiring data analysis and visual support. Skills: Experienced in either programming languages such as Python and/or R, big data tools such as Hadoop, or data visualization tools such as Tableau. Strong SQL proficiency required. The ability to communicate effectively in writing, including conveying complex information and promoting more »
for Data & Analytics Platforms Experience with Relational Databases like Oracle, NoSQL Databases and/or Big Data technologies (e.g. Oracle, SQL Server, Postgres, Spark, Hadoop, other Open Source). Must have experience in Data Security Solutions (Identity and Access Management and Data Security Access Management) Must have experience of more »
industry experience Experience in distributed system design Experience with Pure/Alloy Working knowledge of open-source tools such as AWS lambda, Prometheus Spark, Hadoop orSnowflake knowledge would be a plus. Additional Information Location: This role can be delivered in a hybrid nature from one of these offices Dublin more »
Manchester, England, United Kingdom Hybrid / WFH Options
Made Tech
how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes) Ability to create data pipelines on a more »
experience Demostrate In-depth knowledge of large-scale data platforms (Databricks, Snowflake) and cloud-native tools (Azure Synapse, RedShift) Experience of analytics technologies (Spark, Hadoop, Kafka) Have familiarity with Data Lakehouse architecture, SQL Server, DataOps, and data lineage concepts Demonstrate In-depth knowledge of large-scale data platforms (Databricks more »
NumPy, Spark). Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Preferred Qualifications: Experience with distributed computing platforms (e.g., Hadoop, Apache Kafka). Familiarity with cloud computing services (e.g., AWS, GCP, Azure). Knowledge of financial markets and trading concepts. Previous exposure to DevOps more »
best of breed Java toolsets - focused on MicroServices Architectures, powerful front- and backend frameworks, RESTful services, and everything from NoSQL databases like MongoDB and Hadoop, high-performance data grids like HazelCast to multi-node relational systems. You will be working in a Scrum Team of cross-functional skills in more »
share knowledge with the team. Qualifications You will have expertise within the following: Java and Python development knowledge (Essential) Previous experience with Spark or Hadoop (Essential) Trino or Airflow (Desirable) Architecture and capabilities. Designing and implementing complex solutions with a focus on scalability and security. Excellent communication and collaboration more »
Chicago, Illinois, United States Hybrid / WFH Options
Request Technology - Robyn Honquest
CLI and IAM etc. (required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Experience working with various types of databases like Relational, NoSQL, Object more »
reliability and availability of the company's software products. Data Processing Pipelines : You'll design and implement data processing pipelines using technologies like Kafka, Hadoop, Hive, Storm, or Zookeeper, enabling real-time and batch processing of data from the blockchain. Hands-on Team Leadership : As a hands-on leader more »
in computer science, Mathematics, Statistics or similar engineering discipline. Working knowledge of Linux. Knowledge of network protocols and operation. Data analysis and visualization in Hadoop, Splunk. Data manipulation using tools like MapReduce or SQL. Experience with network security products and solutions. HTML, CSS and JavaScript experience. Python. Ifyou are more »
Organizational, motivational, and interpersonal skills Additional, But Not Required Skills Spark libraries Scala programming Python programming SQL queries Experience with distributed computing Experience with Hadoop and cloud databases Demonstrated proficiency working with business and technical teams to integrate algorithms into product platforms on large data sets Additional Information About more »
Organizational, motivational, and interpersonal skills Additional, But Not Required Skills Spark libraries Scala programming Python programming SQL queries Experience with distributed computing Experience with Hadoop and cloud databases Additional Information About Epsilon Epsilon is a global advertising and marketing technology company positioned at the center of Publicis Groupe. Epsilon more »
Organizational, motivational, and interpersonal skills Additional, But Not Required Skills Spark libraries Scala programming Python programming SQL queries Experience with distributed computing Experience with Hadoop and cloud databases Additional Information About Epsilon Epsilon is a global advertising and marketing technology company positioned at the center of Publicis Groupe. Epsilon more »