in big data technology with experience ranging from platform architecture, data management, data architecture and application architecture High Proficiency working with Hadoop platform including Spark/Scala, Kafka, SparkSQL, HBase, Impala, Hive and HDFS in multi-tenant environments Solid base in data technologies like warehousing, ETL, MDM, DQ, BI and analytical tools extensive experience in metadata management and data More ❯
system, Distributions like Cloudera/Hortonworks, Pig and HIVE Data processing frameworks – Spark & Spark streaming Hands-on experience with multiple databases like PostgreSQL, Snowflake, Oracle, MS SQL Server, NOSQL (HBase/Cassandra, MongoDB) Experience in cloud data eco-system - AWS, Azure or GCP in the data engineering space with at least few complex & high-volume data projects as an More ❯
system, Distributions like Cloudera/Hortonworks, Pig and HIVE Data processing frameworks – Spark & Spark streaming Hands-on experience with multiple databases like PostgreSQL, Snowflake, Oracle, MS SQL Server, NOSQL (HBase/Cassandra, MongoDB) Experience in cloud data eco-system - AWS, Azure or GCP in the data engineering space with at least few complex & high-volume data projects as an More ❯
in Low latency applications Financial background preferable Spark expertise (micro batching, EOD/real time) Python In-memory databases SQL Skills & RDBMS concepts Linux Hadoop Ecosystem (HDFS, Impala, HIVE, HBASE, etc.) Python , R or equivalent scripting language(s) Excellent Excel Analysis skills Good understanding of Investment Banking data A history of delivering against agreed objectives Ability to multi-task More ❯
Hadoop, Spark, and related technologies YOUR PROFILE Expertise on Hadoop, Spark & Scala Experience in developing complex data transformation workflows(ETL) using Big Data Technologies Good expertise on HIVE, Impala, HBase Hands on experience to finetune Spark jobs Experience with Java and distributed computing ABOUT CAPGEMINI Capgemini is a global business and technology transformation partner, helping organizations to accelerate their More ❯
Java and/or Python Development. 2+ years of experience working with relational databases such as MySQL, Postgres, etc. 2+ years of experience with NoSQL databases like Bigtable, Cassandra, HBase, etc. Experience with schema design and data modeling. Strong understanding of large-scale distributed data processing. Experience with developing extract-transform-load (ETL). Experience with distributed messaging systems More ❯
or functional programming generally. Exposure with highly concurrent, asynchronous backend technologies, such as Ktor, http4k, http4s, Play, RxJava, etc. Exposure with DynamoDB or similar NoSQL databases, such as Cassandra, HBase, BigTable, or Cosmos DB. Exposure with Git workflows, and the ability to tailor the workflow to project needs. Exposure with containerised application deployment using Docker, Amazon ECS, Kubernetes, etc. More ❯
a major RDBMS such as DB2, Oracle. • Exposure and hands on in Microservices, Distributed Cache (REDIS, Couchbase) and Cloud technologies • Good to have knowledge and experience in Big data - HBASE and Impala concepts. • Experienced with XML parsing (including schemas), JSON and third-party libraries like Gauva, lombok. • Well versed with design standards & frameworks; experience in working on multiple technologies. More ❯
may be substituted for a bachelor's degree. Discretionary Requirements -Cloud Experience: Shall have three (3) years demonstrated work experience with distributed scalable Big Data Store (NoSQL) such as Hbase, CloudBase/Acumulo, Big Table, etc.; Shall have demonstrated work experience with the Map Reduce programming model and technologies such as Hadoop, Hive, Pig, etc.; Shall have demonstrated work More ❯
Monitoring utilities, Disaster recovery process/tools Experience in troubleshooting and problem resolution Experience in System Integration Knowledge of the following: Hadoop, Flume, Sqoop, Map Reduce, Hive/Impala, Hbase, Kafka, Spark Streaming Experience of ETL tools incorporating Big Data Shell Scripting, Python Beneficial Skills: Understanding of: LAN, WAN, VPN and SD Networks Hardware and Cabling set-up experience More ❯
Java development Test-driven development Reviewing code, including AI-generated code Data engineering projects Valuable additional skills include: AI Prompt Engineering for code generation Docker/Kubernetes Hadoop (HDFS, HBase, Kafka, Spark, etc.) This role will be based out of our Glasgow Office. Purpose of the role Design, develop, and improve software using various engineering methodologies to provide capabilities More ❯
java and other high-level languages such as C, C++ Shall have demonstrated ability to work with OpenSource (NoSQL) products that support highly distributed, massively parallel computation needssuch as Hbase, CloudBase/Acumulo, Big Table. Shall have demonstrated work experience with the Map Reduce programming model and technologiessuch as Hadoop, Hive, Pig. Shall have demonstrated work experience with the More ❯
/Linux operating environment Within the last 5 years, a minimum of 3 years experience with OpenSource (NoSQL) products that support highly distributed, massively parallel computation needs such as hbase, apache accumulo, and/or big table Within the last 3 years, a minimum of 1 year experience with requirements analysis and design for 1 or more object-oriented More ❯
similar scope, type and complexity. • Bachelor's degree in Computer Science or related discipline. • Cloud Experience; demonstrated work experience with: o Distributed scalable Big Data Store (NoSQL) such as Hbase, CloudBase/Acumulo, Big Table, etc. o Map Reduce programming model and technologies such as Hadoop, Hive, Pig, etc. o Hadoop Distributed File System (HDFS) o Serialization such as More ❯
Ashburn, Virginia, United States Hybrid / WFH Options
A1C Partners
Load Balancing and Enterprise Service Bus frameworks (preferably under Cloud environment) • Agile Scrum and possibly experience with leading a Scrum team as a scrum master or equivalent • PostgreSQL, DynamoDB, HBase, MongoDB, Cassandra • Masters in Computer Science or related field Customer Requirements • Clearance - Ability to obtain and hold a public trust position and favorable suitability based on a CBP Background More ❯
Ashburn, Virginia, United States Hybrid / WFH Options
TAIG (Tactical Analytic & Intelligence Group)
Load Balancing and Enterprise Service Bus frameworks (preferably under Cloud environment) • Agile Scrum and possibly experience with leading a Scrum team as a scrum master or equivalent • PostgreSQL, DynamoDB, HBase, MongoDB, Cassandra • Good understanding of DevSecOps practice and tools • Masters in Computer Science or related field Customer Requirements • Clearance - Ability to obtain and hold a public trust position and More ❯
installation, evaluation, enhancement, maintenance, and problem diagnosis/resolution. - Shall have demonstrated ability to work with OpenSource (NoSQL) products that support highly distributed, massively parallel computation needs such as Hbase, Accumulo, Big Table, et cetera. - Shall have at least one (1) year of experience developing software with high level languages such as Java, C, C++, et cetera. - Shall have More ❯
Requirements: Systems Engineering (DevOps) The Contractor shall coordinate with multiple entities, including mission partners, to ensure tools meet defined requirements. The Contractor shall apply DevOps principles and philosophies to continuously deliver high value enhancements to software in a service-based More ❯
Build and optimize systems, tools, and validation strategies to support new features Help design/build distributed real-time systems and features Use big data technologies (e.g. Spark, Hadoop, HBase, Cassandra) to build large scale machine learning pipelines Develop new systems on top of real-time streaming technologies (e.g. Kafka, Flink) 5+ years software development experience 5+ years experience … in Java, Shell, Python development Excellent knowledge of Relational Databases, SQL and ORM technologies (JPA2, Hibernate) is a plus Experience in Cassandra, HBase, Flink, Spark or Kafka is a plus. Experience in the Spring Framework is a plus Experience with test-driven development is a plus Must be located in Ireland More ❯