knowledge of YAML or similar languages The following Technical Skills & Experience would be desirable for Data Devops Engineer: Jupyter Hub Awareness Minio or similar S3 storage technology Trino/Presto RabbitMQ or other common queue technology e.g. ActiveMQ NiFi Rego Familiarity with code development, shell-scripting in Python, Bash etc. As a Cloud Technical Architect, you will be responsible More ❯
/SCI or TS/SCI with CI Polygraph Desired Experience: Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Work could possibly require some on-call work. The Swift Group and Subsidiaries are an Equal Opportunity/Affirmative Action employer. All qualified applicants will receive consideration for More ❯
/SCI or TS/SCI with CI Polygraph Desired Experience: Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Work could possibly require some on-call work. The Swift Group and Subsidiaries are an Equal Opportunity/Affirmative Action employer. All qualified applicants will receive consideration for More ❯
/SCI or TS/SCI with CI Polygraph Desired Experience: Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Work could possibly require some on-call work. The Swift Group and Subsidiaries are an Equal Opportunity/Affirmative Action employer. All qualified applicants will receive consideration for More ❯
close the workspace during regular business hours as needed. Preferred Requirements • Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. • Experience with containers, EKS, Diode, CI/CD, and Terraform are a plus. Benefits $152,000-$198,000 salary per year, depending on experience. 11 Federal Holidays More ❯
an active TS/SCI with Polygraph security clearance required Desired Qualifications: Familiarity with AWS CDK Terraform, Packer Design Concepts: REST APIs Programming Languages: JavaScript/NodeJS Processing Tools: Presto/Trino, MapReduce, Hive The Swift Group and Subsidiaries are an Equal Opportunity/Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race More ❯
or Go Must have a DoD 8140/8570 compliance certification (i.e. Security+ certification) Preferred Experience with big data technologies like: Hadoop, Spark, MongoDB, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers and Kubernetes More ❯
Kubernetes, AWS, Helm, Terraform, Vault, Grafana, ELK Stack, New Relic Relevant experience in the maintenance of data APIs and data lake architectures, including experience with Apache Iceberg, Trino/Presto, Clickhouse, Snowflake, BigQuery. Master's degree in Computer Science or Engineering-related field Get to know us better YouGov is a global online research company, offering insight into what More ❯
Kubernetes, AWS, Helm, Terraform, Vault, Grafana, ELK Stack, New Relic Relevant experience in the maintenance of data APIs and data lake architectures, including experience with Apache Iceberg, Trino/Presto, Clickhouse, Snowflake, BigQuery. Master's degree in Computer Science or Engineering-related field #LI-PM1 Get to know us better... YouGov is a global online research company, offering insight More ❯
london, south east england, united kingdom Hybrid / WFH Options
YouGov
Kubernetes, AWS, Helm, Terraform, Vault, Grafana, ELK Stack, New Relic Relevant experience in the maintenance of data APIs and data lake architectures, including experience with Apache Iceberg, Trino/Presto, Clickhouse, Snowflake, BigQuery. Master's degree in Computer Science or Engineering-related field #LI-PM1 Get to know us better... YouGov is a global online research company, offering insight More ❯
or similar Science or Engineering discipline Strong Python and other programming skills (Java and/or Scala desirable) Strong SQL background Some exposure to big data technologies (Hadoop, spark, presto, etc.) NICE TO HAVES OR EXCITED TO LEARN: Some experience designing, building and maintaining SQL databases (and/or NoSQL) Some experience with designing efficient physical data models/ More ❯
ingest millions of data points per day and develop highly available data processing and REST services to distribute data across PWM. Technologies used include: Data Technologies: Kafka, Spark, Hadoop, Presto, Alloy - a data management and governance platform Programming Languages: Java, Scala, Scripting Microservice Technologies: REST, Spring Boot, Jersey Build and CI/CD Technologies: Gradle, Jenkins, GitLab, SVN Cloud More ❯
points per day and create a highly available data processing and REST services to distribute data to different consumers across PWM. Technologies used include: Data Technologies: Kafka, Spark, Hadoop, Presto, Alloy - a data management and data governance platform Programming Languages: Java, Scala, Scripting Database Technologies: MongoDB, ElasticSearch, Cassandra, MemSQL, Sybase IQ/ASE Micro Service Technologies: REST, Spring Boot More ❯
excited if you have 4+ years of relevant work experience in Analytics, Business Intelligence, or Technical Operations Master in SQL, Python, and ETL using big data tools (HIVE/Presto, Redshift) Previous experience with web frameworks for Python such as Django/Flask is a plus Experience writing data pipelines using Airflow Fluency in Looker and/or Tableau More ❯
petabyte-scale data warehouse for fast SQL analytics; Amazon Athena, for serverless SQL queries directly on S3 data; Amazon EMR, a managed big data platform for Spark, Hive, and Presto workloads; AWS Glue, a managed ETL service for data preparation and transformation; AWS Lake Formation, for simplified data lake creation and management; AWS SageMaker Studio, a unified IDE for More ❯