Mandatory) Demonstrated experience in large-scale data migration efforts. (Mandatory) Demonstrated experience with database architecture, performance design methodologies, and system-tuning recommendations. Preference for familiarity with Glue, Hive, and Iceberg or similar Demonstrated experience with Python, Bash, and Terraform Demonstrated experience with DevSecOps solutions and tools Demonstrated experience implementing CI/CD pipelines using industry standard process Desired Skills … Demonstrated experience with Data Quality and Data Governance concepts and experience. Demonstrated experience maintaining, supporting, and improving the ETL process through the implementation and standardization of data flows with Apache Nifi and other ETL tools. Demonstrated experience with Apache Spark More ❯
Demonstrated experience in large-scale data migration efforts. 5. (Mandatory) Demonstrated experience with database architecture, performance design methodologies, and system-tuning recommendations. Preference for familiarity with Glue, Hive, and Iceberg or similar 6. (Mandatory) Demonstrated experience with Python, Bash, and Terraform 7. (Mandatory) Demonstrated experience with DevSecOps solutions and tools 8. (Mandatory) Demonstrated experience implementing CI/CD pipelines … with Data Quality and Data Governance concepts and experience. 11. (Desired) Demonstrated experience maintaining, supporting, and improving the ETL process through the implementation and standardization of data flows with Apache Nifi and other ETL tools. 12. (Desired) Demonstrated experience with Apache Spark More ❯
isn't just another data role-this is your chance to engineer solutions that truly matter. Key Responsibilities: Design, develop, and optimize scalable data pipelines using technologies such as Apache Spark, ApacheIceberg, Trino, OpenSearch, AWS EMR, NiFi, and Kubernetes containers. Ingest and move structured and unstructured data using approved methods into enterprise or local storage systems. … AWS, Kubernetes). Deep understanding of working with diverse data types and formats, including structured, semi-structured, and unstructured data. Familiarity with data ingestion tools and platforms such as Apache NiFi, Spark, and related open-source technologies. Demonstrated ability to collaborate across teams, including data scientists, software engineers, data stewards, and mission partners. Knowledge of data governance principles, metadata More ❯
Demonstrated experience in large-scale data migration efforts. 5. (Mandatory) Demonstrated experience with database architecture, performance design methodologies, and system-tuning recommendations. Preference for familiarity with Glue, Hive, and Iceberg or similar 6. (Mandatory) Demonstrated experience with Python, Bash, and Terraform 7. (Mandatory) Demonstrated experience with DevSecOps solutions and tools 8. (Mandatory) Demonstrated experience implementing CI/CD pipelines … with Data Quality and Data Governance concepts and experience. 11. (Desired) Demonstrated experience maintaining, supporting, and improving the ETL process through the implementation and standardization of data flows with Apache Nifi and other ETL tools. 12. (Desired) Demonstrated experience with Apache Spark MUST be a US Citizen with a U.S. Government clearance - Intel with Polygraph NOTE: Must have More ❯
standard processes. TS/SCI with poly required to start Desired Experience Experience with the Sponsor's data environment and on-premises compute structure. Experience with Glue, Hive, and Iceberg or similar technologies. Experience with Terraform. Experience with DevSecOps solutions and tools. Experience with Data Quality and Data Governance concepts and experience. Experience maintaining, supporting, and improving the ETL … process using Apache NiFi or similar tools. Experience with Apache Spark. Equal Opportunity Employer/Veterans/Disabled Accommodations: If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to access job openings or apply for a job on this More ❯
environment. Proficiency in Spark/PySpark, Azure data technologies, Python or Scala, SQL. Experience with testing frameworks like pytest or ScalaTest. Knowledge of open table formats such as Delta, Iceberg, or Apache Hudi. Experience with CI/CD workflows using Azure DevOps, GitHub Actions, and version control systems like GIT. Understanding of cloud infrastructure and Infrastructure as Code … Scrum or Kanban. Nice to have skills: Experience in retail or e-commerce. Knowledge of Big Data and Distributed Computing. Familiarity with streaming technologies like Spark Structured Streaming or Apache Flink. Additional programming skills in PowerShell or Bash. Understanding of Databricks Ecosystem components. Experience with Data Observability or Data Quality frameworks. Additional Information What's in it for you More ❯
Demonstrated experience in large-scale data migration efforts. 5. (Mandatory) Demonstrated experience with database architecture, performance design methodologies, and system-tuning recommendations. Preference for familiarity with Glue, Hive, and Iceberg or similar 6. (Mandatory) Demonstrated experience with Python, Bash, and Terraform 7. (Mandatory) Demonstrated experience with DevSecOps solutions and tools 8. (Mandatory) Demonstrated experience implementing CI/CD pipelines … with Data Quality and Data Governance concepts and experience. 11. (Desired) Demonstrated experience maintaining, supporting, and improving the ETL process through the implementation and standardization of data flows with Apache Nifi and other ETL tools. 12. (Desired) Demonstrated experience with Apache Spark B4CORP Company Information B4Corp is a small defense contracting company that focuses on providing an optimum More ❯
test enterprise-level software solutions using tools and techniques such as BDD, Data Reconciliation, Source Control, TDD, Jenkins. Documenting configurations, processes, and best practices. Knowledge of file formats JSON, Iceberg, Avro. Basic knowledge of AWS technologies like IAM roles, Lakeformation, Security Groups, CloudFormation, Redshift. Big Data/Data Warehouse testing experience. Experience in the Financial services domain. Mentoring experience More ❯
JVM based language A good understanding of data technologies ideally with experience with relational databases, big data, data warehouses and marts, and stream processing technologies (Kafka, S3, Flink, Snowflake, Iceberg) Product-first mindset and a desire to design and build tools that solve real user problems Mastery of fundamental software engineering practices like good design documentation, unit testing, peer More ❯
based language A good understanding of data technologies ideally with experience with relational databases, big data, data warehouses and marts, and stream processing technologies (e.g. Kafka, S3, Flink, Snowflake, Iceberg) Product-first mindset and a desire to design and build tools that solve real user problems Mastery of fundamental software engineering practices like good design documentation, unit testing, peer More ❯
Social network you want to login/join with: Want to drive a top brand's Data team with 1m+ users? If you love building software in Python, implementing robust data pipelines, and driving best practices, you may be interested More ❯
multiple heterogenous data sources. Good knowledge of warehousing and ETLs. Extensive knowledge of popular database providers such as SQL Server, PostgreSQL, Teradata and others. Proficiency in technologies in the Apache Hadoop ecosystem, especially Hive, Impala and Ranger Experience working with open file and table formats such Parquet, AVRO, ORC, Iceberg and Delta Lake Extensive knowledge of automation and More ❯
Build the future of the AI Data Cloud. Join the Snowflake team. Snowflake is seeking an accomplished Principal Sales Specialist, Data Engineering Platform (Openflow and Iceberg/Polaris) to drive sales execution for our ingestion workloads for the EMEA markets. Reporting to the Managing Direction, Data Engineering Platform Sales, this is a strategic role that works closely with the More ❯