/CD) with Jenkins, TeamCity, Circle CI, or a similar tool. PREFERRED QUALIFICATIONS - One or more Web and Application Server technologies (e.g., ApacheHTTPD, Apache Tomcat, Nginx, Glassfish, JBoss, Puma, Passenger, IIS). - Scripting or development experience using Python, Node.js/JavaScript, PowerShell or C# (.NET), Ruby, Golang. More ❯
Science, Math, or Financial Engineering degree Strong knowledge in other programming language(s) - e.g., JavaScript, Typescript, Kotlin Strong knowledge of data orchestration technologies - e.g., Apache Airflow, Dagster, AWS Step Functions Understanding of ETL/ELT workflows, data modeling, and performance optimization for both batch and real-time processing. Commitment More ❯
performance and responsiveness. Stay Up to Date with Technology: Keep yourself and the team updated on the latest Python technologies, frameworks, and tools like Apache Spark , Databricks , Apache Pulsar , Apache Airflow , Temporal , and Apache Flink , sharing knowledge and suggesting improvements. Documentation: Contribute to clear and concise … or Azure . DevOps Tools: Familiarity with containerization ( Docker ) and infrastructure automation tools like Terraform or Ansible . Real-time Data Streaming: Experience with Apache Pulsar or similar systems for real-time messaging and stream processing is a plus. Data Engineering: Experience with Apache Spark , Databricks , or similar … big data platforms for processing large datasets, building data pipelines, and machine learning workflows. Workflow Orchestration: Familiarity with tools like Apache Airflow or Temporal for managing workflows and scheduling jobs in distributed systems. Stream Processing: Experience with Apache Flink or other stream processing frameworks is a plus. Desired More ❯
Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience in working with process scheduling platforms like Apache Airflow. Open to working in GS proprietary technology like Slang/SECDB An understanding of compute resources and the ability to interpret performance metrics More ❯
Strong communication skills and the ability to work in a team. Strong analytical and problem-solving skills. PREFERRED QUALIFICATIONS Experience with Kubernetes deployment architectures Apache NiFi experience Experience building trading controls within an investment bank ABOUT GOLDMAN SACHS At Goldman Sachs, we commit our people, capital and ideas to More ❯
to-end solutions, learn from experts, leverage various technologies depending on the team including; Java, JavaScript, TypeScript, React, APIs, MongoDB, Elastic Search, Kafka, Kubernetes, Apache Flink, Kafka CDC be able to innovate and incubate new ideas, have an opportunity to work on a broad range of problems, often dealing More ❯
diagram of proposed tables to enable discussion. Good communicator and comfortable with presenting ideas and outputs to technical and non-technical users. Worked on Apache Airflow before to create DAGs. Ability to work within Agile, considering minimum viable products, story pointing, and sprints. More information: Enjoy fantastic perks like More ❯
in mentoring junior team members Experience in Oracle/Relational Databases and/or Mongo Experience in GitLab CI/CD Pipelines Knowledge of Apache NiFi Experience in JavaScript/TypeScript & React Experience of Elasticsearch and Kibana Knowledge of Hibernate Proficiency in the use of Atlassian Suite - Bitbucket, Jira More ❯
in mentoring junior team members Experience in Oracle/Relational Databases and/or Mongo Experience in GitLab CI/CD Pipelines Knowledge of Apache NiFi Experience in JavaScript/TypeScript & React Experience of Elasticsearch and Kibana Knowledge of Hibernate Proficiency in the use of Atlassian Suite - Bitbucket, Jira More ❯
in mentoring junior team members Experience in Oracle/Relational Databases and/or Mongo Experience in GitLab CI/CD Pipelines Knowledge of Apache NiFi Experience in JavaScript/TypeScript & React Experience of Elasticsearch and Kibana Knowledge of Hibernate Proficiency in the use of Atlassian Suite - Bitbucket, Jira More ❯
AWS Certified Data Analytics - Specialty or AWS Certified Solutions Architect - Associate. Experience with Airflow for workflow orchestration. Exposure to big data frameworks such as Apache Spark, Hadoop, or Presto. Hands-on experience with machine learning pipelines and AI/ML data engineering on AWS. Benefits: Competitive salary and performance More ❯
Proficiency in version control tools like Git ensures effective collaboration and management of code and data models. Experience with workflow automation tools, such as Apache Airflow, is crucial for streamlining and orchestrating complex data processes. Skilled at integrating data from diverse sources, including APIs, databases, and third-party systems More ❯
Science or equivalent Experience in developing Finance or HR related applications Working experience with Tableau Working experience with Terraform Experience in creating workflows for Apache Airflow and Jenkins Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees More ❯
experience working with relational and non-relational databases (e.g. Snowflake, BigQuery, PostgreSQL, MySQL, MongoDB). Hands-on experience with big data technologies such as Apache Spark, Kafka, Hive, or Hadoop. Proficient in at least one programming language (e.g. Python, Scala, Java, R). Experience deploying and maintaining cloud infrastructure More ❯
. Proficiency in data visualization libraries (Plotly, Seaborn). Solid understanding of database design principles and normalization. Experience with ETL tools and processes and Apache Oozie or similar workflow management tools. Understanding of Machine Learning and AI concepts is a plus. Leadership & Interpersonal Skills: Proven track record of managing More ❯
quickly and apply new skills Desirable Solid understanding of microservices development SQL and NoSQL databases working set Familiar with or able to quickly learn Apache NiFi, Apache Airflow, Apache Kafka, KeyCloak, Serverless Computing, GraphQL, APIs, APIM Good skills working with JSON, XML, YAML files Knowledge in Python More ❯
data within multiple EDRMS and Content Management Systems. Understanding of streaming data technologies and methodologies. Experience in mainstream Cloud Data Lakehousing platforms (such as Apache Spark, Microsoft Fabric, Databricks, Snowflake) and associated industry standard/portable data formats (e.g., Delta Lake, Iceberg, Parquet, CSV, JSON, Avro, ORC, and XML More ❯
data within multiple EDRMS and Content Management Systems. Understanding of streaming data technologies and methodologies. Experience in mainstream Cloud Data Lakehousing platforms (such as Apache Spark, Microsoft Fabric, Databricks, Snowflake) and associated industry standard/portable data formats (e.g., Delta Lake, Iceberg, Parquet, CSV, JSON, Avro, ORC, and XML More ❯
data within multiple EDRMS and Content Management Systems. Understanding of streaming data technologies and methodologies. Experience in mainstream Cloud Data Lakehousing platforms (such as Apache Spark, Microsoft Fabric, Databricks, Snowflake) and associated industry standard/portable data formats (e.g., Delta Lake, Iceberg, Parquet, CSV, JSON, Avro, ORC, and XML More ❯
data within multiple EDRMS and Content Management Systems. Understanding of streaming data technologies and methodologies. Experience in mainstream Cloud Data Lakehousing platforms (such as Apache Spark, Microsoft Fabric, Databricks, Snowflake) and associated industry standard/portable data formats (e.g., Delta Lake, Iceberg, Parquet, CSV, JSON, Avro, ORC, and XML More ❯
data within multiple EDRMS and Content Management Systems. Understanding of streaming data technologies and methodologies. Experience in mainstream Cloud Data Lakehousing platforms (such as Apache Spark, Microsoft Fabric, Databricks, Snowflake) and associated industry standard/portable data formats (e.g., Delta Lake, Iceberg, Parquet, CSV, JSON, Avro, ORC, and XML More ❯
data within multiple EDRMS and Content Management Systems. Understanding of streaming data technologies and methodologies. Experience in mainstream Cloud Data Lakehousing platforms (such as Apache Spark, Microsoft Fabric, Databricks, Snowflake) and associated industry standard/portable data formats (e.g., Delta Lake, Iceberg, Parquet, CSV, JSON, Avro, ORC, and XML More ❯
data within multiple EDRMS and Content Management Systems. Understanding of streaming data technologies and methodologies. Experience in mainstream Cloud Data Lakehousing platforms (such as Apache Spark, Microsoft Fabric, Databricks, Snowflake) and associated industry standard/portable data formats (e.g., Delta Lake, Iceberg, Parquet, CSV, JSON, Avro, ORC, and XML More ❯
data within multiple EDRMS and Content Management Systems. Understanding of streaming data technologies and methodologies. Experience in mainstream Cloud Data Lakehousing platforms (such as Apache Spark, Microsoft Fabric, Databricks, Snowflake) and associated industry standard/portable data formats (e.g., Delta Lake, Iceberg, Parquet, CSV, JSON, Avro, ORC, and XML More ❯
data within multiple EDRMS and Content Management Systems. Understanding of streaming data technologies and methodologies. Experience in mainstream Cloud Data Lakehousing platforms (such as Apache Spark, Microsoft Fabric, Databricks, Snowflake) and associated industry standard/portable data formats (e.g., Delta Lake, Iceberg, Parquet, CSV, JSON, Avro, ORC, and XML More ❯