with Data Quality and Data Governance concepts and experience. 11. (Desired) Demonstrated experience maintaining, supporting, and improving the ETL process through the implementation and standardization of data flows with Apache Nifi and other ETL tools. 12. (Desired) Demonstrated experience with Apache Spark More ❯
Python Experience with containerization, including Podman, Docker, and Kubernetes Ability to develop and deploy services on Cloud platforms, including AWS Ability to process and ingest large scale data using Apache Spark Top Secret clearance Bachelor's degree Nice If You Have: Experience with developing ETL and ELT pipelines with Apache Nifi and Databricks Ability to perform message queuing … and real time streaming with Apache Kafka Ability to perform Scala programming Clearance: Applicants selected will be subject to a security investigation and may need to meet eligibility requirements for access to classified information ; Top Secret clearance is required. Compensation At Booz Allen, we celebrate your contributions, provide you with opportunities and choices, and support your total well-being. More ❯
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills: Designing Databricks based solutions More ❯
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills: Designing Databricks based solutions More ❯
technologies. Experience with Terraform. Experience with DevSecOps solutions and tools. Experience with Data Quality and Data Governance concepts and experience. Experience maintaining, supporting, and improving the ETL process using Apache NiFi or similar tools. Experience with Apache Spark. Equal Opportunity Employer/Veterans/Disabled Accommodations: If you are a qualified individual with a disability or a disabled More ❯
in Microsoft Fabric and Databricks, including data pipeline development, data warehousing, and data lake management Proficiency in Python, SQL, Scala, or Java Experience with data processing frameworks such as Apache Spark, Apache Beam, or Azure Data Factory Strong understanding of data architecture principles, data modelling, and data governance Experience with cloud-based data platforms, including Azure and or More ❯
notebook development, and automation capabilities. Strong expertise in designing and implementing data ingestion pipelines, data transformations, and data quality processes using Databricks. Experience with big data technologies such as Apache Spark, Apache Hive, Delta Lake, and Hadoop. Solid understanding of data governance principles, data modeling, data cataloging, and metadata management. Hands-on experience with cloud platforms like AWS More ❯
Arlington, Virginia, United States Hybrid / WFH Options
Full Visibility LLC
AWS S3, Azure Blob, MinIO, or similar) Proficiency in data parsing and transformation, handling structured and unstructured data Hands-on experience with ETL tools and data workflow orchestration (e.g., Apache Airflow, Luigi, Prefect) Strong programming skills in Python, SQL, or Scala Experience with open-source data processing tools (e.g., Kafka, Spark, Flink, Hadoop) Familiarity with database technologies (PostgreSQL, MySQL More ❯
and technologies: Required Skills: JavaScript (front-end frameworks preferred) Java or Python programming languages Prior experience working with one or more of the following: Red Hat Linux Postgres AWS Apache Experience collaborating cross-functionally with subcontractors, local engineers, and dispersed teammates Preferred Skills: Angular and AngularJS experience (especially AngularJS to Angular code migration) Docker/Kubernetes/Containerization experience More ❯
these languages: JavaScript (front-end frameworks preferred), Java, or Python. Prior experience working with one or more of the following environments and applications: Red Had Linux, Postgres, AWS, and Apache Experience working on a Development team within SAFe or other Agile Release Trains Experience with Angular and/or Angular.JS Experience with Git and contributing to open source projects More ❯
technical and professional experience Preferred Skills: Experience working within the public sector. Knowledge of cloud platforms (e.g., IBM Cloud, AWS, Azure). Familiarity with big data processing frameworks (e.g., Apache Spark, Hadoop). Understanding of data warehousing concepts and experience with tools like IBM Cognos or Tableau. Certifications:While not required, the following certifications would be highly beneficial: Experience … working within the public sector. Knowledge of cloud platforms (e.g., IBM Cloud, AWS, Azure). Familiarity with big data processing frameworks (e.g., Apache Spark, Hadoop). Understanding of data warehousing concepts and experience with tools like IBM Cognos or Tableau. ABOUT BUSINESS UNIT IBM Consulting is IBM's consulting and global professional services business, with market leading capabilities in More ❯
Gloucester, Gloucestershire, South West, United Kingdom Hybrid / WFH Options
Anson Mccade
tools like JUnit, Git, Jira, MongoDB, and React Familiarity with cloud platforms (especially AWS), microservices, and containerisation DV clearance (or eligibility to obtain it) Nice to Have: Experience with Apache NiFi, JSF, Hibernate, Elasticsearch, Kibana, or AWS services like EC2, Lambda, EKS CI/CD pipeline expertise using GitLab Knowledge of secure, scalable architectures for cloud deployments O.K. I More ❯
Science or a related field. Experience working on and shipping live service games. Experience working on Spring Boot projects. Experience deploying software/services on Kubernetes. Experience working with Apache Spark and Iceberg. More ❯
Science, Software Engineering, or a related field. Extensive experience with a broad range of data technologies, platforms, and languages, such as SQL, Python, Java, and data orchestration tools like Apache Airflow. Demonstrated expertise in modern data architecture principles, including data lakes, data warehousing, ETL processes, and real-time data processing. Proficiency in Agile methodologies tailored for data-centric projects More ❯
NoSQL databases such as MongoDB, ElasticSearch, MapReduce, and HBase. Demonstrated experience maintaining, upgrading, troubleshooting, and managing software, hardware and networks (specifically the hardware networks piece). Demonstrated experience with Apache NiFi. Demonstrated experience with the Extract, Transform, and Load (ETL) processes. More ❯
NoSQL databases such as MongoDB, ElasticSearch, MapReduce, and HBase. Demonstrated experience maintaining, upgrading, troubleshooting, and managing software, hardware and networks (specifically the hardware networks piece). Demonstrated experience with Apache NiFi. Demonstrated experience with the Extract, Transform, and Load (ETL) processes. More ❯
Newcastle upon Tyne, Tyne and Wear, Tyne & Wear, United Kingdom
Randstad Technologies Recruitment
institutions, alongside a proven record of relevant professional experience." Proven experience in a data specialist role with a passion for solving data-related problems. Expertise in SQL, Python , and Apache Spark , with experience working in a production environment. Familiarity with Databricks and Microsoft Azure is a plus. Financial Services experience is a bonus, but not required. Strong verbal and More ❯
and engineering practices. Key competencies include: Microsoft Fabric expertise : Designing and delivering data solutions using Microsoft Fabric, including Pipelines, Notebooks, Dataflows Gen2. Programming and query languages : Proficiency in Python, Apache Spark, KQL (Kusto Query Language). End-to-end data solution delivery : Experience with Data Governance, Migration, Modelling, ETL/ELT, Data Lakes, Warehousing, MDM, and BI. Engineering delivery More ❯
Washington, Washington DC, United States Hybrid / WFH Options
Guydo Consulting, Inc
experience OR No degree with 8+ years of professional experience Java, JavaScript, C#, TypeScript, Node.js, Angular, Spring, Maven, Python Both relational SQL and NoSQL database experience Web Application development (Apache, Tomcat, JBoss, Nginx, Node.js, IIS) Write well-designed, testable, efficient full stack code. Design, develop code, test, and/or modify enterprise-wide systems and/or applications software. More ❯
Computer Science, Engineering, or a related field, or equivalent industry experience. Preferred Qualifications Experience or interest in mentoring junior engineers. Familiarity with data-centric workflows and pipeline orchestration (e.g., Apache Airflow). Proficiency in data validation, anomaly detection, or debugging using tools like Pandas, Polars, or data.table/R. Experience working with AWS or other cloud platforms. Knowledge of More ❯
the development and adherence to data governance standards. Data-Driven Culture Champion : Advocate for the strategic use of data across the organization. Skills-wise, you'll definitely: Expertise in Apache Spark Advanced proficiency in Python and Pyspark Extensive experience with Databricks Advanced SQL knowledge Proven leadership abilities in data engineering Strong experience in building and managing CI/CD More ❯
Leverage full-stack technologies including Java, JavaScript, TypeScript, React, APIs, MongoDB, Elastic Search, DMN, BPMN, and Kubernetes. Utilize data-streaming technologies such as Kafka CDC, Kafka topics, EMS, and Apache Flink. Innovate and incubate new ideas. Work on a broad range of problems involving large data sets, real-time processing, messaging, workflows, and UI/UX. Drive the full More ❯
data-driven performance analysis and optimization Strong communication skills and ability to work in a team Strong analytical and problem-solving skills PREFERRED QUALIFICATIONS Experience with Kubernetes deployment architectures Apache NiFi experience Experience building trading controls within an investment bank ABOUT GOLDMAN SACHS At Goldman Sachs, we commit our people, capital, and ideas to help our clients, shareholders, and More ❯
full-stack technologies including; Java, JavaScript, TypeScript, React, APIs, MongoDB, Elastic Search, DMN, BPMN and Kubernetes leverage data-streaming technologies including Kafka CDC, Kafka topic and related technologies, EMS, Apache Flink be able to innovate and incubate new ideas, have an opportunity to work on a broad range of problems, often dealing with large data sets, including real-time More ❯
Traditional Test Automation Suites such as Micro Focus Unified Functional Testing (UFT) o Development tools/technology such as Java, J2EE technologies, Oracle, Postgres, XML, XSLT, WSDL, SOAP, JSP, Apache Groovy, and JQUERY • Ability to convey technical information to non-technical individuals • Ability to work independently • Candidate must have strong communication skills Job Requirements: Key Tasks and Responsibilities • Evaluate More ❯