as Informatica PowerCenter and BDM, AutoSys, Airflow, and SQL Server Agent. Experience with cloud platforms preferably AWS. Strong knowledge of AWS cloud services, including EMR, RDS Postgres, Redshift Athena, S3, and IAM. Solid understanding of data warehousing principles and best practices. Strong proficiency in SQL for data manipulation, reporting More ❯
Arlington, Virginia, United States Hybrid / WFH Options
Amazon
DESCRIPTION The Amazon Web Services Professional Services (ProServe) team is seeking a skilled Delivery Consultant to join our team at Amazon Web Services (AWS). In this role, you'll work closely with customers to design, implement, and manage AWS solutions that meet their technical requirements … experience - 3+ years of experience with data warehouse architecture, ETL/ELT tools, data engineering, and large-scale data manipulation using technologies like Spark, EMR, Hive, Kafka, and Redshift - Experience with relational databases, SQL, and performance tuning, as well as software engineering best practices for the development lifecycle, including … with large datasets and extracting value from them - Experience leading large-scale data engineering and analytics projects using AWS technologies like Redshift, S3, Glue, EMR, Kinesis, Firehose, and Lambda, as well as experience with non-relational databases and implementing data governance solutions Amazon is an equal opportunity More ❯
data technologies such as Hadoop, Hive, Spark, EMR. Experience with ETL tools like Informatica, ODI, SSIS, BODI, or DataStage. Our inclusive culture empowers Amazon employees to deliver the best results for our customers. If you have a disability and need workplace accommodations during the application and hiring process More ❯
methodology, continuous integration, and delivery (CI/CD) and related tools Knowledge of Docker or container-based systems Knowledge of AWS Services, such as EMR, Connect, CloudFormation, Lambda, SNS and SQS Knowledge of Chef AWS or RHEL certification is Preferred At Cognizant you will experience an exciting mix of More ❯
to automate quality control processes Requirements Over 3+ years of hands-on Python development experience for production systems Proficient in data cloud computing technologies (EMR, Databricks) within leading cloud platforms (AWS, Google Cloud, or Azure) Skilled in designing and implementing data pipelines using distributed storage platforms Demonstrated analytical skills More ❯
V, Openstack, VmWare, RHVE. Experience with Azure/GCP and AWS. Experience in automation of performance testing. Data environments exposure is a plus (Airflow, EMR, SageMaker, Ray, Tensorflow, MLflow, Kubeflow, Dask). Working conditions: Occasional out of hour's conferencing with overseas colleagues. Occasional out of hours or weekend More ❯
recruiting a Software Engineer focussing on Python for our Software Team. Our Tech Stack: Python, FastAPI, Redis, Postgres, React, Plotly, Docker, Athena SQL, Athena & EMR Spark, ECS, Temporal, AWS, Azure. What you can expect as a Software Engineer at Monolith AI: As a Senior/Lead Software Engineer at More ❯
solutions that truly matter. Key Responsibilities: Design, develop, and optimize scalable data pipelines using technologies such as Apache Spark, Apache Iceberg, Trino, OpenSearch, AWS EMR, NiFi, and Kubernetes containers. Ingest and move structured and unstructured data using approved methods into enterprise or local storage systems. Create robust ETL processes More ❯
data stack, including (but not limited to) spark/hadoop, kafka, Aerospike/Dynamodb Experience with AWS tech stack, including but not limited to EMR, Athena, EKS Expert knowledge of multi-threading, memory model, etc. Understanding of database fundamentals and MySQL knowledge Experience with CICD tools such as Jenkins More ❯
with DevOps Experience with Kubernetes and Helm Charts Experience with Docker and/or Podman Containerization best practices Experience with AWS Services (EKS, Lambdas, EMR, ECR, VPC, etc.) Experience with Terraform Desired Skills: Terraform and Packer with AWS (AMIs, Environment, Configuration Automation, etc.) Database ORM Tools (Flyway, etc.) CI More ❯
Are you interested in rapidly growing business? The Amazon ARTS team is responsible for creating core analytics tech capabilities and data engineering services for ROW (the rest of the world excl. NA and EU) countries. It comprises platforms development, research science, software development, and data engineering. ARTS develops … and lead it to a smooth launch with limited manager guidance; 3) help stakeholders and junior teammates ramp up with team internal and Amazon tech products (e.g., Redshift Cluster, Datanet, Query Performance); 4) develop data pipelines (daily/intra-day/real-time) independently; 5) routine KTLO on … QUALIFICATIONS - 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience - Experience with big data technologies such as: Hadoop, Hive, Spark, EMR - Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions Our inclusive culture empowers Amazonians More ❯
for customer data Implement least access principle Ensure reliability and security of microservices Skills and Requirements Proficiency in AWS services, such as EC2, RDS, EMR, VPC networking, S3, etc. Experience with programming languages like Python, Ruby, Java, Groovy, and Bash Familiarity with infrastructure as code tools like Terraform and More ❯
years of experience in scaling SaaS services and infrastructure Goal-driven, positive attitude with strong communication skills Proficient in AWS services, including EC2, RDS, EMR, VPC networking, S3, etc. Comfortable with one or more of: Python, Ruby, Java, Groovy, Bash Working knowledge with one or more of: Terraform, Ansible More ❯
JDK 1.8) & prefer to have some experience with Spring Boot. Good to have experience using AWS cloud services (e.g., EC2, S3, Lambda, MSK, ECS, EMR, RDS, Athena etc. ) Experience working with Maven, Jenkins, Git etc. Understanding of database concepts and working knowledge with any of the vendors (Preferably Oracle More ❯
are looking for someone who can demonstrate an aptitude or willingness to learn some or all of the following technologies. AWS - S3, IAM, RDS, EMR, EC2, etc Linux Commands Trino Apache Spark Node.js JavaScript Preact.js Postgres MySQL HTML CSS Target Salary Range is $125k-$150k or more depending on More ❯
Relocation assistance: No Visa sponsorship eligibility: No Job Description: Java Fullstack developer L3 Support Experience Should be very good in AWS services - EC2, S3, EMR, Lambda, Cloud watch and more. Java with Microservice ( No UI needed) Mongo DB experience. Design, develop, and deploy web applications using AWS services such More ❯
Mathematics, Physics, or a related field - Good understanding of distributed computing environments - Prior working experience with AWS - any or all of EC2, S3, EBS, EMRAmazon is an equal opportunities employer, and we value your passion to discover, invent, simplify and build. We welcome applications from all … members of society irrespective of age, sex, disability, sexual orientation, race, religion or belief. Amazon is strongly committed to diversity within its community and especially welcomes applications from South African citizens who are members of designated groups who may contribute to Employment Equity within the workplace and the … Equity will be considered when appointing potential candidates. We are required by law to verify your ability to work lawfully in South Africa. Amazon requires that you submit a copy of either your identity document or your passport and any applicable work permit if you are a foreign More ❯
Engineer has the following responsibilities: Design, develop, and implement robust data pipelines and data warehousing solutions using AWS services (e.g., Redshift, Glue, S3, Lambda, EMR, Kinesis). Optimize data storage, retrieval, and processing for performance and scalability. Manage and maintain AWS data infrastructure, ensuring high availability and reliability. Develop … years of proven experience in data engineering, with a strong focus on AWS services. Proven expertise in AWS data services (Redshift, Glue, S3, Lambda, EMR, Kinesis, etc.). Strong proficiency in SQL and Python. Experience integrating data with Microsoft Power BI. Excellent communication and presentation skills, with the ability More ❯
day in a scalable fashion using AWS technologies? Do you want to create the next-generation tools for intuitive data access? If so, Amazon Finance Technology (FinTech) is for you! Amazon's Financial Technology team is looking for a passionate, results-oriented, inventive Data Engineers who … understands how to deal with large sets of data and transactions and will help us deliver on a new generation of software, leveraging Amazon Web Services. The candidate is passionate about technology and wants to be involved with real business problems. Our platform serves Amazon's … experience - Experience with data modeling, warehousing and building ETL pipelines - Experience with SQL PREFERRED QUALIFICATIONS - Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions - Experience with non-relational databases/data stores (object storage, document or key-value stores, graph More ❯
Have you ever ordered a product from Amazon and been amazed at how fast it gets to you? Every day Amazon engineers are relentlessly working to decrease the time between Click to Deliver for your products. The Amazon Fulfillment Technologies (AFT) team owns all … of the software and infrastructure which powers Amazon's world-class fulfillment engine. Our team is building complex, massive data systems to capture data during every step in the automated pipeline and use that data to proactively predict efficiency and cost improvements to deliver the packages fast to … years of SQL experience - Experience with data modeling, warehousing and building ETL pipelines PREFERRED QUALIFICATIONS - Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions - Experience with non-relational databases/data stores (object storage, document or key-value stores, graph More ❯
databases, and services. Cloud infrastructure is deployed using AWS C2S available services in the IDL VPC boundaries. Commonly used AWS tools include Managed OpenSearch, EMR, EC2, Lambdas, etc. Microservices and Kubernetes container orchestration is used to deploy applications in a managed way Working with the ISSM and various networking More ❯
Architect AWS Certified Cloud Practitioner, SysOps Administrator (Associate), Certified DevOps Engineer, or Big Data (Specialty). 5 to 8+ years' experience with utilizing Amazon Web Services to create cloud-based applications such as (but not limited to) Lambda, SWF, SNS, SQS, Elasticsearch, CloudWatch, CloudFormation, S3, RDS, and Bedrock. More ❯
opportunities for leveraging AWS services and implement effective metrics and monitoring processes. Your Skills and Experience Experience of AWS tools (e.g. Athena, Redshift, Glue, EMR), Java, Scala, Python, Spark. Experience of developing enterprise grade ETL/ELT data pipelines and demonstrable knowledge of applying Data Engineering best practices (coding … practices to DS, unit testing, version control, code review). Big Data Eco-Systems, Cloudera/Hortonworks, AWS EMR, GCP DataProc or GCP Cloud Data Fusion. Streaming technologies and processing engines, Kinesis, Kafka, Pub/Sub and Spark Streaming. Experience of working with CI/CD technologies, Git, Jenkins More ❯