better world for future generations? AWS Utility Computing (UC) provides product innovations - from foundational services such as Amazon's SimpleStorageService (S3) and Amazon Elastic Compute Cloud (EC2), to consistently released new product innovations that continue to set AWS's services and features apart … of Things (Iot), Platform, and Productivity Apps services in AWS, including support for customers who require specialized security solutions for their cloud services. Amazon Web Services is seeking the industry's top technical experts to grow our team dedicated to expanding our Databases suite of services. We are … DevOps philosophies, yet is not constrained by how "things are usually done" and is willing to deconstruct and reinvent systems, processes, and tools. Amazon has a fast-paced environment where we "Work Hard, Have Fun, Make History." On a typical day engineers might deep dive to root cause More ❯
member of our team, you will contribute to the design, development, and deployment of applications hosted on AWS, leveraging its serverless infrastructure, including DynamoDB, S3, Lambda, and CloudFront. Your experience with Infrastructure as Code (IaC) under AWS will be essential in streamlining deployment processes and ensuring efficient infrastructure management. … backend, and React for frontend. Design and implement scalable, secure, and performant RESTful APIs. Build serverless applications on AWS leveraging services such as DynamoDB, S3, Lambda, CloudFront, and API Gateway. Write efficient and scalable infrastructure using AWS CDK (Infrastructure as Code). Collaborate with cross-functional teams to define … with Node.js for backend development. Experience with React.js for building dynamic and responsive frontend interfaces. Hands-on experience with AWS serverless services: Lambda, DynamoDB, S3, CloudFront, API Gateway. Experience with AWS CDK for Infrastructure as Code. Experience with RESTful API design and development. Familiarity with GraphQL (highly desirable but More ❯
Systems Engineer/DevOps Engineer, Managed Operations Job ID: Amazon Development Centre Ireland Limited - D94 AWS is set to introduce the inaugural European Sovereign Cloud (ESC), marking a significant development in utility computing (UC). To spearhead this initiative, we are actively seeking experienced systems engineers with a … team, you will play a pivotal role in building and leading operations and development teams dedicated to delivering high-availability AWS services, including EC2, S3, Dynamo, Lambda, and Bedrock, exclusively for EU customers. For more information on ESC please check out our blog: Your responsibilities will encompass overseeing the … AWS Utility Computing (UC). AWS Utility Computing (UC) provides product innovations - from foundational services such as Amazon's SimpleStorageService (S3) and Amazon Elastic Compute Cloud (EC2), to consistently released new product innovations that continue to set AWS's services and features apart More ❯
deep passion and desire to engineer and operate the world's largest cloud computing infrastructure to provide a better world for future generations? Amazon Web Services (AWS) builds and operates some of the largest internet infrastructure on the planet; providing companies of all sizes with an infrastructure web … driven by learning and innovation. AWS Utility Computing (UC) provides product innovations - from foundational services such as Amazon's SimpleStorageService (S3) and Amazon Elastic Compute Cloud (EC2), to consistently released new product innovations that continue to set AWS's services and features apart … to support and operate some of the most comprehensive artificial intelligence and machine learning services in the world. From SageMaker to Bedrock to Amazon Q, the AI/ML team manage some of the most exciting, rapidly evolving services in AWS. If you join us, you'll be More ❯
RedHat Linux 6 or 7 (Certification is not required) Experience with shell scripting in a Linux/UNIX environment Familiarity with AWS services (EC2, S3, CloudWatch, etc.) Knowledge of Kubernetes and Terraform Experience with storage systems, including working with server disks, file systems, volume management software (e.g., LVM), and More ❯
RedHat Linux 6 or 7 (Certification is not required) Experience with shell scripting in a Linux/UNIX environment Familiarity with AWS services (EC2, S3, CloudWatch, etc.) Knowledge of Kubernetes and Terraform Experience with storage systems, including working with server disks, file systems, volume management software (e.g., LVM), and More ❯
Experience working with large datasets. Exceptional communication and collaboration skills. Advantageous: Any operational experience supporting real-time systems. Working knowledge of AmazonS3, Airflow, Kafka, Docker. More ❯
Experience working with large datasets. Exceptional communication and collaboration skills. Advantageous: Any operational experience supporting real-time systems. Working knowledge of AmazonS3, Airflow, Kafka, Docker. More ❯
RedHat Linux 6 or 7 (Certification is not required) Experience with shell scripting in a Linux/UNIX environment Familiarity with AWS services (EC2, S3, CloudWatch, etc.) Knowledge of Kubernetes and Terraform Experience with storage systems, including working with server disks, file systems, volume management software (e.g., LVM), and More ❯
o As well as realisation of data security and data governance requirements. • Azure Data Factory, Azure Synapse, Azure Data Lake Storage Gen 2. • AWS S3, AWS Glue and CloudFormation • Databricks configuration, notebooks and pipelines. • Support I-SP-IPA platform architects in the realisation of responsibilities the Data Analytics Platform. More ❯
SQL coding skills You have experience with big data frameworks and tools including Spark You have a good knowledge of AWS data services (e.g. S3, Redshift, EMR, Glue) You have strong analytical, problem solving and critical thinking skills You have excellent communication skills and experience of working across teams More ❯
and how these can be practically applied. Experience with Python or other scripting languages Good working knowledge of the AWS data management stack (RDS, S3, Redshift, Athena, Glue, QuickSight) or Google data management stack (Cloud Storage, Airflow, Big Query, DataPlex, Looker) About Our Process We can be flexible with More ❯
Google Ads) Data Quality & Operations: Ensure quality, reliability, and stability across data models and pipelines Create secure, efficient data shares for external partners (e.g., S3, SFTP, Snowflake Data Sharing) Contribute to the evolution and continuous improvement of our data platform, tooling, and processes Stakeholder Enablement: Support business users across More ❯
Google Ads) Data Quality & Operations: Ensure quality, reliability, and stability across data models and pipelines Create secure, efficient data shares for external partners (e.g., S3, SFTP, Snowflake Data Sharing) Contribute to the evolution and continuous improvement of our data platform, tooling, and processes Stakeholder Enablement: Support business users across More ❯
Web browser, HDMI, Bluetooth, WiFi/Ethernet. Develop anomaly detection techniques and data-driven solutions to identify and address system issues. Apply AWS technologies (S3, Athena, Quick Sight) for data management and visualization. Architect and implement DevOps solutions for operational improvement. Ensuring timely and high-quality software releases across More ❯
expertise RedHat Linux 6 or 7 ( being certified is not essential). Shell scripting experience in Linux/UNIX environment. Knowledge of AWS (EC2, S3, CloudWatch etc) Knowledge of Kubernetes and Terraform Storage experience i.e working with server disks file systems and volume management software (like LVM) and understanding More ❯
PostgreSQL, MySQL, MongoDB, Cassandra). Experience with data pipeline orchestration tools. Experience with cloud-based data platforms such as AWS, GCP, or Azure (e.g., S3, BigQuery, Azure Data Lake Storage). Comfortable working in fast-paced, agile environments, managing uncertainty and ambiguity. Preferred qualifications: Additional legal knowledge as evidenced More ❯
through cloud platforms, utilising new LLM models and building out new pipelines. Desired Skills ⚙️ Python, SQL Tableau, Power BI Terraform, Bedrock AWS (DMS, Redshift, S3) Azure (Synapse, Microsoft AI) If you are a skilled Engineer (Python, SQL, Tableau, AWS, Azure) who is interested in this role then please apply More ❯
through cloud platforms, utilising new LLM models and building out new pipelines. Desired Skills ⚙️ Python, SQL Tableau, Power BI Terraform, Bedrock AWS (DMS, Redshift, S3) Azure (Synapse, Microsoft AI) If you are a skilled Engineer (Python, SQL, Tableau, AWS, Azure) who is interested in this role then please apply More ❯
through cloud platforms, utilising new LLM models and building out new pipelines. Desired Skills ⚙️ Python, SQL Tableau, Power BI Terraform, Bedrock AWS (DMS, Redshift, S3) Azure (Synapse, Microsoft AI) If you are a skilled Engineer (Python, SQL, Tableau, AWS, Azure) who is interested in this role then please apply More ❯
through cloud platforms, utilising new LLM models and building out new pipelines. Desired Skills ⚙️ Python, SQL Tableau, Power BI Terraform, Bedrock AWS (DMS, Redshift, S3) Azure (Synapse, Microsoft AI) If you are a skilled Engineer (Python, SQL, Tableau, AWS, Azure) who is interested in this role then please apply More ❯
systems. Ensure security across the Data & Analytics technology stack consists primarily of: Oracle tools, Snowflake, Postgres, various AWS Services (SageMaker, Lambda, Step Functions, DMS, S3 etc.) in the AWS Cloud. Data Security Engineer - Your Background The ideal Data Security Engineer will have: Experience in a similar role, in both More ❯
systems. Ensure security across the Data & Analytics technology stack consists primarily of: Oracle tools, Snowflake, Postgres, various AWS Services (SageMaker, Lambda, Step Functions, DMS, S3 etc.) in the AWS Cloud. Data Security Engineer - Your Background The ideal Data Security Engineer will have: Experience in a similar role, in both More ❯
St. Albans, Hertfordshire, South East, United Kingdom Hybrid / WFH Options
Premier Foods
Premier Foods and to contribute to driving business performance through effective data-driven decision-making. The Key Requirements... Strong experience with AWS services (e.g., S3, Redshift, RDS) Proficiency in Matillion ETL Strong SQL skills Experience with data warehousing concepts and best practices Analytical mindset Self-motivated, results-driven Logical More ❯
MATLAB, SAS, and InfoSphere • Experience with a combination of database tools and cloud components supporting ETL, such as: SQL, mySQL, and Data Lakes, AWS S3, IBM Cloud, and/or Microsoft Azure Data Explorer . • Experience working with data in a variety of structured and unstructured formats. • Experience with More ❯