Functions, and Kinesis. Work with structured and unstructured data from multiple sources, ensuring efficient data ingestion, transformation, and storage. Develop and optimize data lake and data warehouse solutions using Amazon S3, Redshift, Athena, and Lake Formation. Implement data governance, security, and compliance best practices, including IAM roles, encryption, and access controls. Monitor and optimize performance of data workflows … engineering with a strong focus on AWS cloud technologies. Proficiency in Python, PySpark, SQL, and AWS Glue for ETL development. Hands-on experience with AWS data services, including Redshift, Athena, Glue, EMR, and Kinesis. Strong knowledge of data modeling, warehousing, and schema design. Experience with event-driven architectures, streaming data, and real-time processing using Kafka or Kinesis. Expertise More ❯
to design innovative data solutions that address complex business requirements and drive decision-making. Your skills and experience Proficiency with AWS Tools: Demonstrable experience using AWS Glue, AWS Lambda, Amazon Kinesis, Amazon EMR , AmazonAthena, Amazon DynamoDB, Amazon Cloudwatch, Amazon SNS and AWS Step Functions. Programming Skills: Strong experience with modern programming languages More ❯
forward. It's an exciting time, and to continue our growth, we are recruiting a Senior Software Engineer focusing on Python for our Software Team. Our Tech Stack: AWS, Athena SQL, Athena Spark, ECS, Azure, Azure Synapse SQL & Spark, Python, Flask, Fast API, Redis, Postgres, React, Plotly, Docker. We will potentially add GCP and on-premise in the More ❯
Platform Engineer to join our team and help develop and maintain a high scale, fully serverless cloud data platform. Our product ingests and processes large volumes of data into Amazon S3 using a modern architecture based on AWS serverless services and enables powerful querying and analytics through Amazon Athena. In this role, you'll work on a system More ❯
Experience working with big-data stack, including (but not limited to) spark/hadoop, kafka, Aerospike/Dynamodb Experience with AWS tech stack, including but not limited to EMR, Athena, EKS Expert knowledge of multi-threading, memory model, etc. Understanding of database fundamentals and MySQL knowledge Experience with CICD tools such as Jenkins, Graphite, Grafana and Docker Knowledge of More ❯
of orchestration tooling (e.g., Airflow, Dagster, Azure Data Factory, Fivetran) Desirable: - Experience deploying AI/ML models in production environments - Familiarity with AWS data services (e.g., S3, Glue, Kinesis, Athena) - Exposure to real-time data streaming and analytics paradigms #LI-RJ1 Together, as owners, let's turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork More ❯
and customise them for different use cases. Develop data models and Data Lake designs around stated use cases to capture KPIs and data transformations. Identify relevant AWS services - on Amazon EMR, Redshift, Athena, Glue, Lambda, to design an architecture that can support client workloads/use-cases; evaluate pros/cons among the identified options to arrive at More ❯
Use Terraform to automate infrastructure provisioning, deployment, and configuration, ensuring efficiency and repeatability in cloud environments. Database Design & Optimisation : Design and optimise complex SQL queries, and relational databases (e.g., Amazon Redshift, PostgreSQL, MySQL) to enable fast, efficient data retrieval and analytics. Data Transformation : Apply ETL/ELT processes to transform raw financial data into usable insights for business intelligence … understanding of data engineering concepts, including data modelling, ETL/ELT processes, and data warehousing. Proven experience with AWS services (e.g., S3, Redshift, Lambda, ECS, ECR, SNS, Eventbridge, CloudWatch, Athena etc.) for building and maintaining scalable data solutions in the cloud. Technical Skills (must have): Python: Proficient in Python for developing custom ETL solutions, data processing, and integration with More ❯
Collaborate with development teams to design and implement automated tests for microservices, emphasizing Spring Boot and Java-based architectures. Implement testing strategies for AWS data lakes (e.g., S3, Glue, Athena) with a focus on schema evolution, data quality rules, and performance benchmarks, prioritizing data lake testing over traditional SQL approaches. Automate data tests within CI/CD workflows to … maintain scalable test automation frameworks, with a focus on backend, API, and data systems using tools like Pytest and Postman. Expertise in Pandas, SQL, and AWS analytics services (Glue, Athena, Redshift) for data profiling, transformation, and validation within data lakes. Solid experience with AWS (S3, Lambda, EMR, ECS/EKS, CloudFormation/Terraform) and understanding of cloud-native architectures More ❯
Collaborate with development teams to design and implement automated tests for microservices, emphasizing Spring Boot and Java-based architectures. Implement testing strategies for AWS data lakes (e.g., S3, Glue, Athena) with a focus on schema evolution, data quality rules, and performance benchmarks, prioritizing data lake testing over traditional SQL approaches. Automate data tests within CI/CD workflows to … maintain scalable test automation frameworks, with a focus on backend, API, and data systems using tools like Pytest and Postman. Expertise in Pandas, SQL, and AWS analytics services (Glue, Athena, Redshift) for data profiling, transformation, and validation within data lakes. Solid experience with AWS (S3, Lambda, EMR, ECS/EKS, CloudFormation/Terraform) and understanding of cloud-native architectures More ❯
case to adopt new technologies Develop new tools and infrastructure using Python (Flask/Fast API) or Java (Spring Boot) and relational data backend (AWS - Aurora/Redshift/Athena/S3) Support users and operational flows for quantitative risk, senior management and portfolio management teams using the tools developed Qualifications/Skills Required Advance degree in computer science More ❯
structured queries Hands on experience with DBT, building and maintaining modular, scalable data models that follow best practices Strong understanding of dimensional modelling Familiarity with AWS data services (S3, Athena, Glue) Experience with Airflow for scheduling and orchestrating workflows Experience working with data lakes or modern data warehouses (Snowflake, Redshift, BigQuery) A pragmatic problem solver who can balance technical More ❯
data architecture principles and how these can be practically applied. Experience with Python or other scripting languages Good working knowledge of the AWS data management stack (RDS, S3, Redshift, Athena, Glue, QuickSight) or Google data management stack (Cloud Storage, Airflow, Big Query, DataPlex, Looker) About Our Process We can be flexible with the structure of our interview process if More ❯
and data ingestion tools such as Airflow and Stitch, along with Python scripting for integrating diverse data sources. Large-scale data processing: Proficient with distributed query engines like AWS Athena or SparkSQL for working with datasets at the scale of billions of rows. Event streaming data: Experienced in working with live streamed event data, including transforming and modeling real More ❯
Amazon Last Mile - Routing and Planning DE Design, implement, and support data warehouse/data lake infrastructure using AWS big data stack, Python, Redshift, Quicksight, Glue/lake formation, EMR/Spark/Scala, Athena etc. • Extract huge volumes of structured and unstructured data from various sources (Relational/Non-relational/No-SQL database) and message streams … support for the interview or onboarding process, please visit for more information. If the country/region you're applying in isn't listed, please contact your Recruiting Partner. Amazon is an equal opportunity employer and does not discriminate on the basis of protected veteran status, disability, or other legally protected status. More ❯
Delivery Consultant - Data Analytics, AWS Professional Services Job ID: Amazon Web Services Australia Pty Ltd Are you a Data Analytics specialist? Do you have Real-time Data Analytics, Data Warehousing, Big Data, Modern Data Strategy, Data Lake and Data engineering experience? Do you like to solve the most complex and high scale (billions+ records) data challenges in the world … high impact projects that use the latest data analytics technologies? Would you like a career path that enables you to progress with the rapid adoption of cloud computing? At Amazon Web Services, we're hiring highly technical cloud architect specialised in data analytics to collaborate with our customers and partners to derive business value from latest in analytics services. … decisions and desired customer outcomes. Key job responsibilities - Expertise - Collaborate with pre-sales and delivery teams to help partners and customers learn and use services such as AWS Glue, Amazon S3, Amazon DynamoDB, Amazon Relational Database Service (RDS), Amazon Elastic Map Reduce (EMR), Amazon Kinesis, Amazon Redshift, AmazonAthena, AWS Lake Formation More ❯
with data privacy regulations. Technical Competencies The role is a hands-on technical leadership role with advanced experience in at least most of the following technologies Cloud Platforms: AWS (Amazon Web Services): Knowledge of services like S3, EC2, Lambda, RDS, Redshift, EMR, SageMaker, Glue, and Kinesis. Azure: Proficiency in services like Azure Blob Storage, Azure Data Lake, VMs, Azure … Lake Formation, Azure Purview. Data Security Tools: AWS Key Management Service (KMS), Azure Key Vault. Data Analytics & BI: Visualization Tools: Tableau, Power BI, Looker, and Grafana. Analytics Services: AWS Athena, Amazon QuickSight, Azure Stream Analytics. Development & Collaboration Tools: Version Control: Git (and platforms like GitHub, GitLab). CI/CD Tools: Jenkins, Travis CI, AWS CodePipeline, Azure DevOps. More ❯
and maintaining robust, cloud-native data pipelines Automating and orchestrating workflows using tools like AWS Glue, Azure Data Factory, or Google Cloud Dataflow Working with platform services such as Amazon S3, Azure Synapse, Google BigQuery, and Snowflake Implementing Lakehouse architectures using tools like Databricks or Snowflake Collaborating closely with engineers, analysts, and client teams to deliver value-focused data … ve got solid experience working with Python, SQL, Spar and data pipeline tools such as dbt or Airflow You're comfortable working across cloud platforms - especially AWS (Glue, Lambda, Athena), Azure (Data Factory, Synapse), or GCP (Dataflow, BigQuery, Cloud Composer) You have a good understanding of data modelling, data warehousing and performance optimisation You care deeply about data quality More ❯
with Python coupled with strong SQL skills. In addition, you will also have a strong desire to work with Docker, Kubernetes, Airflow and the AWS data technologies such as Athena, Redshift, EMR and various other tools in the AWS ecosystem. You would be joining a team of 25+ engineers across mobile, web, data and platform. We look for engineers More ❯
Wakefield, Yorkshire, United Kingdom Hybrid / WFH Options
Flippa.com
Senior Data Engineer - Data Infrastructure and Architecture: C-4 Analytics C-4 Analytics is a fast-growing, private, full-service digital marketing company that excels at helping automotive dealerships increase sales, increase market share, and lower cost per acquisition. We More ❯
Summary Yelp engineering culture is driven by our values : we're a cooperative team that values individual authenticity and encourages creative solutions to problems. All new engineers deploy working code their first week, and we strive to broaden individual impact More ❯
We're Hiring: Senior Contact Centre Engineer (Amazon Connect) 📍 Location: Remote (UK Residents only) 📣 Type: Full Time, permanent 📅 Hours: Permanent Full Time Role 09:00-17:30 📢 About Us At Kensington Mortgages, we've been leading the way in providing specialist mortgage solutions for over 25 years. We're dedicated to offering people a chance to secure a mortgage … Engineer and shape the future of our cloud-based communication systems. We are looking for an experienced hands-on engineer to support the development, deployment, and maintenance of our Amazon Connect-based contact centre environment. This role plays a key part in delivering digital routing solutions, supporting internal users including training, and ensuring our systems run smoothly and securely. … and working with 3rd party service providers. ☑️ Experience, Knowledge, Skills Strong experience designing, deploying and maintaining contact centre environments (Avaya, Genesys etc.) including SIP Solid hands on knowledge of Amazon Connect, AWS Lambda, Lex Bots and DynamoDB Strong background in designing and implementing call routing systems, including routing profiles, queues, callbacks, emergency messaging, hours of operation, and holiday routing More ❯