Sunbury-On-Thames, London, United Kingdom Hybrid / WFH Options
BP Energy
PySpark for data processing and automation. Strong command of SQL for data querying, transformation, and performance tuning. Deep experience with cloud platforms , preferably AWS (e.g., S3, Glue, Redshift, Athena, EMR, Lambda). Experience with Azure or GCP is a plus. Experience building and managing data lakes and data warehouses. Strong understanding of distributed systems and big data processing. Experience More ❯
sunbury, south east england, united kingdom Hybrid / WFH Options
BP Energy
PySpark for data processing and automation. Strong command of SQL for data querying, transformation, and performance tuning. Deep experience with cloud platforms , preferably AWS (e.g., S3, Glue, Redshift, Athena, EMR, Lambda). Experience with Azure or GCP is a plus. Experience building and managing data lakes and data warehouses. Strong understanding of distributed systems and big data processing. Experience More ❯
guildford, south east england, united kingdom Hybrid / WFH Options
BP Energy
PySpark for data processing and automation. Strong command of SQL for data querying, transformation, and performance tuning. Deep experience with cloud platforms , preferably AWS (e.g., S3, Glue, Redshift, Athena, EMR, Lambda). Experience with Azure or GCP is a plus. Experience building and managing data lakes and data warehouses. Strong understanding of distributed systems and big data processing. Experience More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Lorien
junior engineers and sharing best practices Staying ahead of the curve with emerging data technologies What You'll Bring: Solid hands-on experience with AWS (Glue, Lambda, S3, Redshift, EMR) Strong Python, SQL, and PySpark skills Deep understanding of data warehousing and lakehouse concepts Problem-solving mindset with a focus on performance and scalability Excellent communication skills across technical More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
The DarkStar Group
documentation for each project including ETL mappings, code use guide, code location and access instructions. Design and optimize Data Pipelines using tools such as Spark, Apache Iceberg, Trino, OpenSearch, EMR cloud services, NiFi and Kubernetes containers Ensure the pedigree and provenance of the data is maintained such that the access to data is protected Clean and preprocess data to More ❯
Herndon, Virginia, United States Hybrid / WFH Options
The DarkStar Group
documentation for each project including ETL mappings, code use guide, code location and access instructions. Design and optimize Data Pipelines using tools such as Spark, Apache Iceberg, Trino, OpenSearch, EMR cloud services, NiFi and Kubernetes containers Ensure the pedigree and provenance of the data is maintained such that the access to data is protected Clean and preprocess data to More ❯
Arlington, Virginia, United States Hybrid / WFH Options
CGI
with using the Joint Enterprise Modeling and Analytics (JEMA) framework or Apache NiFi to develop data processing workflows • Experience in AWS services including but not limited to Sagemaker, S3, EMR, and IAM. CGI is required by law in some jurisdictions to include a reasonable estimate of the compensation range for this role. The determination of this range includes various More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Rackner
moving Agile DevSecOps team that builds secure, scalable data platforms-and get paid weekly. What You'll Do Build OpenAPI-compliant APIs, data schemas, & pipelines in AWS (S3, RDS, EMR) Develop with Python (FastAPI, Django, Flask) & JavaScript (Node.js, Vue, React) Deploy containerized workloads in Kubernetes (AWS EKS, Rancher) with CI/CD Apply DevSecOps + security-first practices from More ❯
Reston, Virginia, United States Hybrid / WFH Options
SRC
deploying in Python -Experience with LLM or AI/ML pipelines and secure deployment of ML models in production -Experience deploying cloud-native architectures on AWS to include S3, EMR, EKS, IAM and Redshift -Experience with SQL to include optimizing queries with window functions, CTEs, and large datasets -Familiarity with Apache Airflow -Experience implementing FedRAMP/FISMA controls -Experience More ❯
Reston, Virginia, United States Hybrid / WFH Options
SRC
deploying in Python -Experience with LLM or AI/ML pipelines and secure deployment of ML models in production -Experience deploying cloud-native architectures on AWS to include S3, EMR, EKS, IAM and Redshift -Experience with SQL to include optimizing queries with window functions, CTEs, and large datasets -Familiarity with Apache Airflow -Experience implementing FedRAMP/FISMA controls -Experience More ❯
City of London, London, England, United Kingdom Hybrid / WFH Options
Client Server Ltd
on data projects You have experience of establishing data analytics and supporting AI solutions You have a good working experience of AWS (e.g. S3, Kinesis, Glue, Redshift, Lambda and EMR) and/or Azure data services (e.g. ADF, Synapse, Fabric, Azure Functions) You have advanced client and stakeholder management skills What's in it for you: As a Data More ❯
City of London, London, England, United Kingdom Hybrid / WFH Options
Client Server Ltd
on data projects You have experience of establishing data analytics and supporting AI solutions You have a good working experience of AWS (e.g. S3, Kinesis, Glue, Redshift, Lambda and EMR) and/or Azure data services (e.g. ADF, Synapse, Fabric, Azure Functions) You have advanced client and stakeholder management skills What's in it for you: As a Data More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
will have strong Python development skills (must be able to design and write clean, maintainable, and testable code). Extensive AWS expertise, particularly across: Lambda Glue Glue Data Catalog EMR services API Gateway Relational database experience with Aurora Postgres (including query performance tuning). Spark experience, including pipelines using Spark on data stored in iceberg table format in S3 More ❯
Arlington, Virginia, United States Hybrid / WFH Options
Elder Research, Inc
software development lifecyclefrom design to deployment. Required Skills/Experience: Hands-on experience with data engineering tools such as Hadoop, Cloudera, and Apache Spark. Proficiency with AWS services including EMR Studio. Familiarity with CI/CD pipelines, GitHub, and version control workflows. Experience working with or maintaining an Analytics Repository. Collaborate with law enforcement, regulatory bodies, and other stakeholders More ❯
Arlington, Virginia, United States Hybrid / WFH Options
Amazon
Description The Amazon Web Services Professional Services (ProServe) team is seeking a skilled Delivery Consultant to join our team at Amazon Web Services (AWS). In this role, you'll work closely with customers to design, implement, and manage AWS solutions that meet their technical requirements and business objectives. You'll be a key player in … Engineering, related field, or equivalent experience - 3+ years of experience with data warehouse architecture, ETL/ELT tools, data engineering, and large-scale data manipulation using technologies like Spark, EMR, Hive, Kafka, and Redshift - Experience with relational databases, SQL, and performance tuning, as well as software engineering best practices for the development lifecycle, including coding standards, reviews, source control … a track record of working with large datasets and extracting value from them - Experience leading large-scale data engineering and analytics projects using AWS technologies like Redshift, S3, Glue, EMR, Kinesis, Firehose, and Lambda, as well as experience with non-relational databases and implementing data governance solutions Amazon is an equal opportunity employer and does not discriminate More ❯
Reston, Virginia, United States Hybrid / WFH Options
SRC
deploying in Python -Experience with LLM or AI/ML pipelines and secure deployment of ML models in production -Experience deploying cloud-native architectures on AWS to include S3, EMR, EKS, IAM and Redshift -Experience with SQL to include optimizing queries with window functions, CTEs, and large datasets -Familiarity with Apache Airflow -Experience implementing FedRAMP/FISMA controls -Experience More ❯
Reston, Virginia, United States Hybrid / WFH Options
SRC
and deploying in Python -Experience with LLM or AI/ML pipelines and secure deployment of ML models in production -Experience deploying cloud-native architectureson AWS to include S3, EMR, EKS, IAM and Redshift -Experience with SQL to include oprimizing queries with window functions, CTEs, and large datasets -Familiarity with Apache Airflow -Experience implementing FedRAMP/FISMA controls -Experience More ❯
Reston, Virginia, United States Hybrid / WFH Options
SRC
deploying in Python -Experience with LLM or AI/ML pipelines and secure deployment of ML models in production -Experience deploying cloud-native architectures on AWS to include S3, EMR, EKS, IAM and Redshift -Experience with SQL to include optimizing queries with window functions, CTEs, and large datasets -Familiarity with Apache Airflow -Experience implementing FedRAMP/FISMA controls -Experience More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
data platform, with responsibilities including: Designing and implementing scalable data pipelines using Python and Apache Spark Building and orchestrating workflows using AWS services such as Glue , Lambda , S3 , and EMR Serverless Applying best practices in software engineering: CI/CD , version control , automated testing , and modular design Supporting the development of a lakehouse architecture using Apache Iceberg Collaborating with … engineering fundamentals: ETL/ELT, schema evolution, batch processing Experience or strong interest in Apache Spark for distributed data processing Familiarity with AWS data tools (e.g., S3, Glue, Lambda, EMR) Strong communication skills and a collaborative mindset Comfortable working in Agile environments and engaging with stakeholders Bonus Skills Experience with Apache Iceberg or similar table formats (e.g., Delta Lake More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
data platform, with responsibilities including: Designing and implementing scalable data pipelines using Python and Apache Spark Building and orchestrating workflows using AWS services such as Glue , Lambda , S3 , and EMR Serverless Applying best practices in software engineering: CI/CD , version control , automated testing , and modular design Supporting the development of a lakehouse architecture using Apache Iceberg Collaborating with … engineering fundamentals: ETL/ELT, schema evolution, batch processing Experience or strong interest in Apache Spark for distributed data processing Familiarity with AWS data tools (e.g., S3, Glue, Lambda, EMR) Strong communication skills and a collaborative mindset Comfortable working in Agile environments and engaging with stakeholders Bonus Skills Experience with Apache Iceberg or similar table formats (e.g., Delta Lake More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
and fine-tune query performance on Aurora Postgres and other relational databases. Architect and manage data solutions on AWS using serverless technologies such as Lambda, Glue, Glue Data Catalog, EMR serverless, and API Gateway. Implement and manage large-scale data processing with Spark (Iceberg tables in S3, Gold layer in Aurora Postgres). Collaborate with data scientists, analysts, and … extensible, and testable code. Proven experience with relational databases (Aurora Postgres preferred), including performance optimisation. Extensive AWS experience, particularly with serverless data engineering tools (Lambda, Glue, Glue Data Catalog, EMR serverless, API Gateway, S3). Solid Spark experience with large-scale data pipelines and data lakehouse architectures (Iceberg format a plus). Hands-on experience with data modelling and More ❯
Fort Lauderdale, Florida, United States Hybrid / WFH Options
Vegatron Systems
then will eventually be sitting in Frt. Lauderdale, FL. Candidates should be senior Data Engineers with big data tools (Hadoop, Spark, Kafka) as well as AWS (cloud services: EC2, EMR, RDS, Redshift) and NOSQL. This is a phone and Skype to hire. Candidates in Florida with a LinkedIn profile preferred but not required. Essential Duties and Responsibilities: • Past experience … or Google Cloud • Experience with big data tools: Hadoop, Spark, Kafka, etc. • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra • Experience with AWS cloud services: EC2, EMR, RDS, Redshift • Experience with stream-processing systems: Storm, Spark-Streaming, etc. • Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc. • Strong hands-on personality … or Google Cloud • Experience with big data tools: Hadoop, Spark, Kafka, etc. • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra • Experience with AWS cloud services: EC2, EMR, RDS, Redshift • Experience with stream-processing systems: Storm, Spark-Streaming, etc. • Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc. IDEAL CANDIDATE: Experience More ❯