London, England, United Kingdom Hybrid / WFH Options
DoiT International
production experience with K8S. Hands-on, production experience with cloud (AWS, Azure, GCP). Experience with data engineering - streaming and batch processing, spark, tryno, iceberg, clickhouse, parquet. Be your truest self. Work on your terms. Make a difference. We are home to a global team of incredible talent who More ❯
of data technologies ideally with experience with relational databases, big data, data warehouses and marts, and stream processing technologies (e.g. Kafka, S3, Flink, Snowflake, Iceberg) Product-first mindset and a desire to design and build tools that solve real user problems Mastery of fundamental software engineering practices like good More ❯
/or Python. Experience with tools like Terraform to provision AWS cloud services. Knowledge of AWS Glue, AWS Athena, and AWS S3. Understanding of Apache Parquet and open table formats such as Delta, Iceberg, and Hudi. Experience with Test Driven Development using JUnit, Mojito, or similar tools. Extensive More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
JR United Kingdom
big data technologies (e.g., Spark, Databricks, Delta Lake, BigQuery) Familiarity with eventing technologies (e.g., Event Hubs, Kafka) and file formats such as Parquet, Delta, Iceberg Want to learn more? Get in touch for an informal chat. #J-18808-Ljbffr More ❯
big data technologies (e.g., Spark, Databricks, Delta Lake, BigQuery) Familiarity with eventing technologies (e.g., Event Hubs, Kafka) and file formats such as Parquet, Delta, Iceberg Want to learn more? Get in touch for an informal chat. #J-18808-Ljbffr More ❯
big data technologies (e.g., Spark, Databricks, Delta Lake, BigQuery) Familiarity with eventing technologies (e.g., Event Hubs, Kafka) and file formats such as Parquet, Delta, Iceberg Interested in learning more? Get in touch for an informal chat. #J-18808-Ljbffr More ❯
big data technologies (e.g., Spark, Databricks, Delta Lake, BigQuery) Familiarity with eventing technologies (e.g., Event Hubs, Kafka) and file formats such as Parquet, Delta, Iceberg Interested in learning more? Get in touch for an informal chat. #J-18808-Ljbffr More ❯
big data technologies (e.g., Spark, Databricks, Delta Lake, BigQuery) Familiarity with eventing technologies (e.g., Event Hubs, Kafka) and file formats such as Parquet, Delta, Iceberg Interested in learning more? Get in touch for an informal chat. #J-18808-Ljbffr More ❯
Databricks/Delta Lake/BigQuery) Familiarity with eventing technologies (e.g., Event Hubs/Kafka) and file formats such as Parquet/Delta/Iceberg Want to learn more? Get in touch for an informal chat. #J-18808-Ljbffr More ❯
big data technologies (e.g., Spark, Databricks, Delta Lake, BigQuery) Familiarity with eventing technologies (e.g., Event Hubs, Kafka) and file formats such as Parquet, Delta, Iceberg If you're interested in learning more, get in touch for an informal chat. #J-18808-Ljbffr More ❯
Nottingham, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
big data technologies (e.g., Spark, Databricks, Delta Lake, BigQuery) Familiarity with eventing technologies (e.g., Event Hubs, Kafka) and file formats such as Parquet, Delta, Iceberg Interested in learning more? Get in touch for an informal chat. #J-18808-Ljbffr More ❯
Sheffield, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
big data technologies (e.g., Spark, Databricks, Delta Lake, BigQuery) Familiarity with eventing technologies (e.g., Event Hubs, Kafka) and file formats such as Parquet, Delta, Iceberg Interested in learning more? Get in touch for an informal chat. #J-18808-Ljbffr More ❯
Stoke-on-Trent, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
big data technologies (e.g., Spark, Databricks, Delta Lake, BigQuery) Familiarity with eventing technologies (e.g., Event Hubs, Kafka) and file formats such as Parquet, Delta, Iceberg Want to learn more? Get in touch for an informal chat. #J-18808-Ljbffr More ❯
Richmond, Virginia, United States Hybrid / WFH Options
DKMRBH Inc
AWS cloud to support data integration, data storage Required 7 Years Hands on experience in using AWS db services including S3, Aurora, postgres, DynamoDB, Iceberg, and Snowflake. Required 5 Years Core experience of configure data management practices on AWS using native services such as AWS Glue, AWS Data Zone More ❯
Your day-to-day -Wrangle and draw meaningful insights from massive amounts of unstructured textual data using the latest tools and technologies like Spark, Iceberg, Athena, AWS SageMaker -Apply unsupervised learning algorithms across billions of email interactions to identify emerging threat patterns -Work with state-of-the-art machine More ❯
the future of the AI Data Cloud. Join the Snowflake team. Snowflake is seeking an accomplished Principal Sales Specialist, Data Engineering Platform (Openflow and Iceberg/Polaris) to drive sales execution for our ingestion workloads for the EMEA markets. Reporting to the Managing Direction, Data Engineering Platform Sales, this More ❯
database architecture and performance, implementing DevSecOps practices, and building CI/CD pipelines using Python, Bash, and Terraform. Preferred candidates will have experience with Apache Spark, Apache Nifi, data governance, and ETL standardization. Familiarity with Glue, Hive, and Iceberg or similar technologies is a plus. Tasks Performed … queries. • Plan and execute large-scale data migrations. • Improve database performance through architecture and tuning. • Create and maintain data flows using ETL tools like Apache Nifi. • Manage infrastructure as code using Python, Bash, and Terraform. • Integrate security into development and deployment workflows. • Build and support automated CI/CD … large-scale data migration efforts. • Demonstrated experience with database architecture, performance design methodologies, and system-tuning recommendations. Preference for familiarity with Glue, Hive, and Iceberg or similar. • Demonstrated experience with Python, Bash, and Terraform. • Demonstrated experience with DevSecOps solutions and tools. • Demonstrated experience implementing CI/CD pipelines using More ❯
help craft innovative solutions. To be successful as a Data Engineer, you should have experience with: CICD Pipelines - GitLab Cloud Technologies such as AWS, Iceberg, Snowflake, Databricks Containerization/Orchestration: Docker/Kubernetes Framework Components- Spark/Scala/Kafka Unix: Scripting and Config Other Highly Valued Skills Include … Automation - Python/Bash Scripting DataBase - Teradata, Oracle Workflow Management: Apache Airflow You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills. More ❯
with Spark or DBT on Starburst Use SQL to transform data into meaningful insights Build and deploy infrastructure with Terraform Implement DDL, DML with Iceberg Do code reviews for your peers Orchestrate your pipelines with DAGs on Airflow Participate in SCRUM ceremonies (standups, backlogs, demos, retros, planning) Secure data … diagram of proposed tables to enable discussion Good communicator and comfortable with presenting ideas and outputs to technical and non-technical users Worked on Apache Airflow before to create DAGS. Ability to work within Agile, considering minimum viable products, story pointing and sprints More information: Enjoy fantastic perks like More ❯
experience include: Proven desire to expand your cloud/platform engineering capabilities Experience working with Big Data Experience of data storage technologies: Delta Lake, Iceberg, Hudi Proven knowledge and understanding of Apache Spark, Databricks or Hadoop Ability to take business requirements and translate these into tech specifications Competence More ❯
experience include: Proven desire to expand your cloud/platform engineering capabilities Experience working with Big Data Experience of data storage technologies: Delta Lake, Iceberg, Hudi Proven knowledge and understanding of Apache Spark, Databricks or Hadoop Ability to take business requirements and translate these into tech specifications Competence More ❯
using Python and pandas within a financial environment. Strong knowledge of relational databases and SQL. Familiarity with various technologies such as S3, Kafka, Airflow, Iceberg Proficiency working with large financial datasets from various vendors. A commitment to engineering excellence and pragmatic technology solutions. A desire to work in an … understanding of financial markets. Experienceworking with hierarchical referencedata models. Provenexpertise in handling high-throughput, real-time marketdata streams Familiarity with distributed computing frameworks suchas Apache Spark Operational experience supporting real time systems. Equal Opportunity Workplace We are proud to be an equal opportunity workplace. We do not discriminate based More ❯
Job Description We know that people want great value combined with an excellent experience from a bank they can trust, so we launched our digital bank, Chase UK, to revolutionise mobile banking with seamless journeys that our customers love. We More ❯
Role Description : As a Senior Lead within Software Engineering, you'll design and implement functionalities, focusing on Data Engineering tasks. You'll be working with semi-structured data to ingest and distribute it on a Microsoft Fabric-based platform, modernizing More ❯
Join to apply for the Senior Lead Software Engineer role at LSEG (London Stock Exchange Group) Role Description As a Senior Lead within Software Engineering, you’ll design and implement functionalities, focusing on Data Engineering tasks. You’ll be working More ❯