robust way possible! Diverse training opportunities and social benefits (e.g. UK pension schema) What do you offer? Strong hands-on experience working with modern Big Data technologies such as Apache Spark, Trino, Apache Kafka, Apache Hadoop, Apache HBase, Apache Nifi, Apache Airflow, Opensearch Proficiency in cloud-native technologies such as containerization and Kubernetes Strong More ❯
. Familiarity with data warehousing solutions (e.g., Redshift, BigQuery, Snowflake). Knowledge of containerization and orchestration tools (e.g., Docker, ECS, Kubernetes). Familiarity of data orchestration tools (e.g. Prefect, Apache Airflow). Familiarity with CI/CD pipelines and DevOps practices. Familiarity with Infrastructure-as-code tools (e.g. Terraform, AWS CDK). Employee Benefits: At Intelmatix, our benefits package More ❯
fully documented and meet appropriate standards for security, resilience and operational support. Skills & Experience Required Essential: Hands-on experience developing data pipelines in Databricks, with a strong understanding of Apache Spark and Delta Lake. Proficient in Python for data transformation and automation tasks. Solid understanding of AWS services, especially S3, Transfer Family, IAM, and VPC networking. Experience integrating data More ❯
Good work ethic, self-starter, and results-oriented Additional Preferred Qualifications: Domain knowledge in Financial Industry and Capital Markets is a plus. Experience with Big Data technologies ( i.e. Kafka, Apache Spark, NOSQL) Knowledge of BI tools like Power BI, Microstrategy etc Exposure to Python and Scala Exposure to Salesforce ecosytem About S&P Global Ratings At S&P Global More ❯
Gloucester, Gloucestershire, United Kingdom Hybrid / WFH Options
Navtech, Inc
of Science Degree in software engineering or a related field Proficiency in English spoken and written Nice-to-Haves: Experience with ETL/ELT pipeline design and tools (e.g., Apache Airflow). Familiarity with Change Data Capture (CDC) solutions. Knowledge of database services on other cloud platforms (e.g., Azure SQL Database, Google Cloud Spanner). Understanding of ORM frameworks More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
Navtech, Inc
of Science Degree in software engineering or a related field Proficiency in English spoken and written Nice-to-Haves: Experience with ETL/ELT pipeline design and tools (e.g., Apache Airflow). Familiarity with Change Data Capture (CDC) solutions. Knowledge of database services on other cloud platforms (e.g., Azure SQL Database, Google Cloud Spanner). Understanding of ORM frameworks More ❯
and scalable environments for our data platforms. Leveraging cloud-native technologies and AWS tools such as AWS S3, EKS, Glue, Airflow, Trino, and Parquet, you will prepare to adopt Apache Iceberg for greater performance and flexibility. You'll address high-performance data workloads, ensuring seamless execution of massive queries, including 600+ billion-row queries in Redshift, by designing and More ❯
experience working as a Software Engineer on large software applications Proficient in many of the following technologies - Python, REST, PyTorch, TensorFlow, Docker, FastAPI, Selenium, React, TypeScript, Redux, GraphQL, Kafka, Apache Spark. Experience working with one or more of the following database systems - DynamoDB, DocumentDB, MongoDB Demonstrated expertise in unit testing and tools - JUnit, Mockito, PyTest, Selenium. Strong working knowledge More ❯
with Interface/API data modelling. Experience with CI/CD GITHUB Actions (or similar) AWS fundamentals (e.g., AWS Certified Data Engineer) Knowledge of Snowflake/SQL Knowledge of Apache Airflow Knowledge of DBT Familiarity with Atlan for data catalog and metadata management Understanding of iceberg tables Who we are: Were a business with a global reach that empowers More ❯
data ecosystem (e.g., Pandas, NumPy) and deep expertise in SQL for building robust data extraction, transformation, and analysis pipelines. Hands-on experience with big data processing frameworks such as Apache Spark, Databricks, or Snowflake, with a focus on scalability and performance optimization. PREFERRED QUALIFICATIONS: Solid understanding of cloud infrastructure, particularly AWS, with practical experience using Docker, Kubernetes, and implementing More ❯
developing and implementing enterprise data models. Experience with Interface/API data modelling. Experience with CI/CD GITHUB Actions (or similar) Knowledge of Snowflake/SQL Knowledge of Apache Airflow Knowledge of DBT Familiarity with Atlan for data catalog and metadata management Understanding of iceberg tables Who we are: Were a business with a global reach that empowers More ❯
or MS degree in Computer Science or equivalent Experience in developing Finance or HR related applications Working experience with Tableau Working experience with Terraform Experience in creating workflows for Apache Airflow and Jenkins Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive More ❯
as Docker/Kubernetes. Expertise in problem solving and analysing global scale systems and critical production service environments. Strong debugging, testing/validation, and analytics/SQL(AO), skills, Apache, Experience working with Agile methodologies (Scrum) and cross-functional teams. Understand the mechanical sympathy between software workloads and the demand it places on the underlying hardware. Verbally and cognitively More ❯
ideally in a high-ownership, fast-paced environment. Nice to have: Experience working in the Payments, Fintech, or Financial Crime domain (e.g., fraud detection, AML, KYC). Experience with Apache Flink or other streaming data frameworks is highly desirable. Experience working in teams, building and maintaining data science and AI solutions. Experience integrating with third-party APIs, especially in More ❯
REST APIs and integration techniques Familiarity with data visualization tools and libraries (e.g. Power BI) Background in database administration or performance tuning Familiarity with data orchestration tools, such as Apache Airflow Previous exposure to big data technologies (e.g. Hadoop, Spark) for large data processing Strong analytical skills, including a thorough understanding of how to interpret customer business requirements and More ❯
as Docker/Kubernetes. Expertise in problem solving and analysing global scale systems and critical production service environments. Strong debugging, testing/validation, and analytics/SQL(AO), skills, Apache, Experience working with Agile methodologies (Scrum) and cross-functional teams. Understand the mechanical sympathy between software workloads and the demand it places on the underlying hardware. Verbally and cognitively More ❯
indexing, partitioning. Hands-on IaC development experience with Terraform or CloudFormation. Understanding of ML development workflow and knowledge of when and how to use dedicated hardware. Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. Experience with data quality More ❯
Gloucester, Gloucestershire, South West, United Kingdom Hybrid / WFH Options
Omega Resource Group
GitLab) Contributing across the software development lifecycle from requirements to deployment Tech Stack Includes: Java, Python, Linux, Git, JUnit, GitLab CI/CD, Oracle, MongoDB, JavaScript/TypeScript, React, Apache NiFi, Elasticsearch, Kibana, AWS, Hibernate, Atlassian Suite What's on Offer: Hybrid working and flexible schedules (4xFlex) Ongoing training and career development Exciting projects within the UK's secure More ❯
Hucclecote, Gloucestershire, United Kingdom Hybrid / WFH Options
Omega Resource Group
GitLab) Contributing across the software development lifecycle from requirements to deployment Tech Stack Includes: Java, Python, Linux, Git, JUnit, GitLab CI/CD, Oracle, MongoDB, JavaScript/TypeScript, React, Apache NiFi, Elasticsearch, Kibana, AWS, Hibernate, Atlassian Suite What s on Offer: Hybrid working and flexible schedules (4xFlex) Ongoing training and career development Exciting projects within the UK s secure More ❯
Gloucester, Gloucestershire, South West, United Kingdom
Anson Mccade
Python Strong experience developing on Linux Version control using Git Agile development (SCRUM) Working with both relational databases (Oracle) and NoSQL (MongoDB) Experience with GitLab CI/CD Pipelines , Apache NiFi , and Atlassian tools (Jira, Bitbucket, Confluence) Front-end skills: JavaScript/TypeScript, React Search and analytics tools: Elasticsearch, Kibana Nice to Have: Experience developing for AWS Cloud (EC2 More ❯
North West London, London, United Kingdom Hybrid / WFH Options
Anson Mccade
knowledge of Kafka , Confluent , and event-driven architecture Hands-on experience with Databricks , Unity Catalog , and Lakehouse architectures Strong architectural understanding across AWS, Azure, GCP , and Snowflake Familiarity with Apache Spark, SQL/NoSQL databases, and programming (Python, R, Java) Knowledge of data visualisation, DevOps principles, and ML/AI integration into data architectures Strong grasp of data governance More ❯
collaboratively Proficiency in multiple programming languages Technologies: Scala, Java, Python, Spark, Linux, shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience with process scheduling platforms like Apache Airflow Open to working with proprietary GS technologies such as Slang/SECDB Understanding of compute resources and performance metrics Knowledge of distributed computing, including parallel and cloud processing More ❯
and reliability across our platform. Working format: full-time, remote. Schedule: Monday to Friday (the working day is 8+1 hours). Responsibilities: Design, develop, and maintain data pipelines using Apache Airflow . Create and support data storage systems (Data Lakes/Data Warehouses) based on AWS (S3, Redshift, Glue, Athena, etc.). Integrate data from various sources, including mobile More ❯
MySQL, PostgreSQL, or Oracle. Experience with big data technologies such as Hadoop, Spark, or Hive. Familiarity with data warehousing and ETL tools such as Amazon Redshift, Google BigQuery, or Apache Airflow. Proficiency in Python and at least one other programming language such as Java, or Scala. Willingness to mentor more junior members of the team. Strong analytical and problem More ❯
data into a data platform using Fivetran. Experience of developing BI dashboards using Power BI. Knowledge of security concepts relevant to Azure. Experience of workflow management tools such as Apache Airflow. Interested in the role? Complete the online application. We look forward to getting to know you. Discover more about LGT Wealth Management A message from our CEO Ben More ❯