pipelines Hands-on experience with Agile (Scrum) methodologies Database experience with Oracle and/or MongoDB Experience using the Atlassian suite : Bitbucket, Jira, and Confluence Desirable Skills Knowledge of Apache NiFi Front-end development with React (JavaScript/TypeScript) Working knowledge of Elasticsearch and Kibana Experience developing for cloud environments, particularly AWS (EC2, EKS, Fargate, IAM, S3, Lambda) Understanding More ❯
products or platforms Strong knowledge of SQL and experience with large-scale relational and/or NoSQL databases Experience testing data pipelines (ETL/ELT), preferably with tools like Apache Airflow, dbt, Spark, or similar Proficiency in Python or similar scripting language for test automation Experience with cloud platforms (AWS, GCP, or Azure), especially in data-related services Familiarity More ❯
Gloucester, Gloucestershire, South West, United Kingdom
Anson Mccade
Python Strong experience developing on Linux Version control using Git Agile development (SCRUM) Working with both relational databases (Oracle) and NoSQL (MongoDB) Experience with GitLab CI/CD Pipelines , Apache NiFi , and Atlassian tools (Jira, Bitbucket, Confluence) Front-end skills: JavaScript/TypeScript, React Search and analytics tools: Elasticsearch, Kibana Nice to Have: Experience developing for AWS Cloud (EC2 More ❯
Python Strong experience developing on Linux Version control using Git Agile development (SCRUM) Working with both relational databases (Oracle) and NoSQL (MongoDB) Experience with GitLab CI/CD Pipelines , Apache NiFi , and Atlassian tools (Jira, Bitbucket, Confluence) Front-end skills: JavaScript/TypeScript, React Search and analytics tools: Elasticsearch, Kibana Nice to Have: Experience developing for AWS Cloud (EC2 More ❯
with multiple languages • Technologies: Scala, Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) • Experience in working with process scheduling platforms like Apache Airflow. • Should be ready to work in GS proprietary technology like Slang/SECDB • An understanding of compute resources and the ability to interpret performance metrics (e.g., CPU, memory More ❯
Are A degree in computer science, engineering, mathematics or a related technical field Experience with object-oriented programming preferred General familiarity with some of the technologies we use: Python, Apache Spark/PySpark, Java/Spring Amazon Web Services SQL, relational databases Understanding of data structures and algorithms Interest in data modeling, visualisation, and ETL pipelines Knowledge of financial More ❯
multiple programming languages Technologies: Scala, Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience in working with process scheduling platforms like Apache Airflow. Open to working in GS proprietary technology like Slang/SECDB An understanding of compute resources and the ability to interpret performance metrics (e.g., CPU, memory, threads, file More ❯
with multiple languages • Technologies: Scala, Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) • Experience in working with process scheduling platforms like Apache Airflow. • Should be ready to work in GS proprietary technology like Slang/SECDB • An understanding of compute resources and the ability to interpret performance metrics (e.g., CPU, memory More ❯
translate concepts into easily understood diagrams and visuals for both technical and non-technical people alike. AWS cloud products (Lambda functions, Redshift, S3, AmazonMQ, Kinesis, EMR, RDS (Postgres)). Apache Airflow for orchestration. DBT for data transformations. Machine Learning for product insights and recommendations. Experience with microservices using technologies like Docker for local development. Apply engineering best practices to More ❯
field. Technical Skills Required Hands-on software development experience with Python and experience with modern software development and release engineering practices (e.g. TDD, CI/CD). Experience with Apache Spark or any other distributed data programming frameworks. Comfortable writing efficient SQL and debugging on cloud warehouses like Databricks SQL or Snowflake. Experience with cloud infrastructure like AWS or More ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
Azure Storage, Medallion Architecture, and working with data formats such as JSON, CSV, and Parquet. • Strong understanding of IT concepts, including security, IAM, Key Vault, and networking. • Exposure to Apache Airflow and DBT is a bonus. • Familiarity with agile principles and practices. • Experience with Azure DevOps pipelines. The "Nice to Haves": • Certification in Azure or related technologies. • Experience with More ❯
Are A degree in computer science, engineering, mathematics or a related technical field Experience with object-oriented programming preferred General familiarity with some of the technologies we use: Python, Apache Spark/PySpark, Java/Spring Amazon Web Services SQL, relational databases Understanding of data structures and algorithms Interest in data modeling, visualisation, and ETL pipelines Knowledge of financial More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Made Tech
strategies. Strong experience in IaC and able to guide how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop. Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes) Ability to create data pipelines on a cloud More ❯
Nottingham, England, United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (AI-Driven SaaS Platform) Technologies: Python, Snowflake, Data Modelling, ETL/ELT, Apache Airflow, Kafka, AWS Environment: Large-scale data environment, Fully remote UK, Microservices architecture About the Role Are you a Data Engineering enthusiast who thrives on designing and implementing robust ETL processes, highly scalable data structures, and data pipelines within an enterprise-scale data processing … infrastructure background, understanding of system migrations, and experience with data warehousing concepts. Technical Skills Deep understanding of SQL and NoSQL databases (MongoDB or similar) Experience with streaming platforms like Apache Kafka Development and maintenance of ELT pipelines Knowledge of data warehousing best practices High proficiency in Apache Kafka and Apache Airflow Strong AWS experience Additional Attributes Agile More ❯
the contract. Benefits include Medical, Dental, Vision, 401k with company matching, and life insurance. Rate: $80 - $86/hr W2 Responsibilities: Develop, optimize, and maintain data ingestion flows using Apache Kafka, Apache Nifi, and MySQL/PostgreSQL. Develop within AWS cloud services such as RedShift, SageMaker, API Gateway, QuickSight, and Athena. Coordinate with data owners to ensure proper … analysis, data visualization, and machine learning techniques. Proficiency in programming languages such as Python, R, and Java. Experience in building modern data pipelines and ETL processes with tools like Apache Kafka and Apache Nifi. Proficiency in Java, Scala, or Python programming. Experience managing or testing API Gateway tools and Rest APIs. Knowledge of traditional databases like Oracle, MySQL More ❯
Nottingham, Nottinghamshire, United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (AI-Driven SaaS plaform) (Python, Snowflake, Data Modelling, ETL/ELT, Apache Airflow, Kafka, AWS) Large-scale data environment Up to £70,000 plus benefits FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalable data structures and data pipelines within a truly enterprise-scale data processing … platform integrates Python and Snowflake and you'll need a deep understanding of SQL and NoSQL databases (MongoDB or similar!) You'll also have experience with streaming platforms like Apache Kafka and be able to develop and maintain ELT and essentially bring a solid understanding of data warehousing concepts and best practice. You will understanding Apache Kafka to … a high standard and have solid knowledge of Apache Airflow - from a Cloud perspective, you will be an AWS enthuiast! Naturally you will have good understanding on AWS. I'd love you to be an advocate of Agile too - these guys are massive on Agile Delivery and Scrum - so it's importantly you share a similar mind-set and More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Eden Scott
cutting-edge technologies. About the Role You’ll be part of an agile, cross-functional team building a powerful data platform and intelligent search engine. Working with technologies like Apache Lucene, Solr, and Elasticsearch, you'll contribute to the design and development of scalable systems, with opportunities to explore machine learning, AI-driven categorisation models, and vector search. What … You’ll Be Doing Design and build high-performance data pipelines and search capabilities. Develop solutions using Apache Lucene, Solr, or Elasticsearch. Implement scalable, test-driven code in Java and Python. Work collaboratively with Business Analysts, Data Engineers, and UI Developers. Contribute across the stack – from React/TypeScript front end to Java-based backend services. Leverage cloud infrastructure … leading data sets. Continuous improvements to how data is processed, stored, and presented. Your Profile Strong experience in Java development, with some exposure to Python. Hands-on knowledge of Apache Lucene, Solr, or Elasticsearch (or willingness to learn). Experience in large-scale data processing and building search functionality. Skilled with SQL and NoSQL databases. Comfortable working in Agile More ❯
Strong knowledge of relational and NoSQL databases (e.g., PostgreSQL, MongoDB) and data modeling principles. Proven ability to design, build, and maintain scalable data pipelines and workflows using tools like Apache Airflow or similar. Strong problem-solving and analytical skills. Excellent communication and collaboration skills. Nice to have: Hands-on experience with data warehouse and lakehouse architectures (e.g., Databricks, Snowflake … or similar). Experience with big data frameworks (e.g., Apache Spark, Hadoop) and cloud platforms (e.g., AWS, Azure, or GCP). #J-18808-Ljbffr More ❯
Sheffield, England, United Kingdom Hybrid / WFH Options
Autodesk
such as AWS, Azure, or GCP · Docker · Documenting code, architectures, and experiments · Linux systems and bash terminals Preferred Qualifications o Databases and/or data warehousing technologies, such as Apache Hive, Iceberg etc. o Data transformation via SQL and DBT. o Orchestration platforms such as Apache Airflow, Argo Workflows, etc. o Data catalogs and metadata management tools o More ❯
Bath, England, United Kingdom Hybrid / WFH Options
Autodesk
such as AWS, Azure, or GCP · Docker · Documenting code, architectures, and experiments · Linux systems and bash terminals Preferred Qualifications o Databases and/or data warehousing technologies, such as Apache Hive, Iceberg etc. o Data transformation via SQL and DBT. o Orchestration platforms such as Apache Airflow, Argo Workflows, etc. o Data catalogs and metadata management tools o More ❯
Birmingham, England, United Kingdom Hybrid / WFH Options
Autodesk
such as AWS, Azure, or GCP · Docker · Documenting code, architectures, and experiments · Linux systems and bash terminals Preferred Qualifications o Databases and/or data warehousing technologies, such as Apache Hive, Iceberg etc. o Data transformation via SQL and DBT. o Orchestration platforms such as Apache Airflow, Argo Workflows, etc. o Data catalogs and metadata management tools o More ❯
Cambridge, England, United Kingdom Hybrid / WFH Options
Autodesk
such as AWS, Azure, or GCP · Docker · Documenting code, architectures, and experiments · Linux systems and bash terminals Preferred Qualifications o Databases and/or data warehousing technologies, such as Apache Hive, Iceberg etc. o Data transformation via SQL and DBT. o Orchestration platforms such as Apache Airflow, Argo Workflows, etc. o Data catalogs and metadata management tools o More ❯
Stockton-on-Tees, England, United Kingdom Hybrid / WFH Options
Autodesk
such as AWS, Azure, or GCP · Docker · Documenting code, architectures, and experiments · Linux systems and bash terminals Preferred Qualifications o Databases and/or data warehousing technologies, such as Apache Hive, Iceberg etc. o Data transformation via SQL and DBT. o Orchestration platforms such as Apache Airflow, Argo Workflows, etc. o Data catalogs and metadata management tools o More ❯
in Microsoft Fabric and Databricks, including data pipeline development, data warehousing, and data lake management Proficiency in Python, SQL, Scala, or Java Experience with data processing frameworks such as Apache Spark, Apache Beam, or Azure Data Factory Strong understanding of data architecture principles, data modelling, and data governance Experience with cloud-based data platforms, including Azure and or More ❯