Extensive Oracle Fusion and Oracle Integration Cloud Experience Oracle EBS Experience Experience of managed file transfer solutions such as Go Anywhere Knowledge of the ApacheAirflow platform This is an exciting opportunity to play a pivotal role within a huge transformation while working with critical national infrastructure. If More ❯
cloud security and Identity and Access Management (IAM). Knowledge of vector databases and retrieval-augmented generation (RAG). Data pipeline development experience using Airflow, Spark, or similar tools. Experience with AWS Lambda and DynamoDB Streams. Why you'll love working at WorkBuzz - Our culture is fast-paced and More ❯
data analytics platform on AWS, employing the AWS Cloud Development Kit (CDK). Construct resilient and scalable data pipelines using SQL/PySpark/Airflow to effectively ingest, process, and transform substantial data volumes from diverse sources into a structured format, ensuring data quality and integrity. Devise and implement More ❯
remove blockers to keep delivery on track. Your Qualifications: Strong familiarity with data integration, transformation, and orchestration workflows; experience with tools like Informatica, SnapLogic, Airflow, or DataBricks is a plus. Experience with Anaplan, other planning applications, or multi-dimensional modeling tools is highly desirable. Working knowledge of API-driven More ❯
data quality concepts, methodologies, and best practices. Proficiency in SQL and data querying for data validation and testing purposes. Hands-on experience with Snowflake, Airflow or Matillion would be ideal. Familiarity with data integration, ETL processes, and data governance frameworks. Solid understanding of data structures, relational databases, and data More ❯
Office skills, especially MS Excel. Proficiency with SQL (particularly SQL server for Oracle). Proficiency with Python. Experience with Azure- preferred but not required. Airflow/Cloud computing- preferred but not required. Abilities to lead and own workstreams from start to finish. The ideal candidate will be curious, persevering More ❯
experience working with CI/CD pipelines and containerization technologies like Docker and Kubernetes. Ideally, some familiarity with data workflow management tools such as Airflow as well as big data technologies such as Apache Spark/Ignite or other caching and analytics technologies. A working knowledge of FX More ❯
ability to work in an Agile/Scrum environment. Preferred Qualifications: AWS Certified Data Analytics - Specialty or AWS Certified Solutions Architect - Associate. Experience with Airflow for workflow orchestration. Exposure to big data frameworks such as Apache Spark, Hadoop, or Presto. Hands-on experience with machine learning pipelines and More ❯
Github integration and automation). Experience with scripting languages such as Python or R. Working knowledge of message queuing and stream processing. Experience with Apache Spark or Similar Technologies. Experience with Agile and Scrum Technologies. Familiarity with dbt and Airflow is an advantage. Experience working in a start More ❯
Github integration and automation). Experience with scripting languages such as Python or R. Working knowledge of message queuing and stream processing. Experience with Apache Spark or Similar Technologies. Experience with Agile and Scrum Technologies. Familiarity with dbt and Airflow is an advantage. Experience working in a start More ❯
Engineering languages such as Python, Scala, Java, SQL and a proven ability to learn new programming languages Experience with workflow orchestration tools such as Airflow Detailed problem-solving approach, coupled with a strong sense of ownership and drive A passionate bias to action and passion for delivering high-quality … visualizations skills to convey information and results clearly Experience with DevOps tools such as Docker, Kubernetes, Jenkins, etc. Experience with event messaging frameworks like Apache Kafka The hiring range for this position in Santa Monica, California is $136,038 to $182,490 per year, in Glendale, California is More ❯
with a track record of leading effective agile and lean software teams. Have a strong background in DevOps deploying, managing and maintaining services using Airflow, Docker, Terraform and AWS CLI tools to achieve infrastructure-as-code and automated deployments. Have excellent knowledge of AWS services (ECS, IAM, EC2, S3 … DynamoDB, MSK). Our Technology Stack: Python and Scala Starburst and Athena Kafka and Kinesis DataHub ML Flow and Airflow Docker and Terraform Kafka, Spark, Kafka Streams and KSQL DBT AWS, S3, Iceberg, Parquet, Glue and EMR for our Data Lake Elasticsearch and DynamoDB More information: Enjoy fantastic perks More ❯
automation. Working alongside experienced senior engineers, you'll bring a data engineering mindset to the team, building sophisticated systems that parallel orchestration tools like Airflow or Temporal. Rather than creating individual pipelines, you'll develop the frameworks and tools that allow users to create their own pipelines efficiently, while … the opportunity to work on Cloud Infrastructure, whether it be AWS, Azure or GCP. You've got experience with orchestration frameworks such as Temporal, Airflow or Dagster. You've had the opportunity to and enjoyed being part of a fast-paced and growing Software Engineering company. You're not More ❯
services experience is desired but not essential. API development (FastAPI, Flask) Tech stack : Azure, Python, Databricks, Azure DevOps, ChatGPT, Groq, Cursor AI, JavaScript, SQL, Apache Spark, Kafka, Airflow, Azure ML, Docker, Kubernetes and many more. Role Overview: We are looking for someone who is as comfortable developing AI More ❯
london, south east england, united kingdom Hybrid / WFH Options
Aventis Solutions
services experience is desired but not essential. API development (FastAPI, Flask) Tech stack : Azure, Python, Databricks, Azure DevOps, ChatGPT, Groq, Cursor AI, JavaScript, SQL, Apache Spark, Kafka, Airflow, Azure ML, Docker, Kubernetes and many more. Role Overview: We are looking for someone who is as comfortable developing AI More ❯
Scala. ? AI Frameworks: Extensive experience with AI frameworks and libraries, including TensorFlow, PyTorch, or similar. ? Data Processing: Expertise in big data technologies such as Apache Spark, Hadoop, and experience with data pipeline tools like Apache Airflow. ? Cloud Platforms: Strong experience with cloud services, particularly AWS, Azure, or Google More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Smart DCC
you be doing? Design and implement efficient ETL processes for data extraction, transformation, and loading. Build real-time data processing pipelines using platforms like Apache Kafka or cloud-native tools. Optimize batch processing workflows with tools like Apache Spark and Flink for scalable performance. Infrastructure Automation: Implement Infrastructure … Integrate cloud-based data services with data lakes and warehouses. Build and automate CI/CD pipelines with Jenkins, GitLab CI/CD, or Apache Airflow. Develop automated test suites for data pipelines, ensuring data quality and transformation integrity. Monitoring & Performance Optimization: Monitor data pipelines with tools like Prometheus More ❯
Analytics, Software Development, and Data Engineering. You'll a pivotal role in designing complex data pipelines, managing data warehouses using tools like dbt and Airflow, and enabling predictive analytics solutions for our diverse group of clients. As part of a consultancy, you'll have the opportunity to work closely … in a dynamic consultancy environment where creativity, autonomy, and teamwork are valued. Cutting-Edge Tools: Gain hands-on experience with leading tools like dbt, Airflow, Snowflake, and cloud platforms. Key Responsibilities Design and manage data warehouses using SQL, NoSQL, and cloud platforms. Develop ETL/ELT pipelines using AirflowMore ❯
pipelines using Python and pandas within a financial environment. Strong knowledge of relational databases and SQL. Familiarity with various technologies such as S3, Kafka, Airflow, Iceberg. Proficiency working with large financial datasets from various vendors. A commitment to engineering excellence and pragmatic technology solutions. A desire to work in … working with hierarchical reference data models. Proven expertise in handling high-throughput, real-time market data streams. Familiarity with distributed computing frameworks such as Apache Spark. Operational experience supporting real-time systems. Equal Opportunity Workplace We are proud to be an equal opportunity workplace. We do not discriminate based More ❯
Wakefield, Yorkshire, United Kingdom Hybrid / WFH Options
Flippa.com
data. Libraries Tools : Terraform, Flask, Pandas, FastAPI, Dagster, GraphQL, SQLAlchemy, GitLab, Athena. Your Trusted Companions : Docker, Snowflake, MongoDB, Relational Databases (eg MySQL, PostgreSQL), Dagster, Airflow/Luigi, Spark, Kubernetes. Your AWS Kingdom : Lambda, Redshift, EC2, ELB, IAM, RDS, Route53, S3-the building blocks of cloud mastery. Your Philosophy : Continuous … or machine learning libraries like NumPy, matplotlib, seaborn, scikit-learn, etc. Experience in building ETL/ELT processes and data pipelines with platforms like Airflow, Dagster, or Luigi. What's important for us: Academically Grounded : Bachelor's or Master's degree in Computer Science, Data Engineering, or related field. More ❯
reporting specific to each internal department. Required Experience & Skills: Technical Expertise: Proven experience with AWS/Azure/GCP (preferably AWS), Redshift/BigQuery, Apache Spark/EMR, Docker, Python, and SQL. Hands-on expertise with workflow orchestration tools (Prefect, Mage, or Airflow). Experience with MPP architectures More ❯
AI solutions using the Databricks Lakehouse (Delta Lake, Unity Catalog, MLflow). Design and lead the development of modular, high-performance data pipelines using Apache Spark and PySpark. Champion the adoption of Lakehouse architecture (bronze/silver/gold layers) to ensure scalable, governed data platforms. Collaborate with stakeholders … DevOps, and data reliability engineering (DRE) best practices across Databricks environments. Integrate with cloud-native services and orchestrate workflows using tools such as dbt, Airflow, and Databricks Workflows. Drive performance tuning, cost optimisation, and monitoring across data workloads. Mentor engineering teams and support architectural decisions as a recognised Databricks … expert. Essential Skills & Experience: Demonstrable expertise with Databricks and Apache Spark in production environments. Proficiency in PySpark, SQL, and working within one or more cloud platforms (Azure, AWS, or GCP). In-depth understanding of Lakehouse concepts, medallion architecture, and modern data warehousing. Experience with version control, testing frameworks More ❯
companies where years-long behemoth projects are the norm, our projects are fast-paced, typically 2 to 4 months long. Most are delivered using Apache Spark/Databricks on AWS/Azure and require you to directly manage the customer relationship alone or in collaboration with a Project Manager. … Databricks (PySpark, SQL, Delta Lake, Unity Catalog); You have extensive experience in ETL/ELT development and data pipeline orchestration (e.g., Databricks Workflows, DLT, Airflow, ADF, Glue, and Step Functions); You're proficient in SQL and Python , using them to transform and optimize data like a pro; You know … at DATAPAO, meaning that you'll get access to Databricks' public and internal courses to learn all the tricks of Distributed Data Processing, MLOps, Apache Spark, Databricks, and Cloud Migration from the best. Additionally, we'll pay for various data & cloud certifications, you'll get dedicated time for learning More ❯
Engineering languages such as Python, Scala, Java, SQL and a proven ability to learn new programming languages Experience with workflow orchestration tools such as Airflow Deep understanding of end-to-end pipeline design and implementation. Attention to detail and quality with excellent problem solving and interpersonal skills Preferred Qualifications … audience. Experience with DevOps tools such as Docker, Kubernetes, Jenkins, etc. Innate curiosity about consumer behavior and technology Experience with event messaging frameworks like Apache Kafka A fan of movies and television is a strong plus. Required Education Bachelor's degree in Computer Science, Information Systems, Software, Electrical or More ❯
responsible for the storing and processing of most of our data. You will work with other Data Engineers using tools such as Python, SQL, Airflow and Prefect to ensure quality and collaborate with other teams across the business. What you'll be working on: Implementing and supporting ETLs and … You should apply if you have: 3+ years of data engineering, analytics or machine learning experience Advanced skills in Python and SQL Experience with Airflow, Prefect or other task orchestration tools Familiarity with modern data stack - you know the current trends and what tools to use for the job … solutions Experience with documentation of system architecture Pandas, Jupyter, Plotly DBT, Kafka BI tools such as Tableau, Metabase and Superset The current tech stack: Airflow Clickhouse DBT Python MongoDB PostgreSQL MariaDB Kafka K8s AWS FXC Intelligence is a leading provider of cross-border payments data and intelligence, providing some More ❯