technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical discussions across industry groups. Excellent design experience, with entrepreneurship skills to own and lead solutions for clients. Excellent ETL skills, Data Modeling Skills. Excellent communication skills. Ability to define the … minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Required Skills : Mandatory Skills [at least 2 Hyperscalers]: GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF. Preferred Skills : Designing Databricks based More ❯
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical discussions across industry groups. Excellent design experience, with entrepreneurship skills to own and lead solutions for clients. Excellent ETL skills, Data Modeling Skills. Excellent communication skills. Ability to define the … minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Required Skills : Mandatory Skills [at least 2 Hyperscalers]: GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF. Preferred Skills : Designing Databricks based More ❯
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical discussions across industry groups. Excellent design experience, with entrepreneurship skills to own and lead solutions for clients. Excellent ETL skills, Data Modeling Skills. Excellent communication skills. Ability to define the … minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Required Skills : Mandatory Skills [at least 2 Hyperscalers]: GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF. Preferred Skills : Designing Databricks based More ❯
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical discussions across industry groups. Excellent design experience, with entrepreneurship skills to own and lead solutions for clients Ability to define the monitoring, alerting, deployment strategies for various services. Experience providing … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Designing Databricks based solutions for More ❯
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical discussions across industry groups. Excellent design experience, with entrepreneurship skills to own and lead solutions for clients Excellent ETL skills, Data Modeling Skills Excellent communication skills Ability to define the … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills: Designing Databricks based More ❯
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical discussions across industry groups. Excellent design experience, with entrepreneurship skills to own and lead solutions for clients Excellent ETL skills, Data Modeling Skills Excellent communication skills Ability to define the … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills: Designing Databricks based More ❯
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical discussions across industry groups. Excellent design experience, with entrepreneurship skills to own and lead solutions for clients Excellent ETL skills, Data Modeling Skills Excellent communication skills Ability to define the … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills: Designing Databricks based More ❯
London, England, United Kingdom Hybrid / WFH Options
AlphaSights
and well-tested solutions to automate data ingestion, transformation, and orchestration across systems. Own data operations infrastructure: Manage and optimise key data infrastructure components within AWS, including Amazon Redshift, ApacheAirflow for workflow orchestration, and other analytical tools. You will be responsible for ensuring the performance, reliability, and scalability of these systems to meet the growing demands of … data pipelines, data warehouses, and leveraging AWS data services. Strong proficiency in DataOps methodologies and tools, including experience with CI/CD pipelines, containerized applications, and workflow orchestration using Apache Airflow. Familiar with ETL frameworks, and bonus experience with Big Data processing (Spark, Hive, Trino), and data streaming. Proven track record – You’ve made a demonstrable impact in your More ❯
London, England, United Kingdom Hybrid / WFH Options
KennedyPearce Consulting
S3, Redshift, RDS, Glue, Lambda, IAM). Strong expertise in Terraform Proficient in SQL for querying relational databases and handling large datasets. Experience with data pipeline orchestration tools (e.g., ApacheAirflow, AWS Step Functions). Familiarity with CI/CD pipelines and version control systems (e.g., Git). Knowledge of data warehousing concepts and best practices. Benefits : Competitive More ❯
writing, optimization techniques, data modeling, and database performance tuning. Skilled in working with large datasets, building stored procedures, functions, and triggers, and implementing ETL processes. Have used products like ApacheAirflow, DBT, Gitlab/Github, BigQuery Demonstrable experience in Data Modelling including working with denormalised data structures, testing, asserts and data cleansing The other stuff we are looking More ❯
standards while collaborating with cross-functional teams to optimise data flows and support data-driven decision-making. 3 5 years of experience in data integration, orchestration, or automation roles ApacheAirflow, MuleSoft, Dell Boomi, Informatica Cloud). Familiarity with cloud data platforms (e.g., AWS, Microsoft Azure, Google Cloud Platform) and related data movement technologies, including AWS Lambda and More ❯
driven decision-making. What we'd like to see from you: 3–5 years of experience in data integration, orchestration, or automation roles Solid experience with orchestration tools (e.g., ApacheAirflow, MuleSoft, Dell Boomi, Informatica Cloud). Familiarity with cloud data platforms (e.g., AWS, Microsoft Azure, Google Cloud Platform) and related data movement technologies, including AWS Lambda and More ❯
driven decision-making. What we'd like to see from you: 3–5 years of experience in data integration, orchestration, or automation roles Solid experience with orchestration tools (e.g., ApacheAirflow, MuleSoft, Dell Boomi, Informatica Cloud). Familiarity with cloud data platforms (e.g., AWS, Microsoft Azure, Google Cloud Platform) and related data movement technologies, including AWS Lambda and More ❯
driven decision-making. What we'd like to see from you: 3–5 years of experience in data integration, orchestration, or automation roles Solid experience with orchestration tools (e.g., ApacheAirflow, MuleSoft, Dell Boomi, Informatica Cloud). Familiarity with cloud data platforms (e.g., AWS, Microsoft Azure, Google Cloud Platform) and related data movement technologies, including AWS Lambda and More ❯
data engineering or a related field. Proficiency in programming languages such as Python, Spark, SQL. Strong experience with SQL databases. Expertise in data pipeline and workflow management tools (e.g., ApacheAirflow, ADF). Experience with cloud platforms (Azure preferred) and related data services. There’s no place quite like BFS and we’re proud of that. And it More ❯
data engineering or a related field. Proficiency in programming languages such as Python, Spark, SQL. Strong experience with SQL databases. Expertise in data pipeline and workflow management tools (e.g., ApacheAirflow, ADF). Experience with cloud platforms (Azure preferred) and related data services. There's no place quite like BFS and we're proud of that. And it More ❯
Snowflake Architect - ETL, AIRFLOW, AWS, SQL, Python, and ETL tools (Streamsets, DBT), RDBMS A Snowflake Architect is required for a long-term project with a fast-growing company. Responsibilities: -Design, develop, and maintain robust data pipelines and ETL processes using Snowflake on AWS. -Implement data warehousing solutions, ensuring efficient storage, retrieval, and transformation of large datasets. -Collaborate with data … warehousing concepts and data modeling. >Excellent problem-solving and communication skills, with a focus on delivering high-quality solutions. >Understanding/hands on experience in Orchestration solutions such as Airflow >Deep knowledge of key non-functional requirements such as availability, scalability, operability, and maintainability More ❯
warehousing concepts and data modeling. >Excellent problem-solving and communication skills, with a focus on delivering high-quality solutions. >Understanding/hands on experience in Orchestration solutions such as Airflow >Deep knowledge of key non-functional requirements such as availability, scalability, operability, and maintainability More ❯
managers, analysts, our supply partners, and our travelers. Our work spans across a variety of datasets and ML models and across a diverse technology stack ranging from Spark, Sagemaker, Airflow, Databricks, Kubernetes, AWS and much more! What you will do: Work in a cross-functional team of Machine Learning engineers and Machine Learning Science to design and code large … preferably in Spark Good understanding of machine learning pipelines and machine learning frameworks such as TensorFlow and Pytorch Familiar with cloud services (e.g., AWS) and workflow orchestration tools (e.g., Airflow) Experience working with Agile/Scrum methodologies. Familiar with the e-commerce or travel industry. #J-18808-Ljbffr More ❯
Experience with MarTech integration and customer data platforms (CDPs) such as Adobe Experience Platform, Salesforce Data Cloud, or Segment. Hands-on experience with tools like SQL, Python, Spark, Kafka, Airflow, or dbt. More ❯
Data Storage & Databases: SQL & NoSQL Databases: Experience with databases like PostgreSQL, MySQL, MongoDB, and Cassandra. Big Data Ecosystems: Hadoop, Spark, Hive, and HBase. Data Integration & ETL: Data Pipelining Tools: Apache NiFi, Apache Kafka, and Apache Flink. ETL Tools: AWS Glue, Azure Data Factory, Talend, and Apache Airflow. AI & Machine Learning: Frameworks: TensorFlow, PyTorch, Scikit-learn, Keras More ❯
and guidance to colleagues at all levels. The ideal candidate must be proficient within an agile delivery environment, and excellent knowledge of DBT & Snowflake, and other technologies including SQL, Airflow, Power BI & Azure Data Factory would be beneficial. The RAC engineering team revolves around a platform mindset, as a Senior Data Engineer, you will extend this culture and ensure … Influence interface with the business and make sense of complicated or incomplete requests What you will need... Great knowledge of technologies including but not limited to: DBT, SQL, Snowflake, Airflow, Azure Data Factory, PowerBI. Be able to work with minimal supervision in a dynamic and timeline sensitive work environment. A strong understanding of agile data development methodologies, values, and More ❯
of our team in the UK. Responsibilities Technical Leadership Design and implement scalable data architectures and pipelines Work across cloud platforms (Azure, GCP, AWS) and orchestration tools (e.g. dbt, Airflow) Build AI-ready data infrastructure for analytics and data science use cases Lead technical delivery across multiple client engagements Client Engagement & Strategy Act as a technical advisor and thought … of SQL and Spark Hands-on with Azure and Databricks (GCP/AWS also valued) Skilled in data governance, ingestion/transformation, and metadata management Familiarity with tools like Airflow, dbt, Power BI, Tableau Exposure to machine learning or advanced analytics techniques Strategic & Leadership Skills Ability to translate technical solutions into business value for senior stakeholders Team leadership, mentoring More ❯
revenue data in a cloud environment Ensure models are robust, explainable, and production-ready Tech Environment Python (NumPy, Pandas, scikit-learn) SQL GCP (BigQuery, Cloud Functions) Deployment: Docker, Kubernetes, Airflow Git, CI/CD pipelines Tableau (optional) Requirements Proven expertise in time series forecasting at multiple projects Strong Python and SQL skills Experience deploying models into production in a More ❯
warehousing concepts and data modeling. > Excellent problem-solving and communication skills, with a focus on delivering high-quality solutions. > Understanding/hands on experience in Orchestration solutions such as Airflow > Deep knowledge of key non-functional requirements such as availability, scalability, operability, and maintainability More ❯