Lead Data Engineer Location: York - 2 days per week onsite/3 days WFH Start: ASAP Duration: 6-12 months Requirements: Snowflake DBT/Airflow Azure PowerBI/Dax Traditional SQL JIRA, Confluence, (GitHub/BitBucket) Azure Data Factory more »
expertise with either AWS or GCP Strong python experience Exposure to medallion architecture 👍 Bonus points for: Supporting experience with tech like Athena, RedShift, BigQuery, Airflow, Kinesis, Kafka (or similar) The ability to contribute to technical decisioning Experience working with a high volume of data pipelines more »
to onboard strategies and desks efficiently Working with time-series data sets, and building the data infra for the desk Stack: Python, SQL, MongoDB, Airflow The firm can offer market leading salaries, collaboration with London's best technologists, and also a hybrid work model. If this sounds of interest more »
Manchester Area, United Kingdom Hybrid / WFH Options
Maxwell Bond®
ensuring best practises, quality in data transformation and modelling. Essential experience with tech including; GCP, SQL and DBT. Preferably working experience with: Kafka, Dataform, Airflow, Tableau, PowerBI, Redshift, Snowflake, Terraform and BigQuery. This position does not offer Visa Sponsorship , please refrain from applying if you require sponsorship at any more »
requirements and achieving impactful outcomes. A detail-oriented mindset and a reputation for maintaining high standards in your work. Tech Stack : dbt, SQL, Snowflake, Airflow, AWS, Tableau. DE&I is also at the heart of the business and they strongly believe that ensuring diversity of background and experience will more »
range of databases. Snowflake is widely used, as are Docker and Kubernetes for containerisation. ETL and ELT tech are also used every day, primarily Airflow, Spark, Hive and a lot more. You’ll need to come from a strong academic background with some commercial experience in a data heavy more »
Abingdon-On-Thames, England, United Kingdom Hybrid / WFH Options
Mirus Talent
mandatory, familiarity with the following technologies and tools would be advantageous: Dagster (or similar Orchestration Tools) : Experience with Dagster or other orchestration tools like Airflow for managing complex data workflows and pipelines. Qlik Sense Cloud (or similar Reporting Tools) : Knowledge of Qlik Sense Cloud or similar reporting tools such more »
algorithms Expertise in popular data science platforms such as Alteryx and Python, including libraries and frameworks like NumPy, SciPy, Pandas, NLTK, TensorFlow, PyTorch, and Airflow Strong understanding of statistical analysis, encompassing distributions, statistical testing, regression, and other techniques Experience handling unstructured data sets Familiarity with software engineering principles and more »
Bradford, England, United Kingdom Hybrid / WFH Options
HCLTech
pub/sub, dataflow, dataproc, big query, cloud sql) knowledge in containers and container orchestration CI/CD experience version control (GIT) Orchestration tools ( airflow or cloud composer more »
our customers. Skills and experience we’re looking for: Experience of creating advanced visualisations in Tableau. Experience in SQL and Python. Experience of AWS, Airflow, S3 and working with Snowflake in a large complex organisation is advantageous. Experience of establishing processes to identify and managing issues, in data or more »
Staines-Upon-Thames, England, United Kingdom Hybrid / WFH Options
IFS
diverse environments, leveraging Azure and other modern technologies. Proven ability to orchestrate complex data workflows and manage Kubernetes clusters on AKS , utilizing tools like Airflow, Kubeflow, Argo, and Dagster. Familiarity with data ingestion tools such as Airbyte and Fivetran, accommodating a wide array of data sources. Mastery of large more »
Proven experience in MLOps and deploying machine learning models on Kubernetes. Proficiency in cloud technologies, AWS, GCP, Azure Experience with data orchestration tools (e.g., ApacheAirflow). Familiarity with Terraform for CI/CD and infrastructure as code. Strong programming skills in software development. A cloud-agnostic mindset more »
London, England, United Kingdom Hybrid / WFH Options
Parkopedia
of web development principles (HTTP, RESTful APIs) and data structures Experience with data retrieval, transformation, and manipulation techniques using Python and tools such as ApacheAirflow Commercial experience with AWS and IaC (Terraform/CDK/CloudFormation) Applicable understanding of API security, common exploits and secure development practices more »
data quality checks and monitoring processes. KEY SKILLS Proficiency in SQL and data querying validation and testing purposes. Hands on experience with Snowflake or Airflow or DBT. Familiarity with data integration, ETL processes, and data governance frameworks. Solid understanding of data structures, relational databases and data modelling concepts. Experience more »
learning systems. Expert in Python. Knowledge of common ML libraries like Pandas, NumPy and Scikit-Learn. Experience with ML workflow orchestration tools (Kubeflow, MLFlow, Airflow etc). Experience with Vertex AI Understanding of CI/CD pipelines and DevOps processes. Ability to collaborate with data scientists and software engineers. more »
South East London, London, United Kingdom Hybrid / WFH Options
Stepstone UK
TMS (Tealium IQ, GTM and Adobe Dynamic Tag Manager) changes. Integrate data sources via web and REST APIs. Data pippingand modelling using SQL, DBT, Airflow, ETL, Data Warehousing, Redshift and Python. Transfer knowledge of the business processes and requirements to the development teams. Collaborate with Product, Marketing and Development more »
Staines-Upon-Thames, England, United Kingdom Hybrid / WFH Options
IFS
for advanced content analysis and indexing & developing RAG services . Experience in managing data workflows and Kubernetes clusters on AKS, utilizing tools such as Airflow, Kubeflow, Argo, and Dagster. Familiarity using scripting languages and tools such as Bash, PowerShell, Azure CLI, Terraform, and Helm Charts. Additional Information Location : This more »
S3, Lambda, Aurora, Docker/Kubernetes. AWS Cloud Management and Optimisation: Manageandoptimize cloud-based infrastructureonAWS, ensuring security,scalability, reliability, and cost-efficiency. MUST HAVE: ApacheAirflow, AWS networking concepts such as VPN, VPC peering, subnets, security groups, NAT gateways and how to implement scalable and fault tolerant network more »
City of London, London, United Kingdom Hybrid / WFH Options
ECS Resource Group
role in shaping the technological landscape of our projects. Key Responsibilities: Architect end-to-end solutions, leveraging your expertise in Python, Java, Spark, Trino, Airflow, and Hadoop. Maintain a hands-on approach, ensuring that your architectural vision translates seamlessly into implementation. Collaborate with cross-functional teams to design and more »
data manipulation and analysis Experience with Kubernetes and Docker for scalable infrastructure deployment Hands on experience with CI/CD and Orchestration tools eg. Airflow, Jenkins, Bamboo, etc. Experience with MongoDB is highly beneficial Knowledge of data pipeline development and optimization Familiarity with data science principles and workflows is more »
NLP approaches like Word2Vec or BERT, including identifying the right KPIs and objective functions. Experience working with big data systems (Spark, EMR, Kafka, S3, Airflow) and programming languages (Java, Scala, or Python). Experience building in-production Machine Learning systems Good understanding of system architecture, including experience with big more »
Staines-Upon-Thames, England, United Kingdom Hybrid / WFH Options
IFS
Langchain, Semantic Kernel, and tools like MS tooling, Co-Pilot Studio, ML Studio, Prompt flow, Kedro, etc. Proficiency in pipeline orchestration with tools like Airflow, Kubeflow, and Argo . Exceptional communication skills, with the ability to articulate complex statistical concepts clearly. Personal Competencies: A results-oriented professional with a more »
Infrastructure AWS Python Java Go JavaScript Big Data Lake Data Mesh Kubernetes Terraform Finance Trading Glue Athena Dremio Iceberg Snowflake DBT Arrow gRPC protobuf Airflow Ignite Asset Manager Investment Management Financial Services Hedge Fund) required by our asset management client in London. You MUST have the following: Strong experience … in an enterprise scale environment The following is DESIRABLE, not essential: SRE for big data Glue, Athena, Dremio, Iceberg, Snowflake, DBT, Arrow, gRPC, protobuf, Airflow, Ignite Graffana, Prometheus Role: Site Reliability Engineer (SRE DevOps Infrastructure AWS Python Java Go JavaScript Big Data Lake Data Mesh Kubernetes Terraform Finance Trading … Glue Athena Dremio Iceberg Snowflake DBT Arrow gRPC protobuf Airflow Ignite Asset Manager Investment Management Financial Services Hedge Fund) required by our asset management client in London. You will join a team 6 data engineers who are responsible for core engineering of a big data environment on AWS. You more »
4.Monitoring and Logging: Implement and maintain monitoring, logging, and alerting solutions. Key technologies: AWS, VPN, VPC Peering, EC2, S3, Lambda, Aurora, Docker/Kubernetes. ApacheAirflow, AWS networking concepts such as VPN, VPC peering, subnets, security groups, NAT gateways.AWS CloudWatch or equivalent. Kafka or similar data streaming components. more »
on with projects but no requirements, previous experience in this is essential SKILLS AND EXPERIENCE NEEDED: Experience in Redshift database (AWS) Experience with DBT, Airflow and Fivetran Working collaboratively with multiple teams across the business Management/mentoring experience of a team INTERVIEW PROCESS: 1st Stage- Initial Chat 2nd more »