pandas, numpy, pyspark Good understanding of OOP, software design patterns, and SOLID principles Good experience in Docker Good experience in Linux Good experience in Airflow Good knowledge of cloud architecture Good experience in Terraform Expert experience with database systems (snowflake, sql, postgres etc.) Experience of micro-service development and more »
Manchester Area, United Kingdom Hybrid / WFH Options
Maxwell Bond®
ensuring best practises, quality in data transformation and modelling. Essential experience with tech including; GCP, SQL and DBT. Preferably working experience with: Kafka, Dataform, Airflow, Tableau, PowerBI, Redshift, Snowflake, Terraform and BigQuery. This position does not offer Visa Sponsorship , please refrain from applying if you require sponsorship at any more »
on with projects but no requirements, previous experience in this is essential SKILLS AND EXPERIENCE NEEDED: Experience in Redshift database (AWS) Experience with DBT, Airflow and Fivetran Working collaboratively with multiple teams across the business Management/mentoring experience of a team INTERVIEW PROCESS: 1st Stage- Initial Chat 2nd more »
Key technical experience: Ability to operate in a fast changing environment. Fluent in English Previous cloud based infrastructure experience, particularly with AWS. Experience using Airflow and dbt Expert SQL knowledge Solid understanding of Dimensional Data Modelling. Experience with at least one or more of these programming languages: Python, Scala …/Java Experience with distributed data and computing tools, mainly Apache Spark & Kafka Understanding of critical path approaches, how to iterate to build value, engaging with stakeholders actively at all stages. Able to deal with ambiguity and change. A self-starter who's able to work independently where necessary more »
hosted data platform that would be used by the entire firm. Tech Stack: Python/Java, Dremio, Snowflake, Iceberg, PySpark, Glue, EMR, dbt and Airflow/Dagster. Cloud: AWS, Lambdas, ECS services This role would focus on various areas of Data Engineering including: End to End ETL pipeline development … Implementing Data Curation, metadata management and data quality tooling. Requirements: Strong Python/Java Software Engineering skills Excellent AWS knowledge, ideally with exposure to Airflow, Glue, Iceberg and Snowflake Previous experience with Dremio, dbt, EMR or Dagster Good Computer Science fundamentals knowledge with strong knowledge of software and data more »
Role: Graduate Data Engineer Type: 12 months fixed-term Location: Peterborough Ready to utilise your skills to process and extract value from large datasets? Are you passionate about performing root cause analysis on various data? We have an exciting role more »
Cambridge, England, United Kingdom Hybrid / WFH Options
Corriculo Recruitment
a MedTech company with a growing reputation in their field. The Data Architect will use their skills and experience of technologies including Kubernetes, Kafka, Airflow, Linux, AWS and Python to spearhead the evolution and enhancement of their existing data infrastructure, as they scale up the adoption of their software … the Software as a Medical Device industry Experience of designing and implementing scalable and secure data infrastructure Hands-on experience with the Kubernetes, Kafka, Airflow, Python, Ubuntu Linux, AWS (or similar cloud platforms) Strong database skills using SQL - Any experience with PostgreSQL or MongoDB would be advantageous An understanding more »
or Angular good but not necessary) Agile The following is DESIRABLE, not essential: AWS or GCP Buy-side Data tools such as Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio Fixed Income performance, risk or attribution TypeScript and Node Role: Python Developer (Software Engineer Programmer Developer Python Fixed Income … work is largely down to you. It can be entirely Back End. Otherwise, the stack includes Redux Saga, Ag-Grid, Node, TypeScript, gRPC, protobuf, Apache Ignite, ApacheAirflow and AWS. As the application suite grows and advances in complexity, there is a decent amount of interaction with … the office 1-2 times a week. The tech environment is very new and will soon likely include exposure to the following: Glue, Athena, Airflow, Ignite, DBT, Arrow, Iceberg, Dremio This is an environment that has been described as the only corporate environment with a start-up/fintech more »
such as FiveTran or Stitch Experience using Business Intelligence tools such as PowerBI, Looker or Tableau Experience with data pipeline tools such as DBT, Airflow or Luigi are a plus! Experience using cloud environments e.g. Azure or AWS Understanding of the Agile delivery method Working Conditions: · Permanent, London Chiswick more »
range of databases. Snowflake is widely used, as are Docker and Kubernetes for containerisation. ETL and ELT tech are also used every day, primarily Airflow, Spark, Hive and a lot more. You’ll need to come from a strong academic background with some commercial experience in a data heavy more »
algorithms Expertise in popular data science platforms such as Alteryx and Python, including libraries and frameworks like NumPy, SciPy, Pandas, NLTK, TensorFlow, PyTorch, and Airflow Strong understanding of statistical analysis, encompassing distributions, statistical testing, regression, and other techniques Experience handling unstructured data sets Familiarity with software engineering principles and more »
tooling: containers (e.g., Docker) container orchestration (Kubernetes/K8s) CI/CD experience Version control (Git, Github, Gitlab) Orchestration/DAGs tools (e.g., Argo, Airflow, Kubeflow) Infrastructure as Code (Terraform, etc.) HOW TO APPLY Please register your interest by sending your CV to niall.wharton@@xcede.com or click the Apply more »
London, England, United Kingdom Hybrid / WFH Options
Aventum Group
Profisee), Snowflake Data Integration, Azure Service Bus, Deltalake, BigQuery, Azure DevOps, Azure Monitor, Azure Data Factory, SQL Server, Azure DataLake Storage, Azure App Service, ApacheAirflow, Apache Iceberg, Apache Spark, Apache Hudi, Apache Kafka, Power BI, BigQuery, Azure ML is a plus Experience with more »
of the company's data infrastructure. You will work with some of the most innovative tools in the market including Snowflake, AWS (Glue, S3), Apache Spark, ApacheAirflow and DBT!! The role is hybrid, with 2 days in the office in central London and the company is … Experience developing and maintaining data pipelines from scratch Data modelling, data integration and transformation experience Hands on work with tools such as Snowflake, AWS, Airflow, and DBT Proficiency in data manipulation, scripting and automation with Python Desirable: Experience leading teams Version control systems such as Git or Bitbucket Agile more »
Degree in Computer Science, Engineering, Management Information Systems, Mathematics, a related field, or equivalent work experience (3+ years) Experience in: Database orchestration technologies, specifically Airflow and/or DBT Experience with streaming data architectures, specifically Kafka Knowledge of semi structured data: Parquet, Avro, JSONA deep understanding of AWS Cloud more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
in SQL, NoSQL, Blob,Delta Lake, and other enterprise scale data stores. Data Orchestration - Enterprise scale usage of technology such as Azure Data Factory, ApacheAirflow, Logic Apps, DBT, SnapLogic, Spark or similar tools. Software Tooling - GIT/GitHub, CI/CD, deployment tools like Octopus, Terraform infrastructure more »
existing systems and ingestion pipelines. Requirements: Proven experience working with Python or Java or C# Experience working with ELT/ELT technologies such as Airflow, Argo, Dagster, Spark, Hive Strong technical expertise, especially in data processing and exploration, with a willingness to learn new technologies. A passion for automation more »