Engineer, ETL/ELT, Azure, Azure SQL, Azure Data Factory, Azure Data Storage, Azure Synapse, Azure Data Lake, DevOps, CI/CD, ApacheSpark, PySpark, SQL and Data Engineer) We have several fantastic new roles for Data Engineer (ETL/ELT, Azure, Azure SQL, Azure Data Factory, Azure … Data Storage, Azure Synapse, Azure Data Lake, DevOps, CI/CD, ApacheSpark, PySpark, SQL) to join an ambitious and award-winning Microsoft consultancy. They specialise in delivering innovative technology-based business solutions to investment banks, financial services companies, prestigious music/media label and many more. They … following: ETL/ELT, Azure, Azure SQL, Azure Data Factory, Azure Data Storage, Azure Synapse, Azure Data Lake, DevOps, CI/CD, ApacheSpark, PySpark and SQL. Knowledge of software development methodologies is of interest (Agile, Scrum). My client will provide ongoing training. Successful candidates are likely more »
Cloud experience (we primarily use GCP) Working with container technologies Skills we’d like to hear about Data processing frameworks (such as ApacheSpark, Apache Beam) Data orchestration and monitoring systems (such as Airflow) NoSQL databases for document storage (such as Elasticsearch) Message brokers, caching, and queuing … such as Kafka, Redis) Infrastructure-as-Code (IaC) using Terraform Management of data security, privacy, and lineage (from row level security to systems like Apache Range/Apache Atlas) Statically typed languages (such as Scala) data-build-tool (dbt) Process For technical roles, we ask applicants to submit more »
and experience with tools such as compilers, IDEs and systems for version control ( e.g. , git); Experience with data systems: analytics ( e.g. , Apache Hadoop, Spark, Flink, Hive), streaming ( e.g. , Apache Flink, Spark, Kafka, Pulsar, RabbitMQ), security and governance ( e.g. , Apache Ranger, Sentry, Knox, Atlas), catalogs ( e.g. … Collibra, Talend, Alation, DataHub), data and table formats ( e.g. , Apache Parquet, ORC, Avro, Iceberg, Delta Lake), quality ( e.g. , Apache Griffin), distributed query engines and databases ( e.g. , Presto, Dremio, Apache Pinot, Apache Druid, Oracle, Snowflake, AWS Redshift, Google BigQuery), DevOps. Google Anthos, AWS outpost and Azure ARC. more »
similar role. Strong proficiency in SQL and relational database systems (e.g., MySQL, PostgreSQL). Expertise in building ETL pipelines using tools like ApacheSpark, Apache Beam, or similar frameworks. Solid understanding of data warehousing and data modelling concepts. Proficient in programming languages such as Python, Scala, or more »
education and relevant experience Programming land querying languages e.g. SQL, R, Hive, Python, Java, Scala Data modelling, ETL data pipeline tools, e.g. ApacheSpark, Apache Kafka Experience demonstrating proficiency as a Data Engineer or similar role, delivering data products and services such as data modelling, data schemas more »
of the following; Distributed event streaming platforms, such as Apache Kafka or similar products, Data processing and analytics platforms, such as ApacheSpark, Apache Beam or their derivatives, Data warehouses in the cloud, such as AWS Redshift, GCP Bigquery and Azure Synapse, etc. We encourage you more »
Riseholme, LN2, Nettleham, Lincolnshire, Lincoln, UK Hybrid / WFH Options
Spinwell Global Limited
education and relevant experience Programming land querying languages e.g. SQL, R, Hive, Python, Java, Scala Data modelling, ETL data pipeline tools, e.g. ApacheSpark, Apache Kafka Experience demonstrating proficiency as a Data Engineer or similar role, delivering data products and services such as data modelling, data schemas more »
Data Scientist - Glasgow - Python, Pandas, NumPy, SciPy,Scikit-learn, TensorFlow, PyTorch, ApacheSpark, Hadoop, ML Ops, NLP, AWS, GCP. £60,000 to £90,000 We are looking for a highly motivated and experienced Data Scientist to join our clients team in Glasgow. The successful candidate will be part … Learning, Predictive Modelling, Data Visualisation, Data Mining, Statistical Analysis, Data Engineering Data Scientist - Glasgow - Python, Pandas, NumPy, SciPy, Scikit-learn, TensorFlow, PyTorch, ApacheSpark, Hadoop, ML Ops, NLP, AWS, GCP: £60,000 to £90,000 For more information about Shift F5 and the opportunities we have to offer more »
Employment Type: Permanent
Salary: £50000 - £90000/annum Mileage, Pension, Health Scheme
Exposure to running/maintaining ML models and MLOps is preferred. Experience with Big Data technologies and Distributed Systems such as Hadoop, HDFS, HIVE, Spark, Databricks, Cloudera. Experience with data lake formation and data warehousing principles. Experience with Datamodelling, Schema design and using semi-structured and structured data. Understanding … Parquet, ORC, Avro and the pro and cons of each type. Experience developing near Real Time event streaming pipelines with tools such as - Kafka, Spark Streaming, Apache Flink & Apache Beam. Strong experience in SQL and building Data Analytics. Good understanding of the differences and trade-offs between more »
such as Python, Java, or Scala. Strong knowledge of SQL and experience with data warehousing. Experience with data processing frameworks such as ApacheSpark or Apache Beam. Familiarity with cloud-based data platforms such as AWS, GCP, or Azure. Excellent communication skills and ability to work in more »
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Myles Roberts
and tools such as Kubernetes, Jenkins & Terraform. Strong understanding of software engineering best practices. Experience with big data processing technologies such as ApacheSpark or Apache Flink would be nice to have. more »
London, England, United Kingdom Hybrid / WFH Options
Blue Wolf Digital
such as Tableau , QlikView, Power BI, Adobe analytics, Google Analytics, DataStudio, Looker and any other similar applications is desirable Big data stack such as Spark, Hadoop, Flink or similar would be ideal but not mandatory CI/CD and modern engineering toolset desirable. Good engineering practices. Team Well versed more »
South East London, England, United Kingdom Hybrid / WFH Options
Blue Wolf Digital
such as Tableau , QlikView, Power BI, Adobe analytics, Google Analytics, DataStudio, Looker and any other similar applications is desirable Big data stack such as Spark, Hadoop, Flink or similar would be ideal but not mandatory CI/CD and modern engineering toolset desirable. Good engineering practices. Team Well versed more »
to transform and cleanse data for analytical purposes, while adhering to best practices. Proficient in data engineering tools and technologies such as ApacheSpark, Hadoop, AWS, Azure, GCP, and ETL frameworks. Strong knowledge of SQL and NoSQL databases, data modeling, and database design. Programming skills in languages like more »
Services but flexible Proficient in data processing and manipulation, preferably using Python Extensive experience within SQL/NoSQL databases and big data tools (Hadoop, Spark) a benefit as this will be something youll be involved with long term. Cloud-based data solutions (e.g., AWS, Google Cloud, Azure). Data more »
Services but flexible Proficient in data processing and manipulation, preferably using Python Extensive experience within SQL/NoSQL databases and big data tools (Hadoop, Spark) a benefit as this will be something you'll be involved with long term. Cloud-based data solutions (e.g., AWS, Google Cloud, Azure). more »
development methods; Experience working with cloud platforms or accreditation (Azure, AWS, GCP) ; Experience of working with big data technologies such as Hadoop, Presto, Redshift, Spark or similar; Experience with AI techniques such as NLP or Neural Networks; Experience with deep learning tools such as Keras, TensorFlow, PyTorch, MXNet is more »
Tests) Desirable experience/skills: Exposure to the asset management business and/or financial markets Familiarity with AWS, Kubernetes, Docker, Terraform Experience with Spark, Databricks, Airflow more »
South East London, England, United Kingdom Hybrid / WFH Options
McCabe & Barton
Tests) Desirable experience/skills: Exposure to the asset management business and/or financial markets Familiarity with AWS, Kubernetes, Docker, Terraform Experience with Spark, Databricks, Airflow more »
Kubeflow. Experience in Java/Scala and proficiency in cloud compute platforms (AWS, GCP, Azure, etc.). Hands-on experience with big data technologies (Spark, Hadoop or Hive, etc.). If you're captivated by the realms of ML Ops and eager to shape the future of cybersecurity, we more »
programming languages (Python etc.) Proficiency with cloud services (ideally AWS)Proficiency with large-scale streaming data (Kafka, Kinesis, etc.) Proficiency with Big Data systems (Spark, Hadoop, Pig, etc.) Proficiency with structured or custom ETL management (Airflow, Luigi, etc.) Proficiency with SQL databases (Relational, Snowflake, etc.) Excellent communication and leadership more »
City of London, London, United Kingdom Hybrid / WFH Options
Vanquis Banking Group
compliance. Nice to have: Familiarity with Infrastructure as Code (IaC) tools (e.g., Terraform, CloudFormation). Experience of data-frame processing (e.g. Pandas in Python, Spark in Scala/Python or Snowpark in Snowflake). Experience with Node.js or Java Experience in Regulated/FinTech environment. Working Conditions - Norwich/ more »
Bradford, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Vanquis Banking Group
compliance. Nice to have: Familiarity with Infrastructure as Code (IaC) tools (e.g., Terraform, CloudFormation). Experience of data-frame processing (e.g. Pandas in Python, Spark in Scala/Python or Snowpark in Snowflake). Experience with Node.js or Java Experience in Regulated/FinTech environment. Working Conditions - Norwich/ more »
Norwich, Norfolk, East Anglia, United Kingdom Hybrid / WFH Options
Vanquis Banking Group
compliance. Nice to have: Familiarity with Infrastructure as Code (IaC) tools (e.g., Terraform, CloudFormation). Experience of data-frame processing (e.g. Pandas in Python, Spark in Scala/Python or Snowpark in Snowflake). Experience with Node.js or Java Experience in Regulated/FinTech environment. Working Conditions - Norwich/ more »
Petersfield, Hampshire, South East, United Kingdom Hybrid / WFH Options
Vanquis Banking Group
compliance. Nice to have: Familiarity with Infrastructure as Code (IaC) tools (e.g., Terraform, CloudFormation). Experience of data-frame processing (e.g. Pandas in Python, Spark in Scala/Python or Snowpark in Snowflake). Experience with Node.js or Java Experience in Regulated/FinTech environment. Working Conditions - Norwich/ more »