CMake Proficiency in developing cross-platform SDKs for Windows, macOS, Linux, WebAssembly and Embedded Platforms Knowledge of machine learning frameworks such as ONNXRuntime or Apache TVM Experience deploying and optimising real time embedded audio algorithms Familiarity with audio codecs, audio formats and audio streaming protocols is preferred Willingness to more »
Leeds, England, United Kingdom Hybrid / WFH Options
Harvey Nash
websites and web apps using HTML, PHP, Javascript Full stack development, Bootstrap, SQL Best practice PHP with an emphasis on secure development practices Linux, Apache/Nginx, PostgreSQL/MySQL, Bootstrap stack Creating scalable, clean and resilient solutions through code Version control through Git to manage the codebase efficiently more »
need to showcase a good understanding of modern AWS/Azure Well-Architected Frameworks along with demonstrable experience with SQL, Linux and web servers (Apache, Nginx) vs knowledge of containerisation and serverless paradigms. The right candidate will have a great attention to detail and strong analytical skills, with the more »
Azure Data Lake , Azure Databricks or GCP Cloud Dataproc . Familiarity with big data technologies and distributed computing frameworks, such as Hadoop, Spark, or Apache Flink. Experience scaling an “API-Ecosystem ”, designing, and implementing “API-First” integration patterns. Experience working with authentication and authorisation protocols/patterns. Other Information more »
Kubernetes/Docker or other container technologies. Scripting skills including Python and Bash . Good understanding of Linux and database technologies such as Ubuntu, Apache, PHP, MySQL, PostgreSQL, Nginx, Mercurial and Git. more »
development (ideally AWS) and container technologies Strong communication and interpersonal skills Experience managing projects and working with external third party teams Ideally experience with Apache Spark or Apache Flink (but not essential) Please note, this role is unable to provide sponsorship. If this role sounds of interest and more »
London, England, United Kingdom Hybrid / WFH Options
Version 1
Azure SQL Data Warehouse, Azure Data Lake, Azure Cosmos DB, Azure Stream Analytics Direct experience in building data pipelines using Azure Data Factory and Apache Spark (preferably Databricks). Experience building data warehouse solutions using ETL/ELT tools such as SQL Server Integration Services (SSIS), Oracle Data Integrator … ODI), Talend, and Wherescape Red. Experience with Azure Event Hub, IOT Hub, Apache Kafka, Nifi for use with streaming data/event-based data Experience with other Open Source big data products eg Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational/NoSQL data repositories (incl. more »
engineers of varying levels of experience. Flexibility and willingness to adapt to new software and techniques. Nice to Have Experience working with projects in Apache Spark, Databricks of similar. Expert cloud platform knowledge, e.g. Azure What will be your key responsibilities? A technical expert and leader on the Petcare more »
science or other related engineering fields Plusses: • Experience with React • Experience with MongoDB • Experience working on streaming technologies like Kaftka and distributed technologies like Apache Ignite • Experience working on AWS, GCP, Kubernetes, IaC • Experience working with C# • Financial industry experience • Experience with cloud like AWS, GCP, Azure (ideally GCP more »
step functions and ECS services. Strong understanding of AWS ecosystems like Lambdas, step functions and ECS services. Experience with data stack technologies, such as Apache Iceberg & Spark. Exposure to Apache Airflow, Prefect, Dagster, DBT. Expertise in data analysis with exposure to data services (such as Glue, Lake Formation more »
DBs Assisting in the development of high performing teams Demonstratable problem solving and ownership skills Nice to have: Understanding/knowledge/exposure to Apache, Tomcat, Container tools, SSO technologies, and monitoring tools but certainly not critical to the functionality of this position. The above is a wish list more »
Monitoring, Tuning, Housekeeping Experience working in an ITIL environment (desired) UNIX, Linux, AIX, Solaris (desired) Nice to have: Understanding/knowledge/exposure to Apache, Tomcat, Container tools, SSO technologies, and monitoring tools but certainly not critical to the functionality of this position. The above is a wish list more »
with data science libraries such as pandas, numpy, scipy, etc. An understanding of machine learning. UI skills for interacting with dashboards constructed using Grafana, Apache Superset, etc. Experience with databases (e.g., SQL). Deep data and statistical analysis knowledge. Maintaining high-quality code documentation. Insight into what assurance metrics more »
designing and building robust, scalable, distributed data systems and pipelines, using open source and public cloud technologies. Strong experience with data orchestration tools: e.g. Apache Airflow, Dagster. Experience with big data storage and processing technologies: e.g. DBT, Spark, SQL, Athena/Trino, Redshift, Snowflake, RDBMSs (PostgreSQL/MySQL). … Knowledge of event-driven architectures and streaming technologies: e.g. Apache Kafka, Kafka Streams, Apache Flink. Experience with public cloud environments: e.g. AWS, GCP, Azure, Terraform. Strong knowledge of software engineering practices: e.g. testing, CI/CD (Jenkins, Github Actions), agile development, git/version control, containers etc. Strong more »
AWS ecosystems like Lambdas, step functions and ECS services. Experience of Dremio is a nice to have Experience with data stack technologies, such as Apache Iceberg & Spark. Exposure to Apache Airflow, Prefect, Dagster, DBT. Expertise in data analysis with exposure to data services (such as Glue, Lake Formation more »
that incorporate various data backends, query languages and ORM frameworks. Experience designing and building ETL pipelines built around libraries and frameworks like Pandas and Apache Spark. Strong API design skills and a familiarity with building web applications. A proponent of great testing, first-class observability and automating everything. Familiarity more »
and transformation. 4. Develop and maintain ETL workflows, scripts, and data processing jobs using programming languages (e.g., Python, Java, Scala) and ETL tools (e.g., Apache Spark, Apache Airflow). 5. Identify and address data quality issues and implement data cleansing, validation, and enrichment processes. 6. Collaborate with software more »
technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud more »
technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud more »
technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud more »
technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud more »
technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud more »
technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud more »
technologies, languages, and techniques in the rapidly evolving world of high-volume data processing. Technologies We Use: Development languages/frameworks : Java/Scala, Apache Spark, Kafka, Vertica, JavaScript (React/Redux), MicroStrategy Amazon : EMR, Step Functions, SQS, LaMDA and AWS cloud-native architectures DevOps Tools : Terraform or Cloud more »