integration architecture and design skills Good communication skills Desirable Skills: JavaScript – with React/Vue being even better. Docker/Kubernetes Linux – Basic sysadmin (Apache, Nginx) SQL/Oracle/PostgreSQL/MongoDB/DynamoDB Message Queues – RabbitMQ or similar AWS or GCP This an office based position more »
have a valid visa as we Are not able to sponsor Technical Stack:- Python, Postgres SQL, Azure Databricks, AWS(S3), Git, Azure DevOps CICD, Apache Airflow Skills years of experience in python scripting. in developing applications in Python language. to python-oriented Algorithm’s libraries such as NumPy, pandas more »
integration architecture and design skills Good communication skills Desirable Skills: JavaScript - with React/Vue being even better. Docker/Kubernetes Linux - Basic sysadmin (Apache, Nginx) SQL/Oracle/PostgreSQL/MongoDB/DynamoDB Message Queues - RabbitMQ or similar AWS or GCP This an office based position DCS more »
Modelling. Experience with at least one or more of these programming languages: Python, Scala/Java Experience with distributed data and computing tools, mainly Apache Spark & Kafka Understanding of critical path approaches, how to iterate to build value, engaging with stakeholders actively at all stages. Able to deal with more »
Tech: - AWS (S3, Glue, EMR, Athena, Lambda) - Snowflake, Redshift - DBT (Data Build Tool) - Programming: Python, Scala, Spark, PySpark or Ab Initio - Data pipeline orchestration (Apache Airflow) - Knowledge of SQL This is a 6 month initial contract with a trusted client of ours. CVs are being presented on Friday and more »
East London, London, United Kingdom Hybrid / WFH Options
Understanding Recruitment
use Java (for a very small amount of scripting work) Have public cloud experience with AWS or other cloud providers Have an understanding of Apache products such as Kafka and Flake Good knowledge of development using CI/CD Bonus points if you knowledge of: Web products Financial markets more »
Greater London, England, United Kingdom Hybrid / WFH Options
Understanding Recruitment
use Java (for a very small amount of scripting work) Have public cloud experience with AWS or other cloud providers Have an understanding of Apache products such as Kafka and Flake Good knowledge of development using CI/CD Bonus points if you knowledge of: Web products Financial markets more »
offs explicit and understandable to others REQUIREMENTS 7+ years' coding experience, including 3 years in a dedicated ML Engineering role 2+ years’ experience with Apache Spark Experience working with GB+ scale data Experience with deployed ML services Experience deploying multiple ML projects across different environments Productionisation experience in at more »
Flask, Tornado or Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as Apache Spark, Hadoop, Kafka, etc. Data acquisition and development of data sets and improving data quality Preparing data for predictive and prescriptive modelling Hands on more »
with impressive visualization (Power BI) · Experience in building large scale DW/BI systems for B2B SAAS companies · Experience with open-source tools like Apache Flink and AWS tools like S3, Redshift, EMR and RDS. · Experience with AI/Machine Learning and Predictive Analytics · Experience in developing global products more »
Manchester Area, United Kingdom Hybrid / WFH Options
Adria Solutions Ltd
data tasks. Knowledge of CI/CD approaches for Data Platforms using Bitbucket and Bitbucket Pipelines. Knowledge of AWS data lake approaches using Athena & Apache Iceberg tables. Exposure to visualisation development using Power BI. Knowledge of MS SQL Server, SSIS, Visual Studio, and SSDT projects. Experience in a relevant more »
compliance with specifications Should have understand Banking domain Should have Core Banking knowledge Familiarity with databases e g MySQL MongoDB web servers e g Apache and UI UX design Excellent communication and teamwork skills Great attention to detail Organizational skills An analytical mind more »
Skills & Experience At least 10 years experience working with JavaScript or Python/Java Previous experience deploying Software into the Cloud EKS, Docker, Kubernetes Apache Spark or NiFi Microservice architecture experience Experience with AI/ML systems more »
with AWS and services like S3. Experienced with Kafka for data streaming. Familiarity with BI reporting tools Good working experience with Airflow, PySpark and Apache Beam Worked with Java for building data applications – Advantageous Worked within the commodities space – Advantageous Not quite right for you? Refer a Data Engineer more »
London, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
Work with Hadoop, Spark, and other platforms for large-scale data processing. Real-Time Data Streaming: Develop and manage pipelines using CDC, Kafka, and Apache Spark. Database Management: Handle SQL databases like Oracle, MySQL, or PostgreSQL. Data Governance: Ensure data quality, security, and compliance with best practices. Ideal Candidate more »
experience in data engineering. Experienced in building ETL data pipelines. Relational database experience w/PostgreSQL. Understanding of tech within our stack: AWS/Apache beam/Kafka. Experience with Object Orientated Programming A desire to work in the commodities/trading sector. Permanent/Full-Time Employment. Hybrid more »
GCP) is highly preferred (experience with other cloud platforms like AWS or Azure is also considered). Familiarity with data pipeline scheduling tools like Apache Airflow. Ability to design, build, and maintain data pipelines for efficient data flow and processing. Understanding of data warehousing best practices and experience in more »
the following: - Experience of working in an Agile product delivery framework - Experience with PHP 8+ and the Laravel framework - Experience with Linux, NGINX (or Apache), MySQL server. LEMP/LAMP stack. - Experience of writing unit tests with test frameworks (PHPUnit, Codeception, etc.) Although we have a dedicated QA Test more »
Bristol, England, United Kingdom Hybrid / WFH Options
Made Tech
and able to guide how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes) Ability to create more »
Manchester, England, United Kingdom Hybrid / WFH Options
Made Tech
and able to guide how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes) Ability to create more »
Drupal Magento BigCommerce Laravel Proficient in setting up development and staging environments. Proficient in using and altering MySQL databases. Familiarity with server structures, specifically Apache and Nginx. Familiarity with domain management and DNS records. Familiarity with Agile and Waterfall work environments. Nice to haves: Familiarity with digital marketing/ more »
EC2N, Broad Street, Greater London, United Kingdom
James Joseph Associates
and team effectiveness Participating in Client Service and Technology meetings Handle production incidents and problem management activities related to the incidents TECH STACK: Linux, Apache, MySQL, OO Perl AWS CloudWatch Monitoring and Alerting Systems Ticketing/workflow Systems (e.g. Jira) KEY SKILLS/EXPERIENCE REQUIRED: Good degree in Computer more »
Linux environments. Knowledge of data modeling, database design, and query optimization techniques. Experience with real-time data processing, streaming, and analytics technologies (e.g., Kafka, Apache Flink). Familiarity with financial markets, trading systems, and quantitative analysis is a plus. Excellent problem-solving, analytical, and communication skills, with the ability more »
testing, and maintenance of data pipelines and data storage systems on Google Cloud Platform (GCP). You will be working with technologies such as Apache Airflow, BigQuery, Python, and SQL to transform and load large data sets, ensuring high data quality and accessibility for business intelligence and analytics purposes. more »
and IAM. Experience with containerization and orchestration tools, particularly Kubernetes. Proficiency in infrastructure as code tools such as Terraform, Ansible, or CloudFormation. Experience in Apache Airflow, AWS Backup & S3 versioning Solid understanding of CI/CD concepts and experience implementing CI/CD pipelines using tools like Jenkins, GitLab more »