and AI models. Data Engineer Required Experience Data engineering experience (2+ years) Cloud platform proficiency (e.g., AWS, Azure, GCP) Data pipeline development (e.g., Airflow, Apache Spark) SQL proficiency, database design Visualization tools knowledge (e.g., Tableau, PowerBI, Looker) Data Engineer Application Process This is a 1 year contract requirement with more »
East London, London, United Kingdom Hybrid / WFH Options
Understanding Recruitment
use Java (for a very small amount of scripting work) Have public cloud experience with AWS or other cloud providers Have an understanding of Apache products such as Kafka and Flake Good knowledge of development using CI/CD Bonus points if you knowledge of: Web products Financial markets more »
Greater London, England, United Kingdom Hybrid / WFH Options
Understanding Recruitment
use Java (for a very small amount of scripting work) Have public cloud experience with AWS or other cloud providers Have an understanding of Apache products such as Kafka and Flake Good knowledge of development using CI/CD Bonus points if you knowledge of: Web products Financial markets more »
the following: - Experience of working in an Agile product delivery framework - Experience with PHP 8+ and the Laravel framework - Experience with Linux, NGINX (or Apache), MySQL server. LEMP/LAMP stack. - Experience of writing unit tests with test frameworks (PHPUnit, Codeception, etc.) Although we have a dedicated QA Test more »
Employment Type: Permanent
Salary: £55000 - £65000/annum Up to £65K Basic + Benefits package
Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/ more »
the following: - Experience of working in an Agile product delivery framework - Experience with PHP 8+ and the Laravel framework - Experience with Linux, NGINX (or Apache), MySQL server. LEMP/LAMP stack. - Experience of writing unit tests with test frameworks (PHPUnit, Codeception, etc.) Although we have a dedicated QA Test more »
Employment Type: Permanent
Salary: £70000 - £80000/annum Up to £80K Basic + Benefits package
EC2N, Broad Street, Greater London, United Kingdom
James Joseph Associates
and team effectiveness Participating in Client Service and Technology meetings Handle production incidents and problem management activities related to the incidents TECH STACK: Linux, Apache, MySQL, OO Perl AWS CloudWatch Monitoring and Alerting Systems Ticketing/workflow Systems (e.g. Jira) KEY SKILLS/EXPERIENCE REQUIRED: Good degree in Computer more »
experience in data engineering. Experienced in building ETL data pipelines. Relational database experience w/PostgreSQL. Understanding of tech within our stack: AWS/Apache beam/Kafka. Experience with Object Orientated Programming A desire to work in the commodities/trading sector. Permanent/Full-Time Employment. Hybrid more »
London, Liverpool, Merseyside, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions
Glue). Hands-on experience with Databricks for data processing and analytics. Proficient in Python programming for data manipulation and automation. Solid understanding of Apache Spark for big data processing. Strong SQL skills for data querying, transformation, and analysis. Excellent problem-solving abilities and attention to detail. Ability to more »
libraries. Knowledge of Azure or other cloud services and ability to implement solutions utilizing them. Familiarity with databases (e.g., MySQL, MongoDB), web servers (e.g. Apache) and UI/UX design. Building and scaling infrastructure services using Microsoft Azure Experience of using core cloud application infrastructure services including identity platforms more »
engineers of varying levels of experience. Flexibility and willingness to adapt to new software and techniques. Nice to Have Experience working with projects in Apache Spark, Databricks of similar. Expert cloud platform knowledge, e.g. Azure What will be your key responsibilities? A technical expert and leader on the Petcare more »
engineers of varying levels of experience. Flexibility and willingness to adapt to new software and techniques. Nice to Have Experience working with projects in Apache Spark, Databricks of similar. Expert cloud platform knowledge, e.g. Azure What will be your key responsibilities? A technical expert and leader on the Petcare more »
as Linux, Bash and Powershell with a good grounding in Linux OS and IP Networking. With application and web server knowledge with NGINX and Apache and virtualisation knowledge with Docker and K8S. The majority of the applications are written in Java and Terraform cloud infrastructure provisioning and APM tool more »
master and meta data management Experience with Azure SQL Database, Azure Data Factory, Azure Storage, Azure IaaS/PaaS related database implementations. Experience with Apache spark and new Fabric framework would be a plus. more »
in Computer Science, Engineering (or other related STEM subject) 5+ years experience in data engineering 2+ years in a leadership role. Experience working with Apache Spark, Azure Data Factory and other data pipelines tools. Strong programming skills. Impeccable communication skills. Precise attention to detail. Pioneering attitude. If you are more »
preferably GCP). BSc/MSc in computer science, maths, physics or STEM subject. Basic knowledge of statistics and machine learning. Experience with Spark, Apache services, ETL tools, Data visualization and dashboards. Experience with streamed data processing, parallel compute, and/or event based architectures. Experience with web-scraping more »
Job Description Must have excellent Liferay and Java skills, Spring MVC, Apache CKF, Dozer (a fast and flexible framework for mapping back and forth between Java Beans) & XML: A minimum of 5 years' work experience in Software Development Experience with implementing service-oriented architecture (SOA) Designs and develops Enterprise more »
the Midlands. Ideal Candidate Profile: We are seeking an individual who have the following attributes: Proven expertise as a Data Engineer, demonstrating proficiency in Apache Spark and cloud-based technologies, particularly Microsoft Azure and Databricks. Strong programming skills, with a focus on Python, along with proficiency in ETL frameworks more »
testing, and maintenance of data pipelines and data storage systems on Google Cloud Platform (GCP). You will be working with technologies such as Apache Airflow, BigQuery, Python, and SQL to transform and load large data sets, ensuring high data quality and accessibility for business intelligence and analytics purposes. more »
and IAM. Experience with containerization and orchestration tools, particularly Kubernetes. Proficiency in infrastructure as code tools such as Terraform, Ansible, or CloudFormation. Experience in Apache Airflow, AWS Backup & S3 versioning Solid understanding of CI/CD concepts and experience implementing CI/CD pipelines using tools like Jenkins, GitLab more »
and IAM. Experience with containerization and orchestration tools, particularly Kubernetes. Proficiency in infrastructure as code tools such as Terraform, Ansible, or CloudFormation. Experience in Apache Airflow, AWS Backup & S3 versioning Solid understanding of CI/CD concepts and experience implementing CI/CD pipelines using tools like Jenkins, GitLab more »
City of London, London, United Kingdom Hybrid / WFH Options
TECHNOLOGY RECWORKS LIMITED
sponsors) Knowledge and experience of the following would be advantageous: Knowledge of Enterprise Architecture Frameworks Good knowledge of Azure DevOps Pipelines Strong experience in Apache Spark framework Previous experience in designing and delivering data warehouse and business intelligence solutions using on-premises Microsoft stack (SSIS, SSRS, SSAS) Knowledge of more »
in alignment with strategic business priorities. Workflow Development: gathering requirements and design solutions alongside stakeholders to build workflows which enhance user experience, using the Apache Groovy programming language or Python Develop and implement integrations between Collibra and different platforms and systems within the Society Build, design and upload Microsoft more »
of the day to day role): Virtualisation with KVM/QEMU Scripting skills (Powershell, Python, Bash, Ruby, Perl, etc.) LAMP (or components of - Linux, Apache, MySQL, PHP) OpenStack (or similar cloud software platform) VMware ESXi Hypervisor Data Modelling tools (XML, NETCONF, YANG, JSON) Must have the ability to break more »
independently or as part of a team and operate to tight deadlines Sense of humour please Advantageous But Not a Must: Great knowledge of Apache, specifically Mod Rewrite Good working knowledge of Linux, including command line Server administration experience Ecommerce experience Smarty Competitive Salary Career Growth and Financial Stability more »