Databricks • Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). • In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/ more »
/Azure DevOps and Visual Studio 2019/2022 Knowledge of JQuery, Windows Forms, MySQL, MS Power BI reporting Familiarity with managing Linux/Apache web stack and Bootstrap front end Sector experience in technology and/or education What We're Looking For: Passion for user experience design more »
Manchester Area, United Kingdom Hybrid / WFH Options
Adria Solutions Ltd
data tasks. Knowledge of CI/CD approaches for Data Platforms using Bitbucket and Bitbucket Pipelines. Knowledge of AWS data lake approaches using Athena & Apache Iceberg tables. Exposure to visualisation development using Power BI. Knowledge of MS SQL Server, SSIS, Visual Studio, and SSDT projects. Experience in a relevant more »
compliance with specifications Should have understand Banking domain Should have Core Banking knowledge Familiarity with databases e g MySQL MongoDB web servers e g Apache and UI UX design Excellent communication and teamwork skills Great attention to detail Organizational skills An analytical mind more »
Luton, England, United Kingdom Hybrid / WFH Options
Ventula Consulting
models and following best practices. The Ability to develop pipelines using SageMaker, MLFlow or similar frameworks. Strong experience with data programming frameworks such as Apache Spark. Understanding of common Data Science and Machine Learning models, libraries and frameworks. This role provides a competitive salary plus excellent benefits package. In more »
experience in data engineering. Experienced in building ETL data pipelines. Relational database experience w/PostgreSQL. Understanding of tech within our stack: AWS/Apache beam/Kafka. Experience with Object Orientated Programming A desire to work in the commodities/trading sector. Permanent/Full-Time Employment. Hybrid more »
engineers of varying levels of experience. Flexibility and willingness to adapt to new software and techniques. Nice to Have Experience working with projects in Apache Spark, Databricks of similar. Expert cloud platform knowledge, e.g. Azure What will be your key responsibilities? A technical expert and leader on the Petcare more »
Strong background with C++ Security-Enhanced Linux Strong knowledge of networking fundamentals Scripting languages e.g. Ruby, Python, Bash Experience of modern libraries including STL, Apache libraries (NiFi etc more »
Proven experience in MLOps and deploying machine learning models on Kubernetes. Proficiency in cloud technologies, AWS, GCP, Azure Experience with data orchestration tools (e.g., Apache Airflow). Familiarity with Terraform for CI/CD and infrastructure as code. Strong programming skills in software development. A cloud-agnostic mindset, with more »
master and meta data management Experience with Azure SQL Database, Azure Data Factory, Azure Storage, Azure IaaS/PaaS related database implementations. Experience with Apache spark and new Fabric framework would be a plus. more »
testing, and maintenance of data pipelines and data storage systems on Google Cloud Platform (GCP). You will be working with technologies such as Apache Airflow, BigQuery, Python, and SQL to transform and load large data sets, ensuring high data quality and accessibility for business intelligence and analytics purposes. more »
and IAM. Experience with containerization and orchestration tools, particularly Kubernetes. Proficiency in infrastructure as code tools such as Terraform, Ansible, or CloudFormation. Experience in Apache Airflow, AWS Backup & S3 versioning Solid understanding of CI/CD concepts and experience implementing CI/CD pipelines using tools like Jenkins, GitLab more »
enviro nment.Understanding of cloud-native computing concepts and experience with hybrid or private cloud platforms is a plus.Technical experience with Microsoft, Red Hat, and Apache software pro ducts.Team-oriented with a passion for engineering excellence and the ability to lead and inspire a team of skilled engi neers.Awareness of more »
DevOps/Agile Experience of managing environments using IAC (Terraform API's) Experience of designing robust, secured and compliant platform Capabilities. Strong understanding of Apache Spark including its architecture, components & how to create, monitor, optimize & scale spark jobs. more »
Microsoft Azure, is recommended as Microsoft Fabric is integrated with Azure services Experience of designing robust, secure and compliant platform capabilities Strong understanding of Apache Spark, including its architecture, components, and how to create, monitor, optimize, and scale Spark jobs. Experience of working in an DevOps/Agile team more »
HAproxy traffic balancing layer4/layer7 Encryption technologies Ansible OR other automation deployment/configuration tools Desirable Skills Open source web stacks (PostgreSQL/Apache/NGINX/ETCD) Git & CI/CD Docker Swarm The company Our client is an international software company trusted by the world's more »
teams to support the orchestration of our ETL pipelines using Airflow and manage our tech stack including Python, Next.js, Airflow, PostgreSQL MongoDB, Kafka and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and quickly resolving production issues. Contribute to more »
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
MBN Solutions
technologies, techniques, and architectures to build and maintain robust data pipelines. Technological Proficiency: Experience with technologies such as Azure Data Factory, Pentaho Data Integrator, Apache Hop, etc. ETL/ELT Practices: Strong understanding of modern ETL/ELT practices, frameworks, tooling, and execution environments. Data Delivery: Knowledge of data more »
environment. Familiarity with cloud-native computing concepts and experience with hybrid or private cloud platforms is advantageous. Technical expertise in a Microsoft, Redhat, and Apache data and software engineering environment. A team-oriented individual with a dedication to engineering excellence and the ability to lead and inspire a team more »
data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook.Our Commitment to Diversity and InclusionAt Databricks, we are committed more »
data engineering or a similar role. > Proficiency in programming languages such as Python, Java, or Scala. > Strong experience with data processing frameworks such as Apache Spark, Apache Flink, or Hadoop. > Hands-on experience with cloud platforms such as AWS, Google Cloud, or Azure. > Experience with data warehousing technologies more »
City of London, England, United Kingdom Hybrid / WFH Options
Penguin Recruitment
market leading software required to communicate and exchange technical information. Demonstrable academic qualifications appropriate to the role. Be a NABERS Assessor Have experience with Apache HVAC (preferred) Have a genuine interest and enthusiasm in the built environment and progress toward Net Zero Carbon. Demonstrable familiarity with key industry agendas more »
Data Scientists and Service Engineering teams Experience with design, development and operations that leverages deep knowledge in the use of services like Amazon Kinesis, Apache Kafka, Apache Spark, Amazon Sagemaker, Amazon EMR, NoSQL technologies and other 3rd parties Develop and define key business questions and to build data more »
Elasticsearch, Bigquery, PostgresQL FullCircl 3 Lead_Data_Engineer 04.24 · Kubernetes, Docker, Airflow KEY RESPONSIBILITIES · Designing and implementing scalable data pipelines using tools such as Apache Spark, Google PubSub etc. · Optimizing data storage and retrieval systems for maximum performance using both relational and NoSQL databases. · Continuously monitoring and improving the … Data Infrastructure projects, as well as designing and building data intensive applications and services. · Experience with data processing and distributed computing frameworks such as Apache Spark · Expert knowledge in one or more of the following languages - Python, Scala, Java, Kotlin · Deep knowledge of data modelling, data access, and data more »
London, England, United Kingdom Hybrid / WFH Options
Pioneer Search
Senior Scala Developer - Apache Spark - Urgent Requirement Contract Length: 6 Months IR35 status: Inside Location: London - Hybrid working A Senior Scala Developer with experience in Apache Spark is needed for a British consultancy organisation. You will be an integral member of the team providing technical expertise to the more »