all solutions and processes. Technical Skills Programming: Proficiency in Python, Java, Scala, or similar languages. Big Data Technologies: Hands-on experience with big data tools e.g. (Databricks, Apache Spark, Hadoop). Cloud Platforms: Familiarity with AWS, Azure, GCP, or other cloud ecosystems for data engineering tasks. Expertise in relational databases (e.g. postgres, sql server) Data Integration Tools: Knowledge of More ❯
native environments. · Familiarity with containerization (Docker, Kubernetes) and DevOps pipelines. · Exposure to security operations center (SOC) tools and SIEM platforms. · Experience working with big data platforms such as Spark, Hadoop, or Elastic Stack. #J-18808-Ljbffr More ❯
e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Experience in Computer Science, Engineering, Mathematics, or a related field and expertise in technology disciplines Exposure to big data frameworks (Spark, Hadoop etc.) used for scalable distributed processing Ability to collaborate effectively with Data Scientists to translate analytical insights into technical solutions Preferred Qualifications, Capabilities, And Skills Familiarity with No SQL More ❯
e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Experience in Computer Science, Engineering, Mathematics, or a related field and expertise in technology disciplines Exposure to big data frameworks (Spark, Hadoop etc.) used for scalable distributed processing Ability to collaborate effectively with Data Scientists to translate analytical insights into technical solutions Additional qualifications include familiarity with NoSQL databases such as More ❯
the following architectural frameworks (TOGAF, ZACHMAN, FEAF) Cloud Experience: AWS or GCP preferred, particularly around migrations and cloud architecture Good technical knowledge and understanding of big data frameworks like Hadoop, Cloudera etc. Deep technical knowledge of database development, design and migration Experience of deployment in cloud using Terraform or CloudFormation Automation or Scripting experience using languages such as Python … monitoring of hybrid on-premise and cloud data solutions Working with a variety of enterprise level organisations to understand and analyse existing on-prem environments such as Oracle, Teradata & Hadoop etc., and be able to design and plan migrations to AWS or GCP Deep understanding of high and low level designs and architecture solutions Developing database scripts to migrate More ❯
/mathematical software (e.g. R, SAS, or Matlab) - Experience with statistical models e.g. multinomial logistic regression - Experience in data applications using large scale distributed systems (e.g., EMR, Spark, Elasticsearch, Hadoop, Pig, and Hive) - Experience working with data engineers and business intelligence engineers collaboratively - Demonstrated expertise in a wide range of ML techniques PREFERRED QUALIFICATIONS - Experience as a leader and More ❯
and technical processes within a technical discipline (., cloud, artificial intelligence, machine learning, mobile, Preferred qualifications, capabilities, and skills Knowledge of AWS Knowledge of Databricks, Pyspark Understanding of Cloudera Hadoop, Spark, HDFS, HBase, Hive. #J-18808-Ljbffr More ❯
applications and technical processes within a technical discipline (., cloud, artificial intelligence, machine learning, mobile, Preferred qualifications, capabilities, and skills Knowledge of AWS Knowledge of Databricks Understanding of Cloudera Hadoop, Spark, HDFS, HBase, Hive. Please note that if you are NOT a passport holder of the country for the vacancy you might need a work permit. Check our Blog More ❯
Livingston, Scotland, United Kingdom Hybrid / WFH Options
JR United Kingdom
team! You will develop scalable data pipelines, ensure data quality, and support business decision-making with high-quality datasets. -Work across technology stack: SQL, Python, ETL, Big Query, Spark, Hadoop, Git, Apache Airflow, Data Architecture, Data Warehousing -Design and develop scalable ETL pipelines to automate data processes and optimize delivery -Implement and manage data warehousing solutions, ensuring data integrity More ❯
Dunfermline, Scotland, United Kingdom Hybrid / WFH Options
JR United Kingdom
team! You will develop scalable data pipelines, ensure data quality, and support business decision-making with high-quality datasets. -Work across technology stack: SQL, Python, ETL, Big Query, Spark, Hadoop, Git, Apache Airflow, Data Architecture, Data Warehousing -Design and develop scalable ETL pipelines to automate data processes and optimize delivery -Implement and manage data warehousing solutions, ensuring data integrity More ❯
skills Formal training or certification on software engineering concepts and applied experience Experience in dealing with large amount of data, Data Engineering skills are desired Proven experience in Spark, Hadoop, Databricks and Snowflake Hands-on practical experience delivering system design, application development, testing, and operational stability Advanced in one or more programming language(s) i.e. Java or Python Advanced More ❯
expertise and technical acumen to ensure successful delivery of complex data projects on time and within budget. Key Responsibilities: Project Management: Lead and manage legacy data platform migration (Teradata, Hadoop), data lake build, and data analytics projects from initiation to completion. Develop comprehensive project plans, including scope, timelines, resource allocation, and budgets. Monitor project progress, identify risks, and implement More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
JR United Kingdom
excellent project management skills and technical experience to ensure successful delivery of complex data projects on time and within budget. Responsibilities: Lead and manage legacy data platform migration (Teradata, Hadoop), data lake build, and data analytics projects from initiation to completion. Develop comprehensive project plans, including scope, timelines, resource allocation, and budgets. Monitor project progress, identify risks, and implement More ❯
solutions to ensure design constraints are met by the software team Ability to initiate and implement ideas to solve business problems Preferred qualifications, capabilities, and skills Knowledge of HDFS, Hadoop, Databricks Knowledge of Airflow, Control-M Familiarity with container and container orchestration such as ECS, Kubernetes, and Docker Familiarity with troubleshooting common networking technologies and issues About Us J.P. More ❯
enterprise-level systems; Excellent object-oriented design skills, including OOA/OOD; Experience with multi-tier architectures and service-oriented architecture; Exposure to and understanding of RDBMS, NoSQL, and Hadoop is desirable; Knowledge of the software development lifecycle and agile practices, including TDD/BDD; Strategic thinking, collaboration, and consensus-building skills. Please note: Familiarity with DevOps is important More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
JR United Kingdom
backend focus Proficiency in Java (required) and at least one other server-side language Solid hands-on experience with microservices and distributed systems Familiarity with data technologies like MySQL, Hadoop, or Cassandra Experience working with AWS services (RDS, EC2, Step Functions, Kinesis) is a plus Strong background in testing, KPIs/SLOs, and performance optimization Prior exposure to compliance More ❯
Aberdeen, Scotland, United Kingdom Hybrid / WFH Options
JR United Kingdom
backend focus Proficiency in Java (required) and at least one other server-side language Solid hands-on experience with microservices and distributed systems Familiarity with data technologies like MySQL, Hadoop, or Cassandra Experience working with AWS services (RDS, EC2, Step Functions, Kinesis) is a plus Strong background in testing, KPIs/SLOs, and performance optimization Prior exposure to compliance More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
JR United Kingdom
backend focus Proficiency in Java (required) and at least one other server-side language Solid hands-on experience with microservices and distributed systems Familiarity with data technologies like MySQL, Hadoop, or Cassandra Experience working with AWS services (RDS, EC2, Step Functions, Kinesis) is a plus Strong background in testing, KPIs/SLOs, and performance optimization Prior exposure to compliance More ❯