data visualization tools. Strong analytical and problem-solving skills. Knowledge of social media analytics and user behavior. Familiarity with big data technologies like Hadoop, Spark, Kafka. Knowledge of AWS machine learning services like SageMaker and Comprehend. Understanding of data governance and security in AWS. Excellent communication skills and attention More ❯
analytics on AWS platforms Experience in writing efficient SQL's, implementing complex ETL transformations on big data platform. Experience in a Big Data technologies (Spark, Impala, Hive, Redshift, Kafka, etc.) Experience in data quality testing; adept at writing test cases and scripts, presenting and resolving data issues Experience with More ❯
machine learning, mobile, etc.) Experience in Computer Science, Engineering, Mathematics, or a related field and expertise in technology disciplines Exposure to big data frameworks (Spark, Hadoop etc.) used for scalable distributed processing Ability to collaborate effectively with Data Scientists to translate analytical insights into technical solutions Preferred Qualifications, Capabilities More ❯
containerization (Docker, Kubernetes) and DevOps pipelines. · Exposure to security operations center (SOC) tools and SIEM platforms. · Experience working with big data platforms such as Spark, Hadoop, or Elastic Stack. #J-18808-Ljbffr More ❯
e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred qualifications, capabilities, and skills Knowledge of AWS Knowledge of Databricks, Pyspark Understanding of Cloudera Hadoop, Spark, HDFS, HBase, Hive. Understanding of Maven or Gradle About the Team J.P. Morgan is a global leader in financial services, providing strategic advice and More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
JR United Kingdom
ML code. LLMOps & AI Model Management: Experience with tools like MLFlow, LangChain, Hugging Face, Kubeflow, or similar platforms. Data Processing: Proficient with Databricks/Spark for large-scale AI data processing. SQL: Strong capabilities in data querying and preparation. Data Architectures: Understanding of modern data infrastructure (lakehouses, data lakes More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Waracle
etc.) Familiarity with Python for AI/ML development is a significant advantage. Experience with data engineering pipelines or big data technologies (e.g., Kafka, Spark) is a plus. The recruitment process you can expect for this role is an initial call with your dedicated Talent Acquisition Partner who will More ❯
certification on software engineering concepts and applied experience Experience in dealing with large amount of data, Data Engineering skills are desired Proven experience in Spark, Hadoop, Databricks and Snowflake Hands-on practical experience delivering system design, application development, testing, and operational stability Advanced in one or more programming language More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
WARACLE
. Familiarity with Python for AI/ML development is a significant advantage. Experience with data engineering pipelines or big data technologies (e.g., Kafka, Spark) is a plus. The recruitment process you can expect for this role is an initial call with your dedicated Talent Acquisition Partner who will More ❯
these issues. You will be required to leverage the company's extensive data assets from both internal and external sources using tools like Python, Spark, and AWS. Additionally, your role will encompass extracting business insights from technical results and effectively communicating them to a non-technical audience. Job Responsibilities More ❯
ML applications or models for these challenges. You will leverage the firm's extensive data resources from both internal and external sources using Python, Spark, and AWS, among other systems. You are expected to extract business insights from technical results and effectively communicate them to a non-technical audience. More ❯
data architecture , including data modeling, warehousing, real-time and batch processing, and big data frameworks. Proficiency with modern data tools and technologies such as Spark, Databricks, Kafka, or Snowflake (bonus). Knowledge of cloud security, networking, and cost optimization as it relates to data platforms. Experience in total cost More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
JR United Kingdom
software practices (SCRUM/Agile, microservices, containerization like Docker/Kubernetes). we'd also encourage you to apply if you possess: Experience with Spark/Databricks. Experience deploying ML models via APIs (e.g., Flask, Keras). Startup experience or familiarity with geospatial and financial data. The Interview Process More ❯
of agile methodologies, including CI/CD, application resiliency, and security. Additional Qualifications, Capabilities, and Skills: Experience with big data technologies such as Hadoop, Spark, or Kafka. Familiarity with tools like GitHub Copilot or Codeium. Knowledge or practical experience with cloud technologies. Understanding of orchestration technologies like Prefect, Airflow More ❯
Skills: Experience as a full stack developer, including proficiency in front-end technologies such as React. Proficiency in big data technologies such as Hadoop, Spark, or Kafka for handling large-scale data processing Experience using tools like GitHub Copilot or Codeium. In-depth knowledge of the financial services industry More ❯
end ownership • Python or similar (Ruby or Node) or another Functional Language • JavaScript and associated frameworks, preferably Vue, or similar • Cloud technologies • SQL (advantageous) • Spark (advantageous) • Docker/Kubernetes – advantageous ) • MongoDB, SQL, Postgres & Snowflake (advantageous) • Developing online, cloud based SaaS products. • Leading and building scalable architectures and distributed systems More ❯
Functional Relevance of Work Experience: 15+ years of experience as an Enterprise Architect. 5+ years of industry exposure to the Government Sector. Certification across Spark and TOGAF. Other Desired Skills: Skilled in generating business cases and canned presentation material that can be reused across multiple customers. Strong leadership abilities More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Canonical
solutions such as Openstack, MicroCloud and Ceph, and solutions that could be deployed either on-premises or in public clouds such as Kubernetes, Kubeflow, Spark, PostgreSQL, etc. The team works hands-on with the technologies by deploying, testing and handing over the solution to our support or managed services More ❯
could miss this opportunity! Get in touch by contacting me at j.shaw-bollands@tenthrevolution.com or on 0191 338 6641! Keywords: Big Data, Hadoop, Scala, Spark, AWS, Migration, Data Engineer, Consultancy, Banking, Finance #J-18808-Ljbffr More ❯
Job Description We are looking for a skilled ETL Developer with hands-on experience in Talend, Python, and Spark, to join data engineering team. The ideal candidate will be responsible for designing, building, and maintaining ETL pipelines that support data extraction, transformation, and loading from various sources into target … proactively resolve data inconsistencies. · Participate in troubleshooting and performance tuning of ETL jobs and workflows. Required Skills & Qualifications: · Proven experience with Talend, Python, and Apache Spark. · Strong understanding of relational databases and Big Data ecosystems (Hive, Impala, HDFS). · Solid experience in data warehousing and data modelling techniques. · Familiarity More ❯