Founding Data Engineer Central London 3 days a week, hybrid NB – we are looking for someone with specific experience around fintech, finance, banking, eCommerce transactions, insurance, payments, fraud, consumer scams or related areas. Please only apply if you have specific more »
with structured and unstructured data, and curate data to provide real-time contextualized insights. Manage full data lifecycle , experience using Microsoft Azure, SQL server, Hadoop ecosystem, Spark, and Kafka, and building capabilities to host a wider set of technologies. When team expands mentoring of new data team members Adopt … Skills & Experience you will have: Experience working successfully within a start up or scale up business Previous work with big data tech such as Hadoop, Spark, Kafka. Good working knowledge of the use of containers including Docker and Kubernetes , and experience with working in Microsoft Azure platform. Coding/ more »
Company Description Petrolink is a global independent and neutral wellsite data solutions company that provides services in major oil and gas regions worldwide. Our specialties include visualization, data analytics, and data interoperability. Our technologies and services drive down the cost more »
Data and Artificial Intelligence, Senior Vice President We are searching for a Senior Vice President of Data and Artificial Intelligence- someone with hands on experience designing AI solutions to solve complex business problems. Your new role is a leadership position more »
infrastructure and/or use case scaling , particularly in a production environment Familiarity with DevOps principles and practices Experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms are of benefit The ability to communicate in English with business fluency , German language skills are a plus more »
infrastructure and/or use case scaling , particularly in a production environment Familiarity with DevOps principles and practices Experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms are of benefit The ability to communicate in English with business fluency , German language skills are a plus more »
infrastructure and/or use case scaling , particularly in a production environment Familiarity with DevOps principles and practices Experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms are of benefit The ability to communicate in English with business fluency , German language skills are a plus more »
infrastructure and/or use case scaling , particularly in a production environment Familiarity with DevOps principles and practices Experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms are of benefit The ability to communicate in English with business fluency , German language skills are a plus more »
infrastructure and/or use case scaling , particularly in a production environment Familiarity with DevOps principles and practices Experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms are of benefit The ability to communicate in English with business fluency , German language skills are a plus more »
infrastructure and/or use case scaling , particularly in a production environment Familiarity with DevOps principles and practices Experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms are of benefit The ability to communicate in English with business fluency , German language skills are a plus more »
and analytical abilities. Preferred Skills: Experience with cloud databases (e.g., AWS, Azure, Google Cloud Platform). Knowledge of big data tools and frameworks (e.g., Hadoop, Spark). Certification in database management (e.g., Microsoft Certified: Azure Data Engineer Associate). What We Offer: Competitive salary and comprehensive benefits package. Opportunity more »
CD tooling Scripting experience (Python, Perl, Bash, etc.) ELK (Elastic stack) JavaScript Cypress Linux experience Search engine technology (e.g., Elasticsearch) Big Data Technology experience (Hadoop, Spark, Kafka, etc.) Microservice and cloud native architecture Desirable Skills Able to demonstrate experience of troubleshooting and diagnosis of technical issues. Able to demonstrate more »
CLI and IAM etc. (required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm more »
CLI and IAM etc. (required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm more »
AWS SageMaker, or Azure Machine Learning for model development and deployment. Data Analytics and Big Data Technologies: Proficient in big data technologies such as Hadoop, Spark, and Kafka for handling large datasets. Experience with data visualization tools like Tableau, Power BI, or Qlik for deriving actionable insights from data. more »
AWS SageMaker, or Azure Machine Learning for model development and deployment. Data Analytics and Big Data Technologies: Proficient in big data technologies such as Hadoop, Spark, and Kafka for handling large datasets. Experience with data visualization tools like Tableau, Power BI, or Qlik for deriving actionable insights from data. more »
Scala Data Engineer (Cloudera, Hadoop and CI/CD) - Banking Client - Brussels Duration: 1 year freelance contract Rate: €500 - €800 per day Hybrid working INSIDE OF IR35 You will join the AIR (Analytics, Insight and Reporting tribe) in GDC division. You will be join our dedicated in-house team … and frameworks . You ideally have knowledge of English , knowledge of other European languages is a plus Nice to haves You know that Cloudera, Hadoop and CI/CD aren't popular video games. You have heard of tools like Apache Spark, Impala and/or Kafka. You have more »
problems and solutions. Advanced experience developing business deliverables that leverage business intelligence platforms, data management platforms, or SQL-based languages (Tableau, Business Objects, Snowflake, Hadoop, Netezza, NoSQL, ANSI SQL, or related). Proven, expert ability to use or build business knowledge through meaningful partnerships at the individual contributor, leadership more »
problems and solutions. Advanced experience developing business deliverables that leverage business intelligence platforms, data management platforms, or SQL-based languages (Tableau, Business Objects, Snowflake, Hadoop, Netezza, NoSQL, ANSI SQL, or related). Proven, expert ability to use or build business knowledge through meaningful partnerships at the individual contributor, leadership more »
responsibility (i.e. P&C, Bank, Finance, etc.) Working knowledge of business intelligence platforms, data management platforms, or SQL-based languages (Tableau, Business Objects, Snowflake, Hadoop, Netezza, NoSQL, ANSI SQL, or related). Ability to build business knowledge through meaningful partnerships at the individual contributor and leadership levels. Proven critical more »
responsibility (i.e. P&C, Bank, Finance, etc.) Working knowledge of business intelligence platforms, data management platforms, or SQL-based languages (Tableau, Business Objects, Snowflake, Hadoop, Netezza, NoSQL, ANSI SQL, or related). Ability to build business knowledge through meaningful partnerships at the individual contributor and leadership levels. Proven critical more »
business problems and solutions. Experience developing business deliverables that leverage business intelligence platforms, data management platforms, or SQL-based languages (Tableau, Business Objects, Snowflake, Hadoop, Netezza, NoSQL, ANSI SQL, or related). Proven ability to build business knowledge through meaningful partnerships at the individual contributor, leadership, and EMG levels. more »
business problems and solutions. Experience developing business deliverables that leverage business intelligence platforms, data management platforms, or SQL-based languages (Tableau, Business Objects, Snowflake, Hadoop, Netezza, NoSQL, ANSI SQL, or related). Proven ability to build business knowledge through meaningful partnerships at the individual contributor, leadership, and EMG levels. more »
Senior Data Engineer - Python/Hadoop/Spark - sought by leading investment bank based in London - Hybrid - contract *inside IR35 - umbrella* Key Responsibilities: Design and implement scalable data pipelines that extract, transform and load data from various sources into the data Lakehouse. Help teams push the boundaries of analytical … ETL processes, and data warehousing. Significant exposure and hands on at least 2 of the programming languages - Python, Java, Scala, GoLang. Significant experience with Hadoop, Spark and other distributed processing platforms and frameworks. Experience working with Open table/storage formats like delta lake, apache iceberg or apache hudi. more »
would be an advantage Data visualization – Tools like Tableau Master data management (MDM) – Concepts and expertise in tools like Informatica & Talend MDM Big data – Hadoop eco-system, Distributions like Cloudera/Hortonworks, Pig and HIVE Data processing frameworks – Spark & Spark streaming Hands-on experience with multiple databases like PostgreSQL more »