Solid understanding of ETL processes , data modeling, and data warehousing. Familiarity with SQL and relational databases. Knowledge of big data technologies , such as Spark, Hadoop, or Kafka, is a plus. Strong problem-solving skills and the ability to work in a collaborative team environment. Excellent verbal and written communication More ❯
data modeling . Experience with relational and NoSQL databases such as Oracle, Sybase, PostgreSQL, SQL Server, MongoDB . Familiarity with big data platforms (e.g., Hadoop, Snowflake). Prior experience with ETL tools or as a SQL developer . Proficiency in Python for data engineering and Tableau for reporting and More ❯
AWS Certified Data Engineer, or AWS Certified Data Analytics, or AWS Certified Solutions Architect Experience with big data tools and technologies like Apache Spark, Hadoop, and Kafka Knowledge of CI/CD pipelines and automation tools such as Jenkins or GitLab CI About Adastra For more than 25 years More ❯
roles, with 5+ years in leadership positions. Expertise in modern data platforms (e.g., Azure, AWS, Google Cloud) and big data technologies (e.g., Spark, Kafka, Hadoop). Strong knowledge of data governance frameworks, regulatory compliance (e.g., GDPR, CCPA), and data security best practices. Proven experience in enterprise-level architecture design More ❯
. Data Platforms: Warehouses: Snowflake, Google BigQuery, or Amazon Redshift. Analytics: Tableau, Power BI, or Looker for client reporting. Big Data: Apache Spark or Hadoop for large-scale processing. AI/ML: TensorFlow or Databricks for predictive analytics. Integration Technologies: API Management: Apigee, AWS API Gateway, or MuleSoft. Middleware More ❯
experience in data engineering, including working with AWS services. Proficiency in AWS services like S3, Glue, Redshift, Lambda, and EMR. Knowledge of Cloudera-based Hadoop is a plus. Strong ETL development skills and experience with data integration tools. Knowledge of data modeling, data warehousing, and data transformation techniques. Familiarity More ❯
Service Catalogue, Cloud Formation, Lake Formation, SNS, SQS, Event Bridge Language & Scripting: Python and Spark ETL: DBT Good to Have: Airflow, Snowflake, Big Data (Hadoop), and Teradata Responsibilities: Serve as the primary point of contact for all AWS related data initiatives and projects. Responsible for leading a team of More ❯
Experience in commodities markets or broader financial markets. Knowledge of quantitative modeling, risk management, or algorithmic trading. Familiarity with big data technologies like Kafka, Hadoop, Spark, or similar. Why Work With Us? Impactful Work: Directly influence the profitability of the business by building technology that drives trading decisions. Innovative More ❯
contributions to the delivery process, manage tasks, and update teams on progress. Skills & Experience: Proven experience as a Data Engineer with expertise in Databricks, Hadoop/Spark. Strong programming skills in Python, Scala, or SQL, with knowledge of CI/CD platforms. Proficiency with distributed computing frameworks and cloud More ❯
london, south east england, united kingdom Hybrid / WFH Options
Randstad Digital UK
AWS Databases: MSSQL, PostgreSQL, MySQL, NoSQL Cloud: AWS (preferred), with working knowledge of cloud-based data solutions Nice to Have: Experience with graph databases, Hadoop/Spark, or enterprise data lake environments What You’ll Bring Strong foundation in computer science principles (data structures, algorithms, etc.) Experience building enterprise More ❯
Richmond, North Yorkshire, Yorkshire, United Kingdom
Datix Limited
programming languages, specifically Python and SQL. Expertise in data management, data architecture, and data visualization techniques. Experience with data processing frameworks like Apache Spark, Hadoop, or Flink. Strong understanding of database systems (SQL and NoSQL) and data warehousing technologies. Familiarity with cloud computing platforms (AWS, Azure) and data security More ❯
or Amazon QuickSight. Programming Languages: Familiarity with Python or R for data manipulation and analysis. Big Data Technologies: Experience with big data technologies like Hadoop or Spark. Data Governance: Understanding of data governance and data quality management. A Bit About Us When it comes to appliances and electricals, we More ❯
Birmingham, Staffordshire, United Kingdom Hybrid / WFH Options
Yelp USA
recommending scalable, creative solutions. Exposure to some of the following technologies: Python, AWS Redshift, AWS Athena/Apache Presto, Big Data technologies (e.g S3, Hadoop, Hive, Spark, Flink, Kafka etc), NoSQL systems like Cassandra, DBT is nice to have. What you'll get: Full responsibility for projects from day More ❯
frameworks like TensorFlow, Keras, or PyTorch. Knowledge of data analysis and visualization tools (e.g., Pandas, NumPy, Matplotlib). Familiarity with big data technologies (e.g., Hadoop, Spark). Excellent problem-solving skills and attention to detail. Ability to work independently and as part of a team. Preferred Qualifications: Experience with More ❯
london, south east england, united kingdom Hybrid / WFH Options
JSS Search
and the ability to work in a fast-paced, collaborative environment. Strong communication and interpersonal skills. Preferred Skills: Experience with big data technologies (e.g., Hadoop, Spark). Knowledge of machine learning and AI integration with data architectures. Certification in cloud platforms or data management. More ❯
Statistics, Informatics, Information Systems, or another quantitative field. They should also have experience using the following software/tools: Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with relational SQL and NoSQL databases, including Postgres and Cassandra. Experience with data pipeline and workflow management tools: Azkaban, Luigi More ❯
. Knowledge of cloud platforms (e.g., Azure). Familiarity with containerization is a plus (e.g., Docker, Kubernetes). Knowledge of big data technologies (e.g., Hadoop, Spark). Knowledge of data lifecycle management. Strong problem-solving skills and attention to detail. Ability to work in an agile development environment. Excellent More ❯
background in agile delivery and effort estimation Familiarity with Python and data libraries Excellent communication and problem-solving skills Nice to Have: Experience with Hadoop, Jenkins Cloud certifications (Azure or AWS) Basic knowledge of Java This is a 6 months rolling contract with daily rate up to £500. Sponsorship More ❯
Engineering, Mathematics, or related field. - Proven experience (5+ years) in developing and deploying data engineering pipelines and products - Strong proficiency in Python - Experienced in Hadoop, Kafka or Spark - Experience leading/mentoring junior team members - Strong communication and interpersonal skills, with the ability to effectively communicate complex technical concepts More ❯
information with attention to detail and accuracy. Adept at queries, report writing, and presenting findings. Experience working with large datasets and distributed computing tools (Hadoop, Spark, etc.) Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests, etc.). Experience with data profiling tools and processes. More ❯
with source control tools (e.g., Git) and CI/CD pipelines. Desirable Skills Familiarity with big data or NoSQL technologies (e.g., MongoDB, Cosmos DB, Hadoop). Exposure to data analytics tools (Power BI, Tableau) or machine learning workflows. Knowledge of data governance, GDPR, and data compliance practices. Why Join More ❯
data governance. Cloud Computing : AWS, Azure, Google Cloud for scalable data solutions. API Strategy : Robust APIs for seamless data integration. Data Architecture : Finbourne LUSID, Hadoop, Spark, Snowflake for managing large volumes of investment data. Cybersecurity : Strong data security measures, including encryption and IAM. AI and Machine Learning : Predictive analytics More ❯
architecture, etc Cloud Computing : AWS, Azure, Google Cloud for scalable data solutions. API Strategy : Robust APIs for seamless data integration. Data Architecture : Finbourne LUSID, Hadoop, Spark, Snowflake for managing large volumes of investment data. Cybersecurity : Strong data security measures, including encryption and IAM. AI and Machine Learning : Predictive analytics More ❯
technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred qualifications, capabilities, and skills Knowledge of AWS Knowledge of Databricks Understanding of Cloudera Hadoop, Spark, HDFS, HBase, Hive. Understanding of Maven or Gradle About the Team J.P. Morgan is a global leader in financial services, providing strategic advice More ❯