in large-scale data migration efforts. (Mandatory) Demonstrated experience with database architecture, performance design methodologies, and system-tuning recommendations. Preference for familiarity with Glue, Hive, and Iceberg or similar Demonstrated experience with Python, Bash, and Terraform Demonstrated experience with DevSecOps solutions and tools Demonstrated experience implementing CI/CD … and Data Governance concepts and experience. Demonstrated experience maintaining, supporting, and improving the ETL process through the implementation and standardization of data flows with Apache Nifi and other ETL tools. Demonstrated experience with Apache Spark More ❯
large-scale data migration efforts. 5. (Mandatory) Demonstrated experience with database architecture, performance design methodologies, and system-tuning recommendations. Preference for familiarity with Glue, Hive, and Iceberg or similar 6. (Mandatory) Demonstrated experience with Python, Bash, and Terraform 7. (Mandatory) Demonstrated experience with DevSecOps solutions and tools 8. (Mandatory … Governance concepts and experience. 11. (Desired) Demonstrated experience maintaining, supporting, and improving the ETL process through the implementation and standardization of data flows with Apache Nifi and other ETL tools. 12. (Desired) Demonstrated experience with Apache Spark More ❯
London, England, United Kingdom Hybrid / WFH Options
Timely Find
languages and carrying out data analysis and hypothesis testing - Advanced SQL OR Python. Experience with big data technologies and data platforms - we use BigQuery, Apache Ibis, SQLGlot, DBT. You might have experience with Hadoop, Hive, Redshift, Snowflake, Spark or similar. Experience with Version control/CI/CD More ❯
London, England, United Kingdom Hybrid / WFH Options
Merantix
Preferred Qualifications Hands-on experience with: Distributed computing frameworks, such as Ray Data and Spark. Databases and/or data warehousing technologies, such as Apache Hive. Data transformation via SQL and DBT. Orchestration platforms, such as Apache Airflow. Data catalogs and metadata management tools. o Vector data stores. More ❯
London, England, United Kingdom Hybrid / WFH Options
Autodesk
code, architectures, and experiments Relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra) Frameworks such as Ray data, Metaflow, Hadoop, Spark, or Hive Preferred Qualifications Experience with computational geometry such as mesh or boundary representation data processing. Experience with CAD model search and retrieval, in PLM systems … code, architectures, and experiments Relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra) Frameworks such as Ray data, Metaflow, Hadoop, Spark, or Hive Vector data stores Preferred Qualifications Experience with computational geometry such as mesh or boundary representation data processing. Experience with CAD model search and retrieval More ❯
London, England, United Kingdom Hybrid / WFH Options
Autodesk
Social network you want to login/join with: Software Development Engineer, Data, London col-narrow-left Client: Autodesk Location: London, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Reference: 61079ee17325 Job Views: 5 More ❯
Business Research Analyst - II, RBS Returns Reduction As a Research Analyst, you'll collaborate with experts to develop ML models leveraging big data solutions and Large Language Models (LLMs) for business needs. You'll drive product pilots, demonstrating innovative thinking More ❯
knowledge of warehousing and ETLs. Extensive knowledge of popular database providers such as SQL Server, PostgreSQL, Teradata and others. • Proficiency in technologies in the Apache Hadoop ecosystem, especially Hive, Impala and Ranger • Experience working with open file and table formats such Parquet, AVRO, ORC, Iceberg and Delta Lake More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
eFinancialCareers
as ITRS Geneos, AppDynamics Good Experience with Log Aggregation tools such as ELK, Splunk, Grafana(GEM) is preferred Experience working with Oracle Database, Hadoop, Apache Spark, Hive, Starburst Experience with Middleware solutions such as Tibco EMS, Kafka Good written and verbal communication skills What we can offer you More ❯
large-scale data migration efforts. 5. (Mandatory) Demonstrated experience with database architecture, performance design methodologies, and system-tuning recommendations. Preference for familiarity with Glue, Hive, and Iceberg or similar 6. (Mandatory) Demonstrated experience with Python, Bash, and Terraform 7. (Mandatory) Demonstrated experience with DevSecOps solutions and tools 8. (Mandatory … Governance concepts and experience. 11. (Desired) Demonstrated experience maintaining, supporting, and improving the ETL process through the implementation and standardization of data flows with Apache Nifi and other ETL tools. 12. (Desired) Demonstrated experience with Apache Spark MUST be a US Citizen with a U.S. Government clearance - Intel More ❯
and verbal communication skills High level of competence in Python, Spark, and Unix/Linux scripts Demonstrable experience using distributed systems (for example, Hadoop, Hive, Impala) Extensive experience with SAS/SQL/Hive for extracting and aggregating data Experience with time series modelling problems Deep learning experience More ❯
and verbal communication skills High level of competence in Python, Spark, and Unix/Linux scripts Demonstrable experience using distributed systems (for example, Hadoop, Hive, Impala) Extensive experience with SAS/SQL/Hive for extracting and aggregating data Experience with time series modelling problems Deep learning experience More ❯
large-scale data migration efforts. 5. (Mandatory) Demonstrated experience with database architecture, performance design methodologies, and system-tuning recommendations. Preference for familiarity with Glue, Hive, and Iceberg or similar 6. (Mandatory) Demonstrated experience with Python, Bash, and Terraform 7. (Mandatory) Demonstrated experience with DevSecOps solutions and tools 8. (Mandatory … Governance concepts and experience. 11. (Desired) Demonstrated experience maintaining, supporting, and improving the ETL process through the implementation and standardization of data flows with Apache Nifi and other ETL tools. 12. (Desired) Demonstrated experience with Apache Spark B4CORP Company Information B4Corp is a small defense contracting company that More ❯
learning, and AI workloads using tools like Jupyter, Spacy, Transformers, and NLTK. Big Data Platforms: Utilize big data NoSQL engines and platforms such as Hive, Impala, and Elasticsearch for data storage and processing BI and Visualization: Implement and support business intelligence and visualization tools like Tableau, Kibana, and PowerBI … data science, machine learning, and AI tools such as Jupyter, Spacy, Transformers, and NLTK. Experience with big data NoSQL engines/platforms such as Hive, Impala, and Elasticsearch. Proficiency with business intelligence and visualization tools like Tableau, Kibana, and PowerBI. Excellent communication and collaboration skills. Preferred Qualifications: Certification in More ❯
Skills and Experience Work experience in data science, machine learning, and business analytics. Practical experience in coding languages such as Python, R, Scala , etc. (Python preferred). Proficiency in database technologies such as SQL, ETL, No-SQL, Data Warehousing, and More ❯
Business Research Analyst - II, RBS Tech As a Research Analyst, you'll collaborate with experts to develop cutting-edge ML solutions for business needs. You'll drive product pilots, demonstrating innovative thinking and customer focus. You'll build scalable solutions More ❯
and Solution Architect teams to design the overall solution architecture for end-to-end data flows. Utilize big data technologies such as Cloudera, Hue, Hive, HDFS, and Spark for data processing and storage. Ensure smooth data management for marketing consent and master data management (MDM) systems. Key Skills and … delivery for streamlined development workflows. Azure Data Factory/DataBricks : Experience with these services is a plus for handling complex data processes. Cloudera (Hue, Hive, HDFS, Spark) : Experience with these big data tools is highly desirable for data processing. Azure DevOps, Vault : Core skills for working in Azure cloud More ❯
and-Spoke), security implementations (IAM, Secret Manager, firewalls, Identity-Aware Proxy), DNS configuration, VPN, and Load Balancing. Data Processing & Transformation: Utilize Hadoop cluster with Hive for querying data, and PySpark for data transformations. Implement job orchestration using Airflow. Core GCP Services Management: Work extensively with services like Google Kubernetes …/CD for automation Deep knowledge of network architectures, security implementations, and management of core GCP services Proficiency in employing data processing tools like Hive, PySpark , and data orchestration tools like Airflow Familiarity with managing and integrating diverse data sources Certified GCP Cloud Architect and Data Engineer Additional Information More ❯
Amazon Selection Monitoring Team Job Description Amazon’s Selection Monitoring team is responsible for expanding the largest catalog on the planet. Our systems process billions of products to algorithmically identify products not yet sold on Amazon and programmatically add them More ❯
We are seeking a specialist Kotlin Developer with experience working on Big Data projects in a high-performance environment. We're working with banks and other major financial institutions on projects where microseconds count. Essential functions You will build and More ❯
London, England, United Kingdom Hybrid / WFH Options
Nuvance Health
member of an agile feature team Helping maintain code quality via code reviews Skill Requirements Proficiency in administrating Big Data technologies (Hadoop, HDFS, Spark, Hive, Yarn, Oozie, Kafka , Hbase, Apache stack) Proficiency in defining highly scalable Platform Architecture. Architectural Design Patterns, Highly optimized, Low latency and Massively Scalable More ❯
random forests, - Strong ability to learn clustering) statistical programming quickly. languages (SAS, R, Python, Matlab) - Able to integrate easily and big data tools (Hadoop, Hive, into multidisciplinary etc). teams. - Solid academic record . - Strong computer skills. What we offer We offer you the possibility to join a firm … tecniche di modellazione (logit, GLM, serie temporali, random forests, clustering), linguaggi di programmazione statistica (SAS, R, Python, Matlab) e strumenti di big data (Hadoop, Hive, etc.) è auspicabile. - Solido percorso accademico. - Corsi Post Laurea e/o corsi di specializzazione sono un plus, specialmente in Data Science, Finanza Quantitativa More ❯
Social network you want to login/join with: We’re looking for a Senior Security Specialist to join us in Reading Reporting into our Senior CERT Manager you will help continue to mature the way in which the organisation More ❯