Trout Lake, Washington, United States Hybrid / WFH Options
Centene
tools development Data warehouse and data mart design and development ETL (Extract, Transform, Load) processes. Data governance and compliance Proficiency in SQL and Python Knowledge of Big Data technologies (Hadoop, Spark) Cloud computing (AWS, Azure, GCP) Data modeling and architecture Advanced analytics and predictive modeling Knowledge of data privacy laws and regulations Proficiency in BI tools (Tableau, PowerBI) Strong More ❯
Palisades, Washington, United States Hybrid / WFH Options
Centene
tools development Data warehouse and data mart design and development ETL (Extract, Transform, Load) processes. Data governance and compliance Proficiency in SQL and Python Knowledge of Big Data technologies (Hadoop, Spark) Cloud computing (AWS, Azure, GCP) Data modeling and architecture Advanced analytics and predictive modeling Knowledge of data privacy laws and regulations Proficiency in BI tools (Tableau, PowerBI) Strong More ❯
Auburn, Washington, United States Hybrid / WFH Options
Centene
tools development Data warehouse and data mart design and development ETL (Extract, Transform, Load) processes. Data governance and compliance Proficiency in SQL and Python Knowledge of Big Data technologies (Hadoop, Spark) Cloud computing (AWS, Azure, GCP) Data modeling and architecture Advanced analytics and predictive modeling Knowledge of data privacy laws and regulations Proficiency in BI tools (Tableau, PowerBI) Strong More ❯
Quincy, Washington, United States Hybrid / WFH Options
Centene
tools development Data warehouse and data mart design and development ETL (Extract, Transform, Load) processes. Data governance and compliance Proficiency in SQL and Python Knowledge of Big Data technologies (Hadoop, Spark) Cloud computing (AWS, Azure, GCP) Data modeling and architecture Advanced analytics and predictive modeling Knowledge of data privacy laws and regulations Proficiency in BI tools (Tableau, PowerBI) Strong More ❯
Ephrata, Washington, United States Hybrid / WFH Options
Centene
tools development Data warehouse and data mart design and development ETL (Extract, Transform, Load) processes. Data governance and compliance Proficiency in SQL and Python Knowledge of Big Data technologies (Hadoop, Spark) Cloud computing (AWS, Azure, GCP) Data modeling and architecture Advanced analytics and predictive modeling Knowledge of data privacy laws and regulations Proficiency in BI tools (Tableau, PowerBI) Strong More ❯
George, Washington, United States Hybrid / WFH Options
Centene
tools development Data warehouse and data mart design and development ETL (Extract, Transform, Load) processes. Data governance and compliance Proficiency in SQL and Python Knowledge of Big Data technologies (Hadoop, Spark) Cloud computing (AWS, Azure, GCP) Data modeling and architecture Advanced analytics and predictive modeling Knowledge of data privacy laws and regulations Proficiency in BI tools (Tableau, PowerBI) Strong More ❯
Lincoln, Washington, United States Hybrid / WFH Options
Centene
tools development Data warehouse and data mart design and development ETL (Extract, Transform, Load) processes. Data governance and compliance Proficiency in SQL and Python Knowledge of Big Data technologies (Hadoop, Spark) Cloud computing (AWS, Azure, GCP) Data modeling and architecture Advanced analytics and predictive modeling Knowledge of data privacy laws and regulations Proficiency in BI tools (Tableau, PowerBI) Strong More ❯
Rock Island, Washington, United States Hybrid / WFH Options
Centene
tools development Data warehouse and data mart design and development ETL (Extract, Transform, Load) processes. Data governance and compliance Proficiency in SQL and Python Knowledge of Big Data technologies (Hadoop, Spark) Cloud computing (AWS, Azure, GCP) Data modeling and architecture Advanced analytics and predictive modeling Knowledge of data privacy laws and regulations Proficiency in BI tools (Tableau, PowerBI) Strong More ❯
White Salmon, Washington, United States Hybrid / WFH Options
Centene
tools development Data warehouse and data mart design and development ETL (Extract, Transform, Load) processes. Data governance and compliance Proficiency in SQL and Python Knowledge of Big Data technologies (Hadoop, Spark) Cloud computing (AWS, Azure, GCP) Data modeling and architecture Advanced analytics and predictive modeling Knowledge of data privacy laws and regulations Proficiency in BI tools (Tableau, PowerBI) Strong More ❯
time data pipelines for processing large-scale data. Experience with ETL processes for data ingestion and processing. Proficiency in Python and SQL. Experience with big data technologies like ApacheHadoop and Apache Spark. Familiarity with real-time data processing frameworks such as Apache Kafka or Flink. MLOps & Deployment: Experience deploying and maintaining large-scale ML inference pipelines into production More ❯
Snowflake. Understanding of cloud platform infrastructure and its impact on data architecture. Data Technology Skills: A solid understanding of big data technologies such as Apache Spark, and knowledge of Hadoop ecosystems. Knowledge of programming languages such as Python, R, or Java is beneficial. Exposure to ETL/ELT processes, SQL, NoSQL databases is a nice-to-have, providing a More ❯
tools and libraries (e.g. Power BI) Background in database administration or performance tuning Familiarity with data orchestration tools, such as Apache Airflow Previous exposure to big data technologies (e.g. Hadoop, Spark) for large data processing Strong analytical skills, including a thorough understanding of how to interpret customer business requirements and translate them into technical designs and solutions. Strong communication More ❯
7+ years in data architecture and solution design, and a history of large-scale data solution implementation. Technical Expertise : Deep knowledge of data architecture principles, big data technologies (e.g., Hadoop, Spark), and cloud platforms like AWS, Azure, or GCP. Data Management Skills : Advanced proficiency in data modelling, SQL/NoSQL databases, ETL processes, and data integration techniques. Programming & Tools More ❯
Familiarity with Data Mesh, Data Fabric, and product-led data strategies. Expertise in cloud platforms (AWS, Azure, GCP, Snowflake). Technical Skills Proficiency in big data tools (Apache Spark, Hadoop). Programming knowledge (Python, R, Java) is a plus. Understanding of ETL/ELT, SQL, NoSQL, and data visualisation tools. Awareness of ML/AI integration into data architectures. More ❯
Docker Experience with NLP and/or computer vision Exposure to cloud technologies (eg. AWS and Azure) Exposure to Big data technologies Exposure to Apache products eg. Hive, Spark, Hadoop, NiFi Programming experience in other languages This is not an exhaustive list, and we are keen to hear from you even if you don't tick every box. The More ❯
Greater Bristol Area, United Kingdom Hybrid / WFH Options
ADLIB Recruitment | B Corp™
Clear communicator, able to translate complex data concepts to cross-functional teams Bonus points for experience with: DevOps tools like Docker, Kubernetes, CI/CD Big data tools (Spark, Hadoop), ETL workflows, or high-throughput data streams Genomic data formats and tools Cold and hot storage management, ZFS/RAID systems, or tape storage AI/LLM tools to More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
ADLIB Recruitment
Clear communicator, able to translate complex data concepts to cross-functional teams Bonus points for experience with: DevOps tools like Docker, Kubernetes, CI/CD Big data tools (Spark, Hadoop), ETL workflows, or high-throughput data streams Genomic data formats and tools Cold and hot storage management, ZFS/RAID systems, or tape storage AI/LLM tools to More ❯
and experience What We're Looking For Strong hands-on experience with Python, Java, or Scala Proficiency in cloud environments (AWS, Azure, or GCP) and big data tech (Spark, Hadoop, Airflow) Solid understanding of SQL, ETL/ELT approaches, and data modelling techniques Experience building CI/CD pipelines with tools like Jenkins or CircleCI Knowledge of data security More ❯
flow diagrams, and process documentation. MINIMUM QUALIFICATIONS/SKILLS Proficiency in Python and SQL. Experience with cloud platforms like AWS, GCP, or Azure, and big data technologies such as Hadoop or Spark. Experience working with relational and NoSQL databases. Strong knowledge of data structures, data modeling, and database schema design. Experience supporting data science workloads with structured and unstructured More ❯
projects Skills & Experience: Proven experience as a Lead Data Solution Architect in consulting environments Expertise in cloud platforms (AWS, Azure, GCP, Snowflake) Strong knowledge of big data technologies (Spark, Hadoop), ETL/ELT, and data modelling Familiarity with Python, R, Java, SQL, NoSQL, and data visualisation tools Understanding of machine learning and AI integration in data architecture Experience with More ❯
as Code (IaC) and deploying infrastructure across environments Managing cloud infrastructure with a DevOps approach Handling and transforming various data types (JSON, CSV, etc.) using Apache Spark, Databricks, or Hadoop Understanding modern data system architectures (Data Warehouse, Data Lakes, Data Meshes) and their use cases Creating data pipelines on cloud platforms with error handling and reusable libraries Documenting and More ❯
Wales, Yorkshire, United Kingdom Hybrid / WFH Options
Made Tech Limited
as Code (IaC) and deploying infrastructure across environments Managing cloud infrastructure with a DevOps approach Handling and transforming various data types (JSON, CSV, etc.) using Apache Spark, Databricks, or Hadoop Understanding modern data system architectures (Data Warehouse, Data Lakes, Data Meshes) and their use cases Creating data pipelines on cloud platforms with error handling and reusable libraries Documenting and More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Made Tech Limited
as Code (IaC) and deploying infrastructure across environments Managing cloud infrastructure with a DevOps approach Handling and transforming various data types (JSON, CSV, etc.) using Apache Spark, Databricks, or Hadoop Understanding modern data system architectures (Data Warehouse, Data Lakes, Data Meshes) and their use cases Creating data pipelines on cloud platforms with error handling and reusable libraries Documenting and More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Made Tech Limited
as Code (IaC) and deploying infrastructure across environments Managing cloud infrastructure with a DevOps approach Handling and transforming various data types (JSON, CSV, etc.) using Apache Spark, Databricks, or Hadoop Understanding modern data system architectures (Data Warehouse, Data Lakes, Data Meshes) and their use cases Creating data pipelines on cloud platforms with error handling and reusable libraries Documenting and More ❯
design of data architectures that will be deployed You have experience in database technologies including writing complex queries against their (relational and non-relational) data stores (e.g. Postgres, ApacheHadoop, Elasticsearch, Graph databases), and designing the database schemas to support those queries You have a good understanding of coding best practices & design patterns and experience with code & data versioning More ❯