Trout Lake, Washington, United States Hybrid / WFH Options
Centene
tools development Data warehouse and data mart design and development ETL (Extract, Transform, Load) processes. Data governance and compliance Proficiency in SQL and Python Knowledge of Big Data technologies (Hadoop, Spark) Cloud computing (AWS, Azure, GCP) Data modeling and architecture Advanced analytics and predictive modeling Knowledge of data privacy laws and regulations Proficiency in BI tools (Tableau, PowerBI) Strong More ❯
Auburn, Washington, United States Hybrid / WFH Options
Centene
tools development Data warehouse and data mart design and development ETL (Extract, Transform, Load) processes. Data governance and compliance Proficiency in SQL and Python Knowledge of Big Data technologies (Hadoop, Spark) Cloud computing (AWS, Azure, GCP) Data modeling and architecture Advanced analytics and predictive modeling Knowledge of data privacy laws and regulations Proficiency in BI tools (Tableau, PowerBI) Strong More ❯
Lincoln, Washington, United States Hybrid / WFH Options
Centene
tools development Data warehouse and data mart design and development ETL (Extract, Transform, Load) processes. Data governance and compliance Proficiency in SQL and Python Knowledge of Big Data technologies (Hadoop, Spark) Cloud computing (AWS, Azure, GCP) Data modeling and architecture Advanced analytics and predictive modeling Knowledge of data privacy laws and regulations Proficiency in BI tools (Tableau, PowerBI) Strong More ❯
Quincy, Washington, United States Hybrid / WFH Options
Centene
tools development Data warehouse and data mart design and development ETL (Extract, Transform, Load) processes. Data governance and compliance Proficiency in SQL and Python Knowledge of Big Data technologies (Hadoop, Spark) Cloud computing (AWS, Azure, GCP) Data modeling and architecture Advanced analytics and predictive modeling Knowledge of data privacy laws and regulations Proficiency in BI tools (Tableau, PowerBI) Strong More ❯
Ephrata, Washington, United States Hybrid / WFH Options
Centene
tools development Data warehouse and data mart design and development ETL (Extract, Transform, Load) processes. Data governance and compliance Proficiency in SQL and Python Knowledge of Big Data technologies (Hadoop, Spark) Cloud computing (AWS, Azure, GCP) Data modeling and architecture Advanced analytics and predictive modeling Knowledge of data privacy laws and regulations Proficiency in BI tools (Tableau, PowerBI) Strong More ❯
Palisades, Washington, United States Hybrid / WFH Options
Centene
tools development Data warehouse and data mart design and development ETL (Extract, Transform, Load) processes. Data governance and compliance Proficiency in SQL and Python Knowledge of Big Data technologies (Hadoop, Spark) Cloud computing (AWS, Azure, GCP) Data modeling and architecture Advanced analytics and predictive modeling Knowledge of data privacy laws and regulations Proficiency in BI tools (Tableau, PowerBI) Strong More ❯
George, Washington, United States Hybrid / WFH Options
Centene
tools development Data warehouse and data mart design and development ETL (Extract, Transform, Load) processes. Data governance and compliance Proficiency in SQL and Python Knowledge of Big Data technologies (Hadoop, Spark) Cloud computing (AWS, Azure, GCP) Data modeling and architecture Advanced analytics and predictive modeling Knowledge of data privacy laws and regulations Proficiency in BI tools (Tableau, PowerBI) Strong More ❯
White Salmon, Washington, United States Hybrid / WFH Options
Centene
tools development Data warehouse and data mart design and development ETL (Extract, Transform, Load) processes. Data governance and compliance Proficiency in SQL and Python Knowledge of Big Data technologies (Hadoop, Spark) Cloud computing (AWS, Azure, GCP) Data modeling and architecture Advanced analytics and predictive modeling Knowledge of data privacy laws and regulations Proficiency in BI tools (Tableau, PowerBI) Strong More ❯
Rock Island, Washington, United States Hybrid / WFH Options
Centene
tools development Data warehouse and data mart design and development ETL (Extract, Transform, Load) processes. Data governance and compliance Proficiency in SQL and Python Knowledge of Big Data technologies (Hadoop, Spark) Cloud computing (AWS, Azure, GCP) Data modeling and architecture Advanced analytics and predictive modeling Knowledge of data privacy laws and regulations Proficiency in BI tools (Tableau, PowerBI) Strong More ❯
time data pipelines for processing large-scale data. Experience with ETL processes for data ingestion and processing. Proficiency in Python and SQL. Experience with big data technologies like ApacheHadoop and Apache Spark. Familiarity with real-time data processing frameworks such as Apache Kafka or Flink. MLOps & Deployment: Experience deploying and maintaining large-scale ML inference pipelines into production More ❯
Snowflake. Understanding of cloud platform infrastructure and its impact on data architecture. Data Technology Skills: A solid understanding of big data technologies such as Apache Spark, and knowledge of Hadoop ecosystems. Knowledge of programming languages such as Python, R, or Java is beneficial. Exposure to ETL/ELT processes, SQL, NoSQL databases is a nice-to-have, providing a More ❯
tools and libraries (e.g. Power BI) Background in database administration or performance tuning Familiarity with data orchestration tools, such as Apache Airflow Previous exposure to big data technologies (e.g. Hadoop, Spark) for large data processing Strong analytical skills, including a thorough understanding of how to interpret customer business requirements and translate them into technical designs and solutions. Strong communication More ❯
7+ years in data architecture and solution design, and a history of large-scale data solution implementation. Technical Expertise: Deep knowledge of data architecture principles, big data technologies (e.g., Hadoop, Spark), and cloud platforms like AWS, Azure, or GCP. Data Management Skills: Advanced proficiency in data modelling, SQL/NoSQL databases, ETL processes, and data integration techniques. Programming & Tools More ❯
Greater Bristol Area, United Kingdom Hybrid / WFH Options
ADLIB Recruitment | B Corp™
Clear communicator, able to translate complex data concepts to cross-functional teams Bonus points for experience with: DevOps tools like Docker, Kubernetes, CI/CD Big data tools (Spark, Hadoop), ETL workflows, or high-throughput data streams Genomic data formats and tools Cold and hot storage management, ZFS/RAID systems, or tape storage AI/LLM tools to More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
ADLIB Recruitment
Clear communicator, able to translate complex data concepts to cross-functional teams Bonus points for experience with: DevOps tools like Docker, Kubernetes, CI/CD Big data tools (Spark, Hadoop), ETL workflows, or high-throughput data streams Genomic data formats and tools Cold and hot storage management, ZFS/RAID systems, or tape storage AI/LLM tools to More ❯
Familiarity with Data Mesh, Data Fabric, and product-led data strategies. Expertise in cloud platforms (AWS, Azure, GCP, Snowflake). Technical Skills Proficiency in big data tools (Apache Spark, Hadoop). Programming knowledge (Python, R, Java) is a plus. Understanding of ETL/ELT, SQL, NoSQL, and data visualisation tools. Awareness of ML/AI integration into data architectures. More ❯
Docker Experience with NLP and/or computer vision Exposure to cloud technologies (eg. AWS and Azure) Exposure to Big data technologies Exposure to Apache products eg. Hive, Spark, Hadoop, NiFi Programming experience in other languages This is not an exhaustive list, and we are keen to hear from you even if you don't tick every box. The More ❯
and experience What We're Looking For Strong hands-on experience with Python, Java, or Scala Proficiency in cloud environments (AWS, Azure, or GCP) and big data tech (Spark, Hadoop, Airflow) Solid understanding of SQL, ETL/ELT approaches, and data modelling techniques Experience building CI/CD pipelines with tools like Jenkins or CircleCI Knowledge of data security More ❯
flow diagrams, and process documentation. MINIMUM QUALIFICATIONS/SKILLS Proficiency in Python and SQL. Experience with cloud platforms like AWS, GCP, or Azure, and big data technologies such as Hadoop or Spark. Experience working with relational and NoSQL databases. Strong knowledge of data structures, data modeling, and database schema design. Experience supporting data science workloads with structured and unstructured More ❯
projects Skills & Experience: Proven experience as a Lead Data Solution Architect in consulting environments Expertise in cloud platforms (AWS, Azure, GCP, Snowflake) Strong knowledge of big data technologies (Spark, Hadoop), ETL/ELT, and data modelling Familiarity with Python, R, Java, SQL, NoSQL, and data visualisation tools Understanding of machine learning and AI integration in data architecture Experience with More ❯
DevOps tools (Git, Maven, Jenkins, CI/CD) Excellent communication and teamwork skills Bonus Points For Experience in payments, financial services, or fraud detection Familiarity with Big Data tools (Hadoop, Spark, Kafka) Exposure to cloud-native architecture (AWS, GCP, Azure) Understanding of TDD/BDD and modern testing frameworks What's On Offer Hybrid Working - Flexibility with 2 remote More ❯
design of data architectures that will be deployed You have experience in database technologies including writing complex queries against their (relational and non-relational) data stores (e.g. Postgres, ApacheHadoop, Elasticsearch, Graph databases), and designing the database schemas to support those queries You have a good understanding of coding best practices & design patterns and experience with code & data versioning More ❯
Bethesda, Maryland, United States Hybrid / WFH Options
Gridiron IT Solutions
Python, SCALA, and/or UNIX shell scripting Expertise in machine learning techniques and statistical analysis Proficiency in SQL and NoSQL databases Experience with big data platforms such as Hadoop, Spark, and Kafka Cloud computing expertise across AWS, Azure, and other Experience in designing and implementing real-time data processing solutions Strong understanding of AI/ML applications in More ❯
Baginton, Warwickshire, United Kingdom Hybrid / WFH Options
Arden University
or moreprogramming languages such asPython, R, and SQL. An understanding of databasetheory and design. Experience withSQL and NoSQL databases. Familiarity with big datatechnologies and ecosystems suchas MicrosoftSynapse/Fabric,Hadoop, Spark, Kafka, and others. Skills in data modelling and datawarehousing solutions. Experienceof dimensional modelling (Kimball). Proven Experience in designing and developing ETL/ELT processes. Knowledge of data More ❯
Near Real time streaming patterns. Strong background in Data Management, Data Governance, Transformation initiatives preferred. Preferred Experience/Familiarity with one or more of these tools Big data platforms - Hadoop, Apache Kafka Relational SQL, NoSQL, and Cloud Native databases - Postgres, Cassandra, Snowflake Experience with data pipeline and orchestration tools - Azkaban, Luigi, or Airflow Experience with stream-processing engines - ApacheMore ❯