San Antonio, Texas, United States Hybrid / WFH Options
IAMUS
systems, network devices, and cybersecurity devices. An understanding of data cleansing and profiling An understanding of data schema, model, and ontology concepts Experience with Jupyter notebooks or similar AI/ML frameworks. Good communication skills Experience with Zoom/Teams/Meet style team meetings More ❯
Solihull, West Midlands, United Kingdom Hybrid / WFH Options
E.ON Gruppe
methods and tools like SQL, Python, or R. Experience in the energy sector is preferred. Familiarity with statistical data evaluation tools such as SAS, Jupyter, etc. Here's what you need to know Award-Winning Benefits: Our market-leading benefits package includes 26 days of holiday plus bank holidays, a More ❯
metrics. Experience using BI tools (e.g., Power BI, Tableau, Looker). Background in the airline, logistics, or operations domains. Exposure to Python, SQL, or Jupyter Notebooks for basic data validation or exploration. The ideal candidate is a team player who will be responsible for working with company data in various More ❯
metrics. Experience using BI tools (e.g., Power BI, Tableau, Looker). Background in the airline, logistics, or operations domains. Exposure to Python, SQL, or Jupyter Notebooks for basic data validation or exploration. The ideal candidate is a team player who will be responsible for working with company data in various More ❯
or more years relevant experience. Proficiency with containerization tools such as Podman and Docker. Experience with applications such as DataHub, Elasticsearch, Neo4j, Druid, JupyterHub, Label Studio, Spark, Kafka, Logstash, Kibana, MailServer, BusyBox, MongoDB, Cassandra, and GitLab CE. Experience with Red Hat Enterprise Linux (RHEL) environments. Strong understanding of Helm for More ❯
solihull, midlands, united kingdom Hybrid / WFH Options
E.ON
methods and tools like SQL, Python, or R. Experience in the energy sector is preferred. Familiarity with statistical data evaluation tools such as SAS, Jupyter, etc. Here's what you need to know Award-Winning Benefits: Our market-leading benefits package includes 26 days of holiday plus bank holidays, a More ❯
support a high-performing team that truly makes a difference, then come join us! Job Description: Perform SIGINT analytic development, utilizing python in a Jupyter Notebook environment. Provide expertise in signals analysis, and contribute to both front-end and/or back-end development. The Level 2 Cryptanalytic Computer Scientist More ❯
will have insurance P&C experience. You will also be proficient in various coding languages (e.g. R, Python) and development environments (e.g. R Studio, Jupyter, VS Code). Alongside this, you will be experienced in data visualization and communication around this to present insights to a non-technical audience. In More ❯
Bitbucket). Experience with on-premise deployments of repository managers (e.g., Artifactory, JFrog, Nexus). Experience with on-premise deployments of developer platforms (e.g., JupyterHub, GitPod). Experience with advanced software engineering concepts and API development. Experience with build and release systems, including publication, replication, distribution, and lifecycle management of More ❯
skills Experience in support roles, incident triage, and handling (SLAs) Linux system administration basics, Bash scripting, environment variables Experience with browser-based IDEs like Jupyter Notebooks Familiarity with Agile methodologies (SAFE, Scrum, JIRA) Languages and Frameworks: JSON YAML Python (advanced proficiency, Pydantic bonus) SQL PySpark Delta Lake Bash Git Markdown More ❯
proficiency in deploying cloud-based machine learning pipelines, particularly on Google Cloud Platform (GCP) and Vertex AI. Extensive experience in Python programming and using Jupyter notebooks. Deep expertise in deep learning frameworks such as Keras, PyTorch, and CoreML. Strong knowledge of image processing techniques and algorithms. Experience in data wrangling More ❯
proficiency in deploying cloud-based machine learning pipelines, particularly on Google Cloud Platform (GCP) and Vertex AI. Extensive experience in Python programming and using Jupyter notebooks. Deep expertise in deep learning frameworks such as Keras, PyTorch, and CoreML. Strong knowledge of image processing techniques and algorithms. Experience in data wrangling More ❯
or more years relevant experience. Proficiency with containerization tools such as Podman and Docker. Experience with applications such as DataHub, Elasticsearch, Neo4j, Druid, JupyterHub, Label Studio, Spark, Kafka, Logstash, Kibana, MailServer, BusyBox, MongoDB, Cassandra, and GitLab CE. Experience with Red Hat Enterprise Linux (RHEL) environments. Strong understanding of Helm for More ❯
enterprise scale relational databases Experience with building queries in SAP HANA studio using BW/4HANA Objects. Experience with reporting tools - Microsoft PowerBI, Tableau, Jupyter Notebooks Experience with developing SAP S4/HANA CDS Views. Experience with large data volume, performance tuning Experience in ABAP programming skills and SQL Skills. More ❯
research setting, with experience identifying biomarkers or therapeutic targets. Familiarity with best practices in version control, reproducible research, and collaborative development environments (e.g., Git, Jupyter, notebooks, code review). Northreach is an equal opportunity employer and we do not discriminate against any employee or applicant for employment based on race More ❯
research setting, with experience identifying biomarkers or therapeutic targets. Familiarity with best practices in version control, reproducible research, and collaborative development environments (e.g., Git, Jupyter, notebooks, code review). Northreach is an equal opportunity employer and we do not discriminate against any employee or applicant for employment based on race More ❯
to traveling to Octopus offices across Europe and the US. Our Data Stack: SQL-based pipelines built with dbt on Databricks Analysis via Python Jupyter notebooks Pyspark in Databricks workflows for heavy lifting Streamlit and Python for dashboarding Airflow DAGs with Python for ETL running on Kubernetes and Docker Django More ❯
clearance desired. Additional Required Skills & Experience: Proficiency with containerization tools such as Podman and Docker. Experience with applications such as DataHub, Elasticsearch, Neo4j, Druid, JupyterHub, Label Studio, Spark, Kafka, Logstash, Kibana, MailServer, BusyBox, MongoDB, Cassandra, and GitLab CE. Experience with Red Hat Enterprise Linux (RHEL) environments. Strong understanding of Helm More ❯
about delivering value to the client Excellent stakeholder engagement skills, ideally within a consulting role Python experience is essential Other technologies such as Cloud, Jupyter, Computer Vision, Natural Language Processing applications will all be looked on favourably Ability to convey complex and highly technical issues to less technical audiences Benefits More ❯
structures collections iter tools Num PyPandas SciPy Creating and working with Virtual Environment and installing required packages using pip Regular expressions Anaconda Working with Jupyter Note book on AEN Working with RESTAPIE xception handling Regular expressions Multithreading multiprocessing Asynchronous programming Lambda functionality Strong problem solving skills A deep understanding and More ❯
Engineering, Cybersecurity, or related discipline). • Proficiency with containerization tools such as Podman and Docker. • Experience with applications such as DataHub, Elasticsearch, Neo4j, Druid, JupyterHub, Label Studio, Spark, Kafka, Logstash, Kibana, MailServer, BusyBox, MongoDB, Cassandra, and GitLab CE. • Experience with Red Hat Enterprise Linux (RHEL) environments. • Strong understanding of Helm More ❯
data product company Experience building or maintaining third-party or in-house data quality and cataloguing solutions Experience with documentation of system architecture Pandas, Jupyter, Plotly DBT, Kafka BI tools such as Tableau, Metabase and Superset The current tech stack: Airflow Clickhouse DBT Python MongoDB PostgreSQL MariaDB Kafka K8s AWS More ❯
each problem to provide state-of-the-art techniques, tools, and approaches. Prefer knowledge of working with Big Data, dataflows, analytics in GME and Jupyter notebooks and Spark. Minimum Required Qualifications: Due to the nature of this position and the information employees will be required to access, U.S. Citizenship is More ❯
support a high-performing team that truly makes a difference, then come join us! Job Description: Perform SIGINT analytic development, utilizing python in a Jupyter Notebook environment. Provide expertise in signals analysis, and contribute to both front-end and/or back-end development. Qualifications: Doctoral degree plus 4 years More ❯