ML libraries (TensorFlow, pytorch, scikit-learn, transformers, XGBoost, ResNet), geospatial libraries (shapely, geopandas, rasterio), CV libraries (scikit-image, OpenCV, yolo, Detectron2). AWS, Postgres, ApacheAirflow, Apache kafka, Apache Spark. Mandatory requirements: You have at least 5 years of experience in DS role, deploying models into more »
ML libraries (TensorFlow, pytorch, scikit-learn, transformers, XGBoost, ResNet), geospatial libraries (shapely, geopandas, rasterio), CV libraries (scikit-image, OpenCV, yolo, Detectron2). AWS, Postgres, ApacheAirflow, Apache kafka, Apache Spark Mandatory requirements: You have at least 5 years of experience in the DS role, deploying models more »
experience. A good understanding of modern lakehouse architectures and corresponding technologies, such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt and Airflow/Dagster. Experience with Cloud providers. Familiarity with AWS S3, ECS and EC2/Fargate would be considered particularly beneficial. Extensive experience with database more »
implement the systems that require the highest data throughput in Java. We implement most of our long running services and analytics in C#.We use Airflow for workflow management, Kafka for data pipelines, Bitbucket for source control, Jenkins for continuous integration, ELK for logs, Grafana, Prometheus & InfluxDb for metrics, Docker more »
Manager Leadership experience: you must have led small teams on the delivery of projects AWS (EC2, ECS, EKS, Glue) Java SQL Spark MWAA/Airflow Agile The following is DESIRABLE, not essential: DBT Trading, Front Office finance You will have line-management responsibility for 3-5 senior engineers although more »
years of professional experience in a computer science/computational role Experience working in a technical environment with DevOps functions (Google Cloud, Airflow, InfluxDB, Grafana) Design and implementation of front-office systems for quant trading Highly Valued Relevant Experience Knowledge of machine learning and statistical techniques and related libraries more »
to understand and create software architectures that span multiple technologies/platforms A flavour of our technology stack Python Flask JavaScript BigQuery Redis ElasticSearch Airflow Google Cloud Platform Kubernetes Docker Benefits Voted "Best Places to Work," our culture is driven by self-starters, team players, and visionaries. Headquartered in more »
have a valid visa as we Are not able to sponsor Technical Stack:- Python, Postgres SQL, Azure Databricks, AWS(S3), Git, Azure DevOps CICD, ApacheAirflow Skills years of experience in python scripting. in developing applications in Python language. to python-oriented Algorithm s libraries such as NumPy more »
You are someone who should have good expertise in data on cloud (AWS Preferable), and someone who can demonstrate experience around distributed frameworks like APACHE SPARK & modern OLAP platforms like Snowflake. You will be looking after a team of 4, so experience of being able to motivate, manage and … experience in modern OLAP technologies like Snowflake & Lake house implementation with AWS integrations. Good to have experience in the streaming side of technology like Apache Kafka, Kinesis, and no SQL databases like Dynamo DB, demonstrable experience in PYTHON with frameworks like SPARK & good understanding on SNOWPARK & SNOWSQL. Experience around … Scheduling tool like ApacheAIRFLOW or AWS managed MWAA, is a plus. Proven Leadership and management experience; previously worked within financial services sector (preferred, not essential). What you’ll get for this role: Our purpose - with you today, for a better tomorrow – is a promise we make more »
end development in Typescript (Node) and Python Have worked with a cloud service, such as AWS Have some experience with data engineering using Databricks, Airflow, Dagster etc. Be passionate about building products for customers, ensuring they are stable, scalable, secure, observable, and performant. Enjoy collaborating closely with colleagues in more »
Redshift, Lambda, Step Functions, DynamoDB, AWS Glue, RDS, Athena, Kinesis, Quicksight. We also widely use other tech such as Snowflake, DBT, Databricks, Informatica, Matillion, Airflow, Tableau, Power BI etc. The Lead Data Architect will liaise with clients to define requirements, refine solutions and ultimately hand over to our own more »
the paradigm of statistical significance testing. Desirable experience: Familiarity with energy data, smart grids, demand response, or related fields is a plus. Experience with Airflow and/or Airbyte. 5+ years of experience as a data scientist, leading a small team. Expertise in Python (including asyncio) as a software more »
simultaneously Proactive and curious mindset, with accountability as a core personal value Desirable skills/experience: Prior experience with data management tools such as Airflow or Dagster Life at TradingHub is a rewarding journey within a fast-growing company that thrives on innovation and collaboration. By combining the best more »
Athena, EMR, SQS, Data pipelines, Glue etc., Shell scripting, SQL, and NoSQL. Experience in utilizing both open source and proprietary cloud data tools like Airflow, Glue, AWS Data Pipelines, MuleSoft, Redshift, BigQuery, and data visualization tools like Looker, Diver, Redash, Tableau, Power Bi etc. Expertise in implementing data quality more »
rapid release cycles. Knowledge of Java, Cucumber DynamoDB, Redis and Redshift Cloud: AWS (S3, EC2, Lambda, AWS Glue/Spark, IAM, Cloudwatch, MSK, Managed Airflow, Athena, Kenesis) Experience of writing and taking responsibility for technical documentation. Knowledge and experience of working with Python and Scala Experience: A degree in more »
Large Language Models (fine tuning, RAG, agents) Our technology stack Python PySpark for processing big data AWS: EMR, ECS, Athena, etc. DevOps: Terraform, Docker, Airflow, MLFlow Additional Information Why should you jump on board? We pay special attention to learning and development and organise quarterly company learning days as more »
machine learning or more general statistical analysis Strong software development skills with proficiency in Python or C++ Experience with analytics frameworks such as Pandas, Apache Spark, Dask, or Flink Experience with machine learning frameworks such as TensorFlow, JAX, PyTorch, Spark MLlib, Keras, or scikit-learn Experience in cloud-based … infrastructures such as AWS or GCP Exposure to orchestration platforms such as ApacheAirflow or Kubeflow Proven attention to detail, critical thinking, and the ability to work independently within a cross-functional team What benefits do we offer ? Cond Nast Learning Hub where you ll find you ll more »
or warehouse. Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed. Modern tech stack - Python, AWS, Airflow and DBT Must haves: A team player, happy to work with several teams, this is key as you will be reporting directly to the more »
Preston, Lancashire, United Kingdom Hybrid / WFH Options
Uniting Ambition
and deep knowledge in core processing and orchestration products such as Big Query, Data Flow, Data Fusion, Data Stream, Cloud Functions, Data Proc and Airflow/Composer. You will have held a leading role in a Data Engineering function with responsibility for the directing the efforts of other data more »