Location: Remote-first (UK-based) Rate: Up to £550 p/d Contract: 6 - 12 months (Outside IR35) Tech Stack: Python, FastAPI, GCP, BigQuery, Apache Spark, ApacheBeam, Google Cloud Dataflow We're working with a forward-thinking consultancy that helps top companies build and scale high … You’ll Be Doing: Building data pipelines and ETL workflows that process huge datasets Designing, optimizing, and maintaining high-throughput reporting solutions Working with Apache Spark for large-scale data processing Using ApacheBeam and Google Cloud Dataflow to manage complex data workflows Developing and improving backend … writing clean, efficient, and scalable code ✔ Experience with BigQuery, PostgreSQL, and Elasticsearch ✔ Hands-on experience with Google Cloud, Kubernetes, and Terraform ✔ Deep understanding of Apache Spark for large-scale data processing ✔ Knowledge of ApacheBeam & Google Cloud Dataflow for data pipeline orchestration ✔ A team-first mindset with More ❯
Location: Remote-first (UK-based) 💰 Rate: Up to £550 p/d 📆 Contract: 6 - 12 months (Outside IR35) 🛠 Tech Stack: Python, FastAPI, GCP, BigQuery, Apache Spark, ApacheBeam, Google Cloud Dataflow We're working with a forward-thinking consultancy that helps top companies build and scale high … You’ll Be Doing: 🔹 Building data pipelines and ETL workflows that process huge datasets 🔹 Designing, optimizing, and maintaining high-throughput reporting solutions 🔹 Working with Apache Spark for large-scale data processing 🔹 Using ApacheBeam and Google Cloud Dataflow to manage complex data workflows 🔹 Developing and improving backend … writing clean, efficient, and scalable code ✔ Experience with BigQuery, PostgreSQL, and Elasticsearch ✔ Hands-on experience with Google Cloud, Kubernetes, and Terraform ✔ Deep understanding of Apache Spark for large-scale data processing ✔ Knowledge of ApacheBeam & Google Cloud Dataflow for data pipeline orchestration ✔ A team-first mindset with More ❯
we're looking for great people, not just those who simply check off all the boxes. What you'll do: Work with technologies like Apache Lucene, Apache Flink, ApacheBeam, and Kubernetes to build core components of Yelp's search infrastructure. Design, build, and maintain scalable … and complexity analysis. Comprehensive understanding of systems and application design, including operational and reliability trade-offs. Experience with distributed data processing frameworks such as Apache Flink or Apache Beam. Familiarity with search technologies like Apache Lucene or Elasticsearch is a plus. Experience working with containerized environments and More ❯
have experience architecting data pipelines and are self-sufficient in getting the data you need to build and evaluate models, using tools like Dataflow, ApacheBeam, or Spark. You care about agile software processes, data-driven development, reliability, and disciplined experimentation You have experience and passion for fostering … Platform is a plus Experience with building data pipelines and getting the data you need to build and evaluate your models, using tools like ApacheBeam/Spark is a plus Where You'll Be This role is based in London (UK). We offer you the flexibility More ❯
have experience architecting data pipelines and are self-sufficient in getting the data you need to build and evaluate models, using tools like Dataflow, ApacheBeam, or Spark You care about agile software processes, data-driven development, reliability, and disciplined experimentation You have experience and passion for fostering … Platform is a plus Experience with building data pipelines and getting the data you need to build and evaluate your models, using tools like ApacheBeam/Spark is a plus Where You'll Be This role is based in London (UK) We offer you the flexibility to More ❯
Lexington, Massachusetts, United States Hybrid / WFH Options
Equiliem
Computer Science. Recent graduates or candidates without a Bachelor's degree considered with clear evidence of significant outside-of-classroom experience. • Experience with the Apache Maven or Gradle build system. • Ability to understand front-end source code written in React or similar frameworks. Provide guidance to less experienced front … and environments, such as Pandas, TensorFlow, and Jupyter Notebook. • Broad knowledge of the general features, capabilities, and trade-offs of common data warehouse (e.g. Apache Hadoop); workflow orchestration (e.g. ApacheBeam); data extract, transform and load (ETL); and stream processing (e.g. Kafka) technologies. Hands-on experience with More ❯
or all of the services below would put you at the top of our list: Google Cloud Storage. Google Data Transfer Service. Google Dataflow (ApacheBeam). Google PubSub. Google CloudRun. BigQuery or any RDBMS. Python. Debezium/Kafka. dbt (Data Build tool). Interview process Interviewing is More ❯
or all of the services below would put you at the top of our list Google Cloud Storage Google Data Transfer Service Google Dataflow (ApacheBeam) Google PubSub Google CloudRun BigQuery or any RDBMS Python Debezium/Kafka dbt (Data Build tool) Interview process Interviewing is a two More ❯