Microsoft Fabric, including Lakehouse (Delta format), OneLake, Pipelines & Dataflows Gen2, Notebooks (PySpark), Power BI & Semantic Models. Possess a solid understanding of data integration patterns, ETL/ELT, and modern data architectures. Be familiar with CI/CD practices in a data engineering context. Have excellent SQL and Spark (PySpark) skills. More ❯
business requirements. Experience with Big Data technologies, Data Lakes, Data Warehouses, Lakehouses. Proficiency in Databricks and Python, including concurrency and error handling. Experience with ETL tools and data visualization tools. Preferred qualifications, capabilities, and skills Experience with AWS services like Lambdas and Terraform. Knowledge of Java and front-end development. More ❯
it's properly understood, organized and governed. Knowledge of some Data Management technologies such as Relational and Columnar Databases, and/or Data Integration (ETL) or API development. Knowledge of some Data Formats such as JSON, XML and binary formats such as Avro or Google Protocol Buffers. Experience collaborating with More ❯