You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
An end-to-end data pipeline which extracts divvy bikeshare data from web loads it into data lake and datawarehouse transforms it using dbt and finally , a dashboard to visualize the data using looker studio, the pipeline is orchestrated using prefect
An end to end ELT project that uses data from the Zomato Restaurant, an Indian multinational restaurant aggregator and food delivery company. The project extracts data from Kaggle dataset, loads it into Snowflake tables, then is transformed and modelled in dbt Labs.
This a console line application is an Ad-hoc Solution for a client who needed a way of extracting data from their own website and print them onto a spreadsheet.
This project showcases an ELT pipeline that extracts JSON data, loads it into a PostgreSQL database, applies transformations using Python scripts, saves the transformed data in a CSV file, and shares it through a FastAPI endpoint.
Building a data pipeline using DBT and Snowflake to load sample TPCH data and perform basic data modeling techniques, such as building data marts, fact tables, macros and tests.
Enterprise ELT Framework using Airbyte, dbt, Prefect, and Power BI for seamless data extraction, transformation, and visualization. This project showcases a scalable pipeline integrating SQL Server, GCP, and tabular models in Power BI for real-time analytics and business intelligence. Ideal for data engineers and analysts seeking efficient ETL/ELT.
Builded an ETL pipeline using Python, Pandas, Python dictionary methods and regular expressions to ETL data. It involves extracting data from multiple sources, cleaning and transforming the data using Jupyter Notebook with pandas, numpy, and datetime packages, and loading the cleaned data into a relational database using pgAdmin