Lab Example: An e-commerce store

In todays data-driven world, understanding customer behavior is essential for optimizing the online shopping experience. This lab focuses on a simplified e-commerce store scenario that demonstrates how user interactions can be captured and analyzed using key data entities.

You'll explore a data model consisting of products, product categories, and event tracking—specifically impressions and clicks. These interactions help uncover insights into customer preferences and product performance.

The lab provides examples and real-world context to make the data more relatable and practical. By the end, you will gain hands-on experience in how to build a Fabric Real-Time end-to-end solution to analyze how customers engage with an online storefront—knowledge that is invaluable for making data-informed business decisions.

The e-commerce store database entities are:

  • Product: the product catalogue.
  • ProductCategory: the product categories.
  • events: a click or impression event.

An impression event is logged when a product appears in the search results.

Impressions example
An impression event is logged when a product appears in search results.

A click event is logged when the product is clicked and the customer has viewed the details.

Clicks example
A click event is logged when a product is opened.

Photo credits: Himiway Bikes, HEAD Accessories, Jan Kopřiva on Unsplash.

1. Login to Lab Environment

Note

Do not use an InPrivate browser window. Recommend using a Personal browser window to successfully run this lab.

  1. Open app.fabric.microsoft.com in your browser.
Fabric URL
Fabric URL
  1. Login with provided credentials, if a trial fabric tenant was previously setup (reference Pre-reqs). You may also choose to run the lab in your own Fabric Tenant if you already have one.
  2. Click Real-Time Intelligence.
Real-Time Intelligence
Real-Time Intelligence

2. Fabric Workspace

  1. There is already Fabric workspace assigned to your user. Please use it during the workshop.

3. Create a new Eventhouse

  1. Click + New Item.
New Item
  1. In New item select Eventhouse.
Select Eventhouse
  1. In New Eventhouse insert WebEvents_EH as name and click Create. After creation, it opens automatically.
Create Eventhouse

Note

The Eventhouse is designed to handle real-time data streams efficiently, enabling timely insights. Data is indexed and partitioned by ingestion time.

4. Enable OneLake Availability

Also called one logical copy. It lets KQL Database tables be accessed from a Lakehouse, Notebooks, etc in delta-parquet format via OneLake.

OneLake availability

Setup

  1. Open the KQL Database WebEvents_EH under KQL Databases.
Open KQL Database
  1. Toggle availability to activate OneLake availability.
Activate OneLake availability

Note

Newly created tables inherit the Database-level OneLake availability setting.

  1. Confirm Apply to existing tables and click Turn on.
Turn on OneLake availability

5. Create a new Eventstream

We will stream notebook-generated impressions and clicks into an Eventstream, consumed by the Eventhouse KQL Database.

Eventstream overview
  1. Open your Workspace (e.g., WorkSpace.Z.XXX) → + New ItemEventstream.
Create Eventstream
  1. Name it WebEventsStream_ES, select Enhanced Capabilities, click Create.
Eventstream name
  1. On Design a flow... click Use Custom Endpoint to create an Event Hub.
Use Custom Endpoint
  1. Source name: WebEventsCustomSourceAdd.
Add source
  1. Click Publish to create the Event Hub.
Publish Eventstream
  1. Select source WebEventsCustomSourceKeys → copy Event hub name.
Copy Event hub name

Note

Copy values into Notepad for later reference.

  1. Reveal and copy the connection string (primary or secondary).
Copy connection string

Note

Connection string must be visible to copy. Kafka endpoints are also provided.

6. Import Data Generator Notebook

Use the Python notebook Generate_synthetic_web_events.ipynb.

  1. Open the notebook in GitHub → Download raw file.
Download notebook

Import it to your Fabric Workspace:

  1. Go to workspace (e.g., RTI Tutorial) → ImportNotebookFrom this computer.
Import notebook
  1. In Import status select Upload.
Upload notebook
  1. Browse and open the downloaded notebook. Fabric will confirm successful import.
Notebook imported
Import success

7. Run the notebook

Note

DO NOT use an InPrivate browser window. Use a Personal window.

  1. Open Generate_synthetic_web_events in your workspace.
Open notebook
  1. Paste eventHubNameevents and eventHubConnString from Task 5.
Paste values
  1. Click Run all.
Run all

Note

Errors in cell 1 due to already installed libraries can be ignored.

Notebook warnings
  1. Scroll to last code cell; confirm JSON events output appears.
Generated events

8. Define Eventstream topology

  1. Open Eventstream WebEventStream_ESEdit.
Open Eventstream
Edit Eventstream
  1. Add Filter → name ClickEventsFilter to keep eventType == CLICK.
Add Filter
Edit Filter
Filter values

Note

CLICK is in ALL CAPS. The filter may show an error until a destination is added.

  1. Connect a Stream named ClickEventsStream (JSON) → destination Eventhouse → create table BronzeClicks.
Add Stream
Choose Stream
Stream settings
Eventhouse destination
  1. Add a second Filter ImpressionEventsFilter for eventType == IMPRESSIONStream ImpressionsEventsStream (JSON) → Eventhouse table BronzeImpressions.
Add second filter
Choose filter
Impression filter values
Impressions stream
Eventhouse dest 2
Eventhouse settings
  1. Click Publish. After a few minutes, destinations switch to Streaming.
Publish topology
Streaming state
Final topology

9. Setting up the Lakehouse

Download products.csv and productcategory.csv from the repo and create a Lakehouse WebSalesData_LH (schemas disabled).

Download ref data
Open workspace
New Lakehouse
Lakehouse name

10. Uploading reference data files and creating delta tables

  1. Click Get dataUpload Files → select both CSV files → Upload.
Get data - Upload
Select files
Upload
Files uploaded
  1. For each file, use Load to tablesNew table → retain defaults → Load.
Load to tables
New table
Load

Note

Repeat for productcategory.csv and products.csv.

Delta tables created

11. Accessing Eventhouse data from the Lakehouse

Create OneLake shortcuts for BronzeClicks and BronzeImpressions.

  1. Get dataNew shortcut (or New table shortcut under schema).
New shortcut
  1. Select Microsoft OneLake.
Select OneLake
  1. Choose KQL Database WebEvents_EHNext.
Select KQL DB
  1. Check tables BronzeClicks and BronzeImpressionsNextCreate.
Select tables
Create
Shortcuts created

Note

Shortcuts have a different icon than regular delta tables.

12. Build the KQL DB schema

Create silver tables, functions, and update policies. Add OneLake shortcuts for products and productcategory from the lakehouse.

KQL DB schema
  1. Open KQL Database WebEvents_EH+ NewOneLake shortcut.
Open KQL DB
New shortcut

Note

Data has already streamed into your KQL-Database; see the overview dashboard.

Overview dashboard
  1. Select Microsoft OneLake → lakehouse WebSalesData_LHNext.
Select OneLake
Select Lakehouse
  1. Add shortcut to table products (repeat for productcategory).
Create products shortcut

Note

Repeat the steps for productcategory.

Shortcuts verified
  1. Click Explore your Data.
Explore your data
Explore window
  1. Open createAll.kql from GitHub and copy contents.
Copy KQL
  1. Open queryset WebEvents_EH_queryset, paste contents, Run. Rename tab to createAll.
Run KQL
Completed status
Rename tab

Note

You can add additional tabs in the KQL Queryset to add new queries.

  1. Expand folders to verify all tables and functions created by the script.
Objects created

Note

Explore additional Real-Time Intelligence Samples via Get data menu.

Samples menu
Sample gallery

13. Real-Time Dashboard

Build a dashboard to visualize streaming data and set auto-refresh to 30 seconds (or Continuous). Optionally import a pre-built JSON definition.

Note

The Proctor Guide covers this process.

Real-Time Dashboard
  1. Workspace → + New ItemReal-Time Dashboard → name Web Events Dashboard.
Open workspace
New Real-Time Dashboard
Dashboard name
  1. + Add tile+ Data source → select Eventhouse WebEvents_EHAdd.
Add tile
Add data source
Select Eventhouse
Confirm source

Proceed to paste each query below, add a visual, and apply changes. Optionally use script file dashboard-RTA.kql.

Note

We demo the first visual; repeat steps for the rest.

Clicks by hour

//Clicks by hour
SilverClicks
| where eventDate between (_startTime.._endTime)
| summarize date_count = count() by bin(eventDate, 1h)
| render timechart
| top 30 by date_count
  1. Set time range to Last 7 daysRun+ Add Visual.
  2. Title: Click by hour. Visual type: Area chartApply changes.

Note

After Apply, the range may reset to 1 hour; we fix via Parameters.

  1. ManageParameters → edit Time range default to Last 7 Days.
Manage
Edit parameter
Set default
Close parameters
New tile

Impressions by hour

//Impressions by hour
SilverImpressions
| where eventDate between (_startTime.._endTime)
| summarize date_count = count() by bin(eventDate, 1h)
| render timechart
| top 30 by date_count
Impressions by hour

Impressions by location

//Impressions by location
SilverImpressions
| where eventDate  between (_startTime.._endTime)
| join external_table('products') on $left.productId == $right.ProductID
| project lon = toreal(geo_info_from_ip_address(ip_address).longitude), lat = toreal(geo_info_from_ip_address(ip_address).latitude), Name
| render scatterchart with (kind = map)
Impressions map

Average Page Load time

//Average Page Load time
SilverImpressions
| where eventDate   between (_startTime.._endTime)
| make-series average_loadtime = avg(page_loading_seconds) on eventDate from _startTime to _endTime+4h step 1h
| extend forecast = series_decompose_forecast(average_loadtime, 4)
| render timechart
Avg load time

Impressions, Clicks & CTR

//Clicks, Impressions, CTR
let imp =  SilverImpressions
| where eventDate  between (_startTime.._endTime)
| extend dateOnly = substring(todatetime(eventDate).tostring(), 0, 10)
| summarize imp_count = count() by dateOnly;
let clck = SilverClicks
| where eventDate  between (_startTime.._endTime)
| extend dateOnly = substring(todatetime(eventDate).tostring(), 0, 10)
| summarize clck_count = count() by dateOnly;
imp
| join clck on $left.dateOnly == $right.dateOnly
| project selected_date = dateOnly , impressions = imp_count , clicks = clck_count, CTR = clck_count * 100 / imp_count
Stat tiles
Duplicate tiles
Clicks tile
CTR tile

Average Page Load Time Anomalies

//Avg Page Load Time Anomalies
SilverImpressions
| where eventDate   between (_startTime.._endTime)
| make-series average_loadtime = avg(page_loading_seconds) on eventDate from _startTime to _endTime+4h step 1h
| extend anomalies = series_decompose_anomalies(average_loadtime)
| render anomalychart
Anomaly chart

Strong Anomalies

//Strong Anomalies
SilverImpressions
| where eventDate between (_startTime.._endTime)
| make-series average_loadtime = avg(page_loading_seconds) on eventDate from _startTime to _endTime+4h step 1h
| extend anomalies = series_decompose_anomalies(average_loadtime,2.5)
| mv-expand eventDate, average_loadtime, anomalies
| where anomalies <> 0
| project-away anomalies

Logo (Markdown Text Tile)

![AdventureWorks](https://vikasrajput.github.io/resources/PBIRptDev/AdventureWorksLogo.jpg "AdventureWorks")
Markdown text tile

Arrange visuals; the dashboard should look similar to:

Final dashboard

Auto-refresh

  1. ManageAuto refreshEnabledDefault refresh rate: ContinuousApply.
  2. HomeSave.
Auto refresh menu
Enable auto refresh
Save dashboard

14. Data Activator

Create a Reflex Alert to send a Teams message when a value meets a threshold.

  1. On tile Click by hourSet alert.
Set alert
  1. Set values: On each event grouped byevent_date, When: date_count, Condition: Becomes greater than → value 250. Action: Message me in teams.
Alert settings
  1. Select WorkspaceCreate a new item → name My ReflexCreate.
Create Reflex

Note

The Reflex item appears in your workspace and can trigger multiple actions.

Stop running the notebook when done.

  1. Open Generate synthetic eventsCancel all.
Cancel all