Advent of 2023, Day 6 – SQL Analytics endpoint
This article is originally published at https://tomaztsql.wordpress.com
In this Microsoft Fabric series:
- Dec 01: What is Microsoft Fabric?
- Dec 02: Getting started with Microsoft Fabric
- Dec 03: What is lakehouse in Fabric?
- Dec 04: Delta lake and delta tables in Microsoft Fabric
- Dec 05: Getting data into lakehouse
We have learned lakehouse, delta tables and there is an SQL analytics endpoint, which we want to dig into!
SQL Analytics endpoint in lakehouse is a SQL-based experience for lakehouse delta tables. By using standard T-SQL language, you can write queries to analyze data in delta tables, create functions, procedures, views and even apply security over the objects. There are some of the functionalities missing from your standard T-SQL language, but the experience is the same.
Besides the SQL experience, you can always use the corresponding items in the workspace view of Lakehouse explorer, use SQL in notebooks, or simply use SQL analytics endpoint
SQL analytics endpoint points to the lakehouse delta table storage. Once you create a delta table in the lakehouse with the pipelines, copy commands or use notebooks, all the tables will be immediately available for querying using the SQL analytics endpoint. Furthermore, you can also create a warehouse within this workspace. Since the SQL Analytics endpoint is available immediately, once you create a lakehouse, you will also have the icons available in the workspace.
Once you are in the SQL Analytics endpoint, all the delta tables will be there for you, ability to store all the SQL queries and get the SQL Server overview of the objects.
I have prepared an additional table “iris_species”, which I have created using a notebook in the lakehouse itself. This is another way to create a delta table.
In the SQL analytics endpoint, I have stored the query as “iris datasets total” which will always be there at your disposal. I can also move it to the folder “Shared queries”:
with subset1 AS
(
select
row_number() over (order by (Select 1)) as rn
,CAST([Sepal.Length] AS DECIMAL(5,1)) AS sl
,CAST([Sepal.Width] AS DECIMAL(5,1)) AS sw
,CAST([Petal.Length] AS DECIMAL(5,1)) AS pl
,CAST([Petal.Width] AS DECIMAL(5,1)) AS pw
FROM iris_data
WHERE
[Species] = 'setosa'
)
SELECT
CAST(rn as varchar(15)) as rn
, sl
, sw
, pl
, pw
FROM subset1
UNION ALL
SELECT
'--AVG: --'
, AVG(sl)
, AVG(sw)
, AVG(pl)
, AVG(pw)
from subset1
Looking at the schemas, you will find a little gem there! Besides the “usual suspects”, there is also a schema called queryinsights, that harnesses the power of logs and “query-store-feels-like” information. Exploring the views will reveal all the logs and statistics of query runs (long-running queries, frequent running queries and requests history, fabric query complete).
This is a super great gem and it will be useful for anyone trying to troubleshoot, optimize and maintain all the queries in the SQL analytics endpoint!
At the bottom of the workspace, you will also find the model tab!
The model gives you the possibility to build the data model – feels like Power BI and Synapse. By drag and drop, you can easily start building the relations between the tables. By connecting the “iris_data” and “iris_species” tables we get the dialog window. Defining a relation between the two objects!
With the visual query, you can start building the SQL like queries using visual:
and we can merge the two tables using visuals:
You can always check the results, View the SQL query and visualize the intermediate or end results.
And you get again a familiar Power BI view for exploring data using visuals
Tomorrow we will look continue exploring the SQL commands in SQL analytics!
Complete set of code, documents, notebooks, and all of the materials will be available at the Github repository: https://github.com/tomaztk/Microsoft-Fabric
Happy Advent of 2023!
Thanks for visiting r-craft.org
This article is originally published at https://tomaztsql.wordpress.com
Please visit source website for post related comments.