The Future of Data Analytics: An Introduction to Microsoft Fabric

Microsoft Fabric, launched on May 24-25 of 2023 at the Microsoft Build event, is an end-to-end data and analytics platform that combines Microsoft’s OneLake data lake, Power BI, Azure Synapse, and Azure Data Factory into a unified software as a service (SaaS) platform. It’s a one-stop solution designed to serve various data professionals including data engineers, data warehousing professionals, data scientists, data analysts, and business users, enabling them to collaborate effectively within the platform to foster a healthy data culture across their organizations​​.

What are the Microsoft Fabric key features?


Data Factory – Microsoft’s Azure Data Factory is a powerful tool that combines the simplicity of Power Query with Azure Data Factory’s scale. It provides over 200 native connectors for data linkage from on-premises and cloud-based sources. Data Factory enables the scheduling and orchestration of notebooks and Spark jobs.

Data Engineering – Leveraging the extensive capabilities of Spark, data engineering in Microsoft Fabric provides premier authoring experiences and facilitates large-scale data transformations. It plays a crucial role in democratizing data through the lakehouse model. Moreover, integration with

Data Science – The data science capability in Microsoft Fabric aids in building, deploying, and operationalizing machine learning models within the Fabric framework. It interacts with Azure Machine Learning for built-in experiment tracking and model registry, empowering data scientists to enhance organizational data with predictions that business analysts can incorporate into their BI reports, thereby transitioning from descriptive to predictive insights.

Data Warehouse – The data warehousing component of Microsoft Fabric offers top-tier SQL performance and scalability. It features a full separation of computing and storage for independent scaling and native data storage in the open Delta Lake format.

Real-Time Analytics – Observational data, acquired from diverse sources like apps, IoT devices, human interactions, and more, represents the fastest-growing data category. This semi-structured, high-volume data, often in JSON or Text format with varying schemas, presents challenges for conventional data warehousing platforms. However, Microsoft Fabric’s Real-Time Analytics offers a superior solution for analyzing such data.

Power BI – Recognised as a leading Business Intelligence platform worldwide, Power BI in Microsoft Fabric enables business owners to access all Fabric data swiftly and intuitively for data-driven decision-making.

What are the Advantages of Microsoft Fabric?

Unified Platform: Microsoft Fabric provides a unified platform for different data analytics workloads such as data integration, engineering, warehousing, data science, real-time analytics, and business intelligence. This can foster a well-functioning data culture across the organization as data engineers, warehousing professionals, data scientists, data analysts, and business users can collaborate within Fabric​​.

Multi-cloud Support: Fabric is designed with a multi-cloud approach in mind, with support for data in Amazon S3 and (soon) Google Cloud Platform. This means that users are not restricted to using data only from Microsoft’s ecosystem, providing flexibility​.

Accessibility: Microsoft Fabric is currently available in public preview, and anyone can try the service without providing their credit card information. Starting from July 1, Fabric will be enabled for all Power BI tenants​.

AI Integration: The private preview of Copilot in Power BI will combine advanced generative AI with data, enabling users to simply describe the insights they need or ask a question about their data, and Copilot will analyze and pull the correct data into a report, turning data into actionable insights instantly​​.

Microsoft Fabric – Licensing and Pricing

Microsoft Fabric capacities are available for purchase in the Azure portal. These capacities provide the compute resources for all the experiences in Fabric from the Data Factory to ingest and transform to Data Engineering, Data Science, Data Warehouse, Real-Time Analytics, and all the way to Power BI for data visualization. A single capacity can power all workloads concurrently and does not need to be pre-allocated across the workloads. Moreover, a single capacity can be shared among multiple users and projects, without any limitations on the number of workspaces or creators that can utilize it.

To gain access to Microsoft Fabric, you have three options:

  1. Leverage your existing Power BI Premium subscription by turning on the Fabric preview switch. All Power BI Premium capacities can instantly power all the Fabric workloads with no additional action required. If you already have a Power BI Premium subscription, you can simply turn on the Fabric preview switch. This means you can enable Microsoft Fabric’s capabilities as part of your existing Power BI Premium subscription without having to do anything else. All the capacities you have with your Power BI Premium subscription can be used to power the full range of workloads in Microsoft Fabric. In other words, you can use your existing Power BI Premium resources to run all of the data and analytics tasks that Microsoft Fabric can handle.
  2. Start a Fabric trial if your tenant supports trials. If you’re not sure about committing to Microsoft Fabric yet, you can start a trial if your tenant (an instance of Azure Active Directory) supports it. A trial allows you to test the service before deciding to purchase. During the trial period, you can explore the full capabilities of Microsoft Fabric, such as data ingestion, data transformation, data engineering, data science, data warehouse operations, real-time analytics, and data visualization with Power BI.
  3. Purchase a Fabric pay-as-you-go capacity from the Azure portal. If you decide that Microsoft Fabric suits your needs and you don’t have a Power BI Premium subscription, you can directly purchase a Fabric capacity on a pay-as-you-go basis from the Azure portal. The pay-as-you-go model is flexible because it allows you to pay for only the compute and storage resources you use. Microsoft Fabric capacities come in different sizes, from F2 to F2048, representing 2 – 2048 Capacity Units (CU). Your bill will be determined by the amount of computing you provision (i.e., the size of the capacity you choose) and the amount of storage you use in OneLake, the data lake built into Microsoft Fabric. This model also allows you to easily scale your capacities up and down to adjust their computing power, and even pause your capacities when not in use to save on your bills​​.

Microsoft Fabric is a unified product for all your data and analytics workloads. Rather than provisioning and managing separate compute for each workload, with Fabric, your bill is determined by two variables: the amount of compute you provision and the amount of storage you use.

Follow the capacities that you can buy in the Azure portal:

Check out this video from Guy and Cube which breaks down the details on pricing and licensing.

How to activate the Microsoft Fabric Trial version?

Step 1

Login to Microsoft Power BI with your Developer Account

You will observe that asides from the OneLake icon at the top left, everything looks normal if you are familiar with Power BI Service.

Step 2

Enable Microsoft Fabric for your Tenant

Your Screen will Look like this

So far, we’ve only enabled Microsoft Fabric at the tenant level. This doesn’t give full access to Fabric resources as can be seen in the illustration below

So, Let’s upgrade the Power BI License to Microsoft Fabric Trial

For a smoother experience, You should create a new Workspace and add Microsoft Fabric Trial License as can be seen below

As you can see, while creating a new Workspace, you can now Assign Fabric Trial License to it. Upon creation, we are able to take full advantage of Microsoft Fabric

This video by Guy and Cube explains the steps for getting the Microsoft Fabric Trial.

Conclusion

Microsoft Fabric is currently in preview but already represents a significant advancement in the field of data and analytics, offering a unified platform that brings together various tools and services. It enables a smooth and collaborative experience for a variety of data professionals, fostering a data-driven culture within organizations. Let´s wait for the next steps from Microsoft.

That’s it for today!

Implementing Data Governance in Power BI: A Step-by-Step Guide

As data plays a crucial role in decision-making and data-driven insights, organizations require a robust data governance framework to manage and monitor their data assets. Power BI offers various features and tools that aid in implementing data governance and ensuring data accuracy, reliability, and security.

As data becomes increasingly critical to organizations of all sizes and industries, managing this data effectively and securely becomes just as important. A crucial aspect of data management is data governance, which is defining and enforcing policies, procedures, and standards for data management. This article will explore data governance basics, how to implement it in Power BI, and the advantages of using Power BI Premium.

What is Data Governance?

Data governance is the set of processes, policies, and standards organizations use to manage their data effectively. It encompasses everything from data quality and security to data privacy and retention. Effective data governance is crucial for organizations to ensure that their data is accurate, secure, and accessible. In addition, it helps organizations make informed decisions, reduce risks associated with poor data quality, and maintain compliance with legal and regulatory requirements.

How to Implement Data Governance in Power BI

Power BI provides various features and tools to help implement data governance. These include Dataflows, Datamarts, Sensitivity labels, Endorsement, Discovery, and Row-Level-Security(RLS). Dataflows allow organizations to connect, clean, and transform data, while Datamarts provide a centralized data repository. Sensitivity labels help to classify and protect sensitive data, while Endorsement allows organizations to enforce data quality standards. Finally, Discovery helps organizations manage, monitor, and understand their data assets. Let’s explain each of them.

Dataflows

dataflow is a collection of tables created and managed in workspaces in the Power BI service. A table is a set of columns used to store data, much like a table within a database. You can add and edit tables in your dataflow and manage data refresh schedules directly from the workspace in which your dataflow was created.

As data volume grows, so does the challenge of wrangling that data into well-formed, actionable information. We want data ready for analytics to populate visuals, reports, and dashboards, so we can quickly turn our volumes of data into actionable insights. With self-service data prep for big data in Power BI, you can go from data to Power BI insights with just a few actions.

When to use dataflows

Dataflows are designed to support the following scenarios:

Create reusable transformation logic that many datasets and reports inside Power BI can share. Dataflows promote the reusability of the underlying data elements, preventing the need to create separate connections with your cloud or on-premises data sources.

Expose the data in your Azure Data Lake Gen 2 storage, enabling you to connect other Azure services to the raw underlying data.

Create a single source of truth by forcing analysts to connect to the dataflows rather than connecting to the underlying systems. This single source gives you control over which data is accessed and how data is exposed to report creators. You can also map the data to industry standard definitions, enabling you to create tidy curated views, which can work with other services and products in the Power Platform.

Suppose you want to work with large data volumes and perform ETL at scale; dataflows with Power BI Premium scale more efficiently and give you more flexibility. Dataflows support a wide range of cloud and on-premises sources.

Prevent analysts from having direct access to the underlying data source. Since report creators can build on top of dataflows, it might be more convenient for you to allow access to underlying data sources only to a few individuals and then provide access to the dataflows for analysts to build on. This approach reduces the load to the underlying systems and gives administrators finer control of when the systems get loaded from refreshes.

    You can use Power BI Desktop and the Power BI service with dataflows to create datasets, reports, dashboards, and apps that use the Common Data Model. You can gain deep insights into your business activities from these resources. Dataflow refresh scheduling is managed directly from the workspace in which your dataflow was created, just like your datasets.

    Click here to learn how to create a Dataflow in Power BI.

    Datamarts

    Datamarts are self-service analytics solutions that enable users to store and explore data in a fully managed database.

    When to use Datamarts

    Datamarts are targeted toward interactive data workloads for self-service scenarios. For example, suppose you’re working in accounting or finance. In that case, you can build your data models and collections, which you can then use to self-serve business questions and answers through T-SQL and visual query experiences. In addition, you can still use those data collections for more traditional Power BI reporting experiences. Datamarts are recommended for customers who need domain-oriented, decentralized data ownership and architecture, such as users who need data as a product or a self-service data platform.

    Datamarts are designed to support the following scenarios:

    Departmental self-service data: Centralize small to moderate data volume (approximately 100 GB) in a self-service fully managed SQL database. Datamarts enable you to designate a single store for self-service departmental downstream reporting needs (such as Excel, Power BI reports, and others), thereby reducing the infrastructure in self-service solutions.

    Relational database analytics with Power BI: Access a datamart’s data using external SQL clients. Azure Synapse and other services/tools that use T-SQL can also use datamarts in Power BI.

    End-to-end semantic models: Enable Power BI creators to build end-to-end solutions without dependencies on other tooling or IT teams. Datamarts eliminates managing orchestration between dataflows and datasets through auto-generated datasets while providing visual experiences for querying data and ad-hoc analysis, all backed by Azure SQL DB.

    Click here if you want to know how to create a Datamart.

    Sensitivity labels

    A Sensitivity label is an information icon that users can apply in the Power BI Desktop or the Power BI Service. They are essentially digital stamps that can be applied to a resource to classify and restrict critical content when shared outside Power BI.

    Click here if you want more information about implementing sensitivity labels.

    Endorsement

    Power BI provides two ways to endorse your valuable, high-quality content to increase its visibility: promotion and certification.
    Promotion: Promotion is a way to highlight the content you think is valuable and worthwhile for others to use. It encourages the collaborative use and spread of content within an organization.
    Any content owner and member with write permissions on the workspace where the content is located can promote the content when they think it’s good enough for sharing.
    Certification: Certification means that the content meets the organization’s quality standards and can be regarded as reliable, authoritative, and ready for use.
    Only authorized reviewers (defined by the Power BI administrator) can certify content. Content owners who wish to see their content certified and are not authorized to certify it themselves must follow their organization’s guidelines about getting their content certified.

    Click here to learn how to endorse your content in Power BI.

    Dataset Discovery

    The Power BI dataset discovery hub empowers Power BI and Microsoft Teams users to discover and re-use organizational and curated datasets and answer their business questions in Power BI or Excel. The hub will empower data owners to manage their assets in a central location.

    Click here to learn more about dataset discovery.

    Row-Level-Security (RLS)

    Row-level security (RLS) with Power BI can be used to restrict data access for given users. Filters restrict data access at the row level, and you can define filters within roles. In the Power BI service, members of a workspace have access to datasets in the workspace. RLS doesn’t restrict this data access.

    Click here to learn more about Row-level security

    What Is Self-Service in Power BI?

    Self-service business intelligence (BI) is a data analytics method that allows business users (e.g., business analysts, managers, and executives) to access and explore datasets without experience in BI, data mining, and statistical analysis. Users can run queries and customize data visualization, dashboards, and reports to support real-time data-driven decision-making.

    Power BI offers robust self-service capabilities. You can tap into data from on-premise, and cloud-based data sources (e.g., Dynamics 365, Salesforce, Azure SQL Data Warehouse, Excel, SharePoint), then filter, sort, analyze, and visualize the information without the help of a BI or IT team.

    Using the Power Query experience, business analysts can directly ingest, transform, integrate, and enrich big data in the Power BI web service. The ingested data can then be shared with other users across various Power BI models, reports, and dashboards.

    How vital is Self-Service in Power BI?

    In many businesses, productivity and agility suffer due to a lengthy process for BI-related data requests. For example, when Alice asks Bob a question, Bob has to wait for the BI/IT team to pull the data. This can take several weeks and multiple meetings, slowing the decision-making process.

    But with Power BI self-service, Bob can quickly retrieve real-time data, and Alice can immediately drill down into relevant datasets during the first meeting. This results in a more efficient discussion and a potential solution that can be implemented immediately.

    The significance of Power BI self-service goes beyond just real-time insights, collaboration, and data reuse. It helps business users develop the habit of relying on data when making decisions. Without easy access to data analytics, they may rely on instincts or experience, leading to suboptimal outcomes. But with real-time data at their fingertips, users can make data-driven decisions, establishing a pattern of data-informed decision-making.

    Implementing Effective Data Governance in a Power BI Self-Service Environment

    Data Governance is critical in implementing a self-service culture in Power BI as it provides a framework for defining, maintaining, and enforcing data management policies. The following are critical components of a data governance plan in Power BI:

    1. Data Quality: Define data quality and accuracy standards to ensure that the data used is reliable and trustworthy.
    2. Data Security: Implement security measures to ensure that sensitive data is protected and only accessible by authorized users.
    3. Data Lineage: Define the lineage of the data sources used in Power BI to ensure that the data can be traced back to its source.
    4. Data Ownership: Assign ownership of data sources and ensure that data owners are responsible for maintaining the accuracy of their data.
    5. Data Stewardship: Designate data stewards responsible for maintaining data quality and ensuring compliance with data management policies.
    6. Data Access Control: Implement access controls to ensure that only authorized users can access sensitive data.
    7. Data Auditing: Implement auditing and monitoring processes to track changes to the data and ensure compliance with data management policies.

    By implementing these key components, organizations can establish a strong foundation for a self-service culture in Power BI while ensuring that the data is secure, accurate, and trustworthy.

    Maximizing Your Data Governance with Power BI Premium

    From scalability to security, Power BI Premium offers a range of features that can help organizations manage their data more effectively. With dedicated capacity, IT departments can ensure consistent performance for their teams. Advanced security features also guarantee data privacy and protection. Follow below the ten advantages of implementing data governance with Power Bi Premium:

    1. Scalability: Power BI Premium can handle large amounts of data and high concurrent usage.
    2. Dedicated Capacity: Dedicated resources for Power BI Premium ensure consistent performance.
    3. IT Governance: IT departments can centrally manage and govern Power BI deployments.
    4. Data Privacy & Security: Advanced security features ensure data privacy and protection.
    5. Shared Workspaces: Teams can collaborate on data and reports in a secure environment.
    6. Unrestricted Data Sources: Power BI Premium supports a broader range of data sources than Power BI Pro.
    7. Dynamic Row-Level Security: Secure access to sensitive data can be managed dynamically.
    8. On-Premises Data Connectivity: Power BI Premium supports connectivity to on-premises data sources.
    9. Long-Term Data Retention: Power BI Premium enables organizations to retain data for extended periods.
    10. Lower TCO: Power BI Premium can provide lower total ownership costs than purchasing individual Power BI Pro licenses.

    10 Effective Strategies for Implementing Data Governance in Power BI

    1. Creating Dataflows for cleaning and transforming data.
    2. Implementing Sensitivity labels to classify and protect sensitive data.
    3. Using Datamarts for centralizing data and improving data management.
    4. Enforcing data quality standards with Endorsement.
    5. Monitoring data assets with Discovery.
    6. Implementing data privacy and security with Power BI Premium.
    7. Improving report refresh times and performance with Power BI Premium.
    8. Sharing reports and dashboards with a larger audience with Power BI Premium.
    9. Utilizing Power BI Premium’s increased capacity for large datasets.
    10. Improving collaboration and data sharing with Power BI Premium’s multi-user authoring feature.

    Video talking about Building a Data Governance Plan for Your Power BI Environment.

    Conclusion:

    Data governance is an essential aspect of data management, helping organizations to ensure that their data is accurate, secure, and accessible. Power BI provides several features to help organizations implement data governance, including Power BI Premium, dataflows, and Datamarts. With these features, organizations can automate collecting and transforming data, reduce the risk of manual errors, and maintain compliance with legal and regulatory requirements. Whether you’re just starting to explore Power BI or are already using it to manage your data, implementing data governance is a crucial step toward effective data management.

    It’s very interesting to look at the Power BI adoption roadmap.

    Matthew Roche’s Blog from Microsoft is a massive reference to Data Culture and Governance. This guy explains everything about Dataflows here.

    If you have any questions discussed in this post or need help, feel free to contact me at this link.

    That’s it for today!