top of page
Website Cover 2.png

Search Interloop

Use the search bar below to find blogs, events & other content on the Interloop Website

38 results found with an empty search

  • The Delta Difference: Streamlining Modern Data Management

    By McClain Reese In today’s data-driven world, where businesses are constantly exploring new frontiers, Delta Tables offer a revolutionary approach to data management, combining flexibility, consistency, and performance. Think of Delta Tables as intelligent data hubs that “remember” changes, making it easier to track updates without needing complex rework. For businesses and DataOps engineers, they’re a foundational tool for real-time data insights that keep your data strategy orbiting smoothly. What are Delta Tables? Delta Tables are designed to store data in a structured format optimized for analytics and updates. Built on open-source formats like Parquet, they add powerful features for tracking changes, managing transactions, and handling incremental data loads—all essential for large-scale data environments. Imagine Delta Tables as your mission control center, allowing for precise data adjustments that keep your data reliable without needing to recreate entire datasets. Why Delta Tables Matter for Data Management Delta Tables provide three major advantages in data management: Data Reliability : Delta Tables prevent data loss, support rollbacks, and keep a history of changes, ensuring data integrity and accuracy. Scalability : Capable of handling massive datasets, Delta Tables allow concurrent operations, enabling incremental data updates without conflicts—a bit like a well-coordinated space crew, always in sync. Performance : Optimized for faster read/write operations, Delta Tables make querying large datasets quicker and more efficient, helping your team access insights light-years faster. Delta Tables vs. Traditional Data Tables One of the biggest differences between Delta Tables and traditional tables is their capability to track and handle data changes autonomously. Traditional tables are static, requiring manual intervention to keep data up to date. Delta Tables, on the other hand, support ACID transactions (Atomicity, Consistency, Isolation, Durability), which maintain secure, reliable updates without compromising data integrity. Delta Tables also feature time-travel queries, allowing you to view data at any point in the past. This is especially valuable in dynamic projects, like machine learning and real-time analytics, where data is constantly evolving, much like stars and planets in orbit. Why Businesses Should Consider Delta Tables For companies relying on real-time insights, Delta Tables simplify the complexity of managing data history, updates, and deletions. They keep data consistent, making them ideal for applications that require rapid, accurate data updates, whether for machine learning or analytics that power new discoveries. Delta Tables’ Impact on Data Engineering For DataOps engineers, Delta Tables simplify workflows by automating tasks that previously required complex coding, such as incremental loads and change merging. Delta Tables free up engineers to focus on discovering insights rather than maintaining complex processes. Plus, they support schema evolution, allowing data structures to adapt without disrupting workflows—like adjusting course while keeping the mission on track. In large-scale data processing, Delta Tables are like a spacecraft’s advanced storage modules, optimizing space with compression and partitioning. This setup allows DataOps teams to handle petabytes of data efficiently, reducing overhead and minimizing bottlenecks. Real-World Scenario: Fixing Data Errors with Delta Tables Here’s a real-world example of Delta Tables in action. A client mistakenly uploaded an incorrect CSV file into an automated sync process, causing inaccurate data updates across their system. Normally, this would require a manual rollback or complex data reprocessing. With Delta Tables, however, the team used the time-travel feature to quickly revert to a previous state. The error was resolved in minutes, showcasing Delta Tables’ capability to instantly restore data accuracy and keep operations running smoothly. The Wrap Up: Why Delta Tables Make a Difference Delta Tables empower businesses to manage high-volume, dynamic data efficiently. Their ability to ensure data reliability, support large-scale operations, and enhance performance makes them essential for any modern data environment. For DataOps engineers, Delta Tables streamline workflows, allowing them to focus on what truly matters—turning data into insights and helping the business explore new horizons. Get Looped In Looking to launch your data strategy with Delta Tables? Get looped in with one of our experts today to explore how Delta Tables can elevate your data management and drive better business outcomes.

  • Boosting Your Data Strategy with Orbit Architecture: A Unified Approach to Seamless Solutions

    In the rapidly advancing world of data operations, the ability to manage complex, outcome-driven data solutions is key. Interloop’s Orbit Architecture offers a flexible, resilient framework that helps DataOps engineers structure data to meet specific business needs without causing unwanted disruptions. Designed to work within the gold layer of the medallion architecture, Orbit Architecture enables each solution to stay on track, reducing risk and enhancing security. What is Orbit Architecture? Orbit Architecture is Interloop’s unique data design pattern, crafted to organize data within the "gold" layer of the medallion architecture. This layer serves as the trusted source of curated data that drives outcome-focused solutions. Much like the “separation of concerns” concept in computer science, Orbit Architecture establishes independent data structures and models within the gold layer, tailored to each specific business outcome. Each "orbit" within the gold layer becomes a specialized mission module, delivering consumption-ready, outcome-specific data to keep systems aligned and resilient. How It Works: A Use Case Imagine a scenario where your team manages two key solutions: a customer health dashboard and an operational Copilot. Both rely on data tables from the gold lakehouse. Now, your engineering team receives a request to adjust an ERP data table to better support the Copilot, and once the update is completed, the Copilot runs seamlessly. However, this update causes unintended issues on the customer health dashboard, which also uses the ERP table to link customer order data to sales opportunities. With Orbit Architecture, these disruptions are prevented. Each solution is contained within its own orbit, all connected to the gold data but insulated from one another. This structure allows each outcome to use the data model that best suits its needs—such as a dimensional model for Power BI or a graph model for an AI tool—without interfering with other data solutions. Key Benefits of Orbit Architecture 🚀 Flexibility Each orbit can be tailored to specific business requirements, giving DataOps engineers the freedom to use the data model (graph, dimensional, relational) that best supports each outcome, rather than forcing every solution into one mold. 🚀 Mitigated Downstream Effects Isolated data structures mean updates in one orbit don’t impact others, reducing the risk of unintended disruptions and data inconsistencies. 🚀 Supports OneLake Ideology Aligned with the OneLake approach, Orbit Architecture maximizes the value of a single golden version of data, allowing for analysis without duplication or data movement. 🚀 Metadata Management Orbit Architecture supports comprehensive metadata for data lineage, quality, and governance, helping to keep all systems organized and compliant. 🚀 Security and Access Control Robust security and access controls are available at each orbit level, allowing sensitive data to be protected without sacrificing accessibility for authorized users. The Wrap Up: Why Orbit Architecture Matters Orbit Architecture equips DataOps engineers with a powerful, structured framework for building flexible, outcome-specific solutions that are efficient and secure. By providing isolated data structures within the gold layer, Orbit Architecture allows engineers to mitigate downstream effects, ensure seamless collaboration, and make informed, data-driven decisions without duplication or interference. Get Looped In Looking to achieve more with your data? Get looped in   with one of our data experts today to explore how Orbit Architecture can streamline your data systems and elevate your outcomes.

  • Interloop® Unveils Mission Control in Private Preview, Offering Speed to Solution with Data Operations Platform for Microsoft Fabric

    In anticipation of Microsoft Ignite , Interloop® is thrilled to announce the private preview launch of Mission Control , the Data Data Operations Platform built specifically for Microsoft Fabric. Mission Control was developed for growth minded data teams that want to take their Microsoft Fabric solutions to the next level. The platform reduces the engineering burden of routine data tasks, such as connecting to API’s, monitoring pipelines, and building data models - freeing up time to focus on the unique challenges of their organization. Put simply, Mission Control empowers teams to deliver production ready Data Analytics & Artificial Intelligence solutions, faster - saving time & resourcing costs. Microsoft Fabric + Mission Control = The Complete Intelligence Solution Interloop’s mission has always been to help organizations make the most of their data, delivering quicker, more impactful insights that drive business value. With Microsoft Fabric , companies are able to create a foundation for their data strategy with an open, scalable, data operating system. While this foundational toolkit is very powerful, it can also be quite complex to implement. Mission Control addresses this challenge directly, simplifying the complexities of data operations and allowing technical teams to easily consume data from operational tools, manage data artifacts, resolve issues proactively, and make insights more accessible. With this strategic partnership, we’re able to help teams accelerate their Microsoft Fabric journey so they can deliver value to the business faster. This means teams of all sizes can go from raw data to data-driven insights in a matter of weeks. Enabling Data Teams to Outpace Business Requests Business needs change. Fast. Whether that means yet another ad hoc request or a new report, you can never predict what requests are coming next. Data teams often get encumbered setting up and maintaining routine data infrastructure, rather than being able to focus on the things that will create a positive impact on the organization. Mission Control alleviates these challenges by providing a holistic DataOps platform allowing data teams to adapt to the constantly changing needs of the business. Mission Control’s Key Features Designed to support growth minded companies, Mission Control integrates seamlessly with Microsoft Fabric, enabling teams to: Connect - Synchronize data across 500+ external sources quickly, eliminating data silos and ensuring unified data access. Monitor - Proactively manage data issues, minimizing disruptions and maximizing productivity. Manage - Receive & respond to feedback from end users while documenting the strategic decisions that are made when building solutions. Explore - Enable self-service insights for non-technical teams, putting them directly on the radar of key data insights. Share - Easily share datasets and dashboards, fostering a collaborative data environment within the organization. Built for the Future of Data-Driven Growth With the data management market set to grow rapidly and Gartner forecasting that DataOps tools will make up 20%-40% of the space, Mission Control is poised to meet the needs of mid-market organizations looking to advance leverage for analytics or artificial intelligence. By leveraging Mission Control, growth minded companies get the rocket fuel needed to reduce the technical load on their teams and deliver value faster, with tools designed to grow with them. Ready to Achieve More With Your Data? Interloop is now offering a private preview of Mission Control. Mid-market organizations ready to strengthen their data operations can join the waitlist and be among the first to explore its potential. Join the Private Preview Waitlist to get looped in and sign up. About Interloop A Microsoft ISV partner, Interloop empowers teams to build and deliver data analytics & AI solutions, faster. Backed by our team of experts, we’re the premier Microsoft Fabric partner for organizations looking to achieve more. With Interloop, consider it mission accomplished.

  • Microsoft Fabric October 2024: Key Updates For Data Professionals

    Here are the top 10 items from the Microsoft Fabric October 2024 Monthly Update that data professionals should be aware of: API for GraphQL Support for Service Principal Names (SPNs) : This new feature enhances security and simplifies authentication for data operations. Lakehouses Enhancements : New sorting, filtering, and searching capabilities have been introduced, making data management more efficient. KQL Queryset Update : A significant addition to KQL Queryset will revolutionize how users interact with their data, improving query efficiency and usability. Free Certification Opportunity : Microsoft is offering 5,000 free DP-600 exam vouchers for the Microsoft Certified: Fabric Analytics Engineer Associate certification, available until the end of the year. New Data Engineer Certification : The Microsoft Certified: Fabric Data Engineer Associate certification is now available, focusing on data ingestion, transformation, administration, monitoring, and performance optimization. Content Copilot Enhancements : Public preview of AI-enhanced Power BI report creation with Copilot, including improved clarity and contextual awareness for building valuable reports. Visual Calculations Update : Combo charts now support visual calculations, and field parameters can be used with visual calculations for more dynamic data visualization. Azure Map Update : Data bound reference layers now allow for dynamic integration with business data, enhancing the interactivity and flexibility of Azure Maps. Notebook Improvements : New features include automatic code generation in API for GraphQL, Git integration, deployment pipeline GA, and enhanced filtering, sorting, and searching in Lakehouse objects. Real-Time Intelligence Enhancements : Integration with GitHub for real-time dashboards, and the ability to save queries to dashboards, improving real-time data visualization and management. These updates bring significant improvements to data management, visualization, and operational efficiency in Microsoft Fabric

  • Interloop Announces New Leadership as Company Readies Launch of Mission Control for Microsoft Fabric

    Jordan Berry assumes role of CEO, Tony Berry transitions to President and Chairman as company prepares for 2025 product launch. Charleston, SC - For Immediate Release Interloop®, a leader in Data Operations and Data Engineering solutions, announces the appointment of Jordan Berry as Chief Executive Officer and Tony Berry as President and Chairman. This leadership transition comes at a pivotal moment as the company prepares to launch Mission Control, a groundbreaking Data Operations platform for Microsoft Fabric in Q1 2025. Jordan Berry, who co-founded Interloop, brings a wealth of experience and vision to the CEO role. Reflecting on his new role, Jordan shared, “I am honored and excited to step into the role of CEO at Interloop. This transition marks a significant milestone for our company as we launch Mission Control, our new Data Operations platform for Microsoft Fabric. Tony will transition to President and Chairman, focusing on enhancing our operations and delivery. I look forward to leading our team in this exciting new phase and building on the strong foundation laid by Tony. Together, we will continue to drive innovation and growth for Interloop.” Tony Berry, who has served as CEO since Interloop’s inception, will now take on the role of President and Chairman. As Interloop looks forward to its next chapter, Tony emphasized the strategic importance of this transition: “I'm thrilled to announce my transition from CEO to President/Chairman at Interloop. Jordan’s assumption of the CEO role marks a pivotal moment for our company. This change is designed to support the launch and growth of our new Mission Control platform that is fueled by Microsoft Fabric. With Jordan at the helm, I am excited for the promising future ahead and the continued success of Interloop.” This leadership shift positions Interloop to expand its impact in the DataOps space as its Mission Control is set to redefine data management for enterprises leveraging Microsoft Fabric. Both Tony and Jordan remain committed to fostering a culture of innovation, delivering transformative solutions for clients, and driving the company’s ambitious goals forward. As Interloop enters this next era, the company’s mission remains steadfast: empowering businesses to harness the full potential of data through advanced operations and engineering solutions. With Jordan’s vision and Tony’s strategic leadership, Interloop is poised to capitalize on the opportunities ahead and continue its trajectory of helping mid-market companies across the country achieve more with their data. About Interloop Serving mid-market companies nationwide, Interloop is the premier data engineering firm and Microsoft ISV partner dedicated to taking emerging entities from data-dark to insight and impact. With expertise in data engineering and DataOps solutions that integrate seamlessly with industry-leading platforms like Microsoft Fabric, Interloop empowers organizations to streamline data processes and harness actionable insights that drive business success. Get Looped In Ready to achieve more with your data? Contact Interloop today to schedule your consult and learn more about the upcoming release of Mission Control projected for Q1 2025.

  • Ignite Your Data Pipelines: New Updates to Microsoft Fabric's Invoke Pipeline Activity

    Author: Mclain Reese What is the Invoke Pipeline Activity? The Invoke Pipeline activity in Microsoft Fabric is your central command for automating and orchestrating complex data workflows by "calling" one pipeline from another. Think of it as the control hub that helps you build modular and reusable pipelines, keeping your data processes orbiting smoothly across your organization. 🆕 What’s New in This Update? With the latest update, the Invoke Pipeline activity has expanded its universe of possibilities. You can now call pipelines across multiple services, including Fabric , Azure Data Factory , and Synapse Analytics . This integration allows you to manage and streamline your data workflows across platforms like never before. Whether you’re transforming data on Azure, analyzing it in Synapse, or managing it in Fabric, this update makes your data processes faster than light. Why Is This Update a Game-Changer? Galactic Flexibility : By integrating services, you can now break down barriers and orchestrate workflows across the data galaxy. No matter which platform your data resides on, this update gives you the flexibility to unify it all. Simplified Command : Forget building complicated workarounds—this update brings everything under one roof. Managing data workflows across platforms is now streamlined, reducing friction and ensuring smoother operations. Modular Workflow Design : Create reusable, space-efficient pipeline components that can be deployed across multiple missions (or projects). This saves time, boosts efficiency, and keeps your data journey on course. 🔍 Seen the "Legacy" Label? No Cause for Cosmic Concern If you’ve spotted the "Legacy" label on your existing Invoke Pipeline activities, don’t panic—your pipelines are still fully operational. This just means they’re part of the previous version, but your workflows won’t skip a beat. You won’t need to take any immediate action, and your legacy pipelines will continue to function as expected, without crashing into any black holes. What’s Next? With the new Invoke Pipeline activity, you can now explore new frontiers in data automation. By integrating services across Fabric, Azure Data Factory, and Synapse, you’ve got the tools to keep your workflows in orbit and push the boundaries of what’s possible. Ready to learn more? Check out the official documentation for all the details: Learn more about the Invoke Pipeline activity update here. Looking to achieve more with your data? Get looped in with one of our data experts today.

  • Using Microsoft Power Automate to Export Power BI Tables

    Author: Anna Cameron Microsoft Power Automate, part of the Power Platform, is a robust tool designed to automate tasks across various systems and applications. Its low-code interface simplifies the automation of repetitive tasks, enhancing efficiency by integrating processes between platforms. With Power Automate, users can build custom workflows that trigger actions based on specific conditions, streamlining complex business processes. The platform supports a wide range of connectors, including Slack, Adobe, Calendly, and Google applications like Google Drive and Calendar. In this guide, we’ll walk you through creating a daily workflow using Power Automate that queries a Power BI dataset, extracts the data, and saves it to OneDrive as a CSV file. 1. Sign in to Power Automate: Start by logging into Power Automate. 2. Create a New Flow:  Go to "My flows" and choose "New flow". For this example, we'll set up a "Scheduled cloud flow" that runs automatically at a specified time. However, selecting an “Automated cloud flow” will also take you to a blank flow where you can add the “Recurrence” trigger as well. 3. Name and Schedule Your Flow: Give your flow a name and set when you want it to run. You can also set the schedule in the following step. Click "Create" to start building your flow. 4. Set the Flow's Trigger: The flow begins with a "Recurrence” trigger, which schedules when the flow's actions will occur. You can always return to this step and edit this step to adjust the run schedule, if necessary. 5. Add an Action: Click the "+" button to add a new action step. Since we want to query data from Power BI, search for "Power BI" in the “Add an action pane” and then click "See more" 6. Run a Query in Power BI: Choose the action to "Run a query against a dataset". 7. Select the Workspace and Dataset: You'll then choose the workspace and dataset where you want to run the query. First, you’ll be prompted to login in to the Microsoft Power BI tenant where the report is hosted. Once connected you will see your workspaces and datasets. 8. Write the Query: Next, you’ll need to write a query in DAX. It’s a good idea to first create this query in Power BI, so you can confirm the results before using it in the flow. Once you’re happy with the query, copy it into the “Query Text” parameter. Be sure to add an EVALUATE statement to the beginning of the query. Then, assign the DAX query to a variable and EVALUATE said variable at the end of the query. Hint: If the data you want export is already in a table visual in a Power BI report, you can export the DAX for that specific visual using the Performance Optimizer in Power BI Desktop. 9. Format the Data: Add a "Select" action to format the data properly. Open the dynamic content list by selecting the lightning bolt icon. Set the “From” parameter to use the First table rows from the previous “Run a query against a dataset” step. The “Select” action in this flow is used to make sure that the column headers appear correctly in the CSV file export. Specify the column names by adding them as Keys in the “Map” parameter. Then, insert an expression by clicking the icon instead of the lightning bolt icon. Use the expression, item()?['table_name[column_name]'] , to transform the column headers pulled from the Power BI dataset. The expression will change the column header from something like “DIM_Date[Week_Start_Date]” to “Week Start Date” in the exported CSV file. 10. Create a CSV File: Add another action to "Create CSV table" and set it up to use the dynamic content Output from the previous “Select” step. 11. Save the File to OneDrive: The final step is to add an action to save the CSV file to OneDrive. You’ll be prompted to login to OneDrive to choose file path where the file will be saved. Set the file name and include a timestamp by using the expression utcNow() . A timestamp will be useful in managing and organizing the files generated by the recurring Power Automate flow. 12. Save and Test: Save your flow and run a test to confirm that the flow works. Thankfully, Power Automate provides a 28-day run history for each flow to track whether a run was successful or not. Clicking on one of the failed runs will open the flow and show exactly what step resulted in the failure. In addition, Microsoft’s new Copilot feature will aid in troubleshooting the errors by explaining the error and providing suggestions to solve the error. Wrap Up Microsoft Power Automate is a powerful tool that simplifies task automation, from sending emails to managing complex, multi-step workflows. Whether you’re extracting data from Power BI, saving it to SharePoint, or sending reports directly to clients, Power Automate helps streamline and automate these processes with ease. Its seamless integration across platforms not only boosts productivity but also helps businesses save valuable time, making it an essential asset for optimizing workflows. Looking for clear, practical solutions to your data challenges? Let’s loop you in. Book your intro call with our data experts today.

  • Navigating the AI Frontier: Copilots Chart the Course for Business Excellence

    Author: Interloop Team In the vast expanse of tech advancement, we find ourselves in a new era—one where AI copilots are not just a futuristic concept but an essential part of modern business operations. Since the launch of OpenAI's ChatGPT, the dream of having a virtual assistant capable of handling myriad tasks—responding to emails, reading documents, answering questions, and even participating in virtual meetings—has transitioned from fantasy to reality. Today, copilots are everywhere, guiding businesses through the complexities of day-to-day operations with the precision of a seasoned astronaut navigating the stars. The scope of AI copilots reach far and wide, encompassing tasks from summarizing emails and reviewing contracts to crafting press releases and generating code. However, while the enthusiasm for these capabilities sometimes outpaces the technology’s current limits, the undeniable truth is that Generative AI, paired with natural language processing, has revolutionized the way organizations function. For instance, sales teams leveraging generative AI have reported saving up to 10 hours per week by automating routine administrative tasks, freeing up valuable time for strategic selling and client engagement. Yet, as business leaders navigate this new frontier, they grapple with the critical challenges of governance and control. The convenience of AI copilots can sometimes lead to inadvertent lapses in security, with team members potentially inputting sensitive client data into systems like ChatGPT. Ensuring responsible and secure AI usage is not just advisable but imperative to safeguarding both proprietary and client information. As we explore the Generative AI landscape, three distinct categories of business copilots have emerged: Application Copilots : These are AI-powered extensions integrated into existing tools like Office 365, Zendesk, and HubSpot, enhancing productivity within familiar platforms. Functional Copilots : These specialized copilots serve as virtual analysts for specific business functions—whether it’s financial research, marketing, or sales. However, without access to your organization’s data, their insights may lack the depth needed for actionable recommendations. Contextual Copilots : The next evolution in AI assistance, these custom-built copilots are designed to understand your organization's unique culture, strategy, and data. Hosted privately and tailored to your specific needs, they offer unparalleled insights and are poised to become the most valuable copilots on your journey through the AI cosmos. As businesses continue to expand into this new AI-powered frontier, contextual copilots represent a significant opportunity. Organizations that invest in these advanced copilots will find themselves better equipped to navigate the complexities of their industry, making informed decisions with the confidence of a captain who knows every star in the galaxy. Where do we land? At Interloop, we think Contextual Copilots are the next evolution of copilots and where organizations stand to gain the most. We’re already seeing many ventures (and working with several) that are building their own contextual Copilots that understand the nuances of their business. Looking for clear, practical solutions to your data challenges? Let’s loop you in. Book your intro call with our data experts today.

  • Microsoft Fabric August 2024: Key Updates For Data Professionals

    The August 2024 updates for Microsoft Fabric puts Copilot and PowerBI at the forefront with some additional helpful enhancements for Data Engineers and Data Scientists. Here are the top 10 updates we think you should know about: Top 10 Microsoft Fabric Updates for August 2024 Copilot and AI Enhancements in Power BI : Users can now ask Copilot questions against their semantic model, improving data interaction and insights generation. V-Order Behavior in Fabric Warehouses : New feature allowing management of V-Order behavior at the warehouse level, enhancing data organization and retrieval efficiency. Monitor ML Experiments from the Monitor Hub : Integration of experiment items into the Monitoring Hub, facilitating easier tracking and management of machine learning experiments. Modern Data Experience in Data Pipeline : Enhanced connectivity to Azure resources, streamlining data integration and pipeline management. Reporting Enhancements in Power BI : Introduction of visual level format strings (preview) and dynamic per recipient subscriptions (generally available), enhancing report customization and distribution. Improved Dataflow Performance : Optimizations to dataflow performance, resulting in faster data processing and reduced latency. Enhanced Security Features : Introduction of new security features to protect data integrity and ensure compliance with industry standards. Expanded Data Connectors : Addition of new data connectors, broadening the range of data sources that can be integrated into the Fabric platform. Data Integration Improvements in Power BI : Updated Save and Upload to OneDrive Flow in Power BI, streamlining data management and sharing. Embedded Analytics in Power BI : Narrative visual with Copilot is now available in SaaS embed, enhancing storytelling and data presentation capabilities. Conclusion The August 2024 updates will help users supercharge their dashboards and more effectively monitor the Machine Learning and Copilot based solutions from within the Fabric portal. For the full updates - check out this video from the Microsoft Team:

  • Microsoft Fabric July 2024 Update: Key Highlights for Data Professionals

    The July 2024 update for Microsoft Fabric brings a host of exciting enhancements and new features, aimed at empowering data professionals with more robust tools and capabilities. Here’s a concise summary of the key updates we think you should be aware of: Enhanced Git Integration One of the standout features in this update is the improved Git integration. Data professionals can now create and manage Git branches and connected workspaces more efficiently. This enhancement simplifies version control and collaboration, making it easier to track changes and maintain consistency across projects 1 . Restore-in-Place for Warehouses The update introduces the ability to perform restore-in-place of a warehouse through the Microsoft Fabric Warehouse Editor. This feature is a game-changer for data recovery and management, allowing users to restore data without the need for complex procedures or downtime 1 . Real-Time Dashboards Real-time intelligence takes a leap forward with the introduction of ultra-low refresh rates for dashboards. Now, dashboards can refresh at intervals as low as 1 and 10 seconds, ensuring that data professionals always have the most current and accurate data at their fingertips. This is particularly beneficial for scenarios requiring immediate data insights and decision-making 1 . Browser Update Reminder for Power BI Users A crucial reminder for Power BI users: ensure your web browser is up to date. Starting August 31, 2024, accessing Power BI on browsers older than Chrome 94, Edge 94, Safari 16.4, or Firefox 93 may result in limited functionality. Keeping your browser updated ensures you can fully leverage the latest features and improvements in Power BI 1 . Conclusion The July 2024 update for Microsoft Fabric is packed with enhancements that significantly boost the capabilities of data professionals. From improved Git integration and real-time dashboards to new learning opportunities and community events, these updates are designed to streamline workflows, enhance data management, and foster professional growth. Stay tuned for more exciting developments as Microsoft continues to innovate and expand the Fabric platform. For the full update - check out this video from the Microsoft Fabric Team:

Interloop - Background

Ready To Get Started?

You're one small step from starting your data-driven journey.

bottom of page