This browser is no longer supported.

Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.

Role assignments

  • 8 contributors

In Reporting Services, role assignments determine access to stored items and to the report server itself. A role assignment has the following parts:

A securable item for which you want to control access. Examples of securable items include folders, reports, and resources.

A user or group account that Windows security or another authentication mechanism can authenticate.

Role definitions define a set of permissible tasks and include:

  • Content Manager
  • Report Builder
  • System Administrator
  • System User

Role assignments are inherited within the folder hierarchy and are automatically inherited by contained:

  • Shared data sources

You can override inherited security by defining role assignments for individual items. At least one role assignment must secure all parts of the folder hierarchy. You can't create an unsecured item or manipulate settings in a way that produces an unsecured item.

The following diagram illustrates a role assignment that maps a group and a specific user to the Publisher role for Folder B.

System-level and item-level role assignments

Role-based security in Reporting Services is organized into the following levels:

Item-level role assignments control access to items in the report server folder hierarchy such as:

  • report models
  • shared data sources
  • other resources

Item-level role assignments are defined when you create a role assignment on a specific item, or on the Home folder.

System role assignments authorize operations that are scoped to the server as a whole. For example, the ability to manage jobs is a system level operation. A system role assignment isn't the equivalent of a system administrator. It doesn't confer advanced permissions that grant full control of a report server.

A system role assignment doesn't authorize access to items in the folder hierarchy. System and item security are mutually exclusive. Sometimes, you might need to create both a system-level and item-level role assignment to provide sufficient access for a user or group.

Users and groups in role assignments

The users or group accounts that you specify in role assignments are domain accounts. The report server references, but doesn't create or manage, users and groups from a Microsoft Windows domain (or another security model if you're using a custom security extension).

Of all the role assignments that apply to any given item, no two can specify the same user or group. If a user account is also a member of a group account, and you have role assignments for both, the combined set of tasks for both role assignments are available to the user.

When you add a user to a group that already has a role assignment, you must reset Internet Information Services (IIS) for the new role assignment to take effect.

Predefined role assignments

By default, predefined role assignments are implemented that allow local administrators to manage the report server. You can add other role assignments to grant access to other users.

For more information about the predefined role assignments that provide default security, see Predefined roles .

Related content

Create, delete, or modify a role (Management Studio) Modify or delete a role assignment ( SSRS web portal ) Set permissions for report server items on a SharePoint site (Reporting Services in SharePoint integrated mode) Grant permissions on a native mode report server

Was this page helpful?

Coming soon: Throughout 2024 we will be phasing out GitHub Issues as the feedback mechanism for content and replacing it with a new feedback system. For more information see: https://aka.ms/ContentUserFeedback .

Submit and view feedback for

Additional resources

SQL Server Reporting Services Report Manager Site Permissions Error After Installation

By: Dallas Snider   |   Comments (12)   |   Related: > Reporting Services Security

After a successful installation of SQL Server Reporting Services 2012 on a personal computer such as a laptop, a permissions error is received when accessing the Report Manager site for the first time. The message typically states the following: "User DOMAIN_NAME\userID does not have required permissions. Verify that sufficient permissions have been granted and Windows User Account Control (UAC) restrictions have been addressed." This error can occur even though you the user, installer and database administrator have full administrative rights on your device.

SQL Server Reporting Services Permissions Error After Installation

Depending on your situation there are two methods that will typically resolve this problem. We will start with the easier solution first before moving to the more complicated solution. The goal of both solutions is to get the Site Settings to display in the top right corner as shown below. Site Settings is not displayed in the image above containing the error message.

The goal of both solutions is to get the Site Settings to display in the top right corner as shown below

1. The first method involves starting Internet Explorer by using the Run as administrator option as shown below.

MSSQLTips.com Sample Image

2. If Site Settings is not displayed, please skip to step 8. If Site Settings is displayed in the top right corner, click on Site Settings. Click on the Security tab on the left. Next, click on New Role Assignment.

If Site Settings is displayed in the top right corner, click on Site Settings

3. On the New System Role Assignment page, enter your group or user name and then select the System Administrator checkbox. Click on OK.

select the System Administrator checkbox

4. After clicking on OK, the system role assignments will be shown.

After clicking on OK, the system role assignments will be shown

5. Next, click on Home in the top right and then click on Folder Settings.

click on Folder Settings

6. Click on New Role Assignment. Add your group or user name and then choose the desired roles to be assigned. Click on OK when finished.

Click on New Role Assignment

7. If you have made it this far, then the problem should be resolved. Exit from Internet Explorer and reopen Internet Explorer normally without running as administrator and navigate to your local Report Manager site.

If you have made it this far, then the problem should be resolved

8. The second method involves temporarily changing the User Access Control settings.

The second method involves temporarily changing the User Access Control settings

9. Change the settings from the default level as shown above to "Never notify" shown below. After clicking OK and rebooting your computer, attempt steps 1 through 7 above. When successfully making these changes, make sure to set the User Access Control settings back to the default level and reboot.

UAC Disabled

  • Read through our Reporting Services Tutorial.
  • Read through our Report Builder Tutorial.

sql server categories

About the author

MSSQLTips author Dallas Snider

Comments For This Article

agree to terms

Related Content

SQL Server Reporting Services Custom Security with Single Sign-on

SQL Server Reporting Services 2012 Permissions

Determining who is viewing reports in SQL Server 2012 Reporting Services

SQL Server Reporting Services Encryption Key

SQL Server Reporting Services Security Options

SQL Server Reporting Services Column Level Security

Centralize and Control Data Access in SSRS 2008 R2

Free Learning Guides

Learn Power BI

What is SQL Server?

Donwload Links

Become a DBA

What is SSIS?

Related Categories

Reporting Services Administration

Reporting Services Best Practices

Reporting Services Configuration

Reporting Services Installation

Reporting Services Migration

Reporting Services Monitoring

Reporting Services Network Load Balancing

Reporting Services Performance

Reporting Services Security

Development

Date Functions

System Functions

JOIN Tables

SQL Server Management Studio

Database Administration

Performance

Performance Tuning

Locking and Blocking

Data Analytics \ ETL

Microsoft Fabric

Azure Data Factory

Integration Services

Popular Articles

Date and Time Conversions Using SQL Server

Format SQL Server Dates with FORMAT Function

SQL Server CROSS APPLY and OUTER APPLY

SQL CASE Statement in Where Clause to Filter Based on a Condition or Expression

SQL Server Cursor Example

SQL NOT IN Operator

DROP TABLE IF EXISTS Examples for SQL Server

SQL Convert Date to YYYYMMDD

Rolling up multiple rows into a single row and column for SQL Server data

Resolving could not open a connection to SQL Server errors

Format numbers in SQL Server

SQL Server PIVOT and UNPIVOT Examples

Script to retrieve SQL Server database backup history and no backups

How to install SQL Server 2022 step by step

An Introduction to SQL Triggers

Using MERGE in SQL Server to insert, update and delete at the same time

How to monitor backup and restore progress in SQL Server

List SQL Server Login and User Permissions with fn_my_permissions

SQL Server Management Studio Dark Mode

SQL Server Loop through Table Rows without Cursor

  • SQL Server training
  • Write for us!

Craig Porteous

Managing SSRS security and using PowerShell automation scripts

So much has changed with Reporting Services 2016 but in terms of security it’s the same under the hood and that’s not necessarily a bad thing. SSRS has long had a robust folder & item level security model with the ability to inherit permissions from parent folders, much like SharePoint and windows in general.

Managing this security model, however, can become difficult as the use of SSRS expands over years & even versions. 5 folders & 40 reports quickly become 30 folders, 200 reports and many different business units or even clients in the same environment. Once you introduce processes to move databases down to non-production environments, it quickly becomes a difficult task to maintain security never mind implement any changes or improvements. I want to outline some tips that have helped me over the years and some PowerShell scripts that will save you hours of clicking!

Best Practices & tips

AD Groups reduce maintenance

It might be an obvious one, but it’s a basic rule in my opinion. Wherever possible, grant security in SSRS (& your database too) to AD groups and fill those groups with the relevant users. This gives you a single place to add/remove people, whether that’s a quarterly task or once every decade. Using AD groups may give you one more step to check “who has access to what” but it makes finding & maintaining those users significantly easier.

Keep permissions to a minimum

Reporting Services has several “out of the box” roles to choose from. If none of those fit the bill or a user needs wants slightly more than Browser access (i.e. View Data Sources) don’t just bump them up to full Content Manager.

ssrs new role assignment

By connecting to your Reporting Services instance via Management Studio (SSMS) you can View the built in security roles. From here you can Add permissions to an existing role or create a whole new role, if only a subset of users need the extra permissions.

ssrs new role assignment

Some further reading on SSRS roles: Role Definitions – Predefined Roles

Clean up default Permissions

You may have noticed that by Default, BUILTIN\Administrators is added as a Content Manager to the Home folder (and every inherited folder!). This is great for initial setup. It allows the server admin(s) to access Report Manager & get started without any security prerequisites.

ssrs new role assignment

Beyond “Day 1” setup this should be removed. In the vast majority of implementations, the server admin will not be the Reporting Services admin, or there will always be people in one group who shouldn’t be in the other.

If you leave this in place you are giving everyone who has administrator rights of the SSRS server full Content Manager access. This is best to remove at first implementation before your instance grows, folders get unique permissions and it’s no longer a single click to fix (though I’ll give you a fast way to fix it later!)

Plan your Security Model

When implementing Reporting Services from scratch, or any new technology/app, it can be too easy to just use a select few “service accounts” for multiple functions & tasks. Usually it’s a case of “whatever gets this fixed/online the fastest”. Using a single AD account for each function within SSRS is good practice & minimises security risk.

An example of accounts used in a production environment:

  • Domain\DataAccess for stored credentials in datasources. This account doesn’t need any access in SSRS or any server permissions. It may be granted db_datareader or more on the datasources it needed to access to.
  • Domain\Deploy would be used to deploy content to Reporting Services. This would only need the Publisher role in SSRS. It could also be a group of senior developers or a dev manager.
  • Domain\Service is the account Reporting Services would run under. This would need the RSExecRole on the ReportServer DB (this is granted during configuration/install). This account would have no data access or Reporting services access.
  • Domain\rsAdmins is an AD group with the admins who manage content & permissions. Generally, this group would not need data access

ssrs new role assignment

Now, this level of separation isn’t always possible and in some smaller organizations a single person covers most of these functions so don’t take the above as a hard requirement. Using these separate domain accounts reduces a single point of failure caused by password lockouts & resets or compromised accounts.

PowerShell Automation

There’s a great deal of automation that can be achieved with PowerShell in Reporting Services., I’ve detailed a few scripts below specific to this security topic, but there’s an abundance of content out there for many tasks, such as deploying reports, folders, data sources etc. Although I’ve focused primarily on native mode Reporting Services, there are also scripts that work with SharePoint integrated mode too.

Development environment security

Unlike your production environment you may want to simplify your dev environment’s security to make it easier for developers to deploy & test without running into permissions issues. This is a good place to utilize Reporting Services’ inherit functionality. Setting all folders to “Revert to Parent Security” makes it easy to add/remove permissions to the whole environment from the top level folder.

If you ever need to copy down your production database this can be a mammoth task to update. This is where PowerShell comes in handy. The following simple script will revert all subfolders in an SSRS environment to Revert to Parent Security .

You may need to adjust the .asmx file for different versions of SSRS though this should work just fine in 2012 onwards.

Security auditing

If you’re inheriting an existing environment or even want to overhaul/audit your current security, the following PowerShell script will allow you to quickly output every folder’s security to csv allowing you to analyse erroneous permissions without searching through folders in Report Manager.

Targeted Changes

Following a security review, you may want to add or remove a single AD account/group across every folder in your environment. There may be many occasions that call for such a blanket change. Again, this would normally be a laboriously manual task without PowerShell. These little snippets show how it can be done & you can always edit these to target a specific folder (& all its sub-folders).

You can then use the following script to remove a user/group or reverse the change made in the last script.

NOTE: The above targeted scripts won’t add or remove users or groups from the top level folder. This process can be easily added, though I’ve omitted it to reduce the risk of removing an admin user/group from the entire site and in the case of adding users, I’ve worked with RS instances where multiple clients share a single instance and only admin accounts have access to the top level “home”.

Love PowerShell!

I hope I’ve provided a few examples of security practices in SSRS and some basic PowerShell scripts to automate administration of security in Reporting Services. You can build upon these scripts to do more advanced tasks such as setting instance wide security from an input file (good for refreshing other environments from production backups.

I know there is a lot of good work going into PowerShell for DBA tasks over at dbatools.io that shows PowerShell is something you want on your tool belt!

Microsoft also put together a bunch of PowerShell scripts for Reporting Services late last year. You can find the article: Community contributions to the PowerShell scripts for Reporting Services & the scripts are on GitHub here: ReportingServicesTools

For SSRS documentation, consider ApexSQL Doc , a tool that documents reports (*.rdl), shared datasets (*.rsd), shared data sources (*.rds) and projects (*.rptproj) from the file system and web services (native and SharePoint) in different output formats.

  • SSRS Roles: Role Definitions – Predefined Roles
  • DBATools page
  • Microsoft SSRS PowerShell Tools: Community contributions to the PowerShell scripts for Reporting Services
  • GitHub link: Reporting Services Powershell Tools

ssrs new role assignment

  • Recent Posts

Craig Porteous

  • How to secure Reporting Services with Group Managed Service Accounts (GMSA) - November 7, 2018
  • Contribute, contribute, contribute! - June 20, 2018
  • Top 10 things you must document in SQL Server Reporting Services (SSRS) - February 26, 2018

Related posts:

  • Choosing and Setting a PowerShell Execution Policy
  • Migrating SSRS content with PowerShell
  • SQL Server Policy Based Management – best practices
  • SQL Server Policy Based Management – applying policies to non-compliant targets
  • PowerShell SQL Server Validation Utility – DBAChecks

logo

  • Microsoft Dynamics 365
  • Power Connector
  • Employee Expenses
  • Enhanced Intercompany Automation
  • Field Service
  • Customer Voice
  • Customer Service
  • Supply Chain Management (Dyn 365 F&O)
  • Project Operations
  • Human Resources
  • Dynamics GP
  • Dynamics GP to Business Central Migration
  • Implementation Packages
  • Microsoft Cloud Licensing Review
  • Shopify & Business Central Integration
  • Support Services for Dynamics GP
  • Support Services for Dynamics 365
  • Product Videos
  • Success Stories
  • News & Events
  • Schedule a consultation
  • Get a quote
  • Schedule a demo
  • Contact support

Microsoft Dynamics GP | Managing SSRS Security Roles

Microsoft Dynamics GP | Managing SSRS Security Roles

Article by: Eduardo Haro – Business Analyst

When people use SSRS for their reporting needs, many times, we get requests on how to set up or manage security roles. We have put together this quick guide detailing the steps on how to edit the security role of either an entire folder or a single report.

Within SSRS, click the Site Settings icon located in the top right-hand corner of the page.

ssrs new role assignment

Navigate to the Security tab and click New Role Assignment .

ssrs new role assignment

Enter the Group Name or User name in the following format “Domain\username or group name.” Then, click, OK .

ssrs new role assignment

Navigate to the SSRS Report or SSRS Folder, which needs its security role changed.

Hover over the report with the cursor and click the yellow arrow to the right of the report name or if you would like to apply the same security settings for all reports within a folder, click the folder settings icon (skip to step 7 if changing the security role for the entire folder).

ssrs new role assignment

Click Manage .

ssrs new role assignment

Navigate to the Security tab and click Edit Item Security .

ssrs new role assignment

Enter the Group Name or User name in the following format “Domain\username or group name” of who/whom should be granted access. Also, make sure to select the desired security role from the option set below. Then, click, OK .

ssrs new role assignment

Privacy Preference Center

Privacy preferences.

Visit Intel B2B Tracking Image - NoScript

SSRS Security Configuration

Report manager general overview, create user groups, reporting server predefined roles, create new role, update site settings, change security for preinstalled folders in report manager.

  • Create and Edit Folders in Report Manager

Report Manager can be used to perform the following tasks

  • View, search, and subscribe to reports
  • Create and manage folders, linked reports, report history, schedules, data source connections, and subscriptions
  • Set properties and report parameters
  • Manage role definitions and assignments that control user access to reports and folders

You can access items that are stored in a report server by navigating the folder hierarchy and clicking on items that you want to view or update, provided that you have permissions to do that. The ability to perform a task in Report Manager depends on the user role assignment. A user who is assigned to a role that has full permissions, such as a report server administrator, has access to the complete set of application menus and pages. A user assigned to a role that has permissions to view and run reports, on the other hand, sees only the menus and pages that support those activities.

Users can be assigned to multiple roles. Each user can have different role assignments for different report servers, or even for the various reports and folders that are stored on a single server. For more information refer Microsoft Report Manager help >> .

Provided that the suggested security settings are applied and the recommended path to access reports are used, an end user, i.e., a report viewer, will access Published Reports and Dashboard from IFS EE and not from Report Manager. Thus a report viewer will not have access to these folders in Report Manager so any change in these folders will only affect the Report Publisher user group.

Typically there are three user groups that use the External Report Integration within IFS Business Reporting & Analysis. These are Report Administrators, Report Publishers and Report Viewers. A Report Administrator has the overall permission and can administrate and manage everything related to Reporting Services. Report Publishers are power users that create reports that are viewed by other users. This user group stores reports in the Report Manager Published Reports and Dashboards folders. Report Publishers also have permissions to configure and manage these folders. Report Viewers are end users that have access to view published reports and dashboards. This user category can also work with ad hoc reporting in Report Builder, create their own reports and save them in their My Report personal folders to which only they have access. A Report Viewer can manage his own My Report folder but has no permission to administrate any of the other report folders in Report Manager.

In Report Manager you define which user group should have access to respective folder. Access can be given to a single user or to a user group with multiple users.

!2013-09-05 Added the note

Note: Windows User groups cannot be used when using IFS provided SQL Server Reporting Services Extensions . Rights for Reporting Server contents should be granted for each user individually. (Reporting Server user groups can be granted to a user).

It is advisable to give access on Report Manager folders to user groups instead of single users since that will facilitate Report Manager folder administration. The user group must be a valid domain account at the network and it is recommended to create user groups for Report Administrators, Report Publishers and Report Viewers. For information on how to create user groups, check your windows security documentation. The user groups will then be connected to the Report Manager folders and this is further described in the Configuration of Report Services >> .

Add domain users to the user groups according to the functionality that they need to access.

Reporting Services installs with predefined roles that you can use to grant access to report server operations. Each predefined role describes a collection of related tasks. Use SQL Server Management Studio to view the set of tasks that each role support. We recommend that the predefined roles and their related tasks are kept unchanged. Refer Microsoft documentation for more information on the pre-defined roles in Reporting Services >> .

The predefined roles that will be used in IFS Reporting are; Content Manager, Publisher, Browser and Report Builder. In addition to this we need a new role with permissions only to view reports. This is described in the next section.

For IFS Reporting we need one additional role in the Reporting Server. This new role should only have access to view reports and should be attached to the Report Viewer user group for the Dashboards and Published Reports folders. This is further described in the Configuration Report Manager page.

To create a new role:

  • Open SQL Server Management Studio and connect to your server.
  • Expand the Report Server node.
  • Expand the Security folder.
  • Right-click on Roles and then click New Role
  • Type the name Viewer for the role.
  • Type a description, e.g., May view reports
  • Select only the task View Reports

The site setting security page controls access to the report server site. System role assignments exist outside of the scope of the report server namespace or folder hierarchy. Operations that are supported through system role assignments include creating and using shared schedules, using Report Builder, and setting default values for some server features.

A default system role assignment is created when the report server is installed. This system roles assignment grants to local system administrators permissions to manage the report server environment. All other users who requires access to Report Builder must also be assigned to a system role assignment. This implies that all user groups, Report Administrators, Report Publisher and Report Viewers must be assigned to a system role to be able to use Report Builder. A system administrator must execute the the steps below.

Edit site security settings for user groups

  • Click the Site Settings link and select Security
  • Click New Role Assignment
  • Enter group ReportAdministrator and select the System Administrator role.
  • Repeat step 2-4 for ReportPublisher and ReportViewer but select the System User role for these user groups.

The contents page in Report Manager shows the items that you have permission to view. Depending on the permissions you have, you may also be able to move, delete, and add items.

The security settings that are set on the root folder, Home, gets inherited to subfolders. During Reporting Services installation the Administrator will be given the Content Manager role.

If you have created the user group according to the security description, you should update the security settings on the Home folder according to the following description:

To add a New Role Assignment

  • In the root folder, click Folder Settings
  • Click New Role Assignment to open the New Role Assignment page, which is used to create additional role assignments for the current folder.
  • Type the name of a group account for which the role assignment is being created, e.g, Report Viewer. The group  must be a valid windows domain account. Enter the account in this format: <domain>\<account>
  • Select the role(s) that respective user group should have. i.e., for Report Publisher select, Publisher, Browser and Report Builder.
  • Repeat step 2-5 for each user group.

Edit Folders in Report Manager

The role assignments for the new created folders will be inherited from the parent level which implies that Report Publishers will be connected to the Publisher, Browser and Report Builder roles and Report Viewers will be connected to the Browser role. However the role assignment for Report Viewers should be changed from Browser to Viewer. The reason for this is that Report Viewers should access Published Reports and Dashboards from IFS Applications and not from Report Manager.

Edit Role Assignment for the New Folders

  • In the Dashboard folder, click Folder Settings and then on the Security page.
  • Click Edit Item Security . You will get a message saying that the item security is inherited from a parent item and a confirmation message asking whether you want to apply other security settings. Click Ok .
  • Click on the Edit link for Report Viewer user group.
  • Deselect the Browser role and select the Viewer role instead.
  • Click Apply
  • Repeat steps 1-4 for the Published Reports folder.

Please enter your information to subscribe to the Microsoft Fabric Blog.

Microsoft fabric updates blog.

Microsoft Fabric May 2024 Update

  • Monthly Update

Headshot of article author

Welcome to the May 2024 update.  

Here are a few, select highlights of the many we have for Fabric. You can now ask Copilot questions about data in your model, Model Explorer and authoring calculation groups in Power BI desktop is now generally available, and Real-Time Intelligence provides a complete end-to-end solution for ingesting, processing, analyzing, visualizing, monitoring, and acting on events.

There is much more to explore, please continue to read on. 

Microsoft Build Announcements

At Microsoft Build 2024, we are thrilled to announce a huge array of innovations coming to the Microsoft Fabric platform that will make Microsoft Fabric’s capabilities even more robust and even customizable to meet the unique needs of each organization. To learn more about these changes, read the “ Unlock real-time insights with AI-powered analytics in Microsoft Fabric ” announcement blog by Arun Ulag.

Fabric Roadmap Update

Last October at the Microsoft Power Platform Community Conference we  announced the release of the Microsoft Fabric Roadmap . Today we have updated that roadmap to include the next semester of Fabric innovations. As promised, we have merged Power BI into this roadmap to give you a single, unified road map for all of Microsoft Fabric. You can find the Fabric Roadmap at  https://aka.ms/FabricRoadmap .

We will be innovating our Roadmap over the coming year and would love to hear your recommendation ways that we can make this experience better for you. Please submit suggestions at  https://aka.ms/FabricIdeas .

Earn a discount on your Microsoft Fabric certification exam!  

We’d like to thank the thousands of you who completed the Fabric AI Skills Challenge and earned a free voucher for Exam DP-600 which leads to the Fabric Analytics Engineer Associate certification.   

If you earned a free voucher, you can find redemption instructions in your email. We recommend that you schedule your exam now, before your discount voucher expires on June 24 th . All exams must be scheduled and completed by this date.    

If you need a little more help with exam prep, visit the Fabric Career Hub which has expert-led training, exam crams, practice tests and more.  

Missed the Fabric AI Skills Challenge? We have you covered. For a limited time , you could earn a 50% exam discount by taking the Fabric 30 Days to Learn It Challenge .  

Modern Tooltip now on by Default

Matrix layouts, line updates, on-object interaction updates, publish to folders in public preview, you can now ask copilot questions about data in your model (preview), announcing general availability of dax query view, copilot to write and explain dax queries in dax query view public preview updates, new manage relationships dialog, refreshing calculated columns and calculated tables referencing directquery sources with single sign-on, announcing general availability of model explorer and authoring calculation groups in power bi desktop, microsoft entra id sso support for oracle database, certified connector updates, view reports in onedrive and sharepoint with live connected semantic models, storytelling in powerpoint – image mode in the power bi add-in for powerpoint, storytelling in powerpoint – data updated notification, git integration support for direct lake semantic models.

  • Editor’s pick of the quarter
  • New visuals in AppSource
  • Financial Reporting Matrix by Profitbase
  • Horizon Chart by Powerviz

Milestone Trend Analysis Chart by Nova Silva

  • Sunburst Chart by Powerviz
  • Stacked Bar Chart with Line by JTA

Fabric Automation

Streamlining fabric admin apis, microsoft fabric workload development kit, external data sharing, apis for onelake data access roles, shortcuts to on-premises and network-restricted data, copilot for data warehouse.

  • Unlocking Insights through Time: Time travel in Data warehouse

Copy Into enhancements

Faster workspace resource assignment powered by just in time database attachment, runtime 1.3 (apache spark 3.5, delta lake 3.1, r 4.3.3, python 3.11) – public preview, native execution engine for fabric runtime 1.2 (apache spark 3.4) – public preview , spark run series analysis, comment @tagging in notebook, notebook ribbon upgrade, notebook metadata update notification, environment is ga now, rest api support for workspace data engineering/science settings, fabric user data functions (private preview), introducing api for graphql in microsoft fabric (preview), copilot will be enabled by default, the ai and copilot setting will be automatically delegated to capacity admins, abuse monitoring no longer stores your data, real-time hub, source from real-time hub in enhanced eventstream, use real-time hub to get data in kql database in eventhouse, get data from real-time hub within reflexes, eventstream edit and live modes, default and derived streams, route streams based on content in enhanced eventstream, eventhouse is now generally available, eventhouse onelake availability is now generally available, create a database shortcut to another kql database, support for ai anomaly detector, copilot for real-time intelligence, eventhouse tenant level private endpoint support, visualize data with real-time dashboards, new experience for data exploration, create triggers from real-time hub, set alert on real-time dashboards, taking action through fabric items, general availability of the power query sdk for vs code, refresh the refresh history dialog, introducing data workflows in data factory, introducing trusted workspace access in fabric data pipelines.

  • Introducing Blob Storage Event Triggers for Data Pipelines
  • Parent/child pipeline pattern monitoring improvements

Fabric Spark job definition activity now available

Hd insight activity now available, modern get data experience in data pipeline.

Power BI tooltips are embarking on an evolution to enhance their functionality. To lay the groundwork, we are introducing the modern tooltip as the new default , a feature that many users may already recognize from its previous preview status. This change is more than just an upgrade; it’s the first step in a series of remarkable improvements. These future developments promise to revolutionize tooltip management and customization, offering possibilities that were previously only imaginable. As we prepare for the general availability of the modern tooltip, this is an excellent opportunity for users to become familiar with its features and capabilities. 

ssrs new role assignment

Discover the full potential of the new tooltip feature by visiting our dedicated blog . Dive into the details and explore the comprehensive vision we’ve crafted for tooltips, designed to enhance your Power BI experience. 

We’ve listened to our community’s feedback on improving our tabular visuals (Table and Matrix), and we’re excited to initiate their transformation. Drawing inspiration from the familiar PivotTable in Excel , we aim to build new features and capabilities upon a stronger foundation. In our May update, we’re introducing ‘ Layouts for Matrix .’ Now, you can select from compact , outline , or tabular layouts to alter the arrangement of components in a manner akin to Excel. 

ssrs new role assignment

As an extension of the new layout options, report creators can now craft custom layout patterns by repeating row headers. This powerful control, inspired by Excel’s PivotTable layout, enables the creation of a matrix that closely resembles the look and feel of a table. This enhancement not only provides greater flexibility but also brings a touch of Excel’s intuitive design to Power BI’s matrix visuals. Only available for Outline and Tabular layouts.

ssrs new role assignment

To further align with Excel’s functionality, report creators now have the option to insert blank rows within the matrix. This feature allows for the separation of higher-level row header categories, significantly enhancing the readability of the report. It’s a thoughtful addition that brings a new level of clarity and organization to Power BI’s matrix visuals and opens a path for future enhancements for totals/subtotals and rows/column headers. 

ssrs new role assignment

We understand your eagerness to delve deeper into the matrix layouts and grasp how these enhancements fulfill the highly requested features by our community. Find out more and join the conversation in our dedicated blog , where we unravel the details and share the community-driven vision behind these improvements. 

Following last month’s introduction of the initial line enhancements, May brings a groundbreaking set of line capabilities that are set to transform your Power BI experience: 

  • Hide/Show lines : Gain control over the visibility of your lines for a cleaner, more focused report. 
  • Customized line pattern : Tailor the pattern of your lines to match the style and context of your data. 
  • Auto-scaled line pattern : Ensure your line patterns scale perfectly with your data, maintaining consistency and clarity. 
  • Line dash cap : Customize the end caps of your customized dashed lines for a polished, professional look. 
  • Line upgrades across other line types : Experience improvements in reference lines, forecast lines, leader lines, small multiple gridlines, and the new card’s divider line. 

These enhancements are not to be missed. We recommend visiting our dedicated blog for an in-depth exploration of all the new capabilities added to lines, keeping you informed and up to date. 

This May release, we’re excited to introduce on-object formatting support for Small multiples , Waterfall , and Matrix visuals. This new feature allows users to interact directly with these visuals for a more intuitive and efficient formatting experience. By double-clicking on any of these visuals, users can now right-click on the specific visual component they wish to format, bringing up a convenient mini-toolbar. This streamlined approach not only saves time but also enhances the user’s ability to customize and refine their reports with ease. 

ssrs new role assignment

We’re also thrilled to announce a significant enhancement to the mobile reporting experience with the introduction of the pane manager for the mobile layout view. This innovative feature empowers users to effortlessly open and close panels via a dedicated menu, streamlining the design process of mobile reports. 

ssrs new role assignment

We recently announced a public preview for folders in workspaces, allowing you to create a hierarchical structure for organizing and managing your items. In the latest Desktop release, you can now publish your reports to specific folders in your workspace.  

When you publish a report, you can choose the specific workspace and folder for your report. The interface is simplistic and easy to understand, making organizing your Power BI content from Desktop better than ever. 

ssrs new role assignment

To publish reports to specific folders in the service, make sure the “Publish dialogs support folder selection” setting is enabled in the Preview features tab in the Options menu. 

ssrs new role assignment

Learn more about folders in workspaces.   

We’re excited to preview a new capability for Power BI Copilot allowing you to ask questions about the data in your model! You could already ask questions about the data present in the visuals on your report pages – and now you can go deeper by getting answers directly from the underlying model. Just ask questions about your data, and if the answer isn’t already on your report, Copilot will then query your model for the data instead and return the answer to your question in the form of a visual! 

ssrs new role assignment

We’re starting this capability off in both Edit and View modes in Power BI Service. Because this is a preview feature, you’ll need to enable it via the preview toggle in the Copilot pane. You can learn more about all the details of the feature in our announcement post here! (will link to announcement post)  

We are excited to announce the general availability of DAX query view. DAX query view is the fourth view in Power BI Desktop to run DAX queries on your semantic model.  

DAX query view comes with several ways to help you be as productive as possible with DAX queries. 

  • Quick queries. Have the DAX query written for you from the context menu of tables, columns, or measures in the Data pane of DAX query view. Get the top 100 rows of a table, statistics of a column, or DAX formula of a measure to edit and validate in just a couple clicks! 
  • DirectQuery model authors can also use DAX query view. View the data in your tables whenever you want! 
  • Create and edit measures. Edit one or multiple measures at once. Make changes and see the change in action in a DA query. Then update the model when you are ready. All in DAX query view! 
  • See the DAX query of visuals. Investigate the visuals DAX query in DAX query view. Go to the Performance Analyzer pane and choose “Run in DAX query view”. 
  • Write DAX queries. You can create DAX queries with Intellisense, formatting, commenting/uncommenting, and syntax highlighting. And additional professional code editing experiences such as “Change all occurrences” and block folding to expand and collapse sections. Even expanded find and replace options with regex. 

Learn more about DAX query view with these resources: 

  • Deep dive blog: https://powerbi.microsoft.com/blog/deep-dive-into-dax-query-view-and-writing-dax-queries/  
  • Learn more: https://learn.microsoft.com/power-bi/transform-model/dax-query-view  
  • Video: https://youtu.be/oPGGYLKhTOA?si=YKUp1j8GoHHsqdZo  

DAX query view includes an inline Fabric Copilot to write and explain DAX queries, which remains in public preview. This month we have made the following updates. 

  • Run the DAX query before you keep it . Previously the Run button was disabled until the generated DAX query was accepted or Copilot was closed. Now you can Run the DAX query then decide to Keep or Discard the DAX query. 

ssrs new role assignment

2. Conversationally build the DAX query. Previously the DAX query generated was not considered if you typed additional prompts and you had to keep the DAX query, select it again, then use Copilot again to adjust. Now you can simply adjust by typing in additional user prompts.   

ssrs new role assignment

3. Syntax checks on the generated DAX query. Previously there was no syntax check before the generated DAX query was returned. Now the syntax is checked, and the prompt automatically retried once. If the retry is also invalid, the generated DAX query is returned with a note that there is an issue, giving you the option to rephrase your request or fix the generated DAX query. 

ssrs new role assignment

4. Inspire buttons to get you started with Copilot. Previously nothing happened until a prompt was entered. Now click any of these buttons to quickly see what you can do with Copilot! 

ssrs new role assignment

Learn more about DAX queries with Copilot with these resources: 

  • Deep dive blog: https://powerbi.microsoft.com/en-us/blog/deep-dive-into-dax-query-view-with-copilot/  
  • Learn more: https://learn.microsoft.com/en-us/dax/dax-copilot  
  • Video: https://www.youtube.com/watch?v=0kE3TE34oLM  

We are excited to introduce you to the redesigned ‘Manage relationships’ dialog in Power BI Desktop! To open this dialog simply select the ‘Manage relationships’ button in the modeling ribbon.

ssrs new role assignment

Once opened, you’ll find a comprehensive view of all your relationships, along with their key properties, all in one convenient location. From here you can create new relationships or edit an existing one.

ssrs new role assignment

Additionally, you have the option to filter and focus on specific relationships in your model based on cardinality and cross filter direction. 

ssrs new role assignment

Learn more about creating and managing relationships in Power BI Desktop in our documentation . 

Ever since we released composite models on Power BI semantic models and Analysis Services , you have been asking us to support the refresh of calculated columns and tables in the Service. This month, we have enabled the refresh of calculated columns and tables in Service for any DirectQuery source that uses single sign-on authentication. This includes the sources you use when working with composite models on Power BI semantic models and Analysis Services.  

Previously, the refresh of a semantic model that uses a DirectQuery source with single-sign-on authentication failed with one of the following error messages: “Refresh is not supported for datasets with a calculated table or calculated column that depends on a table which references Analysis Services using DirectQuery.” or “Refresh over a dataset with a calculated table or a calculated column which references a Direct Query data source is not supported.” 

Starting today, you can successfully refresh the calculated table and calculated columns in a semantic model in the Service using specific credentials as long as: 

  • You used a shareable cloud connection and assigned it and/or.
  • Enabled granular access control for all data connection types.

Here’s how to do this: 

  • Create and publish your semantic model that uses a single sign-on DirectQuery source. This can be a composite model but doesn’t have to be. 
  • In the semantic model settings, under Gateway and cloud connections , map each single sign-on DirectQuery connection to a specific connection. If you don’t have a specific connection yet, select ‘Create a connection’ to create it: 

ssrs new role assignment

  • If you are creating a new connection, fill out the connection details and click Create , making sure to select ‘Use SSO via Azure AD for DirectQuery queries: 

ssrs new role assignment

  • Finally, select the connection for each single sign-on DirectQuery source and select Apply : 

ssrs new role assignment

2. Either refresh the semantic model manually or plan a scheduled refresh to confirm the refresh now works successfully. Congratulations, you have successfully set up refresh for semantic models with a single sign-on DirectQuery connection that uses calculated columns or calculated tables!

We are excited to announce the general availability of Model Explorer in the Model view of Power BI, including the authoring of calculation groups. Semantic modeling is even easier with an at-a-glance tree view with item counts, search, and in context paths to edit the semantic model items with Model Explorer. Top level semantic model properties are also available as well as the option to quickly create relationships in the properties pane. Additionally, the styling for the Data pane is updated to Fluent UI also used in Office and Teams.  

A popular community request from the Ideas forum, authoring calculation groups is also included in Model Explorer. Calculation groups significantly reduce the number of redundant measures by allowing you to define DAX formulas as calculation items that can be applied to existing measures. For example, define a year over year, prior month, conversion, or whatever your report needs in DAX formula once as a calculation item and reuse it with existing measures. This can reduce the number of measures you need to create and make the maintenance of the business logic simpler.  

Available in both Power BI Desktop and when editing a semantic model in the workspace, take your semantic model authoring to the next level today!  

ssrs new role assignment

Learn more about Model Explorer and authoring calculation groups with these resources: 

  • Use Model explorer in Power BI (preview) – Power BI | Microsoft Learn  
  • Create calculation groups in Power BI (preview) – Power BI | Microsoft Learn  

Data connectivity  

We’re happy to announce that the Oracle database connector has been enhanced this month with the addition of Single Sign-On support in the Power BI service with Microsoft Entra ID authentication.  

Microsoft Entra ID SSO enables single sign-on to access data sources that rely on Microsoft Entra ID based authentication. When you configure Microsoft Entra SSO for an applicable data source, queries run under the Microsoft Entra identity of the user that interacts with the Power BI report. 

ssrs new role assignment

We’re pleased to announce the new and updated connectors in this release:   

  • [New] OneStream : The OneStream Power BI Connector enables you to seamlessly connect Power BI to your OneStream applications by simply logging in with your OneStream credentials. The connector uses your OneStream security, allowing you to access only the data you have based on your permissions within the OneStream application. Use the connector to pull cube and relational data along with metadata members, including all their properties. Visit OneStream Power BI Connector to learn more. Find this connector in the other category. 
  • [New] Zendesk Data : A new connector developed by the Zendesk team that aims to go beyond the functionality of the existing Zendesk legacy connector created by Microsoft. Learn more about what this new connector brings. 
  • [New] CCH Tagetik 
  • [Update] Azure Databricks  

Are you interested in creating your own connector and publishing it for your customers? Learn more about the Power Query SDK and the Connector Certification program .   

Last May, we announced the integration between Power BI and OneDrive and SharePoint. Previously, this capability was limited to only reports with data in import mode. We’re excited to announce that you can now seamlessly view Power BI reports with live connected data directly in OneDrive and SharePoint! 

When working on Power BI Desktop with a report live connected to a semantic model in the service, you can easily share a link to collaborate with others on your team and allow them to quickly view the report in their browser. We’ve made it easier than ever to access the latest data updates without ever leaving your familiar OneDrive and SharePoint environments. This integration streamlines your workflows and allows you to access reports within the platforms you already use. With collaboration at the heart of this improvement, teams can work together more effectively to make informed decisions by leveraging live connected semantic models without being limited to data only in import mode.  

Utilizing OneDrive and SharePoint allows you to take advantage of built-in version control, always have your files available in the cloud, and utilize familiar and simplistic sharing.  

ssrs new role assignment

While you told us that you appreciate the ability to limit the image view to only those who have permission to view the report, you asked for changes for the “Public snapshot” mode.   

To address some of the feedback we got from you, we have made a few more changes in this area.  

  • Add-ins that were saved as “Public snapshot” can be printed and will not require that you go over all the slides and load the add-ins for permission check before the public image is made visible. 
  • You can use the “Show as saved image” on add-ins that were saved as “Public snapshot”. This will replace the entire add-in with an image representation of it, so the load time might be faster when you are presenting your presentation. 

Many of us keep presentations open for a long time, which might cause the data in the presentation to become outdated.  

To make sure you have in your slides the data you need, we added a new notification that tells you if more up to date data exists in Power BI and offers you the option to refresh and get the latest data from Power BI. 

Developers 

Direct Lake semantic models are now supported in Fabric Git Integration , enabling streamlined version control, enhanced collaboration among developers, and the establishment of CI/CD pipelines for your semantic models using Direct Lake. 

ssrs new role assignment

Learn more about version control, testing, and deployment of Power BI content in our Power BI implementation planning documentation: https://learn.microsoft.com/power-bi/guidance/powerbi-implementation-planning-content-lifecycle-management-overview  

Visualizations 

Editor’s pick of the quarter .

– Animator for Power BI     Innofalls Charts     SuperTables     Sankey Diagram for Power BI by ChartExpo     Dynamic KPI Card by Sereviso     Shielded HTML Viewer     Text search slicer  

New visuals in AppSource 

Mapa Polski – Województwa, Powiaty, Gminy   Workstream   Income Statement Table  

Gas Detection Chart  

Seasonality Chart   PlanIn BI – Data Refresh Service  

Chart Flare  

PictoBar   ProgBar  

Counter Calendar   Donut Chart image  

Financial Reporting Matrix by Profitbase 

Making financial statements with a proper layout has just become easier with the latest version of the Financial Reporting Matrix. 

Users are now able to specify which rows should be classified as cost-rows, which will make it easier to get the conditional formatting of variances correctly: 

ssrs new role assignment

Selecting a row, and ticking “is cost” will tag the row as cost. This can be used in conditional formatting to make sure that positive variances on expenses are a bad for the result, while a positive variance on an income row is good for the result. 

The new version also includes more flexibility in measuring placement and column subtotals. 

Measures can be placed either: 

  • Default (below column headers) 
  • Above column headers 

ssrs new role assignment

  • Conditionally hide columns 
  • + much more 

Highlighted new features:  

  • Measure placement – In rows  
  • Select Column Subtotals  
  • New Format Pane design 
  • Row Options  

Get the visual from AppSource and find more videos here ! 

Horizon Chart by Powerviz  

A Horizon Chart is an advanced visual, for time-series data, revealing trends and anomalies. It displays stacked data layers, allowing users to compare multiple categories while maintaining data clarity. Horizon Charts are particularly useful to monitor and analyze complex data over time, making this a valuable visual for data analysis and decision-making. 

Key Features:  

  • Horizon Styles: Choose Natural, Linear, or Step with adjustable scaling. 
  • Layer: Layer data by range or custom criteria. Display positive and negative values together or separately on top. 
  • Reference Line : Highlight patterns with X-axis lines and labels. 
  • Colors: Apply 30+ color palettes and use FX rules for dynamic coloring. 
  • Ranking: Filter Top/Bottom N values, with “Others”. 
  • Gridline: Add gridlines to the X and Y axis.  
  • Custom Tooltip: Add highest, lowest, mean, and median points without additional DAX. 
  • Themes: Save designs and share seamlessly with JSON files. 

Other features included are ranking, annotation, grid view, show condition, and accessibility support.  

Business Use Cases: Time-Series Data Comparison, Environmental Monitoring, Anomaly Detection 

🔗 Try Horizon Chart for FREE from AppSource  

📊 Check out all features of the visual: Demo file  

📃 Step-by-step instructions: Documentation  

💡 YouTube Video: Video Link  

📍 Learn more about visuals: https://powerviz.ai/  

✅ Follow Powerviz : https://lnkd.in/gN_9Sa6U  

ssrs new role assignment

Exciting news! Thanks to your valuable feedback, we’ve enhanced our Milestone Trend Analysis Chart even further. We’re thrilled to announce that you can now switch between horizontal and vertical orientations, catering to your preferred visualization style.

The Milestone Trend Analysis (MTA) Chart remains your go-to tool for swiftly identifying deadline trends, empowering you to take timely corrective actions. With this update, we aim to enhance deadline awareness among project participants and stakeholders alike. 

ssrs new role assignment

In our latest version, we seamlessly navigate between horizontal and vertical views within the familiar Power BI interface. No need to adapt to a new user interface – enjoy the same ease of use with added flexibility. Plus, it benefits from supported features like themes, interactive selection, and tooltips. 

What’s more, ours is the only Microsoft Certified Milestone Trend Analysis Chart for Power BI, ensuring reliability and compatibility with the platform. 

Ready to experience the enhanced Milestone Trend Analysis Chart? Download it from AppSource today and explore its capabilities with your own data – try for free!  

We welcome any questions or feedback at our website: https://visuals.novasilva.com/ . Try it out and elevate your project management insights now! 

Sunburst Chart by Powerviz  

Powerviz’s Sunburst Chart is an interactive tool for hierarchical data visualization. With this chart, you can easily visualize multiple columns in a hierarchy and uncover valuable insights. The concentric circle design helps in displaying part-to-whole relationships. 

  • Arc Customization: Customize shapes and patterns. 
  • Color Scheme: Accessible palettes with 30+ options. 
  • Centre Circle: Design an inner circle with layers. Add text, measure, icons, and images. 
  • Conditional Formatting: Easily identify outliers based on measure or category rules. 
  • Labels: Smart data labels for readability. 
  • Image Labels: Add an image as an outer label. 
  • Interactivity: Zoom, drill down, cross-filtering, and tooltip features. 

Other features included are annotation, grid view, show condition, and accessibility support.  

Business Use Cases:   

  • Sales and Marketing: Market share analysis and customer segmentation. 
  • Finance : Department budgets and expenditures distribution. 
  • Operations : Supply chain management. 
  • Education : Course structure, curriculum creation. 
  • Human Resources : Organization structure, employee demographics.

🔗 Try Sunburst Chart for FREE from AppSource  

ssrs new role assignment

Stacked Bar Chart with Line by JTA  

Clustered bar chart with the possibility to stack one of the bars  

Stacked Bar Chart with Line by JTA seamlessly merges the simplicity of a traditional bar chart with the versatility of a stacked bar, revolutionizing the way you showcase multiple datasets in a single, cohesive display. 

Unlocking a new dimension of insight, our visual features a dynamic line that provides a snapshot of data trends at a glance. Navigate through your data effortlessly with multiple configurations, gaining a swift and comprehensive understanding of your information. 

Tailor your visual experience with an array of functionalities and customization options, enabling you to effortlessly compare a primary metric with the performance of an entire set. The flexibility to customize the visual according to your unique preferences empowers you to harness the full potential of your data. 

Features of Stacked Bar Chart with Line:  

  • Stack the second bar 
  • Format the Axis and Gridlines 
  • Add a legend 
  • Format the colors and text 
  • Add a line chart 
  • Format the line 
  • Add marks to the line 
  • Format the labels for bars and line 

If you liked what you saw, you can try it for yourself and find more information here . Also, if you want to download it, you can find the visual package on the AppSource . 

ssrs new role assignment

We have added an exciting new feature to our Combo PRO, Combo Bar PRO, and Timeline PRO visuals – Legend field support . The Legend field makes it easy to visually split series values into smaller segments, without the need to use measures or create separate series. Simply add a column with category names that are adjacent to the series values, and the visual will do the following:  

  • Display separate segments as a stack or cluster, showing how each segment contributed to the total Series value. 
  • Create legend items for each segment to quickly show/hide them without filtering.  
  • Apply custom fill colors to each segment.  
  • Show each segment value in the tooltip 

Read more about the Legend field on our blog article  

Drill Down Combo PRO is made for creators who want to build visually stunning and user-friendly reports. Cross-chart filtering and intuitive drill down interactions make data exploration easy and fun for any user. Furthermore, you can choose between three chart types – columns, lines, or areas; and feature up to 25 different series in the same visual and configure each series independently.  

📊 Get Drill Down Combo PRO on AppSource  

🌐 Visit Drill Down Combo PRO product page  

Documentation | ZoomCharts Website | Follow ZoomCharts on LinkedIn  

We are thrilled to announce that Fabric Core REST APIs are now generally available! This marks a significant milestone in the evolution of Microsoft Fabric, a platform that has been meticulously designed to empower developers and businesses alike with a comprehensive suite of tools and services. 

The Core REST APIs are the backbone of Microsoft Fabric, providing the essential building blocks for a myriad of functionalities within the platform. They are designed to improve efficiency, reduce manual effort, increase accuracy, and lead to faster processing times. These APIs help with scale operations more easily and efficiently as the volume of work grows, automate repeatable processes with consistency, and enable integration with other systems and applications, providing a streamlined and efficient data pipeline. 

The Microsoft Fabric Core APIs encompasses a range of functionalities, including: 

  • Workspace management: APIs to manage workspaces, including permissions.  
  • Item management: APIs for creating, reading, updating, and deleting items, with partial support for data source discovery and granular permissions management planned for the near future. 
  • Job and tenant management: APIs to manage jobs, tenants, and users within the platform. 

These APIs adhere to industry standards and best practices, ensuring a unified developer experience that is both coherent and easy to use. 

For developers looking to dive into the details of the Microsoft Fabric Core APIs, comprehensive documentation is available. This includes guidelines on API usage, examples, and articles managed in a centralized repository for ease of access and discoverability. The documentation is continuously updated to reflect the latest features and improvements, ensuring that developers have the most current information at their fingertips. See Microsoft Fabric REST API documentation  

We’re excited to share an important update we made to the Fabric Admin APIs. This enhancement is designed to simplify your automation experience. Now, you can manage both Power BI and the new Fabric items (previously referred to as artifacts) using the same set of APIs. Before this enhancement, you had to navigate using two different APIs—one for Power BI items and another for new Fabric items. That’s no longer the case. 

The APIs we’ve updated include GetItem , ListItems , GetItemAccessDetails , and GetAccessEntities . These enhancements mean you can now query and manage all your items through a single API call, regardless of whether they’re Fabric types or Power BI types. We hope this update makes your work more straightforward and helps you accomplish your tasks more efficiently. 

We’re thrilled to announce the public preview of the Microsoft Fabric workload development kit. This feature now extends to additional workloads and offers a robust developer toolkit for designing, developing, and interoperating with Microsoft Fabric using frontend SDKs and backend REST APIs. Introducing the Microsoft Fabric Workload Development Kit . 

The Microsoft Fabric platform now provides a mechanism for ISVs and developers to integrate their new and existing applications natively into Fabric’s workload hub. This integration provides the ability to add net new capabilities to Fabric in a consistent experience without leaving their Fabric workspace, thereby accelerating data driven outcomes from Microsoft Fabric. 

ssrs new role assignment

By downloading and leveraging the development kit , ISVs and software developers can build and scale existing and new applications on Microsoft Fabric and offer them via the Azure Marketplace without the need to ever leave the Fabric environment. 

The development kit provides a comprehensive guide and sample code for creating custom item types that can be added to the Fabric workspace. These item types can leverage the Fabric frontend SDKs and backend REST APIs to interact with other Fabric capabilities, such as data ingestion, transformation, orchestration, visualization, and collaboration. You can also embed your own data application into the Fabric item editor using the Fabric native experience components, such as the header, toolbar, navigation pane, and status bar. This way, you can offer consistent and seamless user experience across different Fabric workloads. 

This is a call to action for ISVs, software developers, and system integrators. Let’s leverage this opportunity to create more integrated and seamless experiences for our users. 

ssrs new role assignment

We’re excited about this journey and look forward to seeing the innovative workloads from our developer community. 

We are proud to announce the public preview of external data sharing. Sharing data across organizations has become a standard part of day-to-day business for many of our customers. External data sharing, built on top of OneLake shortcuts, enables seamless, in-place sharing of data, allowing you to maintain a single copy of data even when sharing data across tenant boundaries. Whether you’re sharing data with customers, manufacturers, suppliers, consultants, or partners; the applications are endless. 

How external data sharing works  

Sharing data across tenants is as simple as any other share operation in Fabric. To share data, navigate to the item to be shared, click on the context menu, and then click on External data share . Select the folder or table you want to share and click Save and continue . Enter the email address and an optional message and then click Send . 

ssrs new role assignment

The data consumer will receive an email containing a share link. They can click on the link to accept the share and access the data within their own tenant. 

ssrs new role assignment

Click here for more details about external data sharing . 

Following the release of OneLake data access roles in public preview, the OneLake team is excited to announce the availability of APIs for managing data access roles. These APIs can be used to programmatically manage granular data access for your lakehouses. Manage all aspects of role management such as creating new roles, editing existing ones, or changing memberships in a programmatic way.  

Do you have data stored on-premises or behind a firewall that you want to access and analyze with Microsoft Fabric? With OneLake shortcuts, you can bring on-premises or network-restricted data into OneLake, without any data movement or duplication. Simply install the Fabric on-premises data gateway and create a shortcut to your S3 compatible, Amazon S3, or Google Cloud Storage data source. Then use any of Fabric’s powerful analytics engines and OneLake open APIs to explore, transform, and visualize your data in the cloud. 

Try it out today and unlock the full potential of your data with OneLake shortcuts! 

ssrs new role assignment

Data Warehouse 

We are excited to announce Copilot for Data Warehouse in public preview! Copilot for Data Warehouse is an AI assistant that helps developers generate insights through T-SQL exploratory analysis. Copilot is contextualized your warehouse’s schema. With this feature, data engineers and data analysts can use Copilot to: 

  • Generate T-SQL queries for data analysis.  
  • Explain and add in-line code comments for existing T-SQL queries. 
  • Fix broken T-SQL code. 
  • Receive answers regarding general data warehousing tasks and operations. 

There are 3 areas where Copilot is surfaced in the Data Warehouse SQL Query Editor: 

  • Code completions when writing a T-SQL query. 
  • Chat panel to interact with the Copilot in natural language. 
  • Quick action buttons to fix and explain T-SQL queries. 

Learn more about Copilot for Data Warehouse: aka.ms/data-warehouse-copilot-docs. Copilot for Data Warehouse is currently only available in the Warehouse. Copilot in the SQL analytics endpoint is coming soon. 

Unlocking Insights through Time: Time travel in Data warehouse (public preview)

As data volumes continue to grow in today’s rapidly evolving world of Artificial Intelligence, it is crucial to reflect on historical data. It empowers businesses to derive valuable insights that aid in making well-informed decisions for the future. Preserving multiple historical data versions not only incurred significant costs but also presented challenges in upholding data integrity, resulting in a notable impact on query performance. So, we are thrilled to announce the ability to query the historical data through time travel at the T-SQL statement level which helps unlock the evolution of data over time. 

The Fabric warehouse retains historical versions of tables for seven calendar days. This retention allows for querying the tables as if they existed at any point within the retention timeframe. Time travel clause can be included in any top level SELECT statement. For complex queries that involve multiple tables, joins, stored procedures, or views, the timestamp is applied just once for the entire query instead of specifying the same timestamp for each table within the same query. This ensures the entire query is executed with reference to the specified timestamp, maintaining the data’s uniformity and integrity throughout the query execution. 

From historical trend analysis and forecasting to compliance management, stable reporting and real-time decision support, the benefits of time travel extend across multiple business operations. Embrace the capability of time travel to navigate the data-driven landscape and gain a competitive edge in today’s fast-paced world of Artificial Intelligence. 

We are excited to announce not one but two new enhancements to the Copy Into feature for Fabric Warehouse: Copy Into with Entra ID Authentication and Copy Into for Firewall-Enabled Storage!

Entra ID Authentication  

When authenticating storage accounts in your environment, the executing user’s Entra ID will now be used by default. This ensures that you can leverage A ccess C ontrol L ists and R ole – B ased a ccess c ontrol to authenticate to your storage accounts when using Copy Into. Currently, only organizational accounts are supported.  

How to Use Entra ID Authentication  

  • Ensure your Entra ID organizational account has access to the underlying storage and can execute the Copy Into statement on your Fabric Warehouse.  
  • Run your Copy Into statement without specifying any credentials; the Entra ID organizational account will be used as the default authentication mechanism.  

Copy into firewall-enabled storage

The Copy Into for firewall-enabled storage leverages the trusted workspace access functionality ( Trusted workspace access in Microsoft Fabric (preview) – Microsoft Fabric | Microsoft Learn ) to establish a secure and seamless connection between Fabric and your storage accounts. Secure access can be enabled for both blob and ADLS Gen2 storage accounts. Secure access with Copy Into is available for warehouses in workspaces with Fabric Capacities (F64 or higher).  

To learn more about Copy into , please refer to COPY INTO (Transact-SQL) – Azure Synapse Analytics and Microsoft Fabric | Microsoft Learn  

We are excited to announce the launch of our new feature, Just in Time Database Attachment, which will significantly enhance your first experience, such as when connecting to the Datawarehouse or SQL endpoint or simply opening an item. These actions trigger the workspace resource assignment process, where, among other actions, we attach all necessary metadata of your items, Data warehouses and SQL endpoints, which can be a long process, particularly for workspaces that have a high number of items.  

This feature is designed to attach your desired database during the activation process of your workspace, allowing you to execute queries immediately and avoid unnecessary delays. However, all other databases will be attached asynchronously in the background while you are able to execute queries, ensuring a smooth and efficient experience. 

Data Engineering 

We are advancing Fabric Runtime 1.3 from an Experimental Public Preview to a full Public Preview. Our Apache Spark-based big data execution engine, optimized for both data engineering and science workflows, has been updated and fully integrated into the Fabric platform. 

The enhancements in Fabric Runtime 1.3 include the incorporation of Delta Lake 3.1, compatibility with Python 3.11, support for Starter Pools, integration with Environment and library management capabilities. Additionally, Fabric Runtime now enriches the data science experience by supporting the R language and integrating Copilot. 

ssrs new role assignment

We are pleased to share that the Native Execution Engine for Fabric Runtime 1.2 is currently available in public preview. The Native Execution Engine can greatly enhance the performance for your Spark jobs and queries. The engine has been rewritten in C++ and operates in columnar mode and uses vectorized processing. The Native Execution Engine offers superior query performance – encompassing data processing, ETL, data science, and interactive queries – all directly on your data lake. Overall, Fabric Spark delivers a 4x speed-up on the sum of execution time of all 99 queries in the TPC-DS 1TB benchmark when compared against Apache Spark.  This engine is fully compatible with Apache Spark™ APIs (including Spark SQL API). 

It is seamless to use with no code changes – activate it and go. Enable it in your environment for your notebooks and your SJDs. 

ssrs new role assignment

This feature is in the public preview, at this stage of the preview, there is no additional cost associated with using it. 

We are excited to announce the Spark Monitoring Run Series Analysis features, which allow you to analyze the run duration trend and performance comparison for Pipeline Spark activity recurring run instances and repetitive Spark run activities from the same Notebook or Spark Job Definition.   

  • Run Series Comparison: Users can compare the duration of a Notebook run with that of previous runs and evaluate the input and output data to understand the reasons behind prolonged run durations.  
  • Outlier Detection and Analysis: The system can detect outliers in the run series and analyze them to pinpoint potential contributing factors. 
  • Detailed Run Instance Analysis: Clicking on a specific run instance provides detailed information on time distribution, which can be used to identify performance enhancement opportunities. 
  • Configuration Insights : Users can view the Spark configuration used for each run, including auto-tuned configurations for Spark SQL queries in auto-tune enabled Notebook runs. 

You can access the new feature from the item’s recent runs panel and Spark application monitoring page. 

ssrs new role assignment

We are excited to announce that Notebook now supports the ability to tag others in comments, just like the familiar functionality of using Office products!   

When you select a section of code in a cell, you can add a comment with your insights and tag one or more teammates to collaborate or brainstorm on the specifics. This intuitive enhancement is designed to amplify collaboration in your daily development work. 

Moreover, you can easily configure the permissions when tagging someone who doesn’t have the permission, to make sure your code asset is well managed. 

ssrs new role assignment

We are thrilled to unveil a significant enhancement to the Fabric notebook ribbon, designed to elevate your data science and engineering workflows. 

ssrs new role assignment

In the new version, you will find the new Session connect control on the Home tab, and now you can start a standard session without needing to run a code cell. 

ssrs new role assignment

You can also easily spin up a High concurrency session and share the session across multiple notebooks to improve the compute resource utilization. And you can easily attach/leave a high concurrency session with a single click. 

ssrs new role assignment

The “ View session information ” can navigate you to the session information dialog, where you can find a lot of useful detailed information, as well as configure the session timeout. The diagnostics info is essentially helpful when you need support for notebook issues. 

ssrs new role assignment

Now you can easily access the powerful “ Data Wrangler ” on Home tab with the new ribbon! You can explore your data with the fancy low-code experience of data wrangler, and the pandas DataFrames and Spark DataFrames are all supported.   

ssrs new role assignment

We recently made some changes to the Fabric notebook metadata to ensure compliance and consistency: 

Notebook file content: 

  • The keyword “trident” has been replaced with “dependencies” in the notebook content. This adjustment ensures consistency and compliance. 
  • Notebook Git format: 
  • The preface of the notebook has been modified from “# Synapse Analytics notebook source” to “# Fabric notebook source”. 
  • Additionally, the keyword “synapse” has been updated to “dependencies” in the Git repo. 

The above changes will be marked as ‘uncommitted’ for one time if your workspace is connected to Git. No action is needed in terms of these changes , and there won’t be any breaking scenario within the Fabric platform . If you have any further updates or questions, feel free to share with us. 

We are thrilled to announce that the environment is now a generally available item in Microsoft Fabric. During this GA timeframe, we have shipped a few new features of Environment. 

  • Git support  

ssrs new role assignment

The environment is now Git supported. You can check-in the environment into your Git repo and manipulate the environment locally with its YAML representations and custom library files. After updating the changes from local to Fabric portal, you can publish them by manual action or through REST API. 

  • Deployment pipeline  

ssrs new role assignment

Deploying environments from one workspace to another is supported.  Now, you can deploy the code items and their dependent environments together from development to test and even production. 

With the REST APIs, you can have the code-first experience with the same abilities through Fabric portal. We provide a set of powerful APIs to ensure you the efficiency in managing your environment. You can create new environments, update libraries and Spark compute, publish the changes, delete an environment, attach the environment to a notebook, etc., all actions can be done locally in the tools of your choice. The article – Best practice of managing environments with REST API could help you get started with several real-world scenarios.  

  • Resources folder   

ssrs new role assignment

Resources folder enables managing small resources in the development cycle. The files uploaded in the environment can be accessed from notebooks once they’re attached to the same environment. The manipulation of the files and folders of resources happens in real-time. It could be super powerful, especially when you are collaborating with others. 

ssrs new role assignment

Sharing your environment with others is also available. We provide several sharing options. By default, the view permission is shared. If you want the recipient to have access to view and use the contents of the environment, sharing without permission customization is the best option. Furthermore, you can grant editing permission to allow recipients to update this environment or grant share permission to allow recipients to reshare this environment with their existing permissions. 

We are excited to announce the REST api support for Fabric Data Engineering/Science workspace settings.  Data Engineering/Science settings allows users to create/manage their Spark compute, select the default runtime/default environment, enable or disable high concurrency mode or ML autologging.  

ssrs new role assignment

Now with the REST api support for the Data Engineering/Science settings, you would be able to  

  • Choose the default pool for a Fabric Workspace 
  • Configure the max nodes for Starter pools 
  • Create/Update/Delete the existing Custom Pools, Autoscale and Dynamic allocation properties  
  • Choose Workspace Default Runtime and Environment  
  • Select a default runtime 
  • Select the default environment for the Fabric workspace  
  • Enable or Disable High Concurrency Mode 
  • Enable or Disable ML Auto logging.  

Learn more about the Workspace Spark Settings API in our API documentation Workspace Settings – REST API (Spark) | Microsoft Learn  

We are excited to give you a sneak peek at the preview of User Data Functions in Microsoft Fabric. User Data Functions gives developers and data engineers the ability to easily write and run applications that integrate with resources in the Fabric Platform. Data engineering often presents challenges with data quality or complex data analytics processing in data pipelines, and using ETL tools may present limited flexibility and ability to customize to your needs. This is where User data functions can be used to run data transformation tasks and perform complex business logic by connecting to your data sources and other workloads in Fabric.  

During preview, you will be able to use the following features:  

  • Use the Fabric portal to create new User Data Functions, view and test them.  
  • Write your functions using C#.   
  • Use the Visual Studio Code extension to create and edit your functions.  
  • Connect to the following Fabric-native data sources: Data Warehouse, Lakehouse and Mirrored Databases.   

You can now create a fully managed GraphQL API in Fabric to interact with your data in a simple, flexible, and powerful way. We’re excited to announce the public preview of API for GraphQL, a data access layer that allows us to query multiple data sources quickly and efficiently in Fabric by leveraging a widely adopted and familiar API technology that returns more data with less client requests.  With the new API for GraphQL in Fabric, data engineers and scientists can create data APIs to connect to different data sources, use the APIs in their workflows, or share the API endpoints with app development teams to speed up and streamline data analytics application development in your business. 

You can get started with the API for GraphQL in Fabric by creating an API, attaching a supported data source, then selecting specific data sets you want to expose through the API. Fabric builds the GraphQL schema automatically based on your data, you can test and prototype queries directly in our graphical in-browser GraphQL development environment (API editor), and applications are ready to connect in minutes. 

Currently, the following supported data sources can be exposed through the Fabric API for GraphQL: 

  • Microsoft Fabric Data Warehouse 
  • Microsoft Fabric Lakehouse via SQL Analytics Endpoint 
  • Microsoft Fabric Mirrored Databases via SQL Analytics Endpoint 

Click here to learn more about how to get started. 

ssrs new role assignment

Data Science 

As you may know, Copilot in Microsoft Fabric requires your tenant administrator to enable the feature from the admin portal. Starting May 20th, 2024, Copilot in Microsoft Fabric will be enabled by default for all tenants. This update is part of our continuous efforts to enhance user experience and productivity within Microsoft Fabric. This new default activation means that AI features like Copilot will be automatically enabled for tenants who have not yet enabled the setting.  

We are introducing a new capability to enable Copilot on Capacity level in Fabric. A new option is being introduced in the tenant admin portal, to delegate the enablement of AI and Copilot features to Capacity administrators.  This AI and Copilot setting will be automatically delegated to capacity administrators and tenant administrators won’t be able to turn off the delegation.   

We also have a cross-geo setting for customers who want to use Copilot and AI features while their capacity is in a different geographic region than the EU data boundary or the US. By default, the cross-geo setting will stay off and will not be delegated to capacity administrators automatically.  Tenant administrators can choose whether to delegate this to capacity administrators or not. 

ssrs new role assignment

Figure 1.  Copilot in Microsoft Fabric will be auto enabled and auto delegated to capacity administrators. 

ssrs new role assignment

Capacity administrators will see the “Copilot and Azure OpenAI Service (preview)” settings under Capacity settings/ Fabric Capacity / <Capacity name> / Delegated tenant settings. By default, the capacity setting will inherit tenant level settings. Capacity administrators can decide whether to override the tenant administrator’s selection. This means that even if Copilot is not enabled on a tenant level, a capacity administrator can choose to enable Copilot for their capacity. With this level of control, we make it easier to control which Fabric workspaces can utilize AI features like Copilot in Microsoft Fabric. 

ssrs new role assignment

To enhance privacy and trust, we’ve updated our approach to abuse monitoring: previously, we retained data from Copilot in Fabric, including prompt inputs and outputs, for up to 30 days to check for misuse. Following customer feedback, we’ve eliminated this 30-day retention. Now, we no longer store prompt related data, demonstrating our unwavering commitment to your privacy and security. We value your input and take your concerns seriously. 

Real-Time Intelligence 

This month includes the announcement of Real-Time Intelligence, the next evolution of Real-Time Analytics and Data Activator. With Real-Time Intelligence, Fabric extends to the world of streaming and high granularity data, enabling all users in your organization to collect, analyze and act on this data in a timeline manner making faster and more informed business decisions. Read the full announcement from Build 2024. 

Real-Time Intelligence includes a wide range of capabilities across ingestion, processing, analysis, transformation, visualization and taking action. All of this is supported by the Real-Time hub, the central place to discover and manage streaming data and start all related tasks.  

Read on for more information on each capability and stay tuned for a series of blogs describing the features in more detail. All features are in Public Preview unless otherwise specified. Feedback on any of the features can be submitted at https://aka.ms/rtiidea    

Ingest & Process  

  • Introducing the Real-Time hub 
  • Get Events with new sources of streaming and event data 
  • Source from Real-Time Hub in Enhanced Eventstream  
  • Use Real-Time hub to Get Data in KQL Database in Eventhouse 
  • Get data from Real-Time Hub within Reflexes 
  • Eventstream Edit and Live modes 
  • Default and derived streams 
  • Route data streams based on content 

Analyze & Transform  

  • Eventhouse GA 
  • Eventhouse OneLake availability GA 
  • Create a database shortcut to another KQL Database 
  • Support for AI Anomaly Detector  
  • Copilot for Real-Time Intelligence 
  • Tenant-level private endpoints for Eventhouse 

Visualize & Act  

  • Visualize data with Real-Time Dashboards  
  • New experience for data exploration 
  • Create triggers from Real-Time Hub 
  • Set alert on Real-time Dashboards 
  • Taking action through Fabric Items 

Ingest & Process 

Real-Time hub is the single place for all data-in-motion across your entire organization. Several key features are offered in Real-Time hub: 

1. Single place for data-in-motion for the entire organization  

Real-Time hub enables users to easily discover, ingest, manage, and consume data-in-motion from a wide variety of sources. It lists all the streams and KQL tables that customers can directly act on. 

2. Real-Time hub is never empty  

All data streams in Fabric automatically show up in the hub. Also, users can subscribe to events in Fabric gaining insights into the health and performance of their data ecosystem. 

3. Numerous connectors to simplify data ingestion from anywhere to Real-Time hub  

Real-Time hub makes it easy for you to ingest data into Fabric from a wide variety of sources like AWS Kinesis, Kafka clusters, Microsoft streaming sources, sample data and Fabric events using the Get Events experience.  

There are 3 tabs in the hub:  

  • Data streams : This tab contains all streams that are actively running in Fabric that user has access to. This includes all streams from Eventstreams and all tables from KQL Databases. 
  • Microsoft sources : This tab contains Microsoft sources (that user has access to) and can be connected to Fabric. 
  • Fabric events : Fabric now has event-driven capabilities to support real-time notifications and data processing. Users can monitor and react to events including Fabric Workspace Item events and Azure Blob Storage events. These events can be used to trigger other actions or workflows, such as invoking a data pipeline or sending a notification via email. Users can also send these events to other destinations via Event Streams. 

Learn More  

You can now connect to data from both inside and outside of Fabric in a mere few steps.  Whether data is coming from new or existing sources, streams, or available events, the Get Events experience allows users to connect to a wide range of sources directly from Real-Time hub, Eventstreams, Eventhouse and Data Activator.  

This enhanced capability allows you to easily connect external data streams into Fabric with out-of-box experience, giving you more options and helping you to get real-time insights from various sources. This includes Camel Kafka connectors powered by Kafka connect to access popular data platforms, as well as the Debezium connectors for fetching the Change Data Capture (CDC) streams. 

Using Get Events, bring streaming data from Microsoft sources directly into Fabric with a first-class experience.  Connectivity to notification sources and discrete events is also included, this enables access to notification events from Azure and other clouds solutions including AWS and GCP.  The full set of sources which are currently supported are: 

  • Microsoft sources : Azure Event Hubs, Azure IoT hub 
  • External sources : Google Cloud Pub/Sub, Amazon Kinesis Data Streams, Confluent Cloud Kafka 
  • Change data capture databases : Azure SQL DB (CDC), PostgreSQL DB (CDC), Azure Cosmos DB (CDC), MySQL DB (CDC)  
  • Fabric events : Fabric Workspace Item events, Azure Blob Storage events  

ssrs new role assignment

Learn More   

With enhanced Eventstream, you can now stream data not only from Microsoft sources but also from other platforms like Google Cloud, Amazon Kinesis, Database change data capture streams, etc. using our new messaging connectors. The new Eventstream also lets you acquire and route real-time data not only from stream sources but also from discrete event sources, such as: Azure Blob Storage events, Fabric Workspace Item events. 

To use these new sources in Eventstream, simply create an eventstream with choosing “Enhanced Capabilities (preview)”. 

ssrs new role assignment

You will see the new Eventstream homepage that gives you some choices to begin with. By clicking on the “Add external source”, you will find these sources in the Get events wizard that helps you to set up the source in a few steps. After you add the source to your eventstream, you can publish it to stream the data into your eventstream.  

Using Eventstream with discrete sources to turn events into streams for more analysis. You can send the streams to different Fabric data destinations, like Lakehouse and KQL Database. After the events are converted, a default stream will appear in Real-Time Hub. To turn them, click Edit on ribbon, select “Stream events” on the source node, and publish your eventstream. 

To transform the stream data or route it to different Fabric destinations based on its content, you can click Edit in ribbon and enter the Edit mode. There you can add event processing operators and destinations. 

With Real-Time hub embedded in KQL Database experience, each user in the tenant can view and add streams which they have access to and directly ingest it to a KQL Database table in Eventhouse.  

This integration provides each user in the tenant with the ability to access and view data streams they are permitted to. They can now directly ingest these streams into a KQL Database table in Eventhouse. This simplifies the data discovery and ingestion process by allowing users to directly interact with the streams. Users can filter data based on the Owner, Parent and Location and provides additional information such as Endorsement and Sensitivity. 

You can access this by clicking on the Get Data button from the Database ribbon in Eventhouse. 

ssrs new role assignment

This will open the Get Data wizard with Real-Time hub embedded. 

Inserting image...

You can use events from Real-Time hub directly in reflex items as well. From within the main reflex UI, click ‘Get data’ in the toolbar: 

ssrs new role assignment

This will open a wizard that allows you to connect to new event sources or browse Real-Time Hub to use existing streams or system events. 

Search new stream sources to connect to or select existing streams and tables to be ingested directly by Reflex. 

ssrs new role assignment

You then have access to the full reflex modeling experience to build properties and triggers over any events from Real-Time hub.  

Eventstream offers two distinct modes, Edit and Live, to provide flexibility and control over the development process of your eventstream. If you create a new Eventstream with Enhanced Capabilities enabled, you can modify it in an Edit mode. Here, you can design stream processing operations for your data streams using a no-code editor. Once you complete the editing, you can publish your Eventstream and visualize how it starts streaming and processing data in Live mode .   

ssrs new role assignment

In Edit mode, you can:   

  • Make changes to an Eventstream without implementing them until you publish the Eventstream. This gives you full control over the development process.  
  • Avoid test data being streamed to your Eventstream. This mode is designed to provide a secure environment for testing without affecting your actual data streams. 

For Live mode, you can :  

  • Visualize how your Eventstream streams, transforms, and routes your data streams to various destinations after publishing the changes.  
  • Pause the flow of data on selected sources and destinations, providing you with more control over your data streams being streamed into your Eventstream.  

When you create a new Eventstream with Enhanced Capabilities enabled, you can now create and manage multiple data streams within Eventstream, which can then be displayed in the Real-Time hub for others to consume and perform further analysis.  

There are two types of streams:   

  • Default stream : Automatically generated when a streaming source is added to Eventstream. Default stream captures raw event data directly from the source, ready for transformation or analysis.  
  • Derived stream : A specialized stream that users can create as a destination within Eventstream. Derived stream can be created after a series of operations such as filtering and aggregating, and then it’s ready for further consumption or analysis by other users in the organization through the Real-Time Hub.  

The following example shows that when creating a new Eventstream a default stream alex-es1-stream is automatically generated. Subsequently, a derived stream dstream1 is added after an Aggregate operation within the Eventstream. Both default and derived streams can be found in the Real-Time hub.  

ssrs new role assignment

Customers can now perform stream operations directly within Eventstream’s Edit mode, instead of embedding in a destination. This enhancement allows you to design stream processing logics and route data streams in the top-level canvas. Custom processing and routing can be applied to individual destinations using built-in operations, allowing for routing to distinct destinations within the Eventstream based on different stream content. 

These operations include:  

  • Aggregate : Perform calculations such as SUM, AVG, MIN, and MAX on a column of values and return a single result. 
  • Expand : Expand array values and create new rows for each element within the array.  
  • Filter : Select or filter specific rows from the data stream based on a condition. 
  • Group by : Aggregate event data within a certain time window, with the option to group one or more columns.  
  • Manage Fields : Customize your data streams by adding, removing, or changing data type of a column.  
  • Union : Merge two or more data streams with shared fields (same name and data type) into a unified data stream.  

Analyze & Transform 

Eventhouse, a cutting-edge database workspace meticulously crafted to manage and store event-based data, is now officially available for general use. Optimized for high granularity, velocity, and low latency streaming data, it incorporates indexing and partitioning for structured, semi-structured, and free text data. With Eventhouse, users can perform high-performance analysis of big data and real-time data querying, processing billions of events within seconds. The platform allows users to organize data into compartments (databases) within one logical item, facilitating efficient data management.  

Additionally, Eventhouse enables the sharing of compute and cache resources across databases, maximizing resource utilization. It also supports high-performance queries across databases and allows users to apply common policies seamlessly. Eventhouse offers content-based routing to multiple databases, full view lineage, and high granularity permission control, ensuring data security and compliance. Moreover, it provides a simple migration path from Azure Synapse Data Explorer and Azure Data Explorer, making adoption seamless for existing users. 

ssrs new role assignment

Engineered to handle data in motion, Eventhouse seamlessly integrates indexing and partitioning into its storing process, accommodating various data formats. This sophisticated design empowers high-performance analysis with minimal latency, facilitating lightning-fast ingestion and querying within seconds. Eventhouse is purpose-built to deliver exceptional performance and efficiency for managing event-based data across diverse applications and industries. Its intuitive features and seamless integration with existing Azure services make it an ideal choice for organizations looking to leverage real-time analytics for actionable insights. Whether it’s analyzing telemetry and log data, time series and IoT data, or financial records, Eventhouse provides the tools and capabilities needed to unlock the full potential of event-based data. 

We’re excited to announce that OneLake availability of Eventhouse in Delta Lake format is Generally Available. 

Delta Lake  is the unified data lake table format chosen to achieve seamless data access across all compute engines in Microsoft Fabric. 

The data streamed into Eventhouse is stored in an optimized columnar storage format with full text indexing and supports complex analytical queries at low latency on structured, semi-structured, and free text data. 

Enabling data availability of Eventhouse in OneLake means that customers can enjoy the best of both worlds: they can query the data with high performance and low latency in their  Eventhouse and query the same data in Delta Lake format via any other Fabric engines such as Power BI Direct Lake mode, Warehouse, Lakehouse, Notebooks, and more. 

To learn more, please visit https://learn.microsoft.com/en-gb/fabric/real-time-analytics/one-logical-copy 

A database shortcut in Eventhouse is an embedded reference to a source database. The source database can be one of the following: 

  • (Now Available) A KQL Database in Real-Time Intelligence  
  • An Azure Data Explorer database  

The behavior exhibited by the database shortcut is similar to that of a follower database  

The owner of the source database, the data provider, shares the database with the creator of the shortcut in Real-Time Intelligence, the data consumer. The owner and the creator can be the same person. The database shortcut is attached in read-only mode, making it possible to view and run queries on the data that was ingested into the source KQL Database without ingesting it.  

This helps with data sharing scenarios where you can share data in-place either within teams, or even with external customers.  

AI Anomaly Detector is an Azure service for high quality detection of multivariate and univariate anomalies in time series. While the standalone version is being retired October 2026, Microsoft open sourced the anomaly detection core algorithms and they are now supported in Microsoft Fabric. Users can leverage these capabilities in Data Science and Real-Time Intelligence workload. AI Anomaly Detector models can be trained in Spark Python notebooks in Data Science workload, while real time scoring can be done by KQL with inline Python in Real-Time Intelligence. 

We are excited to announce the Public Preview of Copilot for Real-Time Intelligence. This initial version includes a new capability that translates your natural language questions about your data to KQL queries that you can run and get insights.  

Your starting point is a KQL Queryset, that is connected to a KQL Database, or to a standalone Kusto database:  

ssrs new role assignment

Simply type the natural language question about what you want to accomplish, and Copilot will automatically translate it to a KQL query you can execute. This is extremely powerful for users who may be less familiar with writing KQL queries but still want to get the most from their time-series data stored in Eventhouse. 

ssrs new role assignment

Stay tuned for more capabilities from Copilot for Real-Time Intelligence!   

Customers can increase their network security by limiting access to Eventhouse at a tenant-level, from one or more virtual networks (VNets) via private links. This will prevent unauthorized access from public networks and only permit data plane operations from specific VNets.  

Visualize & Act 

Real-Time Dashboards have a user-friendly interface, allowing users to quickly explore and analyze their data without the need for extensive technical knowledge. They offer a high refresh frequency, support a range of customization options, and are designed to handle big data.  

The following visual types are supported, and can be customized with the dashboard’s user-friendly interface: 

ssrs new role assignment

You can also define conditional formatting rules to format the visual data points by their values using colors, tags, and icons. Conditional formatting can be applied to a specific set of cells in a predetermined column or to entire rows, and lets you easily identify interesting data points. 

Beyond the support visual, Real-Time Dashboards provide several capabilities to allow you to interact with your data by performing slice and dice operations for deeper analysis and gaining different viewpoints. 

  • Parameters are used as building blocks for dashboard filters and can be added to queries to filter the data presented by visuals. Parameters can be used to slice and dice dashboard visuals either directly by selecting parameter values in the filter bar or by using cross-filters. 
  • Cross filters allow you to select a value in one visual and filter all other visuals on that dashboard based on the selected data point. 
  • Drillthrough capability allows you to select a value in a visual and use it to filter the visuals in a target page in the same dashboard. When the target page opens, the value is pushed to the relevant filters.    

Real-Time Dashboards can be shared broadly and allow multiple stakeholders to view dynamic, real time, fresh data while easily interacting with it to gain desired insights. 

Directly from a real-time dashboard, users can refine their exploration using a user-friendly, form-like interface. This intuitive and dynamic experience is tailored for insights explorers craving insights based on real-time data. Add filters, create aggregations, and switch visualization types without writing queries to easily uncover insights.  

With this new feature, insights explorers are no longer bound by the limitations of pre-defined dashboards. As independent explorers, they have the freedom for ad-hoc exploration, leveraging existing tiles to kickstart their journey. Moreover, they can selectively remove query segments, and expand their view of the data landscape.  

ssrs new role assignment

Dive deep, extract meaningful insights, and chart actionable paths forward, all with ease and efficiency, and without having to write complex KQL queries.  

Data Activator allows you to monitor streams of data for various conditions and set up actions to be taken in response. These triggers are available directly within the Real-Time hub and in other workloads in Fabric. When the condition is detected, an action will automatically be kicked off such as sending alerts via email or Teams or starting jobs in Fabric items.  

When you browse the Real-Time Hub, you’ll see options to set triggers in the detail pages for streams. 

ssrs new role assignment

Selecting this will open a side panel where you can configure the events you want to monitor, the conditions you want to look for in the events, and the action you want to take while in the Real-Time hub experience. 

ssrs new role assignment

Completing this pane creates a new reflex item with a trigger that monitors the selected events and condition for you. Reflexes need to be created in a workspace supported by a Fabric or Power BI Premium capacity – this can be a trial capacity so you can get started with it today! 

ssrs new role assignment

Data Activator has been able to monitor Power BI report data since it was launched, and we now support monitoring of Real-Time Dashboard visuals in the same way.

From real-time dashboard tiles you can click the ellipsis (…) button and select “Set alert”

ssrs new role assignment

This opens the embedded trigger pane, where you can specify what conditions, you are looking for. You can choose whether to send email or Teams messages as the alert when these conditions are met.

When creating a new reflex trigger, from Real-time Hub or within the reflex item itself, you’ll notice a new ‘Run a Fabric item’ option in the Action section. This will create a trigger that starts a new Fabric job whenever its condition is met, kicking off a pipeline or notebook computation in response to Fabric events. A common scenario would be monitoring Azure Blob storage events via Real-Time Hub, and running data pipeline jobs when Blog Created events are detected. 

This capability is extremely powerful and moves Fabric from a scheduled driven platform to an event driven platform.  

ssrs new role assignment

Pipelines, spark jobs, and notebooks are just the first Fabric items we’ll support here, and we’re keen to hear your feedback to help prioritize what else we support. Please leave ideas and votes on https://aka.ms/rtiidea and let us know! 

Real-Time Intelligence, along with the Real-Time hub, revolutionizes what’s possible with real-time streaming and event data within Microsoft Fabric.  

Learn more and try it today https://aka.ms/realtimeintelligence   

Data Factory 

Dataflow gen2 .

We are thrilled to announce that the Power Query SDK is now generally available in Visual Studio Code! This marks a significant milestone in our commitment to providing developers with powerful tools to enhance data connectivity and transformation. 

The Power Query SDK is a set of tools that allow you as the developer to create new connectors for Power Query experiences available in products such as Power BI Desktop, Semantic Models, Power BI Datamarts, Power BI Dataflows, Fabric Dataflow Gen2 and more. 

This new SDK has been in public preview since November of 2022, and we’ve been hard at work improving this experience which goes beyond what the previous Power Query SDK in Visual Studio had to offer.  

The latest of these biggest improvements was the introduction of the Test Framework in March of 2024 that solidifies the developer experience that you can have within Visual Studio Code and the Power Query SDK for creating a Power Query connector. 

The Power Query SDK extension for Visual Studio will be deprecated by June 30, 2024, so we encourage you to give this new Power Query SDK in Visual Studio Code today if you haven’t.  

ssrs new role assignment

To get started with the Power Query SDK in Visual Studio Code, simply install it from the Visual Studio Code Marketplace . Our comprehensive documentation and tutorials are available to help you harness the full potential of your data. 

Join our vibrant community of developers to share insights, ask questions, and collaborate on exciting projects. Our dedicated support team is always ready to assist you with any queries. 

We look forward to seeing the innovative solutions you’ll create with the Power Query SDK in Visual Studio Code. Happy coding! 

Introducing a convenient enhancement to the Dataflows Gen2 Refresh History experience! Now, alongside the familiar “X” button in the Refresh History screen, you’ll find a shiny new Refresh Button . This small but mighty addition empowers users to refresh the status of their dataflow refresh history status without the hassle of exiting the refresh history and reopening it. Simply click the Refresh Button , and voilà! Your dataflow’s refresh history status screen is updated, keeping you in the loop with minimal effort. Say goodbye to unnecessary clicks and hello to streamlined monitoring! 

ssrs new role assignment

  • [New] OneStream : The OneStream Power Query Connector enables you to seamlessly connect Data Factory to your OneStream applications by simply logging in with your OneStream credentials. The connector uses your OneStream security, allowing you to access only the data you have based on your permissions within the OneStream application. Use the connector to pull cube and relational data along with metadata members, including all their properties. Visit OneStream Power BI Connector to learn more. Find this connector in the other category. 

Data workflows  

We are excited to announce the preview of ‘Data workflows’, a new feature within the Data Factory that revolutionizes the way you build and manage your code-based data pipelines. Powered by Apache Airflow, Data workflows offer seamless authoring, scheduling, and monitoring experience for Python-based data processes defined as Directed Acyclic Graphs (DAGs). This feature brings a SaaS-like experience to running DAGs in a fully managed Apache Airflow environment, with support for autoscaling , auto-pause , and rapid cluster resumption to enhance cost-efficiency and performance.  

It also includes native cloud-based authoring capabilities and comprehensive support for Apache Airflow plugins and libraries. 

To begin using this feature: 

  • Access the Microsoft Fabric Admin Portal. 
  • Navigate to Tenant Settings. 

Under Microsoft Fabric options, locate and expand the ‘Users can create and use Data workflows (preview)’ section. Note: This action is necessary only during the preview phase of Data workflows. 

ssrs new role assignment

2. Create a new Data workflow within an existing or new workspace. 

ssrs new role assignment

3. Add a new Directed Acyclic Graph (DAG) file via the user interface. 

ssrs new role assignment

4.  Save your DAG(s). 

ssrs new role assignment

5. Use Apache Airflow monitoring tools to observe your DAG executions. In the ribbon, click on Monitor in Apache Airflow. 

ssrs new role assignment

For additional information, please consult the product documentation .   If you’re not already using Fabric capacity, consider signing up for the Microsoft Fabric free trial to evaluate this feature. 

Data Pipelines 

We are excited to announce a new feature in Fabric that enables you to create data pipelines to access your firewall-enabled Azure Data Lake Storage Gen2 (ADLS Gen2) accounts. This feature leverages the workspace identity to establish a secure and seamless connection between Fabric and your storage accounts. 

With trusted workspace access, you can create data pipelines to your storage accounts with just a few clicks. Then you can copy data into Fabric Lakehouse and start analyzing your data with Spark, SQL, and Power BI. Trusted workspace access is available for workspaces in Fabric capacities (F64 or higher). It supports organizational accounts or service principal authentication for storage accounts. 

How to use trusted workspace access in data pipelines  

Create a workspace identity for your Fabric workspace. You can follow the guidelines provided in Workspace identity in Fabric . 

Configure resource instance rules for the Storage account that you want to access from your Fabric workspace. Resource instance rules for Fabric workspaces can only be created through ARM templates. Follow the guidelines for configuring resource instance rules for Fabric workspaces here . 

Create a data pipeline to copy data from the firewall enabled ADLS gen2 account to a Fabric Lakehouse. 

To learn more about how to use trusted workspace access in data pipelines, please refer to Trusted workspace access in Fabric . 

We hope you enjoy this new feature for your data integration and analytics scenarios. Please share your feedback and suggestions with us by leaving a comment here. 

Introducing Blob Storage Event Triggers for Data Pipelines 

A very common use case among data pipeline users in a cloud analytics solution is to trigger your pipeline when a file arrives or is deleted. We have introduced Azure Blob storage event triggers as a public preview feature in Fabric Data Factory Data Pipelines. This utilizes the Fabric Reflex alerts capability that also leverages Event Streams in Fabric to create event subscriptions to your Azure storage accounts. 

ssrs new role assignment

Parent/Child pipeline pattern monitoring improvements

Today, in Fabric Data Factory Data Pipelines, when you call another pipeline using the Invoke Pipeline activity, the child pipeline is not visible in the monitoring view. We have made updates to the Invoke Pipeline activity so that you can view your child pipeline runs. This requires an upgrade to any pipelines that you have in Fabric that already use the current Invoke Pipeline activity. You will be prompted to upgrade when you edit your pipeline and then provide a connection to your workspace to authenticate. Another additional new feature that will light up with this invoke pipeline activity update is the ability to invoke pipeline across workspaces in Fabric. 

ssrs new role assignment

We are excited to announce the availability of the Fabric Spark job definition activity for data pipelines. With this new activity, you will be able to run a Fabric Spark Job definition directly in your pipeline. Detailed monitoring capabilities of your Spark Job definition will be coming soon!  

ssrs new role assignment

To learn more about this activity, read https://aka.ms/SparkJobDefinitionActivity  

We are excited to announce the availability of the Azure HDInsight activity for data pipelines. The Azure HDInsight activity allows you to execute Hive queries, invoke a MapReduce program, execute Pig queries, execute a Spark program, or a Hadoop Stream program. Invoking either of the 5 activities can be done in a singular Azure HDInsight activity, and you can invoke this activity using your own or on-demand HDInsight cluster. 

To learn more about this activity, read https://aka.ms/HDInsightsActivity  

ssrs new role assignment

We are thrilled to share the new Modern Get Data experience in Data Pipeline to empower users intuitively and efficiently discover the right data, right connection info and credentials.   

ssrs new role assignment

In the data destination, users can easily set destination by creating a new Fabric item or creating another destination or selecting existing Fabric item from OneLake data hub. 

ssrs new role assignment

In the source tab of Copy activity, users can conveniently choose recent used connections from drop down or create a new connection using “More” option to interact with Modern Get Data experience. 

ssrs new role assignment

Related blog posts

Microsoft fabric april 2024 update.

Welcome to the April 2024 update! This month, you’ll find many great new updates, previews, and improvements. From Shortcuts to Google Cloud Storage and S3 compatible data sources in preview, Optimistic Job Admission for Fabric Spark, and New KQL Queryset Command Bar, that’s just a glimpse into this month’s update. There’s much more to explore! … Continue reading “Microsoft Fabric April 2024 Update”

Microsoft Fabric March 2024 Update

Welcome to the March 2024 update. We have a lot of great features this month including OneLake File Explorer, Autotune Query Tuning, Test Framework for Power Query SDK in VS Code, and many more! Earn a free Microsoft Fabric certification exam!  We are thrilled to announce the general availability of Exam DP-600, which leads to … Continue reading “Microsoft Fabric March 2024 Update”

June 2024 Update

Revision history, feature summary, enterprise performance management, epm spotlight features, monthly update schedule, helpful information, oracle cloud readiness app, oracle cloud infrastructure, oracle-managed oci migration for classic uk government environments, 60-day retention of backups from oci (gen 2) test environments, security and access, information on setting multiple password policies, new application role: access control - manage, epm automate, option to use existing users with the simulateconcurrentusage command, option to use existing users with the simulate concurrent usage rest api, new role assignment report for users (v2) rest api, new role assignment report for groups (v2) rest api, view the current theme for the application in the get applications rest api, accessibility, accessibility keys for epm cloud platform tasks, updated epm books extension for smart view, new application setting to enable upcoming support for oracle smart view for google workspace in google sheets, data integration, new bank statement verification option for account reconciliation bai statement balances, forms and dashboards, forms 2.0 is the default for new and recreated applications, introducing bursting email consolidate attachments and zip attachments options, user interface, ability to add a custom logo for redwood experience themes, new how do i... help center tab, bank statement verification option for bai statement balances in data integration, application specific validation for equivalent nodes in related viewpoints, calculated name for nodes added by copy or model after action, copy and model after nodes using request files and subscriptions, copy nodes and inclusion property parameters added to subscriptions report, create and delete events for additional metadata objects in system event audit, escalation indicator for invitees in request inspector, improved subscription management for archived dimensions and viewpoints, management hierarchy approvals, new and updated videos, new extractpackage epm automate command, node clipboard for viewpoints, oracle edm cloud – oracle database | sync enterprise data recipe available for oracle integration, prevent deletion of submitted requests, prevent rejection of selected request types, request priority, request validation scope for custom validations, rest api and epm automate improvements for global connections in extracts and extract packages, sibling sort order column for financials cloud general ledger exports, sort option for viewpoint extract columns, tax reporting application type, use idcs groups with permissions, policies, subscriptions, and views, extendmovementscope substitution variable, updated narrative reporting extension for smart view, new search field in embed contents dialog in smart view, support for multiple versions of models in strategic modeling, collaborating with ipm insights using tags, new parameter in the "update dimensions as a job" rest api and the epm automate loaddimdata command, new calculation segmentation option in custom calculation rules, additional calculation details and model snapshot documentation report available on the calculation control page, migration backup enhanced to export failed calculation runs from the last seven days, automatic selection of national jurisdiction for entity during data import, important actions and considerations.

This document will continue to evolve as existing sections change and new information is added. All updates appear in the following table:

ssrs new role assignment

We’re here and we’re listening. If you have a suggestion on how to make our cloud services even better then go ahead and tell us. There are several ways to submit your ideas, for example, through the Ideas Lab on Oracle Customer Connect. Wherever you see this icon after the feature name it means we delivered one of your ideas.

GIVE US FEEDBACK

We welcome your comments and suggestions to improve the content. Please send us your feedback at [email protected] .

The information contained in this document may include statements about Oracle’s product development plans. Many factors can materially affect Oracle’s product development plans and the nature and timing of future product releases. Accordingly, this Information is provided to you solely for information only, is not a commitment to deliver any material, code, or functionality, and should not be relied upon in making purchasing decisions. The development, release, and timing of any features or functionality described remains at the sole discretion of Oracle.

This information may not be incorporated into any contractual agreement with Oracle or its subsidiaries or affiliates. Oracle specifically disclaims any liability with respect to this information. Refer to the Legal Notices and Terms of Use for further information.

Column Definitions:

Features Delivered Enabled

Report = New or modified, Oracle-delivered, ready to run reports.

UI or Process-Based: Small Scale = These UI or process-based features are typically comprised of minor field, validation, or program changes. Therefore, the potential impact to users is minimal.

UI or Process-Based: Larger Scale* = These UI or process-based features have more complex designs. Therefore, the potential impact to users is higher.

Features Delivered Disabled = Action is needed BEFORE these features can be used by END USERS. These features are delivered disabled and you choose if and when to enable them. For example, a) new or expanded BI subject areas need to first be incorporated into reports, b) Integration is required to utilize new web services, or c) features must be assigned to user roles before they can be accessed.

Be sure to review the details of significant feature(s) being introduced.

In Enterprise Data Management, approval policies can be configured with the Management Hierarchy approval method to invite approvers based on a hierarchy of users defined in a Users type application.

In Planning, you can now collaborate with your colleagues more effectively and leverage the power of automated analysis with IPM Insights using tags.

Additionally, In the July (24.07) update of EPM Cloud, Oracle will automatically update all EPM Cloud environments that use a non-Redwood theme to the Redwood Experience. As a result, themes outside of the Redwood Experience will no longer be supported.

Business Benefit: Spotlight features are important enhancements that can significantly impact the EPM Cloud user experience.

Key Resources

  • Management Hierarchy Approvals in this What’s New document

Collaborating with IPM Insights Using Tags in this What’s New document

  • About Redwood Experience
  • Configuring EPM Cloud Appearance

Test Environments: Oracle will apply this monthly update during the first daily maintenance that occurs at or after 22:00 UTC on Friday, June 7, 2024.

Production Environments: Oracle will apply this monthly update during the first daily maintenance that occurs at or after 22:00 UTC on Friday, June 21, 2024.

NOTE: The monthly update will not be applied to any environment for which this monthly update is requested to be skipped using the EPM Automate skipUpdate command or service request to Oracle. The monthly update will also not be applied to any environment on a one-off patch where the fix for the underlying issue is not in this monthly update.

NOTE:  Backing up the daily maintenance snapshot and restoring the environment as needed are self-service operations. Oracle strongly recommends that you download the maintenance snapshot every day to a local server.

The Oracle Help Center provides access to updated documentation. The updates will be available in the Help Center on Friday, June 7, 2024.

NOTE : Some of the links to new feature documentation included in this readiness document will not work until after the Oracle Help Center update is complete.

Documentation Updates Available One Week After Readiness Documents

Updated documentation is published on the Oracle Help Center on the first Friday of each month, coinciding with the monthly updates to Test environments. Because there is a one week lag between the publishing of the readiness documents (What's New and New Feature Summary) and Oracle Help Center updates, some links included in the readiness documents will not work until the Oracle Help Center update is complete.

https://docs.oracle.com/en/cloud/saas/epm-cloud/index.html

Fixed Issues and Considerations

Software issues addressed each month and considerations are posted to a knowledge article on My Oracle Support. Click here to review. You must have a My Oracle Support login to access the article.

NOTE: Fixed issues for EPM Cloud Platform components (Smart View for Office, EPM Automate, REST API, Migration, Access Control, Data Management/Data Integration, Reports, Financial Reporting, and Calculation Manager) are available in a separate document on the My Oracle Support “Release Highlights” page.

This provides visibility into EPM Cloud release contents.

Give Us Documentation Feedback

We welcome your comments and suggestions to improve the content of the What's New document and the product documentation.

Please send us your feedback at [email protected] . In the body or title of the email, state that you are inquiring or providing feedback, and indicate for which EPM Cloud service and the monthly update cycle.

Create and Run an EPM Center of Excellence

A best practice for EPM is to create a Center of Excellence (CoE) . An EPM CoE is a unified effort to ensure adoption and best practices.

Business Benefit: A CoE drives transformation in business processes related to performance management and the use of technology-enabled solutions.

Illustration of a glittering diamond hovering above a hand.

Learn more:

  • Visit the EPM Center of Excellence web page  to learn about the benefits.
  • Watch the webinars on Cloud Customer Connect: Creating and Running a Center of Excellence (CoE) for EPM Cloud and Planning Your Success with EPM Center of Excellence .
  • Watch the videos: Overview: EPM Center of Excellence , Creating a Center of Excellence .
  • Get best practices, guidance, and strategies for your own EPM CoE: Creating and Running an EPM Center of Excellence

Join Oracle Cloud Customer Connect

Please take a moment to join Cloud Customer Connect and its EPM Cloud forums. Customer Connect is a community gathering place for members to interact and collaborate on common goals and objectives. It's where you will find the latest release information, discussion forums, upcoming events, and answers to use-case questions. Joining takes just a few minutes. Join now!

To join, go to https://community.oracle.com/customerconnect/ and select Register in the upper right.

After you have joined and logged in, to access the forums (Categories), from the Cloud Customer Connect home page, select Categories , then Enterprise Resource Planning , and then make your selection under Enterprise Performance Management .

To ensure that you are always in the know, confirm that you have your notification preferences set for EPM Announcements as well as each Category you're following.

  • To set notification preferences for EPM Announcements, go to Categories , then Announcements , and then Enterprise Performance Management . Next, select the Notification preferences drop down.
  • To set notification preferences for each Category, navigate to the Category page and select the Notification preferences drop down. You must go to each category page separately and select the Notification preferences drop down.

NOTE: The Settings and Actions menu contains a link to Cloud Customer Connect. To access this link, from the Home page, click the down arrow next to the user name (in the upper right-hand corner of the screen), and select Cloud Customer Connect.

TIP: Bookmark the Enterprise Performance Management Resource Center to quickly find useful information on all things EPM. As a community member, you have access to important product announcements, best practices, known issues, training highlights, and more.

Follow Us on Social Media

Follow EPM Cloud on YouTube , Twitter , Facebook , and LinkedIn .

These are great resources for the latest updates and information about EPM Cloud.

Business Benefit: These resources can help you optimize your EPM implementation and user experience by providing valuable information and user assistance.

Use the Readiness App available on the Oracle Cloud Application Update Readiness site to review information about features released for Oracle Cloud. The app provides an .xlsx file listing all features released for one or more Cloud product(s), module(s) and update(s) that you designate. The EPM Cloud Features tool will not be updated beginning in this update.

From the Readiness site, click the red Try Our Readiness App! button in the upper right, or use this URL to access the app:

https://www.oracle.com/webfolder/technetwork/tutorials/tutorial/readiness/app/index.html

NOTE: To ensure a complete feature listing, select EPM Common in addition to the business process(es) you wish to view. To determine whether an EPM Common feature applies to your business process, review the Applies To information in the Short Description column of the .xlsx file.

NOTE: The Readiness App includes features from Oct 2021 and later. The EPM Cloud Features tool includes EPM features from March 2018 through June 2, 2023 only.

Business Benefit: The Readiness App is an interactive tool that allows you to view a comprehensive listing of all features that have been released for one or more product(s), module(s) and update(s) that you designate.

EPM Cloud Platform

NOTE: In the Applies To lists, Planning refers to all Planning application types (Custom, FreeForm, Modules, Cash Forecasting, Sales Planning, Strategic Workforce Planning) unless otherwise stated.

One of Oracle's latest advancements, Oracle Cloud Infrastructure (OCI) is the foundation of Oracle's second-generation cloud. OCI, a purpose-built, best-in-class platform for running enterprise applications, is engineered from the ground up to run mission-critical databases, workloads, and applications while providing end-to-end security. Oracle's data centers around the globe are standardizing on the new OCI architecture which will deliver even greater performance and reliability. There are a lot of EPM Cloud features that are available only in OCI. See  Features Available only in OCI EPM Cloud Environments  in Getting Started with Oracle Enterprise Performance Management Cloud for Administrators .

Oracle will migrate all Classic UK Government environments to OCI in Wave 9.  Oracle has sent notifications for Oracle-managed migrations specifying the migration schedule.  Your OCI environments were made available in May 2024. Complete the optional activities documented at Oracle-Managed Migration  before July 2024. 

In July 2024, while applying the monthly update 24.07 during the daily maintenance window, Oracle will automatically clone your OCI environments from Classic environments and change the DNS configuration so that the existing Classic service URLs are routed to the OCI environments. Cloning of test environments will be done on the first Friday of July 2024. Cloning of production environments will take place on the third Friday of July 2024. Oracle will terminate your Classic environments by August 31, 2024.

Applies to:  Account Reconciliation, Enterprise Data Management, Enterprise Profitability and Cost Management, Financial Consolidation and Close, FreeForm, Narrative Reporting, Planning, Profitability and Cost Management, Tax Reporting

Business Benefit: OCI provides you a purpose-built, best-in-class platform that is engineered from the ground up to run mission-critical databases, workloads, and business processes while providing end-to-end security. This new architecture delivers greater performance and reliability, and a number of EPM Cloud features that are not available in Classic EPM Cloud.

Steps to Enable

Review and follow the instructions in  EPM Cloud Classic to Oracle Cloud Infrastructure (OCI) Migration  in Oracle Enterprise Performance Management Cloud Operations Guide.

EPM Cloud Classic to Oracle Cloud Infrastructure (OCI) Migration  in Oracle Enterprise Performance Management Cloud Operations Guide

Oracle-Managed Migration in Oracle Enterprise Performance Management Cloud Operations Guide

Features Available only in OCI EPM Cloud Environments  in  Getting Started with Oracle Enterprise Performance Management Cloud for Administrators

The daily backups of OCI (Gen 2) test environments are now retained for an extended period of 60 days.Previously, the backups of production environments were retained for 60 days while those of test environments were retained for 30 days only.

Applies to:  Account Reconciliation, Enterprise Data Management, Enterprise Profitability and Cost Management, Financial Consolidation and Close, FreeForm, Narrative Reporting, Planning, Profitability and Cost Management, Tax Reporting

Business Benefit: Extended retention of backups of OCI (Gen 2) test environments addresses the customer's audit and security requirements.

  • Backup Data Residency and Retention on OCI (Gen 2)  in  Getting Started with Oracle Enterprise Performance Management Cloud for Administrators

To satisfy the requirement for distinct password restrictions across different users, groups, and roles, you have the option to create multiple password policies and assign them to various IDCS groups. Users belonging to a specific IDCS group adhere to the password policy designated for that group. This information is now available in  Getting Started with Oracle Enterprise Performance Management Cloud for Administrators .

Business Benefit:  By implementing custom password policies across user roles and group memberships, customers can enforce stringent security protocols and operational efficiency.

  • See  Setting Password Policies  in  Getting Started with Oracle Enterprise Performance Management Cloud for Administrators

EPM Cloud users can now be assigned a new Access Control - Manage application role. Users with this role have the ability to manage groups and assign application roles to both groups and individual users. Additionally, they can generate reports on user security.

Applies to:  Enterprise Profitability and Cost Management, Financial Consolidation and Close, FreeForm, Planning, Profitability and Cost Management, Tax Reporting

Business Benefit:  The Access Control - Manage role provides segregation of duties within Access Control; you don't have to provide the Service Administrator role to a user who is responsible for Access Control only.

Administering Access Control for Oracle Enterprise Performance Management Cloud:

  • Enterprise Profitability and Cost Management
  • Financial Consolidation and Close
  • Profitability and Cost Management

Tax Reporting

The simulateConcurrentUsage EPM Automate command has been enhanced to use existing users in the identity domain while running concurrent operations on an environment. To use existing users in the identity domain with this command, you use mode 4 , which uses users defined in the users.csv file included in the input ZIP file. In this mode, the command does not create the simulated users.

Applies to: Financial Consolidation and Close, FreeForm, Planning, Tax Reporting

Business Benefit: This change helps you validate the response time using existing users.

  • Installing EPM Automate
  • simulateConcurrentUsage
  • Creating the users.csv file to Run the simulateConcurrentUsage Command

The Simulate Concurrent Usage REST API has been enhanced to use existing users in the identity domain while running concurrent operations on an environment. To use existing users in the identity domain with this REST API, you use test mode 4, which includes users defined in the users.csv file included in the input ZIP file. In this mode, the REST API does not create the simulated users.

Review the REST service definition in the REST API guides to leverage (available from the Oracle Help Center > your apps service area of interest > APIs & Schema). If you are new to Oracle's REST services you may want to begin with the Quick Start section.

Tips And Considerations

Use these Implementation Best Practices to ensure success with your REST API projects.

  • Simulate Concurrent Usage  in REST API for Enterprise Performance Management Cloud

Access Requirements

  • Service Administrators

A new Role Assignment Report for Users (v2) REST API is available that generates a Role Assignment Report of users in the environment. The report lists the roles assigned to users. It identifies the user's login name, first name, last name, email address and assigned roles. The report can be created for a specific user or a role or a combination of users and roles. The report includes:

Predefined roles (such as Service Administrator)

Application roles (such as Approvals - Assign Ownership, Approvals - Supervise, Approvals - Administer, and Approvals - Design Process)

Applies to: Account Reconciliation, Enterprise Data Management, Enterprise Profitability and Cost Management, Financial Consolidation and Close, FreeForm, Narrative Reporting, Planning, Profitability and Cost Management, Tax Reporting

Business Benefit: As opposed to the v1 version of the Role Assignment Report REST API, this API is synchronous and returns the report content in the response. It also lets you filter the report for a particular user or group.

  • Role Assignment Report for Users (v2) in REST API for Oracle Enterprise Performance Management Cloud
  • Service Administrator or Access Control Manager

A new Role Assignment Report for Groups (v2) REST API is available that generates a Role Assignment Report of groups in the environment. The report lists the roles assigned to groups. It identifies the group’s name, description, type and assigned roles. The report can be created for a specific group or a role or a combination of groups and roles. The report includes:

Business Benefit: This REST API provides a way to find out the roles assigned to groups, rather than to individual users, and is helpful if you have a lot of groups defined in your environment.

  • Role Assignment Report for Groups (v2) in REST API for Oracle Enterprise Performance Management Cloud

You can now use the Get Applications REST API to view the current theme for the application.

Applies to: Enterprise Profitability and Cost Management, Financial Consolidation and Close, Planning, Tax Reporting

Business Benefit: This allows you to automate the process of getting application themes.

  • Get Applications
  • About the REST APIs for EPM Cloud
  • EPM Cloud REST API Compatibility
  • Quick Reference Table - REST API Resource View

The non-standard access keys for navigating the Member Selector dialog box of the Edit Member Properties page in Forms 2.0 and Dashboard 2.0 are now documented in the  Oracle Enterprise Performance Management Cloud Accessibility Guid e.

Applies to : Account Reconciliation, Enterprise Profitability and Cost Management, Financial Consolidation and Close, FreeForm, Narrative Reporting, Planning, Tax Reporting

Business Benefit:  The non-standard accessibility options aid in efficient navigation of EPM Cloud environment, business processes, and common platform components. This ensures inclusivity and compliance with accessibility standards, enhancing user experience and facilitating seamless interaction with the platform for all users.

  • See  Access Keys for EPM Cloud Platform Tasks  in  Oracle Enterprise Performance Management Cloud Accessibility Guide

An updated EPM Books extension for Oracle Smart View for Office is now available to download and install. This update includes general improvements and defect fixes.

Applies to: Enterprise Profitability and Cost Management, Financial Consolidation and Close, FreeForm, Narrative Reporting, Planning, Tax Reporting

Business Benefit: Downloading and installing the latest EPM Books extension for Smart View gives you access to the latest features, improvements, and defect fixes.

To take advantage of the features, improvements, and defect fixes in the EPM Books extension:

  • From within Smart View, in the  Options  dialog,  Extensions  tab, click the  Check for Updates, New Installs, and Uninstalls  link.
  • Select your business process instance, then follow the prompts.
  • Installing the EPM Books Extension

A new application setting, Smart View Add-on with the option Google Sheets , is now available to enable the upcoming support for the Smart View extension in Google Sheets.

Applies to: Enterprise Profitability and Cost Management, Financial Consolidation and Close, FreeForm, Planning, Tax Reporting

When Oracle Smart View for Google Workspace becomes available, users of Google Workspace will be able to access EPM Cloud data, work on forms, and perform ad hoc analysis in Google Sheets. Service Administrators must select the Google Sheets option in application settings to enable the Smart View extension in Google Sheets. This option is not selected by default.

Business Benefit: Users will be able to work on forms and ad hoc grids and use various Smart View features in Google Sheets, when Oracle Smart View for Google Workspace becomes available.

To enable the Smart View add-on extension for Google Sheets:

  • Open the web application for your business process.
  • Click Applications, and then click Settings .
  • In the Smart View Add-on section, select the Google Sheets check box.
  • Click Save .
  • Application Settings
  • Specifying Application Settings
  • What Application and System Settings Can I Specify?

Account Reconciliation customers can now verify that the correct bank statement is being loaded using the Bank Statement Verification feature in Data integration. When enabled, the system compares the current period expected ending balance to the ending balance provided in the BAI file being loaded.

To calculate the current period expected ending balance, the system adds the net total of the current day transactions to the previous period ending balance for the Account ID.  If a difference is detected or any error is encountered, the transactions remain in the Workbench; however, they are not exported to transaction matching.

Applies to : Account Reconciliation

Business Benefit :  Bank Statement Verification enables you to identify:

  • duplicate transactions and duplicate data loads
  • missing balances or transactions
  • statements loaded in the incorrect order (that is, non-chronological order)
  • incorrect currency codes
  • Verifying BAI Bank Statement Balances in Administering Data Integration for Oracle Enterprise Performance Management Cloud

Forms 2.0 is now the default Forms Version option for newly created and recreated applications.

When you create a new application using the application creation wizard or using a Smart View template, the Forms Version will default to Forms 2.0. If you upgrade an existing application, the current Forms Version setting is retained. If you perform an import using Migration to create a new application, the Forms Version setting in the Migration file is retained. Forms 1.0 will continue to be an available option in the Settings page.

To change the Forms Version setting, click Application , and then click Settings . Under Other Options , find the Forms Version setting.

Business Benefit: Defaulting the Forms Version setting to Forms 2.0 ensures you can immediately enjoy our default user experience and to take advantage of features only available in Redwood Experience.

  • About Forms Versions

Bursting now allows you to collate multiple attachments in a zip file format and send it to a single recipient.

Applies to: Enterprise Profitability and Cost Management, Financial Consolidation and Close, FreeForm, Planning, and Tax Reporting

Previously, separate emails were sent for each generated report or PDF. Now, you can optionally select  Consolidate Attachments , which will consolidate all bursting output email attachments for a single user into an email. For newly created bursting definitions, Consolidate Attachments are enabled by default. Also, you can consolidate bursting output email attachments into a ZIP file format by selecting  ZIP Attachments , and then specifying a  ZIP File Folder Path  to create folders within the ZIP file.

Consolidate Attachments and Zip Attachments Options in Bursting

Consolidate Attachments and Zip Attachments Options in Bursting

Business Benefit: This enhancement provides customers the ability to receive multiple attachments in a single email in a convenient and hassle-free manner.

Designing with Reports for Oracle Enterprise Performance Management Cloud

  • About Bursting
  • Prerequisites
  • Simple Steps for Creating a Bursting Definition
  • Configuring the Email Channel

You can now update Redwood Experience themes (Oracle, Custom Dark, and Custom Light) with a custom logo.

Applies to:  Account Reconciliation, Enterprise Data Management, Enterprise Profitability and Cost Management, Financial Consolidation and Close, FreeForm, Planning, Profitability and Cost Management, Tax Reporting

To add a custom logo, click Tools , click Appearance , and then select Enable Redwood Experience (if it's not already selected). To replace the default Oracle logo with your own logo image, select Upload File to choose a locally stored custom image file, or select Provide URL to choose a custom image URL.

Business Benefit: This enhancement enables you to further customize your environment while using the Redwood Experience.

  • Configuring EPM Cloud Appearance in Getting Started with Oracle Enterprise Performance Management Cloud for Administrators
  • Account Reconciliation

A new How Do I …   Help Center library page for Account Reconciliation is now available. The page has a simplified left navigation pane and a revamped design incorporating Redwood graphics. It lists easy, common, and necessary tasks for setting up EPM Cloud and your business process.

The topics are organized with:

  • A list of tasks, starting with the most common, initial tasks displayed first (from the top left column to the bottom right column) and organized for Administrators and Users.
  • Relevant documentation to multiple guides.
  • Relevant documentation, videos, and tutorials displayed together.
  • Additional troubleshooting and best practice content.

For example, to answer the question How do I configure security? , from the How Do I …   page, select the Set Up and Configure graphic, then select Configure Security and Access from the drop-down menu, which links to several guides plus relevant videos and troubleshooting topics.

Business Benefit: The How Do I... Help Center tab enables you to quickly access the information you need to use Account Reconciliation.

Oracle Account Reconciliation How Do I... Tab

Account Reconciliation customers can now verify that the correct BAI bank statement is being loaded using the Bank Statement Verification feature in Data integration. When enabled, the system compares the current period expected ending balance to the ending balance provided in the BAI file being loaded. To calculate the current period expected ending balance, the system adds the net total of the current day transactions to the previous period ending balance for the account. If a difference is detected or any error is encountered in one or more of the accounts, an error log is produced and the transactions will remain in the Workbench; however, they are not exported to transaction matching.

Business Benefit: Bank Statement Verification enables you to identify duplicate transactions and duplicate data loads, missing balances or transactions, statements loaded in the incorrect order (that is, non-chronological order), and incorrect currency codes.

  • Enterprise Data Management

An application specific validation is available to verify whether certain nodes in a viewpoint also exist in one or more other related viewpoints. The Equivalent Nodes With Filter validation is disabled by default but can be enabled for any dimension where the same nodes must be in multiple viewpoints for data integrity reasons. The Related Viewpoints and Equivalent Nodes Filter parameters need to be configured for viewpoints where the validation will be enforced.

Business Benefit: This validation ensures completeness of nodes across multiple related viewpoints in a dimension for reconciliation purposes.

Configuring Related Viewpoints in Administering and Working with Oracle Enterprise Data Management Cloud

Names can be calculated for new nodes that are added to a viewpoint using a Copy or Model After action. The calculated name can be based on properties and relationships which are copied or some other information about the node. Node names can be calculated when performing a Copy or Model After operation interactively, during a request file load, or using a subscription.

Business Benefit: Node names can be calculated for new nodes which are copied or modeled after existing nodes in a viewpoint.

  • Calculating and Storing the Name of a Node in Administering and Working with Oracle Enterprise Data Management Cloud

New nodes can be created based on a Copy or Model After action during a request file load or via a subscription. The request file load format includes two optional columns (Copy Node and Copy Action) which are used to perform the Copy or Model After operation when loading a request file. Subscriptions can be optionally configured with Copy Action and Properties To Match parameters to perform a Copy or Model After operation in a subscription request. The subscription request will copy or model after the node specified in the original request or can automatically choose a sibling node to copy/model after based on certain matching properties.

Business Benefit: New nodes added by a request file load can copy property values and hierarchy relationships from other existing nodes in the same viewpoint. Performing a Copy or Model After via a subscription enables nodes added to target viewpoints to copy property values and hierarchy relationships of existing nodes in those viewpoints.

  • Request Load File Format in Administering and Working with Oracle Enterprise Data Management Cloud
  • Performing Copy and Model After Operations in Subscriptions in Administering and Working with Oracle Enterprise Data Management Cloud

The Subscriptions report has been enhanced to include additional parameters that can be configured in a subscription. A Filters column has been added to cover all subscription filters (including the Inclusion Property filter) and the individual Actions, Top Nodes, and Node Expression columns have been removed from the report. A Copy Nodes column has been added to identify subscriptions which use the Copy Action and Properties to Match parameters. These changes are also available in the Subscriptions report download.

Business Benefit: Users can easily determine which subscriptions are configured with the Copy Nodes parameters or Inclusion Property filter.

  • Subscriptions Report in Administering and Working with Oracle Enterprise Data Management Cloud

Events for the creation and deletion of additional metadata objects are tracked and can be reviewed from the Audit System Events screen in the user interface. Audit results can be filtered by a selected object type in the Metadata event category.

Create and delete events are now recorded for the following metadata objects:

  • Bindings and global connections
  • Extracts and extract packages
  • Queries and compare profiles
  • Subscriptions and time labels

Business Benefit: Application owners and Service Administrators can identify when these metadata objects are created and deleted along with who was responsible.

  • Auditing System Events in Administering and Working with Oracle Enterprise Data Management Cloud

An escalation indicator is displayed on the Workflow tab of the Request inspector for invitees who were invited to approve or commit because the request was escalated to them for one or more workflow policies. This allows users to determine when a request has been escalated and to whom.

Business Benefit: Users of a request can easily identify approvers and committers who were invited when a request is escalated.

  • Inspecting Requests in Administering and Working with Oracle Enterprise Data Management Cloud

Subscriptions are now automatically ignored during request processing after a dimension or viewpoint is archived. When a dimension or viewpoint is unarchived, subscriptions will be reactivated to continue generating subscription requests. In previous releases, subscriptions for an archived dimension needed to be manually disabled in order to prevent them from being run and when a viewpoint was unarchived, subscriptions were not automatically re-enabled.

Business Benefit: Owners and metadata managers no longer need to manually disable and enable subscriptions when dimensions and viewpoints are archived and unarchived.

  • Subscribing to Viewpoints in Administering and Working with Oracle Enterprise Data Management Cloud

Approval policies can be configured with the Management Hierarchy approval method to invite approvers based on a hierarchy of users defined in a Users type application. Users who are ancestors of the request owner in the management hierarchy are individually invited to approve their requests in a sequential manner. The approval policy can be set up with a fixed or variable number of approvals before the policy is fulfilled.

Business Benefit: Requests submitted by different users can have different approvers based on an organizational hierarchy that is maintained in Enterprise Data Management or loaded from a source system.

  • Understanding Management Hierarchy Approvals in Administering and Working with Oracle Enterprise Data Management Cloud

A new video is available:

Registering Enterprise Profitability and Cost Management Applications

This tutorial video shows you how to register an Enterprise Profitability and Cost Management application in Enterprise Data Management. Use the registration wizard to manage connections, cubes, dimensions, and attributes for an external Enterprise Profitability and Cost Management application.

An updated video is also available:

Understanding the Information Model

This overview video introduces you to the information model in Enterprise Data Management. Users browse dimension data by using viewpoints. Learn how dimensions, viewpoints, node sets, hierarchy sets, and node types interact to display dimension data in viewpoints.

Business Benefit: Videos provide 3-5 minute overviews and step-by step instructions to perform tasks and achieve an outcome.

The extractPackage EPM Automate command is now available. This command runs an extract package consisting of multiple extracts for an application using a single operation. The results of each extract in the package are added to a ZIP file, which can be written to the staging area or to a global connection.

Business Benefits: This command helps you extract packages to a file or destination so that they can be propagated across environments.

  • extractPackage command

The node clipboard enables users to select multiple nodes from a viewpoint and insert them together into one or more other viewpoints. Nodes can be added to the clipboard from any viewpoint and then inserted under different parents in the same viewpoint or into other viewpoints in the same view using a drag and drop operation. When nodes which are no longer required on the clipboard, they can be removed as necessary. Nodes remain on the clipboard until the view is closed or reloaded.

Business Benefit: Alternate hierarchies and viewpoints consisting of many nodes can be quickly constructed and aligned using the node clipboard. Users can easily insert multiple nodes into multiple viewpoints increasing productivity and reducing the manual effort to perform those changes.

  • Working with the Node Clipboard in Administering and Working with Oracle Enterprise Data Management Cloud

The Oracle EDM Cloud – Oracle Database | Sync Enterprise Data recipe is now available for Oracle Integration. This recipe enables you to synchronize enterprise data bidirectionally between Oracle Enterprise Data Management Cloud and Oracle Database. The recipe contains two integrations. Use the EDM DB Extract Sync integration to extract enterprise data from a viewpoint in Oracle Enterprise Data Management and load it into a database table. Use the DB EDM Extract Sync integration to extract data from a database table and load it into a viewpoint in Oracle Enterprise Data Management.

NOTE: You must create an agent group and install the on-premises connectivity agent before you download the recipe. This allows messages to be exchanged between Oracle Integration and your locally installed Oracle Database resource.

Business Benefit:  This recipe enables you to accelerate the integration of Oracle Enterprise Data Management with the Oracle database by loading a viewpoint in Enterprise Data Management from a table in Oracle database or hydrating an Oracle database table with a viewpoint extract from Enterprise Data Management.

Sync Oracle Enterprise Data Management Cloud with Oracle Database

A system setting is available to prevent the deletion of submitted requests which were recalled or pushed back to the request owner. The setting can be configured by a service administrator and applies to submitted requests for any application and view in the system. When a user attempts to delete a recalled or pushed back request, the request will be transitioned to the Closed stage instead of deleted.

Business Benefit: This setting retains obsolete requests which were recalled or pushed back to ensure they can be audited.

  • Configuring System Settings in Administering and Working with Oracle Enterprise Data Management Cloud

The Reject workflow action for a request can be optionally restricted for approvers and committers of certain types of requests. A system setting is available to enable or disable this restriction and select the request types where rejection will be prevented. When rejection is not permitted, a request can only be pushed back or recalled if it is not approved.

Business Benefit: Restricting the Reject action prevents requests from being unintentionally closed by approvers or committers. These requests remain active so they can be modified accordingly in order to be successfully completed, rather than requiring a new request to be created and submitted.

A priority can be set for enterprise data requests to identify the urgency of the changes included in the requests. The request owner can set or modify the priority during the Submit stage. The Request Activity page provides a Priority column and filter to find requests based on their priority. Workflow policies can be filtered by request priority to enable different workflows for requests with different priorities.

Business Benefit: Users can specify the priority of their requests to describe their importance and route the requests through a conditional workflow process to ensure they are approved and committed in a timely fashion.

  • Setting and Viewing Request Priority in Administering and Working with Oracle Enterprise Data Management Cloud

Custom validations can be configured with a Node or Parent scope to control which node is evaluated during request item validation. Node scope validations evaluate the node for which request actions have been performed. Parent scope validations evaluate the parent of the node that has request actions. Validations with the Parent scope run only for changes made to hierarchy viewpoints. When validating a viewpoint outside of a request context, Parent scope validations run for every node in the viewpoint.

Business Benefit: Parent nodes in hierarchy viewpoints can be validated when performing request actions such as Insert, Move, and Remove on their child nodes. For example, when a parent node must have at least one child node of a particular type or a parent node cannot have multiple children with certain property values.

  • Working with Custom Validations in Administering and Working with Oracle Enterprise Data Management Cloud

The Connection parameter specified when running extracts and extract packages now provides the option to use a global connection other than the one that is preconfigured. This capability is available when running extracts and extract packages via the REST API or EPM Automate. In previous releases, the Connection parameter supported only the global connection that is configured with the extract definition or extract package.

Business Benefit: Output files for an extract and extract package can be written to multiple global connections by specifying a different connection at run time.

  • Extract a dimension viewpoint in REST API for Oracle Enterprise Data Management Cloud Service
  • Run an extract package in REST API for Oracle Enterprise Data Management Cloud Service
  • extractDimension in Working with EPM Automate for Oracle Enterprise Performance Management Cloud

Dimension exports for Financials Cloud General Ledger applications include an additional column in the GLSegmentHierInterface file to identify the order of sibling nodes under parents in hierarchies. This column is used by the Import Segment Values and Hierarchies process in Oracle ERP Cloud to control the order of tree nodes that are created or updated.

Business Benefit: General ledger hierarchies in Oracle ERP Cloud use the same order of nodes as defined in Enterprise Data Management hierarchy viewpoints.

  • Exporting Oracle Financials Cloud General Ledger Dimensions in Administering and Working with Oracle Enterprise Data Management Cloud

A Sort option can be configured for one or multiple columns of a viewpoint extract to control the order of rows which are written to the output file. Column based sorting is limited to Full extracts only. A viewpoint extract can have one primary and one secondary column used for sorting. Rows in the output file can be sorted in ascending or descending order.

Business Benefit: Rows in an extract output file can be ordered differently than the order in which the nodes exist in the viewpoint for the extract.

  • Selecting Extract Columns in Administering and Working with Oracle Enterprise Data Management Cloud

A new application type is available in Enterprise Data Management to manage dimensions and mappings for the Tax Reporting business process in Oracle EPM Cloud. Tax Reporting applications can be registered on the Applications page, may include standard and attribute dimensions, and provide support for a variety of tax reporting options. Mapping bindings and viewpoints can also be created for Tax Reporting applications after registration. The application type includes predefined properties and validations for managing data as well as with bindings and connections for importing and exporting data.

Business Benefit: Tax Reporting application dimensions and mappings can be managed in a dedicated application type in Enterprise Data Management.

  • Working with Tax Reporting Applications in Administering and Working with Oracle Enterprise Data Management Cloud

Identity Cloud Service (IDCS) groups can now be directly granted permissions as well as directly assigned to policies, subscriptions, and views. Users in the IDCS groups inherit the grants and assignments for the IDCS group that they belong to. In previous releases, IDCS groups needed to belong to an EPM group in order to be granted permissions for these Enterprise Data Management artifacts.

Business Benefit: IDCS groups can be directly granted permissions and assigned to policies, subscriptions, and views.

The ExtendMovementScope substitution variable is now enabled by default. This allows you to include Movement dimension members that are created outside of the FCCS_Movements hierarchy in translation and consolidation calculations. To disable this behavior, set this substitution variable to False.

Business Benefit: The ExtendMovementScope feature enables you to include more Movement dimension members in translation and consolidation, including default translations, Translation Override entries and rules, Configurable Calculation rules, On-Demand rules, and Configurable Consolidation rules.

Translating Data in Administering Financial Consolidation and Close

There are no new features in this update except for the applicable features listed in the EPM Cloud Platform section.

  • Narrative Reporting

An updated Narrative Reporting extension for Oracle Smart View for Office is now available to download and install. Along with general improvements and defect fixes, this update includes support for the 24.06 feature,  New Search Field in Embed Contents Dialog in Smart View .

Business Benefit:  Downloading and installing the latest Narrative Reporting extension for Smart View gives you access to the latest features, improvements, and defect fixes.

To take advantage of the features, improvements, and defect fixes in the Narrative Reporting extension, choose an option to install the extension update:

  • From within Smart View, click the Check for Updates, New Installs, and Uninstalls link in the Options dialog, Extensions tab, select your Narrative Reporting instance, then follow the prompts.
  • From the Narrative Reporting web interface, Downloads page, download the "Smart View Extension for EPM Cloud Narrative Reporting" SVEXT file, then double-click the file to install.
  • Installing the Narrative Reporting Extension  in  Working with Oracle Smart View for Office
  • Downloading and Installing Clients  in  Oracle Enterprise Performance Management Cloud Getting Started with Oracle Enterprise Performance Management Cloud for Administrators
  • Downloading and Installing Clients  in  Oracle Enterprise Performance Management Cloud Getting Started with Oracle Enterprise Performance Management Cloud for Users

When working with Narrative Reporting in Smart View, a new Search field in the Embed Contents dialog facilitates in locating content when the list of available content is long.

This feature requires the 24.06 Narrative Reporting extension update.

To use the new Search field, open a report package, open and check out a doclet, and click the Embed button in the Narrative Reporting ribbon. The Embed Contents dialog is displayed:

Embed Contents Dialog Displaying Full List of Available Content

Embed Contents Dialog Showing Full List of Available Content

Enter a term to search on. In the following example, "sum" is entered. There is no need to enter wildcards; Narrative Reporting searches for and returns all available content that contains "sum" anywhere in the content name.

Embed Contents Dialog Showing Filtered List After Performing Search

Embed Contents Dialog Showing Filtered List After Performing Search

You can refine the entire list of available content or the search results by selecting an option from the Filter drop-down list next to the Search field:

Embed Contents Dialog Drop--down List for Filtering Search Results

Embed Contents Dialog Drop--down List for Filtering Search Results

The default selection is All , to show all available content. You can choose Current Doclet , to display only available content in the current doclet; or Reference Doclets , to display available content from all reference doclets in the report package. You can also choose  Hide Embedded , to hide available content that has already been embedded in the current doclet.

Optionally, click the  X  icon in the Search field to clear the search term and return the list of available content to the original state.

Once you've located the available content to embed, you can use the  Show Preview  button to preview the content, click  Embed  to embed the content in your doclet, or click  Cancel to leave the dialog.

Business Benefit: The new Search field in the Embed Contents dialog allows you to quickly locate available content.

This feature requires the 24.06 Narrative Reporting extension. To install the extension, see Updated Narrative Reporting Extension for Smart View.

  • Installing the Narrative Reporting Extension
  • Embedding Content in a Doclet
  • Adding and Embedding Available Content from Reference Files
  • Smart View and Narrative Reporting

A new How Do I …   Help Center library page for Planning is now available. The page has a simplified left navigation pane and a revamped design incorporating Redwood graphics. It lists easy, common, and necessary tasks for setting up EPM Cloud and your business process.

Business Benefit: The How Do I... Help Center tab enables you to quickly access the information you need to use Planning.

Oracle Planning How Do I... Tab

You can now create and work with multiple versions of models. This feature is available in Planning and in Strategic Modeling Extension for Smart View.

Business Benefit: You no longer need to create a new entity archive at every check in.

  • Opening, Checking Out, and Deleting Models  in Working with Planning
  • Opening a Model in  Working with Strategic Modeling in Smart View

You can now collaborate with your colleagues more effectively and leverage the power of automated analysis with IPM Insights using tags.

  • Use # to tag insights with a custom label. For example, you can tag a set of insights for team review by adding #TeamReview .
  • Use @ to tag a user or group.

Then, filter your Insights dashboard to see tagged insights.

You add tags in the Comments box for an insight.

To add tags to an insight:

  • From the Home page, click IPM and then click Insights .
  • From the Actions menu for an insight, click Add Comment . Alternatively, from the Actions menu for an insight, click View Details and then from the Insights Details page, click the Comments icon. You can also click the link in the Details column on the Insights dashboard and then from the Insights Details page, click the Comments icon.
  • In the Comments box, add a tag # followed by your custom label. Or, to tag a user or group, add a tag @ followed by a user or group name.

Entering Tags in the Comments Box

Entering Tags in the Comments Box from the Insights Dashboard

Entering Tags in the Comments Box from the Insights Details Page

Entering Tags in the Comments Box from the Insights Details Page

Notes about tags:

  • To review insights tagged with your user name or group name, in the Insights dashboard, click My Insights in the Filter area.
  • The most frequently-used tag is highlighted with a chip in the Filter area. Select it to see all insights associated with that tag.
  • To search for tagged insights, enter # or @ followed by the tag text in the Search box.

For example, the Insights dashboard is filtered to show My Insights and insights tagged with #ActionItem :

Insights Dashboard Filtered for Tags

Insights Dashboard Filtered for Tags

Administrators can manage tags, for example, delete tags that are no longer used or add new tags.

To manage tags:

  • From the Home page, click IPM and then click Tags .
  • To add a new tag, click Add Tag and then enter a tag name.
  • To delete a tag, click the Actions menu next to a tag and then click Delete .

Business Benefit: Users can collaborate more effectively using tags.

  • Tags are case sensitive.
  • Tags can appear anywhere in the Comments text.
  • You can include any number of tags in the Comments.
  • Collaborating with Insights Using Tags
  • Managing Tags

A new parameter is available in the "Update Dimensions as a Job" REST API and in the EPM Automate loadDimData command. The optional acceptableDecreasePercentage specifies the percentage difference in member count that is allowed for the operation.

Business Benefit: This parameter provides a mechanism to safeguard against data loss that can happen during a subsequent cube deployment if one or more dimensions did not get fully updated due to missing data in the input file.

  • Update Dimensions as a Job  in REST API for Oracle Enterprise Performance Management Cloud
  • loadDimData in Working with EPM Automate for Oracle Enterprise Performance Management Cloud

You can now apply calculation segmentation to custom calculation rules. If the Calculation Segmentation option is selected when defining a custom calculation rule target, rules are segmented into smaller rules to execute separately.

NOTE: You should use calculation segmentation only when your target range is extremely large and sparsely populated. As a best practice, you should call support before proceeding with this option.

Business Benefit: Calculation segmentation enables the calculation to process rules with a very large data range, which would otherwise be too large for Essbase to process.

  • Defining a Custom Calculation Rule Target  in Administering and Working with Enterprise Profitability and Cost Management

The Properties pane on the Calculation Control page now shows whether all rules were included when a model was calculated. In addition, you can now generate a Model Snapshot Documentation report from the Actions menu on this page.

Business Benefit: Having the additional calculation information and quick access to the Model Documentation report on the Calculation Control page eliminates the need to go to the Calculation Analysis page to get this information.

  • Using the Calculation Control Page in Administering and Working with Enterprise Profitability and Cost Management

When Migration Export runs in full backup mode, in addition to exporting the details of the latest calculation run for each POV, any failed calculation runs from the last seven days are now  exported as well. That is, any calculation runs from the last seven days that end in a status other than "Completed" or "Completed With Warnings" will be included in the export.

Business Benefit:  Details of recent failed calculation runs are useful for Oracle Support when analyzing customer issues with calculation runs. Keeping them in the export also gives customers better context when their snapshots are restored.

  • Calculating Models  in Administering and Working with Enterprise Profitability and Cost Management

You can now specify the TRCS_Domicile_Input Jurisdiction member when importing data via data import, Data Exchange, or Supplemental Data Management in Tax Reporting. Using  TRCS_Domicile_Input allows you to import data without specifying the correct national Jurisdiction (domicile) for each entity.  Post data import, the system automatically identifies the national Jurisdiction (level 0 descendant of All National) for each Entity based on the Entity's Domicile attribute configuration. The system then routes the data to the appropriate national Jurisdiction.

Business Benefit: With this new capability, you do not need to specify Entities' respective national jurisdictions when importing data, hence reducing time, effort, and potential errors caused by incorrect mappings and resulting in data getting trapped in the wrong Jurisdiction.

Administering Tax Reporting

  • Automatic Mapping of Jurisdiction with Entity
  • Importing Data Using Data Management
  • Form Template Sections: Mapping Tab

One-Week Lag Between Readiness Documents Live and Help Center Live

Because Oracle readiness documents (What's New and New Feature Summary) are live one week before the monthly update is applied to Test environments, some links to documentation included in the readiness documents will not work until after the Oracle Help Center update is complete when the monthly update is applied to Test environments.

In addition to the applicable Important Actions and Considerations discussed in the EPM Cloud Platform section below, this update includes Important Actions and Considerations specific to:

EPM CLOUD PLATFORM

Install the Newest Version of EPM Automate

Oracle strongly recommends that you install and use the latest version of EPM Automate. The best practice is to update EPM Automate every month so that you get to take advantage of the latest new features, bug fixes, and stability, security, and reliability improvements.

  • Upgrading EPM Automate

Upgrade to the Redwood Experience and the Removal of Non-Redwood Themes

In the July update (24.07) of EPM Cloud, Oracle will automatically update all EPM Cloud environments that use a non-Redwood theme to the Redwood Experience. As a result, themes outside of the Redwood Experience will no longer be supported.

Customers who are currently using non-Redwood themes are encouraged to switch to the Redwood Experience as soon as possible to take advantage of its numerous features, including compact navigation and theme background choices with support for custom logos and background. Oracle recommends that you use a maximum Windows display scale setting of 125% when using the Redwood Experience.

For more information on the Redwood Experience, see these information sources in Getting Started with Oracle Enterprise Performance Management Cloud for Administrators:

NOTE : Starting with this update (24.06), non-Redwood themes will no longer be supported for Enterprise Data Management.

Applies to: Account Reconciliation, Enterprise Profitability and Cost Management, Financial Consolidation and Close, FreeForm, Narrative Reporting, Planning, Profitability and Cost Management, Tax Reporting

Allow access to static.oracle.com

Customers are reminded to allow access to static.oracle.com. For the Oracle Redwood Experience to work, Customers must allow access to static.oracle.com, which provides image files, JavaScript, and other static content.

See About Redwood Experience in Getting Started with Oracle Enterprise Performance Management Cloud for Administrators.

Fixed Issues in Forms 2.0 and Dashboard 2.0

Many of the gaps and issues that customers have raised for Forms 2.0 and Dashboard 2.0 through Cloud Customer Connect forums have been addressed. We encourage you to use Forms 2.0 and Dashboard 2.0 to ensure that your use cases are being met.

The issues that were addressed are posted to a knowledge article on My Oracle Support. Click here to review. You must have a My Oracle Support login to access the article.

Plan to Upgrade All Environments from Non-Hybrid Essbase Version to Hybrid Essbase Version

In the May (24.05) update, Oracle upgraded all Profitability and Cost Management environments on a non-hybrid Essbase version to a hybrid Essbase version. In the June (24.06) update, Oracle is planning to upgrade all Narrative Reporting environments on a non-hybrid Essbase version to a hybrid Essbase version. Oracle plans to upgrade all other environments on a non-hybrid Essbase version to a hybrid Essbase version in future updates, starting with the July (24.07) update. If your environments are selected for Essbase upgrade in the July (24.07) update, you will get a notification by mid-June 2024.

This change affects the Essbase version used in EPM Cloud; it does not affect the cube configuration in your application. For example, if the application is configured to use non-hybrid cubes, this update will not change it to use hybrid cubes. For detailed information on the use of Essbase in EPM Cloud, see About Essbase in EPM Cloud in Getting Started with Oracle Enterprise Performance Management Cloud for Administrators .

To ensure that your application is compatible with hybrid Essbase version, we've implemented a new utility which verifies your member formulas and provides a report so you can fix any issues. The Essbase Outline Validation menu option is in the Application Overview Actions menu. Select the Pre validate Outline option to perform a validation of your application, and then select Outline Pre-validation Report to view a list of member formulas that need to be fixed to be compatible with hybrid Essbase version. After you have fixed the member formulas, you can use the Pre validate Outline option again to make sure that all issues are resolved.

To verify member formulas:

  • From the Home page, click Application , and then click Overview .
  • Click Actions , select Essbase Outline Validation , then select Pre validate Outline .
  • To view the validation report, select Outline Pre-validation Report .

If your environment is on a non-hybrid Essbase version, make sure to use the Essbase Outline Validation option to resolve all issues as soon as possible to avoid any issues after your environment is upgraded to the hybrid Essbase version. If your environment is already on hybrid Essbase version, you don’t need to take any action.

To check whether your environment is on a hybrid or non-hybrid Essbase version, look at the value of Essbase Version supports Hybrid Block Storage Option in the Activity Report. If the value is Yes , it means that your environment is on hybrid Essbase version. If the value is No , it means that your environment is on a non-hybrid Essbase version.

For more information about this option:

  • In Administering Financial Consolidation and Close , see Validating the Essbase Outline .
  • In Administering FreeForm , see Validating the Essbase Outline .
  • In Administering Planning , see Validating the Essbase Outline .
  • In Administering Tax Reporting , see Validating the Essbase Outline .
  • in Getting Started with Oracle Enterprise Performance Management Cloud for Administrator , see About Essbase in EPM Cloud

Plan to Remove Users Administration and Groups Administration from Audit Reports

Currently, you can get information on Users Administration and Groups Administration from the Audits Reports available in the Identity Console. You can also get this information using EPM Automate and EPM Cloud REST APIs. In an upcoming monthly release, information on Users Administration and Groups Administration will no longer be included in audit reports.

In the Enable Audit dialog box, the Users Administration and Groups Administration options will remain available for some time to allow customers who already have these records to view any existing user and group provisioning records. However, after this change, no new records will be displayed in the Audits Report.

  • Enterprise Profitability and Cost Management: Auditing Overview in Administering and Working with Enterprise Profitability and Cost Management
  • Financial Consolidation and Close: Auditing Information Overview in Administering Financial Consolidation and Close
  • FreeForm: Auditing Overview in Administering FreeForm
  • Planning: Auditing Overview in Administering Planning
  • Tax Reporting: Auditing Information Overview in Administering Tax Reporting
  • roleAssignmentAuditReport
  • groupAssignmentAuditReport
  • Role Assignment Report
  • Role Assignment Audit Report for OCI Gen 2
  • Group Assignment Audit Report

Applies to : Enterprise Profitability and Cost Management, Financial Consolidation and Close, FreeForm, Planning, Tax Reporting

Additional Execution Sequencing Routines for Task Manager Jobs

In a future update, Task Manager Jobs will include additional execution sequencing routines to ensure application and data integrity. When potential conflicting jobs are executed simultaneously, the service will sequence the jobs to ensure application integrity. While the individual job’s execution times will not be impacted, the job’s start time may be delayed due to the optimization sequencing.

Applies to: Financial Consolidation and Close, Planning, Tax Reporting

Change in the Default Format When Exporting to Microsoft Excel

In a future update, the default format for exporting table data to Microsoft Excel from Task Manager, Supplemental Data Manager, and Enterprise Journals will be changed to .XLSX instead of .XLS.

Plan to Discontinue Support for Infolets

Oracle plans to stop supporting the creation of infolets later this year. Applications using the Redwood Experience will no longer see the option to create infolets on the listing pages. In applications using non-Redwood themes, the Create button on the Infolets page will be disabled. There will be no future enhancements or support for infolets.

Future Deprecation of Native Mode Option for Smart View Ad Hoc Behavior Application Setting

Native Smart View Ad Hoc Behavior mode is currently still available for EPM Cloud.

Standard is the ad hoc mode upon which all enhancements in the EPM Cloud are being delivered.

Oracle plans to stop supporting Native mode by the end of year 2024. Customers on Native mode are advised to convert the Smart View Ad Hoc Behavior application setting for their environments to Standard mode and test their use cases. Any gaps found should to be logged as an enhancement to Standard mode. Enhancements should be logged on the Customer Connect EPM Platform Idea Lab by end of April 2024 for Oracle to review.

Perform the following steps to begin working with Standard mode in your test environment:

  • In Application Settings , change Smart View Ad Hoc Behavior to Standard .
  • Open worksheets from Smart View and reconnect.
  • Refresh the sheets.

The expectation for existing Native-mode worksheets is that they will work “as is” when the setting is changed to Standard . New ad hoc sheets will only be created in Standard mode using Standard features.

Smart Forms are not supported in Standard mode and there is no plan to support them in Standard mode.

Announcement: Data Management Feature Migration to Data Integration

The user interface pages listed in the table below are no longer available in Data Management, but are available in Data Integration. Data Integration is available from the Data Exchange card from the home screen in the Cloud EPM Business Process, and users can access these features in the current Cloud EPM update. Data Management is not going away – we are only moving a few features now that have 100% parity with Data Integration. Profitability and Cost Management customers are not affected by the migration and do not see a change in their Data Management user interface. REST APIs are not impacted by this change.

Data Integration is the next generation of the Data Management user interface, enabling users to easily build and manage Cloud EPM integrations. As feature parity between Data Integration and Data Management becomes complete, Data Management features will be turned off, and users will use the new Data Integration user interface instead.

This transition is gradual; future What's New documents will include information about the first set of planned changes as well as updates about additional changes planned for the future.

All Data Integration features discussed in this document are currently generally available in the Cloud EPM business processes.

Integration definitions built with Data Management are also visible in Data Integration, which enables an easy transition. (Data Integration is a new user interface on the Data Management data model and does not require migration of content from Data Management to Data Integration).

Please note that additional new integration features will only be included in Data Integration, and will not be back-ported to Data Management. Critical bug fixes and security fixes will still be made to Data Management until all features are fully migrated. In addition, all features from Data Management will be migrated to Data Integration with the exception of the following:

  • The batch feature will be replaced by the new Pipeline feature. The Pipeline feature was available in the June (23.06) update.
  • The Report Definition feature will not be migrated, only the Report Execution feature. Please note that Account Reconciliation, Financial Consolidation and Close, Planning and Tax Reporting provide a feature to report against the Data Integration relational tables using a custom SQL and a BI Publisher report template via Task Manager.
  • The ability to create new Custom Applications in Data Management will no longer available, and customers should use the “Data Export to File” application type instead. (Existing integrations using a custom application will not be impacted.) See Custom Application in Data Management under EPM Cloud Platform in the Important Actions and Considerations section of this document.

For reference, please see the Data Integration guide available from the documentation library for your specific EPM business process. Select the desired business process, then Books, and then scroll down to the Administering Data Integration for Oracle Enterprise Performance Management Cloud documentation link.

Applies to: Account Reconciliation, Enterprise Profitability and Cost Management, Financial Consolidation and Close, FreeForm, Planning, Tax Reporting

See Administering Data Integration for Oracle Enterprise Performance Management Cloud for more information.

Removal of Support for Custom Application in Data Management

Since the September (23.09) update, custom target applications can no longer be added in Data Management. (Existing integrations that use a custom target application will not be impacted, and will still run without any changes.) This type of application was used to extract data from the EPM Cloud, and then the data was pushed into a flat file instead of being loaded to an EPM Cloud application. The custom application was superseded by the Data Export to File feature in an earlier update. The Data Export to File feature has enhanced functions and capabilities.

If you still have a custom target application, it is recommended that you use the Upgrade Custom Applications option to migrate your existing custom target application to a data export to file application. For more information, see Upgrade Custom Applications at: https://docs.oracle.com/en/cloud/saas/enterprise-performance-management-common/erpia/upgrade_custom_applications_100x438d5119.html .

The migration converts existing file formats from the custom target application to the file formats used in the data export to file option and retains all the existing setup. When the custom target application has been converted, you can run same integration as before. Data export file applications are available both in Data Management and Data Integration.

Applies to : Account Reconciliation, Enterprise Profitability and Cost Management, Financial Consolidation and Close, FreeForm, Planning, Tax Reporting

Calculation Manager Errors and Considerations Enforcement

In a future update, Calculation Manager will enforce the execution of the Errors & Considerations diagnostic tool for any rule being launched, validated, or deployed. This enhancement is scheduled to coincide with the enhancement to the Rules listing page. Administrators will see the Rule Status indicating if there are Errors and/or Considerations in the listing page in addition to the Errors & Considerations tab in Calculation Manager.

Rules will continue to deploy and execute even with Errors or Considerations. In all cases, existing rules that are already deployed will continue to run as-is without any change in behavior. However, Oracle Support will require all Errors to be cleared before reviewing any issues submitted for such rules. Any remaining Considerations will require justification for why they are not cleared before Oracle Support will review any such rules.

Change in Behavior for Grids Created from Groovy and REST APIs for ASO Cubes

In a future update, the suppression behavior for grids created via Groovy DataGridDefinitionBuilder and the exportdataslice REST API for ASO cubes will be made consistent with that of the run time data grids created using the Form Designer. Previously, if a grid was built for an ASO cube in Groovy or REST API, then the system used the NON EMPTY MDX clause to suppress missing rows when the suppressMissingBlocks flag was true. Now, when a grid is built for an ASO cube, it will use the NON EMPTY MDX clause when suppressMissingRows is true, and suppressMissingBlocks will be ignored. This means that ASO grids with suppressMissingRows true and suppressMissingBlocks false will now start using MDX. Likewise, ASO grids with suppressMissingRows false and suppressMissingBlocks true will no longer use MDX. Grids where the suppressMissingRows and suppressMissingBlocks flags had the same value will not be impacted. These changes may result in a change in behavior or performance for some ASO grids. If this happens and is undesirable, consider toggling the value of suppressMissingRows from the builder or the JSON payload in case of the REST API.

Report Import/Export to Excel: Font Installation Requirements for Text Boxes in Excel

Due to performance and rendering quality enhancements to Report text boxes when exported to Excel, Report text boxes may appear distorted or with overlapping text while importing or exporting to Excel when the fonts used within the text boxes are not installed on the client machine. To fix this, install the missing fonts used within the text boxes on the client machine.

Applies to : Enterprise Profitability and Cost Management, Financial Consolidation and Close, FreeForm, Narrative Reporting, Planning, Tax Reporting

Oracle Financial Reporting Statement of Direction

Please refer to the announcement on Cloud Customer Connect in the EPM Cloud Platform category: Announcing Updated Guidance on Cloud EPM Financial Reports Deprecation

Please refer to the Statement of Direction for Oracle Financial Reporting:

Oracle Support Document 2910806.1 ( EPM Cloud Financial Reporting Statement of Direction )

Please note that the Oracle Financial Reporting tentative deprecation dates have been moved to mid-to-late CY25.

For more information:

  • See Appendix B in the Designing with Reports Guide : Migrating Reports from Financial Reporting .
  • On June 8, the most recent Oracle Cloud Customer Connect event, Migrating Your Financial Reporting to Reports , was presented. There is a presentation, an event recording, and an FAQ sheet accessible. In order to view the event recording and documentation, you must log in to Cloud Customer Connect.

ACCOUNT RECONCILIATION

1:1 Currency Rates Are No Longer Required

Updates have been made throughout the application to automatically convert between currencies at a 1:1 rate when the currency codes are the same. For example, when the amount in the Entered bucket is USD and needs to be converted to USD in the Functional bucket, a 1:1 rate is automatically assumed, and therefore does not need to be loaded into currency rates. This simplifies configuration for customers with either single or multi-currency requirements.

In a future update, the default format for exporting the table data to Microsoft Excel will be changed to .XLSX instead of .XLS.

ENTERPRISE DATA MANAGEMENT

Checklists in Announcements No Longer Supported

Starting with the June (24.06) update, checklists in announcements are no longer supported. Announcements that already contain checklists will continued to be displayed, but you will no longer be able to add new checklists to announcements.

Removal of Non-Redwood Themes and Upgrade to a Redwood Theme

Starting with the June (24.06) update, themes outside of the Redwood experience are no longer supported in Enterprise Data Management and have been removed from the Appearance page. Environments that are configured to use a non-Redwood theme will be automatically upgraded to a Redwood theme. The following table identifies the upgrade path for each non-Redwood theme:

FINANCIAL CONSOLIDATION AND CLOSE

Enterprise Journals Preparer Column

Starting in this (24.06) update, the Enterprise Journal Preparer column will be populated with the Preparer name only for Ad-hoc journals and, once claimed, for recurring journals as well. Journals created prior to this update will remain with the current values they have.

Intercompany Matching Report Entity Name Format

If you specify any Consolidation member within the FCCS_Entity Total hierarchy, the report displays the Entity Name only, without the Parent member reference, because both primary entity and shared entity instances are the same.

If you specify a node-level Consolidation member (any member above FCCS_Entity Total), the report displays the member name in the Entity column using the [Parent].[Child] format, because the node-level data could be different for each Parent.Child pair.

Enterprise Journal Dimension Persistence

In a future update, all Enterprise Journal dimension attributes will persist their values once they are set in the Journal.

Future Updates to Ownership Management Locking / Unlocking

Currently there is a single Process Management Approval Unit for all instances of a shared entity. As a result, when data for one instance is Locked in Process Management, all instances are Locked. The same Locking logic will apply to Ownership Management. When data is locked for an Approval Unit POV, Ownership Management settings are also locked for the same Approval Unit POV. As noted above, this link from Process Management Locking / Unlocking to Ownership Management settings can be disabled using a Substitution Variable. Note that this ability to disable the link will be temporary.

When future updates to improve Organization-by-Period functionality are made available, there will be one Process Management Approval Unit for each instance of a shared entity rather than one Approval Unit for all instances. When this is first implemented, the same Locking / Unlocking logic will apply. When the Process Management Approval Unit is locked, then both data and Ownership Management settings will be Locked, but on an instance-by-instance basis for shared entities. At this point, the link between Approval Unit Locking / Unlocking and Ownership Management settings Locking / Unlocking will no longer be able to be disabled. This is to ensure that a change in Ownership settings cannot cause Locked data to be impacted.

A further update will then be implemented to provide a little more flexibility in Locking / Unlocking Ownership Management settings. These settings will be able to be Locked before the Process Management Approval Unit for the same POV is Locked. Ownership Management settings will only be able to be Unlocked if the matching Process Management Approval Unit is unlocked. This will allow users to work on updating data without worrying about ownership settings affecting their results. Again, the link and dependency between Locking / Unlocking data and Ownership Management settings will not be able to be disabled.

Substitution Variables

Substitution variables allow you to enable or disable new Financial Consolidation and Close features as needed. The following substitution variables are available:

YTDFXRevised

When enabled, the Year-to-Date FX calculation logic on historical accounts is changed in the consolidation script for Movements. If the account is Historical, then no adjustment is made for the prior period FX on Opening Balance and FX on Movements for all individual accounts, OBFXCTA, OBFXCICTA, and R/E Prior. YTDFX will be enabled for the first year, so in the first year (except for the first period), the system will use YTD FX calculations.

Starting in the July (24.07) update, the default value will be set to True, and this feature will be enabled by default. To disable the feature, you can change the value of the substitution variable to False.

Performance Substitution Variables

The OptimizeConcurrency, OptimizePelimCalculation, EnableYearlyConsol, ParallelCustomDimDSO and ParallelCustomDimTranslation substitution variables can be enabled to improve performance.

  • OptimizeConcurrency = True

This substitution variable improves concurrency of the consolidation process by executing some of the calculations at the very beginning or at the end. The degree of improvement depends on the entity structure of a given customer. Customers with deeper entity hierarchies will benefit the most.

  • OptimizePelimCalculation = True

This substitution variable improves Partner Eliminations (PElim) performance. If there is a consolidation performance degradation when deploying a user-created “Partner Eliminations Configurable Consolidation Rule” that has an account re-direction, adding this variable can provide significant performance improvement.

EnableYearlyConsol = True

The EnableYearlyConsol substitution variable can be enabled to improve performance for multi-period consolidations in applications that use the Dense/Sparse Optimization option (where Period and Movement are the Dense dimensions).

This substitution variable is applicable if the application meets all of these conditions:

  • The application uses the Dense/Sparse Optimization option (Period and Movement are Dense dimensions)
  • You have two or more dirty periods, and two or more hierarchy levels
  • The dirty entities are identical across periods
  • Equity Pickup sequence is not enabled
  • ParallelCustomDimDSO = True

This substitution variable improves the performance of the consolidation process in applications with Dense/Sparse Optimization, where the Period and Movement dimensions are Dense dimensions. To see performance improvements, you should set this substitution variable to True.

  • ParallelCustomDimTranslation = True

This substitution variable can improve the performance of the consolidation process in applications with Dense/Sparse Optimization, where the Period and Movement dimensions are Dense dimensions. To see performance improvements, you should set this substitution variables to True. If any degradation is noticed, set the variable to False or delete it.

NOTE: The degree of performance improvement varies widely across different applications as it is purely driven by the application design and data distribution.

NARRATIVE REPORTING

Bursting Definition Backward Compatibility in 24.06

Due to the bursting enhancements that are implemented in this update (24.06), bursting definitions created in this update are not backward compatible with prior updates. You cannot export and migrate a bursting definition created in 24.06 to a prior update, as this is not supported.

PROFITABILITY AND COST MANAGEMENT

Oracle Profitability and Cost Management Application Type Statement of Direction

The Oracle Profitability and Cost Management Application Type Statement of Direction provides an overview of the strategic plans and future direction of Oracle Profitability and Cost Management Cloud Service and Oracle Enterprise Performance Management Enterprise Cloud Service Profitability and Cost Management Business Process. Included is information about the change in Development focus to the newer Enterprise Profitability and Cost Management Application Type available exclusively with Enterprise Performance Management Enterprise Cloud Service.

Oracle Support Document 2955235.1 ( Oracle Profitability and Cost Management Application Type Statement of Direction )

ENTERPRISE PROFITABILITY AND COST MANAGEMENT

Preferences Related to Calculation Performance

There are three calculation performance related preferences in Enterprise Profitability and Cost Management that can be used to alter calculation performance characteristics and behavior in certain special cases. Each preference can be set or changed by creating a substitution variable. When creating the substitution variable, be sure to create it for “All Cubes”.

NOTE: The default values shown in the table do not require the creation of a substitution variable.

IMAGES

  1. SSRS: User Does Not Have Required Permissions

    ssrs new role assignment

  2. Grant permissions to view or create reports

    ssrs new role assignment

  3. SSRS Report Manager Site Permissions Error After Installation

    ssrs new role assignment

  4. SSRS Security Roles

    ssrs new role assignment

  5. Configuring SQL Server Reporting Services (SSRS)

    ssrs new role assignment

  6. How to Create New Project in SSRS

    ssrs new role assignment

VIDEO

  1. Why Storytelling in Design is a MUST

  2. Top 100 K-Pop Songs for April 2014 Week 1

  3. How To Create Relationship in SQL Server (Delete / Update Rule)

  4. SSRS 2017 Report deployment

  5. SSRS

  6. MSBI

COMMENTS

  1. Role assignments

    In Reporting Services, role assignments determine access to stored items and to the report server itself. A role assignment has the following parts: A securable item for which you want to control access. Examples of securable items include folders, reports, and resources. A user or group account that Windows security or another authentication ...

  2. sql server

    First add the user : Goto : 1) Sitesettings-->Security and click newrole assignment. 2)Enter the Group or User Name and select role based on your requirement. Please see below screenshot. Then click on OK button. Here you can see the list of Roles which are assigned to group or users. 3) Now click on the link of Home and then click on the link ...

  3. SSRS 2012 Permissions

    Conclusion-SSRS 2012 Security. Implementing SSRS Security requires a two step approach. Object permissions must be granted at the database level while SSRS folder and report level permissions requires that a user be assigned to one or more SSRS roles. Several predefined roles exist and will suffice for much of your permission needs.

  4. Security in SSRS

    Next, click New Role Assignment option to add a new user and to assign a new role. Following are the list of roles available for each user in SSRS Security. Browser: This is the basic role that can assign to the user. A user with Browser role can View Reports, Folders, Models, and resources. Apart from this, a user can Manage individual ...

  5. Configuring SQL Server Reporting Services (SSRS)

    Internet Explorer opens SQL Server Reporting Servies to your Report Manager URL. Click Site Settings, and create a new role assignment so that you can assign the desired Active Directory group to the "System Administrator" role in SSRS. To create a new role assignment, click Security, then New Role Assignment.

  6. SSRS Report Manager Site Permissions Error After Installation

    3. On the New System Role Assignment page, enter your group or user name and then select the System Administrator checkbox. Click on OK. 4. After clicking on OK, the system role assignments will be shown. 5. Next, click on Home in the top right and then click on Folder Settings. 6. Click on New Role Assignment.

  7. Managing SSRS security and using PowerShell automation scripts

    Some further reading on SSRS roles: Role Definitions - Predefined Roles Clean up default Permissions. You may have noticed that by Default, BUILTIN\Administrators is added as a Content Manager to the Home folder (and every inherited folder!).This is great for initial setup. It allows the server admin(s) to access Report Manager & get started without any security prerequisites.

  8. SSRS: How to assign multiple users a role to a report quickly?

    Thanks a lot in advance.. In the ReportServer database, look at the following tables Catalog, Users, Roles, Policies, PolicyUserRole. @AlanSchofield I read from the following that even though I have at least WRITE permissions of the 5 system tables you have listed, I still cannot use T-SQL to add new user assignments as I wanted.

  9. Grant users permission to edit SSRS reports

    To edit SSRS reports, users will need elevated access under Folder settings in SSRS: Folder Settings. From the home page of SSRS Web Portal, click on Manage Folder on the top right. Click on New Role Assignment. Type the username in the box Group or user. If hosted OnPrem, username will be the user's Active Directory username.

  10. Microsoft Dynamics GP

    Within SSRS, click the Site Settings icon located in the top right-hand corner of the page. Navigate to the Security tab and click New Role Assignment. Enter the Group Name or User name in the following format "Domain\username or group name.". Then, click, OK. Navigate to the SSRS Report or SSRS Folder, which needs its security role changed.

  11. SSRS Security Configuration

    Edit site security settings for user groups. Click the Site Settings link and select Security. Click New Role Assignment. Enter group ReportAdministrator and select the System Administrator role. Click OK. Repeat step 2-4 for ReportPublisher and ReportViewer but select the System User role for these user groups.

  12. Can I create a custom role in Sql Server Reporting Services (SSRS

    I would like to create a custom role to mimic 'Browser', but not give the user access to subscribe to reports. I noticed that there is a 'Roles' table in the ReportServer database that lists out the current roles, along with a TaskMask and RoleFlags column. I do not want to attempt to tweak any of these columns b/c I don't know how they work.

  13. 5.4 Configuring SQL Server Reporting Services

    For information about configuring SQL Server Reporting Services, review the following sections: Section 5.4.1, Configuring Report Deployment in SSRS. Section 5.4.2, Configuring Email in SSRS ... Click Site Settings > Security > New Role Assignment. Enter the user or group account that deploys reports from the Reporting Console to SSRS.

  14. Microsoft Fabric May 2024 Update

    Welcome to the May 2024 update. Here are a few, select highlights of the many we have for Fabric. You can now ask Copilot questions about data in your model, Model Explorer and authoring calculation groups in Power BI desktop is now generally available, and Real-Time Intelligence provides a complete end-to-end solution for ingesting, processing ...

  15. ssrs 2008

    Anyway, i click to site settings, than navigate to Security tab. At this window i go to New Role Assignment. My problem is at this section. The only roles i see here are System Administrator and System User. I googled about it, the screenshots i saw all had report server roles like browser,Publisher,Report builder.

  16. What's New

    New Role Assignment Report for Users (v2) REST API. A new Role Assignment Report for Users (v2) REST API is available that generates a Role Assignment Report of users in the environment. The report lists the roles assigned to users. It identifies the user's login name, first name, last name, email address and assigned roles.

  17. Browser Role does not shown in SSRS security

    3. It seems that you accessed this page through the link 'Site Settings', this will take you to System level roles only that's why you see Sys Admin and Sys user only. If you need to assign a group to other built in roles, this should be done at folder lever, so you may need to use the link "Folder Settings" found SQL Server Reporting Services ...