SSRS 2008 Problem : Message: Invalid URI: The Authority/Host could not be parsed

You could face this issue while connecting to Reporting Services from configuration manager.

The below error mainly appears while accessing Web Service URL property in Reporting Services configuration manager and the main reason behind is a broken\invalid <UrlString> tag.

SSRS_CONFIG_Error

SSRS_URL_TAG

To confirm this go to location mentioned below and open rsreportserver.XML file in notepad. \Program Files\Microsoft SQL Server\MSRS10.SQL2008_PROD\Reporting Services\ReportServer

Resolution: Correct the <UrlString> </URLString> accordingly and save the file, and it should work now.

Advertisements

Report server has detected a possible denial of service attack

We encountered a strange issue where the reporting services stops responding for some user. When we investigate the issue and check event viewer we found below mentioned warning

Warning: The report server has detected a possible denial of service attack. The report server is dropping requests for service from the xxx.xxx.xxx.xx

Warning 2: Exception information:
      Exception type: HttpException      
                  Exception message: Server Too Busy

ssrs_conn_11

SSRS_Conn_2

The reason could be here that the number of connections from the same user exceeds the maximum allowed number of concurrent connections from one user, the reporting services will not handle new requests and it will wait until existing requests have terminated.

By default Reporting Services can only allow 20 max connections from one user as shown below hence to resolve this issue you need to increase this value as per your requirement.

Resolution: To resolve this issue go below mentioned location or the location where reportserver config file is available as per your installation.

\Program Files\Microsoft SQL Server\MSRS10.MSSQLSERVER\Reporting Services\ReportServer

  • Open the file in notepad and find “MaxActiveReqForOneUser” and change its valle as per your need like from 20 to 50.

<Add Key=”MaxActiveReqForOneUser” Value=”20″/>

Hope this will resolve your issue.

SSRS ERROR: Managed server timed out. Timeout=150 sec. GC extension timeout=30 sec.

We had an issue in which the reporting services 2008 running on Windows 2008 suddenly stops responding and when we connect to reporting services it’s not responding, even the configuration manager fails to load Reporting Services Configuration in RS configuration Manager.

We explore the Reporting Services logs which generally available on below location and we found something interesting.

ERROR: Managed server timed out. Timeout=150 sec. GC extension timeout=30 sec.

%ProgramFiles%\Microsoft SQL Server\MSSQL.x\Reporting Services\LogFiles

SSRS_LOG1

SSRSLOG

During investigation it seems like a bug in Reporting Services; investigation reveals that the major cause for reporting services to be  unresponsive\crash was slow GC (Garbage collection- it manages the allocation and release of memory for your application). This error can be seen clearly in logs “Granting additional timeout of 30 sec.” indicates that managed server timed out while GC was in process, and so slow GC was the likely reason.

Slow GC occurs usually when you have a lot of RAM on the box and the report server process is using a lot of it. The problem is that for managed applications, a Garbage Collection (GC) occurs periodically. During a GC the process is suspended temporarily.
Resolution:

MS suggest that a long running queries\process related to SSRS must be killed from query analyzer should resolve this issue.

http://connect.microsoft.com/SQLServer/feedback/details/519612/reporting-services-2008-crashing

Resolution which worked for US

We tried several things but at last the resolution which worked for us is to apply a latest SP3 for SQL 2008, after applying latest SP the reporting service went to its normal consistent state.

 

Unable to add new Article in Replication

I encounter an interesting issue on Transaction Replication where I was unable to add a newly added table into some subscriber. My replication setup consists of 1 Publisher and 7 subscribers.

I followed the below step to add a new article in my existing Publication which is working fine in my Development environment which is almost similar to Production.

  1. To avoid complete snapshot of the Publisher database I set allow_anonymous and Immediate_sync to False by default its set to TrueEXEC sp_changepublication @publication = ‘PubDB’,
    @property = ‘allow_anonymous’,
    @value = ‘false’
    GO
    EXEC sp_changepublication @publication = ‘PubDB’,
    @property = ‘immediate_sync’,
    @value = ‘false’
    GO
  2. Added new table into exiting publication properties
  3. Start Snapshot Agent
  4. Snapshot Agent created a Snapshot for one article and the scripts can be seen in repl data folderSnapshot_generation
  5. Started Distribution Agent (Ignore if its already running) to deliver this snapshot on the subscriber and it should look like this as shown belowSnapshotDelivered1

Now here I noticed that on 3 of the sites out of 7 Distribution Agent did not apply snapshot on the subscriber and after some time I noticed that the folder gets deleted for this snapshot containing schema, index ,BCP file etc.

This is because my distribution agent cleanup job was configured to run after every 5 min to control the size of distribution database and shared drive containing snapshots.

EXEC dbo.sp_MSdistribution_cleanup  @min_distretention = 0, @max_distretention = 72

Initially I focused on one of the properties immediate sync = TRUE, which I believe could be responsible for deleting the snapshot before applying it on the subscriber.

But after looking into my distribution agent cleanup job properties and after referring to Article my assumption start getting wrong here the difference is that in my case its set to @min_distretention = 0 so do not wait and delete the transaction once it delivered to subscriber case

If “immediate sync” = TRUE, do not keep transaction EVEN IF they have been replicated to the subscriber but delete any transaction older the 72 hours.  Metadata that is younger than 72 hours, but older than 4 hours will be kept. This is conservative.

If “immediate sync” = FALSE, then keep transaction for at least 4 hours but if the transaction has already been replicated (distributed) to the subscriber go ahead and delete EVEN IF not older then 72 hours. Metadata that is younger than 72 hours, but older than 4 hours may be deleted quite aggressively.

Later on I focused on Replication Script and noticed that there is a small change in the Subscription properties of these 3 subscribers as shown below but the impact is big.

@sync_type = N’replication support only’

subprop

This property allows you to manually synchronize your data and schema on your publisher and your subscriber, but then the support objects needed to make replication work are pushed automatically to the subscriber. The benefit is that you can still take advantage of the replication tools to setup replication, but you also have more control over getting your data and schema in sync.  This is great if you already have a database in place and you only want to setup replication. Instead of having to drop and recreate the tables, this option will just push out the objects needed for replication and not affect your existing schema or data.

But you will not be able to add new article in publication once you configure subscriber with this option and this is by design “Replication Support Only” 

Property

Work Around:-

There is no direct way to fix this as this property and this property is not available in GUI and can not be modified, so you have to drop and recreate the subscription with sync_type = N’automatic’. But in a mission critical application this is not possible so one of the approach is to manually apply the script on subscriber. If you have the script copied from the repl data folder then execute that script on all the subscriber which are having this problem.

legal

Important:- While you are manually adding this article ensure that no transaction are made into this table else you will encounter errors like “Object Note Found” or “Row Not Found”.

Resolution:-

  • Create a new database with same name on another SQL instance and make changes in web.config to point to this database, meanwhile you can reconfigure your subscriber using re-initialization method with sync_type = N’automatic’
  • If your application is accessing subscriber database using Synonyms from other database then create a database with same name (using restoration)on same SQL instance and point synonym to this database and meanwhile you can reconfigure your subscriber using re-initialization method with sync_type = N’automatic’

When this option was enabled on my subscription and why?

Well there is no direct way to identify when this configuration was done so you have to rely on distribution agent creation job date or check for subscriber database restoration date. This option might give you some idea if  in past the replication was synchronized with this option to avoid any down time.

 

Who Detach database on SQL Server

Recently we are facing some complaint from a customer that one of their database gets detached automatically once a week and they have no idea why this is happening.

The other problem is that when they try to attach the SQL Servers says the file is already use even though the datafile is not attached to any database.

Technically speaking SQL database can be detached only when user has sysadmin rights and can only be detached or attached with manual intervention  i.e, either by script (Job,.Net Application or by Powershell etc.) or through SQL SSMS. When a database is

We can track this event from default trace as shown below.

DECLARE @path NVARCHAR(260)
 SELECT @path=path FROM sys.traces WHERE is_default = 1
 SELECT TE.name AS EventNamem, DT.TextData,DT.HostName,dt.StartTime,DT.ApplicationName,
 DT.LoginName
 FROM dbo.fn_trace_gettable (@path, DEFAULT) DT
 INNER JOIN sys.trace_events TE
 ON DT.EventClass = TE.trace_event_id
 where TextData LIKE 'DBCC DETACHDB%'

DetachDB

It has been seen in SQL Server when you detach a database SQL service account lost permission on file so I would suggest you take the database offline and then detach database, this will not reset permission on data\log files, else you have to remove un-necessary permission and have to add SQL  Service Account again.even because of this reason you will get error like file already in use even though it’s not in use. In that case remove un-necessary permission and add SQL Service Account Permission.

MDFPer

 

SQL Server is using more memory then assigned is this normal

SQL Server is using more memory then assigned is this normal?

First of all make sure the minimum memory of SQL Server must be set to 0 MB because this will ensure that even in worst of a scenario of memory crunch SQL Services will come up . Please keep this thing in mind that by default SQL will consume as much memory you allow and apart from SQL Server there are other components as well that needs to be considered like SSIS, SSRS, SSAS because they consumed memory apart from SQL Server. So if you are running these components on Server you have to ensure that you have at least (4 OS + 2 GB for these components – excluding max memory of SQL). In your case SQL is assigned with 13 GB and total is 16 GB, so here SQL will try to capture all 13 GB by design and the most important and interesting thing is that when you assign MAX memory limit you are ONLY defining the buffer pool size, so if you are a DBA and windows team is bugging you that your SQL is using more memory then assigned then you become deaf for windows team for some time and investigate if they transferred ticket to you.

MinMAx

PErf

As I said above you are only defining MAX buffer pool size not complete memory for SQL, so there are other memory allocations required apart from buffer pool for SQLOS and internal operations of SQL Server and that allow it to be able to allocate more than max server memory. So Logically SQL 2008 R2 and all below version MAX memory just control the size of buffer pool or you can say you are defining an upper limit of Buffer pool where it stores data pages and procedure cache, but there are other memory clerks within in SQL Server which also uses memory. Hence sometime its normal when you see SQL Server is using more memory than the assigned.

Now question comes what action the DBA should take when he see that the server memory is 98%.

Don’t come to conclusion directly that there is a memory pressure on server, first we need to consider some performance counters.

  • Target vs Total SQL Memory

Most of the blogs will point on the Target Vs Total memory of SQL Server, however I would say it’s not a reliable counter which will indicate that there is a memory pressure on SQL Server when Target Server Memory > Total Server memory.

ToalTarget

  • Page Life Expectancy

It is the expected time, in seconds, that a data file page read into the buffer pool (the in-memory cache of data files pages) will remain in memory before being pushed out of memory to make room for a different data file page. Another way to think of PLE (Page Life Expectancy) is an instantaneous measure of the pressure on the buffer pool to make free space for pages being read from disk. For both of these definitions, a higher number is better.

However it is very tricky when you have a NUMA configuration the buffer pool is split up into buffer nodes, with one buffer node per NUMA node that SQL Server can see. Each buffer node tracks PLE separately and the counter Buffer Manager: Page life expectancy counter is the average of the buffer node PLEs

Roughly we can say that the value of 300 sec or in minute scale 5 Min is fine.

  • Memory Grant Pending

Every query needs memory to execute and query memory grant is a part of server memory used to store temporary row data while sorting and joining rows. It is called “grant” because the server requires those queries to “reserve” before actually using memory. This reservation improves query reliability under server load, because a query with reserved memory is less likely to hit out-of-memory while running, and the server prevents one query from dominating entire server memory.

If Memory Grants Pending is averaging over 0, queries can’t even start because they can’t get enough memory. You may have queries that have incorrect memory grants, or you may just desperately need more RAM.

In below graph you can see the yellow line always 0 which means sufficient memory available all the time, but if you closely see that my page life expectancy is varying from 1 min to 200 min, which defines that there is something happening on my server.

PLE_MGP

  • Buffer Cache Hit Ratio

It gives the ratio of the data pages found and read from the SQL Server buffer cache and all data page requests. The pages that are not found in the buffer cache are read from the disk, which is significantly slower and affects performance – This value should always be close to 100,

BC

I have mentioned some important performance counters in one of my blog below is the link where you get more information.

https://coresql.com/2013/10/23/how-to-handle-memory-pages-per-second-is-too-high/

What is Integration Services Catalog or SSISDB in SQL 2012

Recently one of my friend who is BI expert gave an idea to use Integration Services Catalog to manage SSIS package in a more efficient way. He showed me some glimpse of that new feature and I have to say

It’s really useful and nice concept, I tried and implemented one of my environment and also tried how to migrate complete integration services catalog with all environment variables from one server to another and also implemented mirroring solution on Integration Services Catalog SSISDB to increase the availability of SSISDB database and availability of JOBS\SSIS.

Prior to SQL 2012 and SSISDB all packages are stored either in MSDB or in file system, also there is another way to store package in SSIS Package store.  SSIS Package Store is nothing but combination of SQL Server and File System deployment, as you can see when you connect to SSIS through SSMS: it looks like a store which has categorized its contents (packages) into different categories based on its manager’s (which is you, as the package developer) taste. So, don’t get it wrong as something different from the 2 types of package deployment (MSDB and File System).

SSIS3

SSIS packages are really just XML files, many organizations have used the “copy and configure” approach to deployments. In such cases, the packages are typically manually copied or copied using batch scripts to shared locations or MSDB database, where they’re executed. Configuration information is then stored in configuration files or special database tables meant only for storing SSIS configuration values.

The “copy and configure” approach can lead to problems. For example, a project where all the SSIS configuration parameters, including connection strings, were stored in a database table. As a result, whenever we restored a copy of the production database in the test environment, all the SSIS packages in the test environment would point to the production database

In an attempt to solve these types of problems, SSIS 2012 provides some new package deployment features, including the new SSISDB catalog and database. SSIS 2012 stores all the packages, projects, parameters, permissions, server properties, and operational history in the SSISDB database, bringing together all the “moving parts” for any SSIS deployment. You access the SSISDB database in SQL Server Management Studio (SSMS) by expanding the Databases node in Object Explorer.

What is Integration Services Catalog?

SSIS Catalog is a new repository model for Integration Services, which introduced from SQL Server 2012 and this new repository model comes with lot of features for developers, and database administrators.

The SSISDB catalog is the central point for working with Integration Services (SSIS) projects that you’ve deployed to the Integration Services server. For example, you set project and package parameters, configure environments to specify runtime values for packages, execute and troubleshoot packages, and manage Integration Services server operations. The objects that are stored in the SSISDB catalog include projects, packages, parameters, environments, and operational history.

SSISCataog2

The project deployment model of SSIS 2012 no longer stores packages in the MSDB or on the file system. Instead once you configure Integration Services Catalog on server it separately creates a SSISDB database on the database server. The project deployment model also allows you to define different environments (DEV, CONS and PROD) to assign correct values for the parameters used in your packages according to the environment against they are executed.

Packages can be deployed to SSISDB and you can consider this as best practice. This feature will keep a deployment history of packages (like a very basic version control) so you can even rollback some revisions of your package. Now you don’t need XML or dedicated SQL tables to save your configurations. Now managing security is also very easy you can manage security through SQL Server because now everything can be handled via SQL Server Security instead of DCOM.

Advantages of Integration Services Catalog (SSISDB)?

  • Packages can be directly deployed to SSISDB using SQL Server Data Tools for BI Studio. Packages keep a deployment history so you can even rollback some revisions of your package.
  • You can manage security through SQL Server because now everything can be handled via SQL Server Security.
  • Integration Services Catalog (SSISDB) comes with a new feature called Integration Services Dashboard, a report automatically built with report services template. Just click Integration Service Catalog and right click your packages to view “All Executions”.
  • You can see very detailed Information about your packages including execution time.

SSISCataog1

  • You can parameterize your Connection Manager or just parts of it.

Example: You can parameterize the server name and in your SSISDB you can create two environments (or more) called “CONS” and “PROD”. Then you can add variables to both of them and map them to the input parameter of your package. Main Advantage you can deploy a package to SSISDB and link to an environment and you don’t have to handle the connection strings by yourself.

  • Main Advantage of the new model is the configuration. You don’t need XML or dedicated SQL tables to save your configurations. You can use input parameters and map them with environments defined on SQL server.

How to configure Integration Services Catalog (SSISDB)?

How packages are stored in various version of SQL Server?

  • 2005 – stored in msdb.dbo.sysdtspackages90
  • 2008 – stored in msdb.dbo.sysssispackages (I seem to recall 2008 RTM using a different table, or reused the 90 table but that got patched out)
  • 2008 R2 – stored in msdb.dbo.sysssispackages
  • 2012 (package deployment model) – stored in msdb.dbo.sysssispackages
  • 2012 (project deployment model) – stored in SSISDB.catalog.packages*
  • 2014 (package deployment model) – stored in msdb.dbo.sysssispackages
  • 2014 (project deployment model) – stored in SSISDB.catalog.packages*

How to backup,restore or move SSISDB from one server to another?