A quicker way to know when DBCC CHECKDB last ran in your database.

Problem : Surely you have scheduled jobs to run DBCC CHECKDB across your SQL Servers to check database integrity. But sometimes you miss checking reports on these scheduled jobs if they did run or completed but failed. So how do you monitor when was the last run of DBCC CHECKDB in your database?

Solution : There are many ways to do this. Also there is a very good in depth post from Paul Randal about DBCC CHECKDB which was a great help for me.

Generally, to check the last run of DBCC CHECKDB you can always use the command

DBCC DBINFO (yourdatabasenamehere) WITH TABLERESULTS

and look for the dbi_dbccLastKnownGood field which contains the date time stamp of the DBCC activity. But this is good only if the DBCC actually ran against the database and not on the snapshot of the database. For cases in which mirrored databases are involved you would usually run DBCC CHECKDB against a snapshot of the mirrored database.

To check the last DBCC CHECKDB run against these mirrored database, you may use a query i use frequently across all the SQL Server I monitor.

CREATE TABLE #DBCheckInfo (
[LogDate] VARCHAR(25),
[ProcessInfo] VARCHAR(10),
[Text] VARCHAR(1000)
)

INSERT INTO #DBCheckInfo EXECUTE xp_ReadErrorLog 0, 1, ‘dbcc checkdb’
SELECT * FROM #DBCheckInfo ORDER BY LogDate DESC
DROP TABLE #DBCheckInfo

In the above method that I use, i check the SQL Server Logs for any run of DBCC CHECKDB activity. This way I get the correct date whether the DBCC actually ran on the database or a snapshot of the database as long as the DBCC CHECKDB command was actually invoked.

Next Step : You can add this query as part of your daily health check monitoring reports.

 


Setting Up Your Custom Data Collector Set

Problem:

From my previous post, we learned how to set up Data Collection in SQL Server. This is most suitable for monitoring purposes and reporting on disk usage and server activities. But what if you want to monitor other areas of your database server aside from the default data collection sets?

Solution:

Aside from the default data collection sets, you can also set up your own customised data collection sets. For example your requirement would be to monitor all failed SQL Agent jobs on a daily basis across all your SQL database servers. Lets use this as an example.

In creating a customised data collection set, the step is composed of 3 parts;

1. defining the data collection container – which contains header parameters such as name of the data collection set, description, logging and schedule to run.  You will need to use the stored procedure sp_syscollector_create_collection_set. See example below;

EXEC [msdb].[dbo].[sp_syscollector_create_collection_set]
                                    @name=N’Failed SQL Jobs’,
                                    @collection_mode=1,
                                    @description=N’Collects data about failed jobs for all servers.’,
                                    @logging_level=0,
                                    @days_until_expiration=180,
                                    @schedule_name=N’CollectorSchedule_Every_6h’,
                                    @collection_set_id=@collection_set_id_1 OUTPUT,
                                    @collection_set_uid=@collection_set_uid_2 OUTPUT

2. defining the data collector type – this is the part you will define the data collector type for your custom data collection. There are predefined data collector type already setup for use. For most usual cases we will use the Generic T-SQL Query Collector Type.   To retrieve this value you may use the statement below;

Declare @collector_type_uid_3 uniqueidentifier
                  Select @collector_type_uid_3 = collector_type_uid
                  From [msdb].[dbo].[syscollector_collector_types]
                  Where name = N’Generic T-SQL Query Collector Type’;

3. Define the data collection item – this is the part wherein you define your parameters for your data collection set.  This will also contain the actual query to retrieve all SQL failed jobs. As an example see the code below

Declare @collection_item_id_4 int
EXEC [msdb].[dbo].[sp_syscollector_create_collection_item]
                                    @name=N’Failed SQL Jobs Item’,
                                    @parameters=N’<ns:TSQLQueryCollector xmlns:ns=”DataCollectorType”><Query><Value>
                                    SELECT  @@ServerName AS [ServerName],
                                                      [sJOB].[name] AS [JobName],
                                                      [sJOBH].[run_date]  AS [LastRunDateTime],
                                                      CAST([sJOBH].[run_duration] AS VARCHAR(6)) AS [LastRunDuration (HH:MM:SS)],
                                                      [sJOBH].[message] AS [LastRunStatusMessage],
                                                      CAST([sJOBSCH].[NextRunDate] AS CHAR(8)),
                                                      [sJOBSCH].[NextRunDate] AS [NextRunDateTime]
                                    FROM [msdb].[dbo].[sysjobs] AS [sJOB] LEFT JOIN (SELECT [job_id], MIN([next_run_date]) AS [NextRunDate]
                    , MIN([next_run_time]) AS [NextRunTime]
                FROM [msdb].[dbo].[sysjobschedules]
                GROUP BY [job_id] ) AS [sJOBSCH] ON [sJOB].[job_id] = [sJOBSCH].[job_id]
                 LEFT JOIN (SELECT [job_id] , [run_date] , [run_time]          , [run_status], [run_duration], [message], ROW_NUMBER() OVER (PARTITION BY [job_id] ORDER BY [run_date] DESC, [run_time] DESC) AS RowNumber FROM [msdb].[dbo].[sysjobhistory] WHERE [step_id] = 0 ) AS [sJOBH] ON [sJOB].[job_id] = [sJOBH].[job_id] AND [sJOBH].[RowNumber] = 1    WHERE  [sJOBH].[run_status] = ”0”                                    ORDER BY [LastRunDateTime] DESC

                        </Value><OutputTable> FailedJobs</OutputTable> </Query><Databases UseSystemDatabases=”true” UseUserDatabases=”true” /> </ns:TSQLQueryCollector>’, 
                                    @collection_item_id=@collection_item_id_4 OUTPUT,
                                    @frequency=60,
                                    @collection_set_id=@collection_set_id_1,
                                    @collector_type_uid=@collector_type_uid_3;

The <OutputTable> tag contains the destination table for your query results. This table will be created with default schema of custom_snapshots inside your Management Data Warehouse database.

After successfully creating your custom data collection set, you must manually start the data collection. You can either do this via GUI from the Data Collection menu or by executing below under the msdb database.

EXEC sp_syscollector_start_collection_set @collection_set_id = <yourcollectionsetid>;

After the initial upload of data thats the time the output table for your data collection set will be created. Now that you have the data for all failed SQL Agent jobs across your database servers on a daily basis, you can create a report for this via Reporting Services and added to your monitoring reports.

There you have it, your first customised data collection set. From here on you can create more monitoring reports for your dashboard and impress your team lead or boss. :)

 


Developer’s Life – Every Developer is a Spiderman

Originally posted on Journey to SQL Authority with Pinal Dave:

I have to admit, Spiderman is my favorite superhero.  The most recent movie recently was released in theaters, so it has been at the front of my mind for some time.

Spiderman was my favorite superhero even before the latest movie came out, but of course I took my whole family to see the movie as soon as I could!  Every one of us loved it, including my daughter.  We all left the movie thinking how great it would be to be Spiderman.  So, with that in mind, I started thinking about how we are like Spiderman in our everyday lives, especially developers.

Let me list some of the reasons why I think every developer is a Spiderman.

  • We have special powers, just like a superhero.  There is a reason that when there are problems or emergencies, we get called in, just like a superhero!  Our powers might not…

View original 328 more words


Data Collection in SQL Server

Problem : As a DBA, one of your primary job description is to handle management and administration of 1 or 100+ SQL Server databases. One of the top skill required for this is being proactive in your managing task and being able to fix something even before it even happens. For this you will need a tool to help in accomplishing this. Good thing in SQL Server it is already available for you to use. Ease to setup and aside from its default reporting you can add your own custom data collection set.

Solution : There is a built in tool for this in SQL Server called Data Collection Set. The idea for this you have data warehouse repository that collects data from all your SQL Servers and generates a report on the current state of your SQL Servers.

If you are Formula 1 Grand Prix fanatic like me, imagine your engineering team collecting all kinds of data from your car and presents it in a dashboard format of reporting so that they will know how to fix the car so you, the driver can come out on top of the race. It is the similar principle here for SQL Server.

Setting up your Data Collector

For configuring your data collector server, you need to configure a server that will act as the data collector and also generate the management data warehouse report. Also you will need a service account for data collection and grant this account data collection roles.

Configure the Management Data Warehouse

In your SQL Server that will act as the data collector, you need to setup the Management Data Warehouse database.

From your SSMS, go to Management menu and right click on Data Collection.

DataCollectionMenu

Select the Configure Data Management Warehouse. This will launch the Configure Data Management Warehouse wizard.  Click on Create or upgrade a management data warehouse option. Click Next.

ConfigureMDWWizard

Enter the server name, that will act as your data collector or the SQL Server on which you are running this wizard. Enter the database name that will act as the repository database from which your report will gather data. For the cache directory, this is optional and for cases when you want your data collector server to collect data but not yet upload to the MDW server.

ConfigureMDWStorage

That’s all that is needed and you can click Finish. Click Close.

ConfigureMDWProgress

Refresh your Data Collection Menu. You should be able to see the following default data collection sets configured.

ConfigureAfter

Each data collection set will automatically create a job in your SQL Agent. You may see this  starting with job name ‘collection_setxxxxxxx’.

At this point, you have completed the setup of your Data Collection in SQL Server. Now to configure the properties of your data collection sets, you right click on each data collection set and select Properties.

DataCollectionSetProperties

From the Properties dialog window, you can configure the schedule of data collection, the retention period of the data collected

Data Collector Permissions and Roles

There are 3 database roles within the MDW database. The service accounts from the database server that have client data collector configured must be added to the logins and mapped to one of the following database role.

  • mdw_admin – members of this role have rights to manage the data collection set properties. Members of this role also have read and write access to the MDW database.
  • mdw_reader – members of this role have read access the MDW database
  • mdw_writer – members of this role have write access the MDW database

From your client database server that is sending the data for data collection set, the least privilege must be granted. The service account of the client database server must be added to the MDW server and must be added to the database role mdw_writer.

Data Collector SQL Server Clients

For all your SQL Server that you want to collect data from, you will need to run the same configuration wizard. And you need to add the SQL Server service account from this client server to your MDW server and make it a member of mdw_writer in the MDW database.

From your client SQL Server, select the option ‘Set up Data Collection’. Click Next.

ConfigureMDWWizard

In the below wizard dialog box, enter the name of your data collector server and the name of your repository database.

ConfigureMDWStorage

Click Finish and you are set. You will need to this for all the SQL Servers that you want to monitor.

Built in Reports and Dashboard

You may have a hard time finding the built-in reports for MDW because they must be used once before they automatically appear in the SSMS menu.

To get started:

  • Open SSMS and go to one of the MDW databases.
  • Right-click and select the Reports menu item.
  • Select Management Data Warehouse menu item.
  • Select Management Data Warehouse Overview menu item.

You should be able to see main reporting dashboard below. The below dashboard contains the server or instance name for all your client SQL Servers and the last data collection date for the 3 default data collector sets

MDWDefault

 

Click on each column to view the different dashboard for your default data collection set.

I have been using the Data Collection only recently but i find it useful for SQL Server monitoring especially the Disk Usage report and dashboard. It really helps on forecast and disk usage trending of all the servers that I am monitoring. I hope you find it useful in your case too.

Aside from the default data collection set that SQL Server automatically creates, you can create your own custom data collection set. There maybe a future requirement for all failed SQL Agent jobs report in your environment or something like an auditing report for all your logins across all your SQL Server Production environment. This can be created too and I will tackle that on my next post!

For more information on data collection, you may visit this link.

 

 


That “Cannot Generate SSPI Context” headache

Problem : You have completed an SQL Server setup on a new instance. The setup is successful, you can connect locally and and everything is fine. No headaches. Now to test your SQL Server, you try to connect from a client PC and you get that “Cannot Generate SSPI Context”  error. Now what?

Most likely you have the same situation as below;

  • You are using Integrated Security.
  • Kerberos authentication is used to perform the security delegation. You can verify this with your domain admin.
  • You are not using Local System account in your SQL Server service which is a good practice.

Solution:

What SSPI?

Security Support Provider Interface (SSPI) is a set of Windows APIs that allows for delegation and mutual authentication over TCP/IP sockets. Therefore, SSPI allows for a computer that is running a Windows operating system to securely delegate a user security token from one computer to another over any transport layer that can transmit raw bytes of data.

What SPN?

Another term that we have to define here is SPN. SPN stands for Service Principal Name. Kerberos authentication uses an identifier named “Service Principal Name” (SPN). Consider an SPN as a domain or forest unique identifier of some instance in a server resource. You can have an SPN for a web service, for a SQL service, or for an SMTP service. You can have a multiple SPNs if you have multiple instance sitting on your database server.

Setting SPN

If you are using a domain user account for your SQL Server service, you need to manually register the SPN for this instance to create the SPN in your Active Directory. The SPN for SQL Server has the following elements;

  • ServiceClass: This identifies the general class of service. This is always MSSQLSvc for SQL Server.
  • Host: This is the fully qualified domain name DNS of the computer hosting SQL Server.
  • Port: This is the port number that the service is listening on. Unless you have made changes to the default port during your setup, this value is 1433.

To register the SPN for the domain user account in your Active Directory, you may run the following command in your cmd window;

SETSPN -A MSSQLSvc/<SQLServerName>:1433 domain\useraccount

The trick here is that you need to issue this command twice. One for the NETBIOS name, which is the above command, and one for the FQDN (Fully Qualified Domain Name) to allow for the setup to succeed.

So you will have to run also the below as your next step.

SETSPN -A MSSQLSvc/<SQLServerName.domain.com>:1433 domain\useraccount

For multiple instances, you will then need to run it twice for each instance that you have. There are times, when you will need the help of your friendly domain admin to run the setspn command because your current permission is not enough. Also to make sure this is successful, you will need to check that the SQL Browser service is running during the setup, because it will be looking for it (especially for multiple instances) and validate it.

To validate that the setspn command was successful, in your cmd window;

setspn – L domain\useraccount

This should list all the instances you have in your server.

One less headache… :)

 


unfoldthecreativity

Traveller Observer

The SQL Pro

(Ayman El-Ghazali - www.thesqlpro.com)

Meels on Wheels

Meels for breakfast, lunch and dinner.

juliansherman

Just another WordPress.com site

Paul Turley's SQL Server BI Blog

from the world of Microsoft SQL Server BI and reporting

TroubleshootingSQL

Explaining the bits and bytes of SQL Server

Coding Tips

We will improve your knowledge

SQL Studies

My SQL Studies

It's Hemlock Time

We don't make sense. It's innate.

Sql And Me

My Experiments with SQLServer

Dimitrios Kalemis

I am exactly like Jesus Christ: an atheist and anarchist against society and bad people with influence and power (judges, social workers, politicians, priests, and teachers).

Clint Huijbers' Blog

Microsoft BI consultant

Blog of Many Useless Wonders

Where Uselessness Abounds!

Stevespevack's Blog

IT Recruitment

SQL Problems and Solutions

Sharing the knowledge & experience of SQL, DBA & BI

Glenn Berry's SQL Server Performance

Semi-random musings about SQL Server performance

Algonquin Studios Blog

Ideas and opinions on software development, web design, customer service, sales and marketing...and food

SutoCom Solutions

Success & Satisfaction with the Cloud

Terli Srinivasa rao Blog

Sql server and Tableau practices

Follow

Get every new post delivered to your Inbox.

Join 46 other followers