As a DBA, it is perfectly normal to forget your password from time to time. No need to hold your breath or panic. In case of emergency that you need to recall your password or the ‘sa’ password, no need to panic because you can do it via the backdoor any way. These steps should get you going and move on with your administration tasks.
- From the SQL Server configuration manager, stop the SQL services.
- Open a cmd window as an administrator or elevated rights
- From the command prompt, you need to locate the folder path of where your SQL Server binary files are located. Go to that path and type; sqlservr.exe –m.
- Once the SQL server service succesfully starts with single user, open another cmd window as administrator or account with elevated rights
- Type in command prompt; sqlcmd –S <servername> or (local) if your SQL Server is in the same machine.
- Then you can reset the ‘sa’ password and also enable or unlock it at the same time.
ALTER LOGIN [sa] WITH PASSWORD=N’P@ssword’
ALTER LOGIN [sa] ENABLE
There you have it. You have successfully taken control of the ‘sa’ account. From this point, close all the previous command windows and go back to SQL Configuration Manager. Start the SQL Server services. Try to connect to your SQL Server and start to use ‘sa’ account. :)
Problem : Surely you have scheduled jobs to run DBCC CHECKDB across your SQL Servers to check database integrity. But sometimes you miss checking reports on these scheduled jobs if they did run or completed but failed. So how do you monitor when was the last run of DBCC CHECKDB in your database?
Solution : There are many ways to do this. Also there is a very good in depth post from Paul Randal about DBCC CHECKDB which was a great help for me.
Generally, to check the last run of DBCC CHECKDB you can always use the command
DBCC DBINFO (yourdatabasenamehere) WITH TABLERESULTS
and look for the dbi_dbccLastKnownGood field which contains the date time stamp of the DBCC activity. But this is good only if the DBCC actually ran against the database and not on the snapshot of the database. For cases in which mirrored databases are involved you would usually run DBCC CHECKDB against a snapshot of the mirrored database.
To check the last DBCC CHECKDB run against these mirrored database, you may use a query i use frequently across all the SQL Server I monitor.
CREATE TABLE #DBCheckInfo (
INSERT INTO #DBCheckInfo EXECUTE xp_ReadErrorLog 0, 1, ‘dbcc checkdb’
SELECT * FROM #DBCheckInfo ORDER BY LogDate DESC
DROP TABLE #DBCheckInfo
In the above method that I use, i check the SQL Server Logs for any run of DBCC CHECKDB activity. This way I get the correct date whether the DBCC actually ran on the database or a snapshot of the database as long as the DBCC CHECKDB command was actually invoked.
Next Step : You can add this query as part of your daily health check monitoring reports.
From my previous post, we learned how to set up Data Collection in SQL Server. This is most suitable for monitoring purposes and reporting on disk usage and server activities. But what if you want to monitor other areas of your database server aside from the default data collection sets?
Aside from the default data collection sets, you can also set up your own customised data collection sets. For example your requirement would be to monitor all failed SQL Agent jobs on a daily basis across all your SQL database servers. Lets use this as an example.
In creating a customised data collection set, the step is composed of 3 parts;
1. defining the data collection container – which contains header parameters such as name of the data collection set, description, logging and schedule to run. You will need to use the stored procedure sp_syscollector_create_collection_set. See example below;
@name=N’Failed SQL Jobs’,
@description=N’Collects data about failed jobs for all servers.’,
2. defining the data collector type – this is the part you will define the data collector type for your custom data collection. There are predefined data collector type already setup for use. For most usual cases we will use the Generic T-SQL Query Collector Type. To retrieve this value you may use the statement below;
Declare @collector_type_uid_3 uniqueidentifier
Select @collector_type_uid_3 = collector_type_uid
Where name = N’Generic T-SQL Query Collector Type';
3. Define the data collection item – this is the part wherein you define your parameters for your data collection set. This will also contain the actual query to retrieve all SQL failed jobs. As an example see the code below
Declare @collection_item_id_4 int
@name=N’Failed SQL Jobs Item’,
SELECT @@ServerName AS [ServerName],
[sJOB].[name] AS [JobName],
[sJOBH].[run_date] AS [LastRunDateTime],
CAST([sJOBH].[run_duration] AS VARCHAR(6)) AS [LastRunDuration (HH:MM:SS)],
[sJOBH].[message] AS [LastRunStatusMessage],
CAST([sJOBSCH].[NextRunDate] AS CHAR(8)),
[sJOBSCH].[NextRunDate] AS [NextRunDateTime]
FROM [msdb].[dbo].[sysjobs] AS [sJOB] LEFT JOIN (SELECT [job_id], MIN([next_run_date]) AS [NextRunDate]
, MIN([next_run_time]) AS [NextRunTime]
GROUP BY [job_id] ) AS [sJOBSCH] ON [sJOB].[job_id] = [sJOBSCH].[job_id]
LEFT JOIN (SELECT [job_id] , [run_date] , [run_time] , [run_status], [run_duration], [message], ROW_NUMBER() OVER (PARTITION BY [job_id] ORDER BY [run_date] DESC, [run_time] DESC) AS RowNumber FROM [msdb].[dbo].[sysjobhistory] WHERE [step_id] = 0 ) AS [sJOBH] ON [sJOB].[job_id] = [sJOBH].[job_id] AND [sJOBH].[RowNumber] = 1 WHERE [sJOBH].[run_status] = ”0” ORDER BY [LastRunDateTime] DESC
</Value><OutputTable> FailedJobs</OutputTable> </Query><Databases UseSystemDatabases=”true” UseUserDatabases=”true” /> </ns:TSQLQueryCollector>’,
The <OutputTable> tag contains the destination table for your query results. This table will be created with default schema of custom_snapshots inside your Management Data Warehouse database.
After successfully creating your custom data collection set, you must manually start the data collection. You can either do this via GUI from the Data Collection menu or by executing below under the msdb database.
EXEC sp_syscollector_start_collection_set @collection_set_id = <yourcollectionsetid>;
After the initial upload of data thats the time the output table for your data collection set will be created. Now that you have the data for all failed SQL Agent jobs across your database servers on a daily basis, you can create a report for this via Reporting Services and added to your monitoring reports.
There you have it, your first customised data collection set. From here on you can create more monitoring reports for your dashboard and impress your team lead or boss. :)
Originally posted on Journey to SQL Authority with Pinal Dave:
I have to admit, Spiderman is my favorite superhero. The most recent movie recently was released in theaters, so it has been at the front of my mind for some time.
Spiderman was my favorite superhero even before the latest movie came out, but of course I took my whole family to see the movie as soon as I could! Every one of us loved it, including my daughter. We all left the movie thinking how great it would be to be Spiderman. So, with that in mind, I started thinking about how we are like Spiderman in our everyday lives, especially developers.
Let me list some of the reasons why I think every developer is a Spiderman.
- We have special powers, just like a superhero. There is a reason that when there are problems or emergencies, we get called in, just like a superhero! Our powers might not…
View original 328 more words