SQL Server proxy account and local policy

Originally posted on Peter's blog:


 
 
 
 
 

View original


Taking control of the ‘sa’ account for SQL Server 2012

As a DBA, it is perfectly normal to forget your password from time to time. No need to hold your breath or panic. In case of emergency that you need to recall your password or the ‘sa’ password, no need to panic because you can do it via the backdoor any way. These steps should get you going and move on with your administration tasks.

  • From the SQL Server configuration manager, stop the SQL services.
  • Open a cmd window as an administrator or elevated rights
  • From the command prompt, you need to locate the folder path of where your SQL Server binary files are located. Go to that path and type; sqlservr.exe –m.
  • Once the SQL server service succesfully starts with single user, open another cmd window as administrator or account with elevated rights
  • Type in command prompt; sqlcmd –S <servername> or (local) if your SQL Server is in the same machine.
  • Then you can reset the ‘sa’ password and also enable or unlock it at the same time.

USE [master]
GO
ALTER LOGIN [sa] WITH PASSWORD=N’P@ssword’
GO
ALTER LOGIN [sa] ENABLE
GO

 
There you have it. You have successfully taken control of the ‘sa’ account. From this point, close all the previous command windows and go back to SQL Configuration Manager. Start the SQL Server services. Try to connect to your SQL Server and start to use ‘sa’ account. :)

Now breathe.


A quicker way to know when DBCC CHECKDB last ran in your database.

Problem : Surely you have scheduled jobs to run DBCC CHECKDB across your SQL Servers to check database integrity. But sometimes you miss checking reports on these scheduled jobs if they did run or completed but failed. So how do you monitor when was the last run of DBCC CHECKDB in your database?

Solution : There are many ways to do this. Also there is a very good in depth post from Paul Randal about DBCC CHECKDB which was a great help for me.

Generally, to check the last run of DBCC CHECKDB you can always use the command

DBCC DBINFO (yourdatabasenamehere) WITH TABLERESULTS

and look for the dbi_dbccLastKnownGood field which contains the date time stamp of the DBCC activity. But this is good only if the DBCC actually ran against the database and not on the snapshot of the database. For cases in which mirrored databases are involved you would usually run DBCC CHECKDB against a snapshot of the mirrored database.

To check the last DBCC CHECKDB run against these mirrored database, you may use a query i use frequently across all the SQL Server I monitor.

CREATE TABLE #DBCheckInfo (
[LogDate] VARCHAR(25),
[ProcessInfo] VARCHAR(10),
[Text] VARCHAR(1000)
)

INSERT INTO #DBCheckInfo EXECUTE xp_ReadErrorLog 0, 1, ‘dbcc checkdb’
SELECT * FROM #DBCheckInfo ORDER BY LogDate DESC
DROP TABLE #DBCheckInfo

In the above method that I use, i check the SQL Server Logs for any run of DBCC CHECKDB activity. This way I get the correct date whether the DBCC actually ran on the database or a snapshot of the database as long as the DBCC CHECKDB command was actually invoked.

Next Step : You can add this query as part of your daily health check monitoring reports.

 


Setting Up Your Custom Data Collector Set

Problem:

From my previous post, we learned how to set up Data Collection in SQL Server. This is most suitable for monitoring purposes and reporting on disk usage and server activities. But what if you want to monitor other areas of your database server aside from the default data collection sets?

Solution:

Aside from the default data collection sets, you can also set up your own customised data collection sets. For example your requirement would be to monitor all failed SQL Agent jobs on a daily basis across all your SQL database servers. Lets use this as an example.

In creating a customised data collection set, the step is composed of 3 parts;

1. defining the data collection container – which contains header parameters such as name of the data collection set, description, logging and schedule to run.  You will need to use the stored procedure sp_syscollector_create_collection_set. See example below;

EXEC [msdb].[dbo].[sp_syscollector_create_collection_set]
                                    @name=N’Failed SQL Jobs’,
                                    @collection_mode=1,
                                    @description=N’Collects data about failed jobs for all servers.’,
                                    @logging_level=0,
                                    @days_until_expiration=180,
                                    @schedule_name=N’CollectorSchedule_Every_6h’,
                                    @collection_set_id=@collection_set_id_1 OUTPUT,
                                    @collection_set_uid=@collection_set_uid_2 OUTPUT

2. defining the data collector type – this is the part you will define the data collector type for your custom data collection. There are predefined data collector type already setup for use. For most usual cases we will use the Generic T-SQL Query Collector Type.   To retrieve this value you may use the statement below;

Declare @collector_type_uid_3 uniqueidentifier
                  Select @collector_type_uid_3 = collector_type_uid
                  From [msdb].[dbo].[syscollector_collector_types]
                  Where name = N’Generic T-SQL Query Collector Type';

3. Define the data collection item – this is the part wherein you define your parameters for your data collection set.  This will also contain the actual query to retrieve all SQL failed jobs. As an example see the code below

Declare @collection_item_id_4 int
EXEC [msdb].[dbo].[sp_syscollector_create_collection_item]
                                    @name=N’Failed SQL Jobs Item’,
                                    @parameters=N'<ns:TSQLQueryCollector xmlns:ns=”DataCollectorType”><Query><Value>
                                    SELECT  @@ServerName AS [ServerName],
                                                      [sJOB].[name] AS [JobName],
                                                      [sJOBH].[run_date]  AS [LastRunDateTime],
                                                      CAST([sJOBH].[run_duration] AS VARCHAR(6)) AS [LastRunDuration (HH:MM:SS)],
                                                      [sJOBH].[message] AS [LastRunStatusMessage],
                                                      CAST([sJOBSCH].[NextRunDate] AS CHAR(8)),
                                                      [sJOBSCH].[NextRunDate] AS [NextRunDateTime]
                                    FROM [msdb].[dbo].[sysjobs] AS [sJOB] LEFT JOIN (SELECT [job_id], MIN([next_run_date]) AS [NextRunDate]
                    , MIN([next_run_time]) AS [NextRunTime]
                FROM [msdb].[dbo].[sysjobschedules]
                GROUP BY [job_id] ) AS [sJOBSCH] ON [sJOB].[job_id] = [sJOBSCH].[job_id]
                 LEFT JOIN (SELECT [job_id] , [run_date] , [run_time]          , [run_status], [run_duration], [message], ROW_NUMBER() OVER (PARTITION BY [job_id] ORDER BY [run_date] DESC, [run_time] DESC) AS RowNumber FROM [msdb].[dbo].[sysjobhistory] WHERE [step_id] = 0 ) AS [sJOBH] ON [sJOB].[job_id] = [sJOBH].[job_id] AND [sJOBH].[RowNumber] = 1    WHERE  [sJOBH].[run_status] = ”0”                                    ORDER BY [LastRunDateTime] DESC

                        </Value><OutputTable> FailedJobs</OutputTable> </Query><Databases UseSystemDatabases=”true” UseUserDatabases=”true” /> </ns:TSQLQueryCollector>’, 
                                    @collection_item_id=@collection_item_id_4 OUTPUT,
                                    @frequency=60,
                                    @collection_set_id=@collection_set_id_1,
                                    @collector_type_uid=@collector_type_uid_3;

The <OutputTable> tag contains the destination table for your query results. This table will be created with default schema of custom_snapshots inside your Management Data Warehouse database.

After successfully creating your custom data collection set, you must manually start the data collection. You can either do this via GUI from the Data Collection menu or by executing below under the msdb database.

EXEC sp_syscollector_start_collection_set @collection_set_id = <yourcollectionsetid>;

After the initial upload of data thats the time the output table for your data collection set will be created. Now that you have the data for all failed SQL Agent jobs across your database servers on a daily basis, you can create a report for this via Reporting Services and added to your monitoring reports.

There you have it, your first customised data collection set. From here on you can create more monitoring reports for your dashboard and impress your team lead or boss. :)

 


Developer’s Life – Every Developer is a Spiderman

Originally posted on Journey to SQL Authority with Pinal Dave:

I have to admit, Spiderman is my favorite superhero.  The most recent movie recently was released in theaters, so it has been at the front of my mind for some time.

Spiderman was my favorite superhero even before the latest movie came out, but of course I took my whole family to see the movie as soon as I could!  Every one of us loved it, including my daughter.  We all left the movie thinking how great it would be to be Spiderman.  So, with that in mind, I started thinking about how we are like Spiderman in our everyday lives, especially developers.

Let me list some of the reasons why I think every developer is a Spiderman.

  • We have special powers, just like a superhero.  There is a reason that when there are problems or emergencies, we get called in, just like a superhero!  Our powers might not…

View original 328 more words


unfoldthecreativity

Traveller Observer

The SQL Pro

(Ayman El-Ghazali - www.thesqlpro.com)

Meels on Wheels

Meels for breakfast, lunch and dinner.

juliansherman

Just another WordPress.com site

Paul Turley's SQL Server BI Blog

from the world of Microsoft SQL Server BI and reporting

TroubleshootingSQL

Explaining the bits and bytes of SQL Server and Azure

Coding Tips

We will improve your knowledge

 Live, Learn, Share

Behind-the-Scenes of XenoHemlock.com

#TeamAPE (author, publisher, entrepreneur)

Sql And Me

My Experiments with SQLServer

Dimitrios Kalemis

I am exactly like Jesus Christ: an atheist and anarchist against society and bad people with influence and power (judges, social workers, politicians, priests, and teachers).

Clint Huijbers' Blog

Senior Certified Microsoft BI Consultant

Blog of Many Useless Wonders

Where Uselessness Abounds!

Stevespevack's Blog

IT Recruitment

SQL Problems and Solutions

Sharing the knowledge & experience of SQL, DBA & BI

Glenn Berry's SQL Server Performance

Semi-random musings about SQL Server performance

Algonquin Studios Blog

Ideas and opinions on software development, web design, customer service, sales and marketing...and food

SutoCom Solutions

Success & Satisfaction with the Cloud

Terli Srinivasa rao Blog

Sql server and Tableau practices

Follow

Get every new post delivered to your Inbox.

Join 48 other followers