Thursday, March 21, 2013

Maitanance Plan fails with error:Alter failed for Server 'servername'.




Hi , Today morning i saw my maintenance plan job to do differential backups got failed and i saw the error
Alter failed for Server 'servername'., oops i never saw this error , tried  goggling and found some help

so what happened was my Windows team  changed the CPU I/O Affinity settings and which made the changes to SQL Server

So to get your any maintenance plan running go ahead and run:
    sp_configure 'allow updates', 0

    reconfigure with override


and execute the Maintenance Job , it will run successfully  

Thanks
03/21/2013

Thursday, March 14, 2013

Differential Backup Job Fails

Hi,

I have 4 databases , all are in FULL recovery model :DB1,DB2,DB3,DB4....

I have taken full backups of DB1 , DB3, DB4 and i did not take DB2 full backup , now when i try take
differential backups of DB1,DB2,DB3,DB4 the job fails saying DB2 don't have a full backup,
what happens to DB1,DB3, DB4 database differential backups ?

The differential backups for DB1 , DB3 and  DB4 will be successful , SQL server job executes DB1 diff backups , then search for DB2 full backup, if not found it would go to DB3, then DB4 and do differential backups then throws the error saying DB2 don't have a Full backup ....


Thanks







Trigger : logs in table and email whenever a database is created or dropped and a object is created altered or dropped


the below script with steps included will create databses , tables and triggers under server and test databse
the AuditNewDatabaseCreation1 will emails and logs details in table whene ever a databse is created or dropped and Trigger:AuditObjects1 will be created in test database  and when ever a object is created , altered, dropped , it lohs information and emails


-----Step1: Create Database: AuditDatabase

-----Step2:Create table as below

USE [AuditDatabase]
 CREATE TABLE [dbo].[DatabaseCreateDropTracking](
    [TrackingId] [int] IDENTITY(1,1) NOT NULL,
    [TrackingDate] [datetime] NOT NULL,
    [DatabaseName] [varchar](100) NULL,
    [EventType] [varchar](100) NULL,
    [LoginName] [varchar](100) NULL,
    [InstanceName] [varchar](100) NULL,
    [SQLCommand] [nvarchar](max) NULL
) ON [PRIMARY]

ALTER TABLE [dbo].[DatabaseCreateDropTracking] ADD  CONSTRAINT [DF_DatabaseCreateDropTracking_TrackingDate]  DEFAULT (getdate()) FOR [TrackingDate]
GO

-----Step3:create table
USE [AuditDatabase]
--CREATE TABLE [dbo].[ObjectTracking](
    [TrackingId] [bigint] IDENTITY(1,1) NOT NULL,
    [TrackingDate] [datetime] NOT NULL,
    [DatabaseName] [varchar](500) NULL,
    [EventType] [varchar](500) NULL,
    [ObjectName] [varchar](500) NULL,
    [ObjectType] [varchar](500) NULL,
    [LoginName] [varchar](500) NULL,
    [HostName] [varchar](500) NULL,
    [SqlCommand] [nvarchar](max) NULL
) ON [PRIMARY]

ALTER TABLE [dbo].[ObjectTracking] ADD  DEFAULT (getdate()) FOR [TrackingDate]
GO

------Step4:THIS  below TRIGGERS EMAILS when a database is created or dropped  and keep track in tables in AuditDatabase   database

/****** Object:  DdlTrigger [AuditNewDatabaseCreation1]    Script Date: 03/14/2013 09:01:15 ******/
SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO


CREATE TRIGGER [AuditNewDatabaseCreation1]
ON ALL SERVER
FOR
-- DDL_DATABASE_LEVEL_EVENTS
-- Defining all the DDL database events on which we need track
CREATE_DATABASE, DROP_DATABASE
AS
BEGIN
--DECLARE @dbname varchar(500)
DECLARE @data XML
DECLARE @InstanceName varchar(100)
Declare @tsql varchar(max)
DECLARE @bd varchar(max)

--SET @dbname = DB_NAME()
SET @data = EVENTDATA()
SET @InstanceName = HOST_NAME()

INSERT INTO AuditDatabase.dbo.DatabaseCreateDropTracking(DatabaseName,EventType,LoginName,InstanceName,SqlCommand)
VALUES(
--@dbname,
@data.value('(/EVENT_INSTANCE/DatabaseName)[1]', 'varchar(100)'),
@data.value('(/EVENT_INSTANCE/EventType)[1]', 'varchar(100)'),
@data.value('(/EVENT_INSTANCE/LoginName)[1]', 'varchar(100)'),
@InstanceName,
@data.value('(/EVENT_INSTANCE/TSQLCommand)[1]', 'varchar(max)')
)
 
Set @tsql = EVENTDATA().value
        ('(/EVENT_INSTANCE/TSQLCommand/CommandText)[1]','varchar(max)')
SET @bd = 'UserName: ' + UPPER(SUSER_NAME()) + '

           ServerName: ' + @@SERVERNAME + '

           Time: '    + CONVERT(varchar(25),Getdate()) + '

           HostName: ' + HOST_NAME() + '

           Database: ' + db_name() + '

           T-SQL: ' +  @tsql
           
IF @bd LIKE '%drop%'
--;astratto@chesapeakebay.net

EXEC msdb.dbo.sp_send_dbmail @profile_name = 'Database Mail ',
                             @recipients = 'xyz@abc.com',                      
                             @subject = 'A new database has been dropped ',
                             @body_format = 'HTML',
                             @importance = 'High',
                             @body = @bd
               

ELSE
--;astratto@chesapeakebay.net
EXEC msdb.dbo.sp_send_dbmail @profile_name = 'Database Mail',
                             @recipients = 'xyz@abc.com',                      
                             @subject = 'A new database has been created ',
                             @body_format = 'HTML',
                             @importance = 'High',
                             @body = @bd
               

END
GO

SET ANSI_NULLS OFF
GO

SET QUOTED_IDENTIFIER OFF
GO

ENABLE TRIGGER [AuditNewDatabaseCreation1] ON ALL SERVER
GO

----STEP6:
CREATE DATABASE TEST
GO


USE [test]
GO

 CREATE  TRIGGER [AuditObjects1]
ON DATABASE
FOR
-- DDL_DATABASE_LEVEL_EVENTS
-- Defining all the DDL database events on which we need track
CREATE_PROCEDURE, ALTER_PROCEDURE, DROP_PROCEDURE,
CREATE_TABLE, ALTER_TABLE, DROP_TABLE,
CREATE_FUNCTION, ALTER_FUNCTION, DROP_FUNCTION,
CREATE_VIEW,ALTER_VIEW,DROP_VIEW,
CREATE_INDEX,ALTER_INDEX,DROP_INDEX
AS

BEGIN



DECLARE @data XML
DECLARE @HostName varchar(500)
SET @data = EVENTDATA()
SET @HostName = HOST_NAME()
Declare @tsql varchar(max)
DECLARE @bd varchar(max)



INSERT INTO AuditDatabase.dbo.ObjectTracking(DatabaseName,EventType,ObjectName,ObjectType,LoginName,HostName,SqlCommand)
VALUES(
@data.value('(/EVENT_INSTANCE/DatabaseName)[1]', 'varchar(500)'),
@data.value('(/EVENT_INSTANCE/EventType)[1]', 'varchar(500)'),
@data.value('(/EVENT_INSTANCE/ObjectName)[1]', 'varchar(500)'),
@data.value('(/EVENT_INSTANCE/ObjectType)[1]', 'varchar(500)'),
@data.value('(/EVENT_INSTANCE/LoginName)[1]', 'varchar(500)'),
@HostName,
@data.value('(/EVENT_INSTANCE/TSQLCommand)[1]', 'varchar(max)')
)

Set @tsql = EVENTDATA().value
        ('(/EVENT_INSTANCE/TSQLCommand/CommandText)[1]','varchar(max)')
       
SET @bd = 'UserName: ' + UPPER(SUSER_NAME()) + '

           ServerName: ' + @@SERVERNAME + '

           Time: '    + CONVERT(varchar(25),Getdate()) + '

           HostName: ' + HOST_NAME() + '

           Database: ' + db_name() + '

           T-SQL: ' +  @tsql
           

--PRINT 'Database has been created or dropped '
           
IF @bd LIKE '%CREATE%'
EXEC msdb.dbo.sp_send_dbmail @profile_name = 'CBPO Database Mail Admins',
                             @recipients = 'xyz@abc.com',
                             @subject = 'An Object is created',
                             @body_format = 'HTML',
                             @importance = 'High',
                             @body = @bd
                           
   IF @bd LIKE '%ALTER%'
   EXEC msdb.dbo.sp_send_dbmail @profile_name = 'Database Mail',
                             @recipients = 'xyz@abc.com',
                             @subject = 'An Object is altered  ',
                             @body_format = 'HTML',
                             @importance = 'High',
                             @body = @bd
                           
 
 
   IF @bd LIKE '%DROP%'    
      EXEC msdb.dbo.sp_send_dbmail @profile_name = 'Database Mail',
                             @recipients = 'xyz@abc.com',
                             @subject = 'An Object is Dropped ',
                             @body_format = 'HTML',
                             @importance = 'High',
                             @body = @bd      
END


GO



ENABLE TRIGGER [AuditObjects1] ON DATABASE
GO


Friday, March 1, 2013

ADD identity to existing Table column


EX: I have a TABLE

CREATE TABLE tempdb
( recordid INT NOT NULL, NAME VARCHAR(10) NULL)
INSERT INTO tempdb VALUES ( 1, 'NICK')

Alter Table  dbo. tempdb
Add recordId  Int Identity(1, 1)
GO
----Column names in each table must be unique. Column name 'recordId' in table 'dbo.tempdb' is specified more than once.

 you cannot add the IDENTITY property to an existing column.

However, you can add a new IDENTITY column to the table:WITH similar NAME recordId1

Alter Table  dbo. tempdb
Add recordId1  Int Identity(1, 1)
GO
--THEN DROP  the old COLUMN recordId
ALTER TABLE tempdb
DROP COLUMN recordId


 rename the new COLUMN recordId1 TO recordId
sp_RENAME 'tempdb.recordid1', 'recordid' , 'COLUMN'
GO

refresh the table AND RIGHT click ON table  choose design change the physical location OF the COLUMN TO FIRST COLUMN

Thanks

Daily Checklist





I always keep this checklist with me 

 1. Check OS Event Logs and SQL Server Logs for all or specific events.
 2. Verify that all scheduled jobs have run successfully, if failed try to solve them depending on   the
        Type of job and priority.
 3. Check that backups have been made and successfully saved to a secure location.
 4. Monitor disk space to ensure your SQL Servers won’t run out of disk space. For best performance, all    disks should have 15% or more of free space. I keep a log of tempdb, and all user databases mdf and ldf in excel  
 5.  Regularly monitor and identify blocking issues. ( I do SP_WHO2) and
     SELECT * FROM sys.sysprocesses
 6. Create SQL Server alerts to notify you of potential problems like space alerts on all drives , and have them e-mailed to you.
 7. Regularly restore backups to a test server in order to verify that you can restore them. You don’t need to restore all backups every day, but do so often to ensure that you are confident you have good backups.
8.check and monitor index fragmentation %
  

Disabling vs. Dropping Indexes


In order to improve your applications and your databases, they will need to change over time. The structure of the database changes, the structure of the tables change, the data in the tables change, the application changes, the queries against the data change. Indexes that once aided performance now sit unused while new indexes are added.
When index needs change and you want to test how removing an index will affect performance, you have two options – you can disable or drop the index. Which is better?
Disabling an Index
To disable in index, issue an ALTER INDEX command.
ALTER INDEX IX_IndexName ON Schema.TableName DISABLE
What happens when you do this? The metadata about the index is retained, as are the usage statistics in sys.dm_db_index_usage_stats. The index is not usable by the query optimizer, however. If you have an index hint in a query specifying the index, the query will fail. Also, foreign key constraints are disabled.
Disabling a nonclustered index will delete the index pages – the space is freed in the database.
Disabling a clustered index has additional effects. The data in the table still exists, but will be inaccessible for anything other than a drop or rebuild operation. All related nonclustered indexes and views are also unavailable. Foreign key constraints that reference the table are disabled. Queries against the table will fail.
If you want to re-enable the index, you need to rebuild it – there is no ALTER INDEX…ENABLE command.
ALTER INDEX IX_IndexName ON Schema.TableName REBUILD
When you rebuild an index, the usage stats will be reset in sys.dm_db_index_usage_stats.
Dropping an Index
To drop a clustered or nonclustered index, issue a DROP INDEX command.
DROP INDEX IndexName ON Schema.TableName
When you do this, the metadata, statistics, and index pages are removed. If you drop a clustered index, the table will become a heap.
Once an index has been dropped, it can’t be rebuilt – it must be created again. Thus, if you are going to drop an existing index, make sure you script the CREATE statement also. If it is determined that you need it again in the future, it will be much easier to open the script than try to remember how you had it set up previously!
Do you want to maintain statistics? 
The biggest difference between disabling and dropping an index is whether the metadata and statistics are persisted. If disabled, they are. If dropped, they are not. Make sure you carefully weigh your options before performing either action, and always have a way to recreate the index available.

exclusive access could not be obtained as database is in use when restoring the databse

I was trying to restore the database from production to our development environment today and got the error

exclusive access could not be obtained as database is in use  while restoring the database , this error occurs when some users or some job or some other process is using the database ..
so to restore the database  successfully please follow the steps


USE [master]
GO

ALTER DATABASE mydb SET SINGLE_USER WITH ROLLBACK IMMEDIATE;
GO

Then RESTORE the DATABASE


IF YOU STILL GET THE ERROR
exclusive access could not be obtained as database is in use , EVEN IF THE DATABASE IS IN SINGLE
USER MODE

CHECK SP_WHO2, sys processes and


CLOSE the connections , CLOSE SSMS AND

OPEN again SSMS 
then RESTORE the DATABASE it will be Restored successfully this time





Thanks





Tempdb gets filled out

Hi , I want to share some strange or may be normal performance issue i faced today .
one of my developers called me today and said F drive is out of Space.Then i went across the windows and sql logs and sys.process, ldf files of all databases  and found nothing useful for 1 hour of my time  and found tempdb mdf  file is 100GB , so hardly i could do was to shrink Tempdb ldf and then mdf which bought me space back ,

then again after my developer ran his insert statement it got filled out again  , so i asked his to pass me the insert querey he is running which is doing heavy UNION ALL,HAVING  and ORDER BY  clauses and extracting 2000 rows only  , then i went across Indexes fragmentation , Statistics info  and found everything looks ok but still tempdb is 100GB, so i used  display estimated execution plan with the querey my developer is using  and it  gave me info to add a non clustered index on particular columns , then i created an non-clustered index , and shrink the tempdb mdf and ldf , then the space came back to 100GB free ...
and then i asked my developer to run the querey again , which ran very well and tempdb did not get filled ...

Always use display estimated execution plan for the queries you are about to run , it costs nothing ..


SO Tempdb gets filled for many reasons
Tempdb gets filled very quickly when we run on large result sets including UNIONS, ORDERBY, GROUPBY clauses,

another reason the tempdb can grow is due to a query returning heavy data,

Worktables that hold intermediate results created during query processing and sorting
Materialized static cursors



Thanks

https://blog.sqlauthority.com/2009/06/27/sql-server-fix-error-17892-logon-failed-for-login-due-to-trigger-execution-changed-database-context...