Thursday, August 27, 2015

Tuning areas where we need to Look


Tuning areas where we need to Look

PAGEIOLATCH—DISK
CXPACKET—Big Queries
SOS_Scheduled_yield—Lots of functions may be missing Indexes
LATCH_EX—Parallelism or missing Indexes 

Wednesday, August 26, 2015

Get SQL Info in detailed


 SELECT
 @@SERVERNAME AS ServerName,
 'SQL Server Year '
        + CASE SUBSTRING(CONVERT(VARCHAR(50), SERVERPROPERTY('productversion')),
                         1, 2)
            WHEN '8.' THEN '2000'
            WHEN '9.' THEN '2005'
            WHEN '10' THEN '2008'
            WHEN '11' THEN '2012'
          END AS SQLServerReleaseName ,
        SERVERPROPERTY('ProductVersion') AS ProductVersion ,
        SERVERPROPERTY('ProductLevel') AS ProductLevel ,
        SERVERPROPERTY('Edition') AS Edition;
GO

Tuesday, August 25, 2015

remove special characters from string in sql server

DECLARE @str VARCHAR(400)
    DECLARE @expres  VARCHAR(50) = '%[~,@,#,$,%,&,*,(,),.,!]%'
      SET @str = '(remove) ~special~ *characters. from string in sql!'
      WHILE PATINDEX( @expres, @str ) > 0
          SET @str = Replace(REPLACE( @str, SUBSTRING( @str, PATINDEX( @expres,@str ), 1 ),''),'-',' ')

      SELECT @str

Friday, August 21, 2015

find out expired credit cards with MM/YY format

I have a table which stores expiry dates of credit cards
ExpiryDate Varchar(10) --datatype
The dates look like this:
'08/10'
'09/11'
'08/16'
'10/17'
how can i find out if a card's date is expired?

ANS: 

SELECT * from Table where isactive=1 
 AND LEFT(ExpiryDate
 

, 2) >= MONTH(GETDATE())
        AND RIGHT(ExpiryDate
 

, 2) >= RIGHT(YEAR(GETDATE()), 2)

Read Permissions on Production Server

what is the harm giving read only access to developers?


2 Scenarios:



1. If I have a very low OLTP server, few databases with only few applications, then there is no harm in giving Read only access to developers.

but

2. what about on a high oltp server, where some developers try to read data from 1 million rows
Select * from Table---? 1 million
or a badly written query; joining multiple tables, query is not optimized or not using proper indexes?

this could Lock tables, use Maximum CPU, have high DISK usage and this could bring production/ application connected to database down .

 an alternate option is replicate the database, tables to a different server(Subscriber) and ask developers to read the data from Subscriber.


Conclusion:
Be careful on giving Read access to Developers and decide whether to give or not based on 2 scenarios mentioned above


Tuesday, August 18, 2015

Outage with Network in SQL Server

We had a small outage today  and tried to look at
1.SQL server logs
2.Event Viewer
3.Any jobs running ?
4.any blocking or deadlocks happened?

the possible outage could be due to
I  was trying to Move 50-70GB .trc files from Production F drive to a local computer or shared drive and that eaten up lot of network bandwidth and caused slow network and few customers got outage for a quick minute .



so, be careful while moving the files from production to a Shared /slow drives outside the network.



SQL Server Profiler trace with T SQL vs GUI

I have created a trace to find out
 ApplicationName,
Databasename,
HostName,
Loginname,
NTDomianName,
NTusernbame,
Servername,
SessionLoginanme,
Starttime,
Textdata
Filter Per database(If required)

And Created a template with
SQL:StmtStarting---this will give all SQL related statements (DDL,DDL)
RPC(remote proceedure):completed,
SP:completed  (SP)


I used GUI, and when i use that my C drive was getting Filled out soon, the reason is when we open a trace files to run via GUI that causes paging and fills out space in C drive quickly

If we script out what we choose and make little modification with T SQL life will be easy to monitor




USE below:
/****************************************************/
/* Created by: SQL Server 2008 R2 Profiler          */
/* Date: 08/18/2015  10:27:12 AM         */
/****************************************************/


-- Create a Queue
declare @rc int
declare @TraceID int
declare @maxfilesize bigint
declare @DateTime datetime

set @DateTime = '2015-08-18 11:00:05.000'------Stop Time
set @maxfilesize = 2048----2GB max File size

-- Please replace the text InsertFileNameHere, with an appropriate
-- filename prefixed by a path, e.g., c:\MyFolder\MyTrace. The .trc extension
-- will be appended to the filename automatically. If you are writing from
-- remote server to local drive, please use UNC path and make sure server has
-- write access to your network share

exec @rc = sp_trace_create
@TraceID output,
2, ----roll over files
N'E:\mssql\August18th',
@maxfilesize,
@Datetime,
@filecount = 'max_rollover_files'
if (@rc != 0) goto error

-- Client side File and Table cannot be scripted

-- Set the events
declare @on bit
set @on = 1
exec sp_trace_setevent @TraceID, 10, 7, @on
exec sp_trace_setevent @TraceID, 10, 8, @on
exec sp_trace_setevent @TraceID, 10, 64, @on
exec sp_trace_setevent @TraceID, 10, 1, @on
exec sp_trace_setevent @TraceID, 10, 10, @on
exec sp_trace_setevent @TraceID, 10, 26, @on
exec sp_trace_setevent @TraceID, 10, 11, @on
exec sp_trace_setevent @TraceID, 10, 35, @on
exec sp_trace_setevent @TraceID, 10, 12, @on
exec sp_trace_setevent @TraceID, 10, 6, @on
exec sp_trace_setevent @TraceID, 10, 14, @on
exec sp_trace_setevent @TraceID, 43, 7, @on
exec sp_trace_setevent @TraceID, 43, 8, @on
exec sp_trace_setevent @TraceID, 43, 64, @on
exec sp_trace_setevent @TraceID, 43, 1, @on
exec sp_trace_setevent @TraceID, 43, 10, @on
exec sp_trace_setevent @TraceID, 43, 26, @on
exec sp_trace_setevent @TraceID, 43, 11, @on
exec sp_trace_setevent @TraceID, 43, 35, @on
exec sp_trace_setevent @TraceID, 43, 12, @on
exec sp_trace_setevent @TraceID, 43, 6, @on
exec sp_trace_setevent @TraceID, 43, 14, @on
exec sp_trace_setevent @TraceID, 41, 7, @on
exec sp_trace_setevent @TraceID, 41, 8, @on
exec sp_trace_setevent @TraceID, 41, 64, @on
exec sp_trace_setevent @TraceID, 41, 1, @on
exec sp_trace_setevent @TraceID, 41, 10, @on
exec sp_trace_setevent @TraceID, 41, 26, @on
exec sp_trace_setevent @TraceID, 41, 11, @on
exec sp_trace_setevent @TraceID, 41, 35, @on
exec sp_trace_setevent @TraceID, 41, 12, @on
exec sp_trace_setevent @TraceID, 41, 6, @on
exec sp_trace_setevent @TraceID, 41, 14, @on


-- Set the Filters
declare @intfilter int
declare @bigintfilter bigint

exec sp_trace_setfilter @TraceID, 35, 0, 6, 'DBNAME'
exec sp_trace_setfilter @TraceID, 35, 0, 1, NULL
-- Set the trace status to start
exec sp_trace_setstatus @TraceID, 1

-- display trace id for future references
select TraceID=@TraceID
goto finish

error:
select ErrorCode=@rc

finish:
go


to check the status of trace running :

 SELECT * FROM ::fn_trace_getinfo(NULL)
--EXEC sp_trace_setstatus @traceid = 2, @status = 0--stop

--EXEC sp_trace_setstatus @traceid = 2, @status = 2--close 


Paging issue with using GUI:
http://www.sqlservercentral.com/Forums/Topic4497-5-1.aspx
http://mssqlwiki.com/2011/07/16/my-c-drive-gets-full-when-i-open-the-profiler-trace/

This tmp file is used by the SQL Server Profiler to buffer the actual output trace files. Once the SQL Server Profiler is closed, these files are deleted automatically.

Friday, August 14, 2015

Use Extended Events to capture Deadlocked Queries


 -- Create a new event session (it is better to create a new session and not modify the system’s built-in session “system_health”):
USE MASTER
GO
CREATE EVENT SESSION [Deadlock_Monitor] ON SERVER
ADD EVENT sqlserver.xml_deadlock_report
ADD TARGET package0.asynchronous_file_target
(SET filename= N'C:\temp\deadlock.xel' )
WITH
(MAX_MEMORY=4096 KB,EVENT_RETENTION_MODE=ALLOW_SINGLE_EVENT_LOSS,MAX_DISPATCH_LATENCY=10 SECONDS,
MAX_EVENT_SIZE=0 KB,MEMORY_PARTITION_MODE=NONE,TRACK_CAUSALITY=OFF,STARTUP_STATE=ON)


-- Enable the session to start capturing events:
USE MASTER
GO
ALTER EVENT SESSION [Deadlock_Monitor] ON SERVER STATE = start;


-- To see how many deadlocks have been captured by the session since it started running, you can run this query:
select COUNT(*) from sys.fn_xe_file_target_read_file ('c:\temp\deadlock*.xel', 'c:\temp\deadlock*.xem', null, null)


-- To get a list of the captured deadlocks and their graphs you can execute this query:
select xml_data.value('(event[@name="xml_deadlock_report"]/@timestamp)[1]','datetime') Execution_Time,
xml_data.value('(event/data/value)[1]','varchar(max)') Query
from (select object_name as event, CONVERT(xml, event_data) as xml_data
from sys.fn_xe_file_target_read_file
('c:\temp\deadlock*.xel', 'c:\temp\deadlock*.xem', null, null)) v order by Execution_Time


-- If you want the session to stop capturing events (until you enable the session again), you can use this query:
ALTER EVENT SESSION [Deadlock_Monitor] ON SERVER STATE = stop;


-- If you want to completely remove (delete) the session from the server, you can use this query:
DROP EVENT SESSION [Deadlock_Monitor] ON SERVER


Source:
http://blogs.msdn.com/b/john_daskalakis/archive/2013/12/02/how-to-use-extended-events-to-proactively-monitor-your-sql-server-for-deadlock-issues.aspx

Minimum SQL Server Alerts that could help

Alerts:Agiast all databases
017--Insufficient resources
018--Non fatal Internal Error
019--Fatal error in internal resource
020--Fatal Error in current process
021--fatal error in current database process
022--Table integrity suspect
023--Database integrity suspect
024--Fatal Error

Error number 823,824,825
Blocking and Deadlocks Alerts 

Thursday, August 6, 2015

DBCC TRACE (3604) DBCC TRACE (3605)

DBCC TRACEON (3604,1)--turn ON


DBCC TRACEON (3605,1)----turn ON




DBCC TRACEOFF(3604, -1) ---turn OFF
GO

DBCC TRACEOFF(3605, -1) --turn OFF
GO

there are advantages and dis advantages of suing above
Disadvantages file gets huge

advantages below:

http://sql-articles.com/articles/general/day-1-trace-flag-3604-3605/

Tip on Recent queries ran


We can find recent queries from cache by using below query:

SELECT
    deqs.last_execution_time AS [Time],
    dest.TEXT AS [Query]
 FROM
    sys.dm_exec_query_stats AS deqs
    CROSS APPLY sys.dm_exec_sql_text(deqs.sql_handle) AS dest
ORDER BY
    deqs.last_execution_time DESC

Once SQL Server restarted or clear the cache using DBCC, all the statements will be gone.


Create a View WITH CHECK OPTION

What does  WITH CHECK OPTION mean? 


CREATE VIEW authors_CA AS
    ( 
        SELECT * FROM Authors WHERE state='CA'
    )
If you now issue the following command:
UPDATE authors_CA SET state='NJ'
then all the records visible in the view are updated with a different State value, and therefore would disappear from the view itself. This is perfectly legal, but can create subtle programming errors. You can avoid this problem by adding the WITH CHECK OPTION predicate when you create the view:
CREATE VIEW authors_CA AS
    ( 
        SELECT * FROM Authors WHERE state='CA'
    )
    WITH CHECK OPTION
Now any insert or update operation that makes a record disappear from the view raises a trappable run time error.

I have used the above today, 
Source:Msdn

Wednesday, August 5, 2015

Send email when DeadlockHappens and track Queries

CAUTION, as this is going to use the error log , please make sure you test in test environment before implementing in Production.






/*Capture deadlocks

CREATE Alert Deadlock ERROR 1205>SQL SERVER Performance Condition Alert>Locks>Number OF deadlocks/sec>_total>raises above 0
Response> CREATE a JOB WITH below script amd notify dba >options email


---To ensure the deadlock related errors are logged
EXEC master.sys.sp_altermessage 1205, 'WITH_LOG', TRUE;
GO
EXEC master.sys.sp_altermessage 3928, 'WITH_LOG', TRUE;
GO

--DBCC TRACEON (3605,1204,1222,-1)
Description:
3605 the results of the DBCC output to the error log.
1204 returns the resources and types of locks participating in the deadlock and the current command affected.
1222 Back to the resources and types of locks participating in the deadlock, and the use does not comply with any XSD schema in XML format affected the current command (further than 1204, SQL 2005 and available).
-1 Open the specified trace flag globally.
DBCC TRACEON (1204, -1)
DBCC TRACEON (1222, -1)----this is going to add 100 lines in log file


*/



--== This is for SQL 2005 and higher. ==--
--== We will create a temporary table to hold the error log detail. ==--
--== Before we create the temporary table, we make sure it does not already exist. ==--
 IF OBJECT_ID('tempdb.dbo.ErrorLog') IS Not Null
 BEGIN
 DROP TABLE tempdb.dbo.ErrorLog
 END
 --== We have checked for the existence of the temporary table and dropped it if it was there. ==--
 --== Now, we can create the table called tempdb.dbo.ErrorLog ==--
CREATE TABLE tempdb.dbo.ErrorLog (Id int IDENTITY (1, 1) NOT NULL,
logdate DATETIME, procInfo VARCHAR(10), ERRORLOG VARCHAR(MAX))
--== We create a 3 column table to hold the contents of the SQL Server Error log. ==--
--== Then we insert the actual data from the Error log into our newly created table. ==--
 INSERT INTO tempdb.dbo.ErrorLog
 EXEC master.dbo.sp_readerrorlog 0
--== With our table created and populated, we can now use the info inside of it. ==--
 BEGIN
--== Set a variable to get our instance name. ==--
--== We do this so the email we receive makes more sense. ==--
 declare @servername nvarchar(150)
 set @servername = @@servername
--== We set another variable to create a subject line for the email. ==--
 declare @mysubject nvarchar(200)
 set @mysubject = 'Deadlock event notification on server '+@servername+'.'
 --== Now we will prepare and send the email. Change the email address to suite your environment. ==--
 EXEC msdb.dbo.sp_send_dbmail
 @profile_name='Default',
 @recipients='abc@xyz.com,
 @subject = @mysubject,
 @body = 'Deadlock has occurred. View attachment to see the deadlock info',
 @query = 'select logdate, procInfo, ERRORLOG from tempdb.dbo.ErrorLog where Id >= (select TOP 1 Id from tempdb.dbo.ErrorLog WHERE ERRORLOG Like ''%Deadlock encountered%'' order by Id DESC)',
 @query_result_width = 600,
 @attach_query_result_as_file = 1
 END
 --== Clean up our process by dropping our temporary table. ==--
 DROP TABLE tempdb.dbo.ErrorLog



 --------------testing

-----------------------------------TESTING Below
 --DROP TABLE ##Employees
 --DROP TABLE ##Suppliers

USE AuditDB
 GO
 CREATE TABLE ##Employees ( EmpId INT IDENTITY, EmpName VARCHAR(16), Phone VARCHAR(16) )
 GO
 INSERT INTO ##Employees (EmpName, Phone)
 VALUES ('Martha', '800-555-1212'), ('Jimmy', '619-555-8080')
 GO
 
 CREATE TABLE ##Suppliers( SupplierId INT IDENTITY, SupplierName VARCHAR(64), Fax VARCHAR(16) )
 GO
 INSERT INTO ##Suppliers (SupplierName, Fax) VALUES ('Acme', '877-555-6060'), ('Rockwell', '800-257-1234')
 GO

 --TAB1
 BEGIN TRAN;
 UPDATE ##Suppliers
 SET Fax = N'676-1019'
 WHERE supplierid = 1


 ----IN the query tab 2
 --BEGIN TRAN;
 --UPDATE ##Employees
 --SET EmpName = 'Brian'
 --WHERE empid = 1



 --Now again in the query tab 1 EXECUTE
 UPDATE ##Employees
 SET phone = N'555-9999'
 WHERE empid = 1
 COMMIT


 ----And in the query tab 2  EXECUTE
 --UPDATE ##Suppliers
 --SET Fax = N'676-1019'
 --WHERE supplierid = 1
 --COMMIT 

Send email when Blocking Happens and Track Queries

 --This script will send an email to operator if blocking process counter rise above 0
USE [msdb]
GO
EXEC msdb.dbo.sp_update_alert @name=N'Blocking',
@message_id=0,
@severity=0,
@enabled=1,
@delay_between_responses=120,
@include_event_description_in=0,
@database_name=N'',
@notification_message=N'',
@event_description_keyword=N'',
@performance_condition=N'SQLServer:General Statistics|Processes blocked||>|0',
@wmi_namespace=N'',
@wmi_query=N'',
@job_id=N'401a9e4d-ae94-47ca-9ac4-deb4c244690a'
GO










-----------------------------JOB NAME "Blocking Job"
 --------------------when blocking happens to notify what script blocking , we can have the below script run to keep track of blocking transaction
--CREATE TABLE AuditBlocking
--    (
--      session_id INT ,
--      Databasename VARCHAR(100) ,
--      DatetimeEvent DATETIME NOT NULL
--                             DEFAULT ( GETDATE() ) ,
--      SQL_text VARCHAR(MAX) ,
--      blocking_session_id INT ,
--      wait_time INT ,
--      wait_type VARCHAR(100) ,
--      last_wait_type VARCHAR(100) ,
--      wait_resource VARCHAR(100) ,
--      transaction_isolation_level INT ,
--      lock_timeout INT
--    )
--GO


      WAITFOR DELAY '00:01:00';
      IF EXISTS ( SELECT    session_id ,
                            DB_NAME(database_id) AS 'Database_Name' ,
                            GETDATE() ,
                            sql_text.text AS 'Text' ,
                            blocking_session_id ,
                            wait_time --Milliseconds(1000 miliosec=1 sec) ,
                            wait_type ,
                            last_wait_type ,
                            wait_resource ,
                            transaction_isolation_level ,
                            ( er.wait_time / 1000 ) ,--added this
                            lock_timeout
                  FROM      sys.dm_exec_requests er
                            CROSS APPLY sys.dm_exec_sql_text(sql_handle) AS sql_text
                  WHERE     ( er.wait_time / 1000 ) > 60 --added this
)
        BEGIN

            INSERT  INTO AuditBlocking---i have created this table before
                    SELECT  session_id ,
                            DB_NAME(database_id) AS 'Database_Name' ,
                            GETDATE() ,
                            sql_text.text AS 'Text' ,
                            blocking_session_id ,
                            wait_time --Milliseconds(1000 miliosec=1 sec) ,
                            wait_type ,
                            last_wait_type ,
                            wait_resource ,
                            transaction_isolation_level ,
                            ( er.wait_time / 1000 ) ,--added this
                            lock_timeout
                    FROM    sys.dm_exec_requests er
                            CROSS APPLY sys.dm_exec_sql_text(sql_handle) AS sql_text
                    WHERE   ( er.wait_time / 1000 ) > 60;  --added this


            DECLARE @AlertMessage VARCHAR(MAX);
            SET @AlertMessage = 'There is Blocking happening on' + ' '
                + @@SERVERNAME + ' '
                + 'Server. Please see AuditDB.dbo.AuditBlocking Table for more Information';
            EXEC msdb.dbo.sp_send_dbmail @profile_name = 'Default',
                @recipients = 'abc@xyz.com', @body = @AlertMessage,
                @importance = 'Low', @subject = 'Blocking Alert';

        END;















-----------------------------TESTING
USE AuditDB
GO
CREATE TABLE ##Employees
    (
      EmpId INT IDENTITY ,
      EmpName VARCHAR(16) ,
      Phone VARCHAR(16)
    )
GO
INSERT  INTO ##Employees
        (EmpName,Phone )
VALUES  ('Martha','800-555-1212'),
        ('Jimmy','619-555-8080')
GO
 
CREATE TABLE ##Suppliers
    (
      SupplierId INT IDENTITY ,
      SupplierName VARCHAR(64) ,
      Fax VARCHAR(16)
    )
GO
INSERT  INTO ##Suppliers
        ( SupplierName, Fax )
VALUES  ('Acme', '877-555-6060' ),
        ('Rockwell', '800-257-1234' )
GO

USE AuditDB
GO
BEGIN TRANSACTION
SELECT  *
FROM    ##Employees WITH ( TABLOCKX, HOLDLOCK );
WAITFOR DELAY '00:02:00'
 ---Wait a minute!
ROLLBACK TRANSACTION
--Release the lock



 ----another query window
 SELECT *
 FROM   ##Employees





 --------------------------check results


 SELECT * FROM dbo.AuditBlocking
 ORDER BY DatetimeEvent asc




SELECT * FROM AuditBlocking
--WHERE   blocking_session_id>=50


/*testing
 SELECT * FROM AuditBlocking
 SP_who2 'Active'

--SELECT * FROM sys.messages
--WHERE message_id IN (823,824,825) AND language_id=1033

--SELECT * FROM sys.messages
--WHERE severity IN (016,017,018,019,020,021,022,023,024,025) AND language_id=1033
*/

how to make views run faster in SQL Server

Hi I have created a VIEW and with in the view i have bunch of select queries with multiple table joins.So when i run the view it took me 45 seconds to get 100000 rows,even when i added a clustered index on the view , it took same time and also the execution plan showed it was not using the Index at all.

My manger told me to add  WITH (NOEXPAND)
SELECT * from View WITH (NOEXPAND)
--it took me 2 seconds to get 100000 rows

...

https://blog.sqlauthority.com/2009/06/27/sql-server-fix-error-17892-logon-failed-for-login-due-to-trigger-execution-changed-database-context...