My company has just gone though its annual review process, and I have finally convinced them that it is time to find a better solution to manage our SQL schema/scripts. Cur
Try "SQL Effects Clarify" which is pretty good tool to compare most of the objects including row counts for FREE. Also there are tools that compares data too.
I'm in the "script it yourself" camp, as third-party products will only get you so far at managing database code. I don't have one script per object, because objects change over time, and nine times out of ten merely updating my "create table" script to have three new columns would be inadequate.
Creating databases is, by and large, trivial. Set up a bunch of CREATE scripts, order them properly (create database before schemas, schemas before tables, tables before procedures, called procedures before calling procedures, etc), and your done. Managing database change is not nearly as simple:
Essentially, what I have is a CREATE script for each database, followed by a series of ALTER scripts as our code base changes over time. Every script checks whether or not it can be run: is this the right "kind" of database, have the necessary prerequisite scripts been run, has this script already been run. Only when the checks are passed will the script perform its changes.
Tool-wise, we use SourceGear Fortress for basic source control, Redgate SQL Compare for general support and trouble-shooting, and a number of home-grown scripts based on SQLCMD for "bulk" deployment of the alter scripts to multiple servers and databases and to track who applied what scripts to which databases at what time. End result: all our databases are consistant and stable, and we can readly prove what version any one is or was at any point in time.
I have an open-source (licensed under LGPL) toolset project which tries to address the issues related to proper DB schema versioning for (and more) SQL Server (2005/2008/Azure), the bsn ModuleStore. The whole process is very close to the concept explained by Philip Kelley's post here.
Basically, the standalone part of the toolset scripts the SQL Server DB objects of a DB schema into files with a standard formatting applied, so that the file contents only changes if the object really did change (very much in contrast to the scripting done by VS, which scripts some scripting date etc. as well, marking all objects as changed even if they are in fact identical).
But the toolset goes beyond that if you use .NET: it allows you to embed the SQL scripts into the library or application (as embedded resources) and then have it compare the embedded scripts with the current state in the database. Non-table-related changes (those that are not "destructive changes" as per Martin Fowler's definition) can be applied automatically or on request (e.g. creating and removing objects such as views, functions, stored procedures, types, indexes), and change scripts (which need to be written manually though) can be applied in the same process as well; new tables are also created, optionally along with their setup data. After the update, the DB schema is again compared against the scripts in order to ensure a successful DB upgrade before the changes are committed.
Note that the whole scripting and comparison code works without SMO, so that you don't have the painful SMO dependency when using the bsn ModuleStore in applications.
Depending on how you want to access the database, the toolset offers even more - it implements some ORM capabilities and offers a very nice and useful interface-based approach to invoke stored procedures, including transparent support for XML with native .NET XML classes and also for TVPs (Table-Valued Parameters) as IEnumerable<PocoClass>
.
Here is my script to track stored proc and udf and triggers into a table.
Create a table to hold existing stored proc source code
INsert the table with all existing trigger and script data
Create a DDL trigger to monitor changes on them
/****** Object: Table [dbo].[sysupdatelog] Script Date: 9/17/2014 11:36:54 AM ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[sysupdatelog] (
[id] [bigint] IDENTITY(1, 1) NOT NULL
,[UpdateUsername] [nvarchar](1000) NULL
,[actionname] [nvarchar](1000) NULL
,[objectname] [nvarchar](1000) NULL
,[type] [nvarchar](1000) NULL
,[createDate] [datetime] NULL
,[content] NTEXT NULL
,CONSTRAINT [PK_sysupdatelog] PRIMARY KEY CLUSTERED ([id] ASC) WITH (
PAD_INDEX = OFF
,STATISTICS_NORECOMPUTE = OFF
,IGNORE_DUP_KEY = OFF
,ALLOW_ROW_LOCKS = ON
,ALLOW_PAGE_LOCKS = ON
) ON [PRIMARY]
) ON [PRIMARY]
GO
ALTER TABLE [dbo].[sysupdatelog] ADD CONSTRAINT [DF__sysupdate__conte__4EDE7CE6] DEFAULT('')
FOR [content]
GO
INSERT INTO [dbo].[sysupdatelog] (
[UpdateUsername]
,[actionname]
,[objectname]
,[type]
,[createDate]
,[content]
)
SELECT 'sa'
,'loginitialdata'
,r.ROUTINE_NAME
,r.ROUTINE_TYPE
,GETDATE()
,r.ROUTINE_DEFINITION
FROM INFORMATION_SCHEMA.ROUTINES r
UNION
SELECT 'sa'
,'loginitialdata'
,v.TABLE_NAME
,'view'
,GETDATE()
,v.VIEW_DEFINITION
FROM INFORMATION_SCHEMA.VIEWS v
UNION
SELECT 'sa'
,'loginitialdata'
,o.NAME
,'trigger'
,GETDATE()
,m.DEFINITION
FROM sys.objects o
JOIN sys.sql_modules m ON o.object_id = m.object_id
WHERE o.type = 'TR'
GO
CREATE TRIGGER [SCHEMA_AUDIT] ON DATABASE
FOR CREATE_PROCEDURE
,ALTER_PROCEDURE
,DROP_PROCEDURE
,CREATE_INDEX
,ALTER_INDEX
,DROP_INDEX
,CREATE_TRIGGER
,ALTER_TRIGGER
,DROP_TRIGGER
,ALTER_TABLE
,ALTER_VIEW
,CREATE_VIEW
,DROP_VIEW AS
BEGIN
SET NOCOUNT ON
DECLARE @data XML
SET @data = Eventdata()
INSERT INTO sysupdatelog
VALUES (
@data.value('(/EVENT_INSTANCE/LoginName)[1]', 'nvarchar(255)')
,@data.value('(/EVENT_INSTANCE/EventType)[1]', 'nvarchar(255)')
,@data.value('(/EVENT_INSTANCE/ObjectName)[1]', 'nvarchar(255)')
,@data.value('(/EVENT_INSTANCE/ObjectType)[1]', 'nvarchar(255)')
,getdate()
,@data.value('(/EVENT_INSTANCE/TSQLCommand/CommandText)[1]', 'nvarchar(max)')
)
SET NOCOUNT OFF
END
GO
SET ANSI_NULLS OFF
GO
SET QUOTED_IDENTIFIER OFF
GO
ENABLE TRIGGER [SCHEMA_AUDIT] ON DATABASE
GO