3/01/2012

Scott Hanselman - Visual Studio 11 Beta in Context


Today Visual Studio 11 Beta is released and available for download. Don't want to read a big blog post? Phooey on you then! ;)
Made it this far? OK, cool. I wanted to do a post that would not only point you to a bunch of other resources, but more generally answer the obvious questions. The questions that I asked before I went to work for Microsoft four years ago. I always ask: What's changed, and why should I care?

"One ASP.NET"

One of the things that the fellows and I are working on that will be more obvious  after the beta and even a little after the final release is this idea of One ASP.NET. We're sweeping through the whole framework, samples, templates and NuGet to make sure things work together cohesively. You'll hear more about this as we firm it up.
Some guiding principles for ASP.NET are these:
  • Every reasonable combination of subsystems just works
  • The model is easily extended by community projects (not just us!)
  • Every subsystem or application type includes a *.Sample that works together with others
Here's the boxes diagram I've been playing with.

These principles are some of the things that drove (and continue to drive) ASP.NET this development cycle. We are trying to give you great mobile options, great HTML5, CSS3 and JavaScript support and also including open source libraries like Modernizr, jQuery, jQuery UI and Knockout.
We are working towards a more pluggable, more friendly but still powerful ASP.NET. Again, more on this soon and some surprises. We'll see more interesting uses of NuGet, more plug-ablity, more examples, and more systems working together.
You want to mix and match a ASP.NET Web API, serialization with JSON.NET, use MongoDB, run Web Pages and Web Forms side by side, add some some SignalR and a few WCF enterprise Web Services? Add in ELMAH, Glimpse, Image Resizer and your favorite NuGet packages? Totally. It's Encouraged. It's all One ASP.NET.

.NET Framework 4.5 Beta

For the most part in my experience, .NET 4.5 is a very compatible release. .NET 4.5 upgrades .NET 4 as .NET 3.5 upgraded .NET 3 (and 2, although we're trying to play by the versioning rules now, thank goodness.) The vast majority of .NET 4 apps should work fine on .NET 4.5 unless you are doing something exotic. I haven't had any problems myself, but I've heard of some unusual edge cases with folks doing ILMerge and a few other things.
There's a number of new improvements. Some of my personal favorites and features I'm most interested in are (these are somewhat obscure, but nice fixes, IMHO):
  • Ability to limit how long the regular expression engine will attempt to resolve a regular expression before it times out.
  • Zip compression improvements to reduce the size of a compressed file.
  • Better performance when retrieving resources.
  • Updates to MEF to better support generics.
  • new Asynchronous methods in I/O classes for Asynchronous File Operations
  • Support for Internationalized Domain Name parsing
  • WPF Ribbon Control
  • WCF HTTPS protocol mapping
  • WCF Asynchronous streaming support
  • WCF Contract-first development  as well as ?singleWSDL for service URLs
Test your apps and PLEASE tell us if you have trouble. This is a beta and there is a still time to fix things.
Please don’t hesitate to post a comment on team blogs, or at one of the forums that are actively monitored: Connect (report bugs), UserVoice (request features) and MSDN Forums (ask for help). I know that folks have issues with Connect sometimes, but folks are actively monitoring all these places and are trying to get back to you with clear answers.

ASP.NET Core Framework

Here's a detailed release notes document about what's new in ASP.NET 4.5 and Visual Studio "Web Developer" 11 Beta. The core ASP.NET framework has a lot of new support around asynchrony. Asynchrony has been a theme throughout the whole Visual Studio 11 process and ASP.NET is full of improvements around this area.
There's support for the await keyword, and Task-based modules and handlers.
private async Task ScrapeHtmlPage(object caller, EventArgs e) 

{    

   WebClient wc = new WebClient();    

   var result = await wc.DownloadStringTaskAsync("http://www.microsoft.com");   

   // Do something with the result

}
Even IHttpAsyncHandler (a classic, and a difficult thing to get right) has a friend now:
public class MyAsyncHandler : HttpTaskAsyncHandler

{    

    // ...     

    // ASP.NET automatically takes care of integrating the Task based override    

    // with the ASP.NET pipeline.    

    public override async Task ProcessRequestAsync(HttpContext context)    

    {        

       WebClient wc = new WebClient();        

       var result = await             

       wc.DownloadStringTaskAsync("http://www.microsoft.com");        

       // Do something with the result    

    }

}
There's security improvements with the inclusion of core encoding routines from the popular AntiXssEncoder library, and you can plug in your own.
ASP.NET also has WebSockets support when running on Windows 8:
public async Task MyWebSocket(AspNetWebSocketContext context) 

{    

   WebSocket socket = context.WebSocket;    

   while (true)    

   {

   ...

   }

}
Bundling and Minification is built in and is also pluggable so you can swap out own techniques for your own, or your favorite open source alternative.
There's lots of performance improvements including features for dense workloads that can get up to a 35% reduction in startup time and memory footprint with .NET 4.5 and Windows 8.
ASP.NET 4.5 also supports multi-core JIT compilation for faster startup and more support for tuning the GC for your server's specific needs.

ASP.NET Web Forms

There's lots of refinements and improvements in Web Forms. Some favorites are strongly-typed data controls. I blogged about this before in my Elegant Web Forms post. There's two way data-binding in controls like the FormView now instead of using Eval() and you'll also get intellisense in things like Repeaters with strongly typed modelTypes.
Web Forms also gets Model Binding (all part of the One ASP.NET strategy) which is familiar to ASP.NET MVC users. Note the GetCategories call below that will bind to a View with IQueryable.
public partial class Categories : System.Web.UI.Page

{

    private readonly DemoWhateverDataContext _db = new DemoWhateverDataContext();

 

    public void Page_Load()

    {

        if (!IsPostBack)

        {

            // Set default sort expression

            categoriesGrid.Sort("Name", SortDirection.Ascending);

        }

    }

 

    public IQueryable<Category> GetCategories()

    {

        return _db.Categories;

    }

}
In this example, rather than digging around in the Request.QueryString, we get our keyword parameter this way:
public IQueryable<Product>GetProducts([QueryString]string keyword) 

{    

  IQueryable<Product> query = _db.Products;

  if (!String.IsNullOrWhiteSpace(keyword))    

  {        

    query = query.Where(p => p.ProductName.Contains(keyword));    

  }

  return query; 

}
Web Forms also get unobtrusive validation, HTML 5 updates and elements, and those of you who like jQuery but also like Web Forms Controls (as well as Ajax Control Toolkit fans) will be thrilled to check out the JuiceUI project. It's an open-source collection of ASP.NET Web Forms components that makes jQuery UI available in a familiar way for Web Forms people.

ASP.NET MVC and Web API

Last week I blogged about Making JSON Web APIs with ASP.NET MVC 4 Beta and ASP.NET Web API. ASP.NET MVC 4 includes these new features (and a few more) and is included in Visual Studio 11 Beta.
  • ASP.NET Web API
  • Refreshed and modernized default project templates
  • New mobile project template
  • Many new features to support mobile apps
  • Recipes to customize code generation
  • Enhanced support for asynchronous methods
  • Read the full feature list in the release notes
Matt Milner has a great post on where ASP.NET Web API and WCF proper meet and diverge, and why you'd use one over the other. I'll be doing a more detailed post on this also, but I like Matt's quote:
WCF remains the framework for building services where you care about transport flexibility. WebAPI is the framework for building services where you care about HTTP.

ASP.NET Web Pages 2

New features include the following:
  • New and updated site templates.
  • Adding server-side and client-side validation using the Validation helper.
  • The ability to register scripts using an assets manager.
  • Enabling logins from Facebook and other sites using OAuth and OpenID.
  • Adding maps using the Mapshelper.
  • Running Web Pages applications side-by-side.
  • Rendering pages for mobile devices.
There's lots  more to talk about in Razor and Web Pages 2 that I will talk about when Web Matrix 2 comes out.

Visual Studio - for Web Developers

Lot of new Web Features - Hello Opera users!There's an extensive list of features and fixes on the Web Developer Tools team Blog. Here are my favorites.
The HTML Editor is smart about HTML5 and you can develop smart HTML5 sites with any ASP.NET technique.
The CSS Editor has a new formatter, color picker, better indentation, smart snippets and vendor-specific IntelliSense. That's Webkit and Opera in the screenshot over there.
The Javascript Editor has IE10's Javascript engine and supports Javascript as a 1st class citizen with all the support you get in other languages like Go To Definition, brace matching, and more.
Page Inspector is all new and lets you to see what elements in the source files (including server-side code) have produced the HTML markup that is rendered to the browser. IIS Express is now the default web application host.

All the Links

General Info

Download

Secondary Downloads (for the IT folks)
Got Visual Studio issues? Complain (kindly) and vote up features and concerns at their UserVoice site.
Got ASP.NET issues? Complain to me (kindly) and vote up features and concerns at our UserVoice site or ask questions in the ASP.NET forums. There will also be new videos, tutorials and information at http://asp.net/vnext and we are continuing to update the site with fresh content.
Hope you enjoy the Beta. Please do take a moment to install it, try it out, and offer feedback. There is time for us to make changes, fixes and improvements but only if you give feedback.

Sponsor: My thanks to DevExpress for sponsoring this week's feed. There is no better time to discover DevExpress. Visual Studio 11 beta is here and DevExpress tools are ready! Experience next generation tools, today.


© 2012 Scott Hanselman. All rights reserved.

View article...

2/23/2012

Recovering a SQL Server Database from Suspect Mode

A couple of days back at I got a call from my support team informing me that one of our database located on the Production Server went into Suspect Mode. The version used was SQL Server 2005 Service Pack 3. Being a Production Database server, it was a Priority 1 incident and the expected time of resolution was 4 hours..
Solution:
The first step was to identify why this incident occured and after investigation it was found that it was due to the corruption of the transactional log file of the database.
I connected to SSMS using the sa login credentials and located the SUSPECT database:

I then reset the status of the SUSPECT Database by executing the below T-SQL query against the master database.
EXEC sp_resetstatus 'test_dr';
sp_resetstatus turns off the suspect flag on a database. This procedure updates the mode and status columns of the named database in sys.databases. Also note that only logins having sysadmin priveleges can perform this :

As you can see in the above screen capture, the T-SQL query gave the warning message upon execution:
You must recover this database prior to access
The next step was to set the SUSPECT database into an EMERGENCY mode. This was done by executing the below SQL query against the master database.
ALTER DATABASE test_dr SET EMERGENCY
Once the database is set to EMERGENCY mode it becomes a READ_ONLY copy and only members of sysadmin fixed server roles have privileges to access it. The basic purpose for this is to facilitate troubleshooting. I did not want other users updating the database while it was being worked on.

As you can see from the above screen capture, once the T-SQL query got executed successfully the state of the database changed from SUSPECT to EMERGENCY.
Once the database state was changed to EMERGENCY. I then performrf a consistency check by executing the below T-SQL query against the master database.
DBCC checkdb('test_dr')
Which resulted in the below output:

As seen from the above screen capture there is no issue with respect to consistency of the test_dr database. Also, this confirmed that the logical and physical integrity of the database was intact.
The next step was to set the database to SINGLE USER mode with ROLLBACK IMMEDIATE. To do this the below SQL query was executed against the master database.
ALTER DATABASE
test_dr SET SINGLE_USER
WITH ROLLBACK IMMEDIATE
The above query will rollback any transactions if any are present in the test_dr database and will bring the database named test_dr into Single User mode.
Please refer to the screen capture below:











The next step was to perform a DBCC Checkdb along with Repair with Data Loss by executing the below T-SQL query against the master database.
DBCC CheckDB ('test_dr', REPAIR_ALLOW_DATA_LOSS)
This query will attempt to repair all reported errors. These repairs can cause some data loss.
Once the DBCC CheckDB with the Repair with Data Loss option were executed, the Database went into Single User mode as shown below:

After performing the above step the database was brought ONLINE and Multiple Users access was enabled by executing the below T-SQL query against the master database.
ALTER DATABASE test_dr SET MULTI_USER
Please refer the screen capture below.

As you can see from the above screen capture the database named test_dr is back ONLINE. I am even able to view its objects as shown below:

As final step for safety, I again checked the consistency of the database which was just repaired and brought ONLINE (i.e. the test_dr database) by executing the below T-SQL query against the master database.
 DBCC CheckDB ('test_dr')

After performing the above steps I ensured that all the required logins had access to the database with proper privileges. The application started working fine and the business was back on track. It took just 38 minutes to bring the SUSPECT database back ONLINE.

Pareto Charts in SSRS

The purpose of a Pareto chart is to highlight the most important amongst a set of factors. For example, in quality control for a manufacturer, a Pareto chart can highlight the most common sources of defects and the highest occurring type of defect.
The Pareto principle is also known as 80-20 rule, so for quality control in a manufacturing environment, that 80% of defects may be expected to come from 20% of the manufacturing issues..

Let us say we need to display the below data in a Pareto chart.
\
After creating the SSRS report, drag and drop the chart and configure it as bar chart. Then drag and drop the Model Name to the x-axis and Sales Amount to the data region.
Select the graph area or chart series (note that you need to select the bars of the graph) and then press F4 (properties). In the custom attributes, select Pareto for ShowColumnAs as shown below:

You will then be able to generate your Pareto Chart.

I think graph the individual sales items are shown as the bars, and the line is the cumulative total which shows that 80% of the sales are generated from the five best selling models.

Knockout Session from South Florida Code Camp

by JohnPapa.net 

I had a great time at the South Florida Code Camp last weekend presenting a Whirlwind tour of Knockout and Javascript Patterns. The rooms were small and way overpacked, but I’ll take that as a sign that the topic is popular Smile
The Knockout session is a whirlwind tour of KnockoutJS ’s features. If you like it and want to see more in depth material on Knockout, you can check out my full  course at Pluralsight titled Building HTML5 and JavaScript Apps with MVVM and KnockoutJS.
image
Here are the slides and sample code from the presentation at code camp. Thanks for attending!

2/15/2012

Adding Custom Code to the Reporting Services 2008 R2

By

There are numerous circumstances when we wish to add a custom function to a SSRS report in order to cater for needs of the customer which exceeds the capability of the built-in functions in SSRS. In these scenarios we will have to write our own functions. In this article I will demonstrate how to add custom code to SSRS.

Using SQL Server 2008 R2 Business Intelligence Studio

 Using Custom Code inside a report:  In this article we will consider a scenario where we want to design a KPI using custom code inside a report using the code tab of the report properties. For this we will have to write the below VB code as VB is the only language currently supported for the custom code in SSRS. Create a new SSRS project in BIDS studio and add new item as a report from the templates.
1. Create a dataset using AdventureWorks as datasource the following query:

SELECT  top 1   Name,  StandardCost, ListPrice, ListPrice - StandardCost AS ProductProfit
FROM    Production.Product
WHERE   (StandardCost >= 1000)
2. Select Report-> Report Properties from the menu.

3. Select Code tab from the report property window

4. At present only VB  supported for writing  custom code inside the report. Copy and paste the below code in the code window and click ok:

 Public Shared Function Test(ByVal profit As Decimal) As String
        Dim st As String
        If profit >= 1000 Then
            st = "High Profit"
        ElseIf profit >= 500 Then
            st = "Moderate Profit"
        Else
            st = "Average Profit"
        End If
        Return st
   End Function
5. Right-click on the textbox  and go to expressions.

6. To call the function written in the custom code window you will have to enter  Code.FunctionName . In the expression in our example it would be something like this:
=Code.Test(Fields!ProductProfit.Value)
7. The final Output should be something like this.

 Using Custom Assemblies:
The custom assemblies can be created using a class library project to create more advanced functionality for your reporting solution. The code reference for function to be used in the reporting services can be given in the references section in the report properties. To do this create a custom assembly and create a class library project called TestClass and add the below code to it:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
namespace TestClass
{
    public class Class1
    {
        public static int Cal(int a, int b)
        {
            return a + b;
        }
    }
} 
 
Compile the code to generate the TestClass.dll file. To access the code inside this file we need to place this dll  in the default location of the Report Designer which is  \Microsoft Visual Studio 9.0\Common7\IDE\PrivateAssemblies.
To add a reference to the file inside the report perform the following steps:
1. Select Report-> Report Properties from the menu.

2. Select the Reference  tab from the report property window

3. Add a reference to the TestClass.dll by browsing to the location of the default report designer i.e \Microsoft Visual Studio 9.0\Common7\IDE\PrivateAssemblies

4. Add the following code in the expression of the textbox of the tablix.

=TestClass.Class1.Cal(3,4) This code calls the function defined in the assembly reference. Here you don’t need to use the Code keyword to access the function defined in the Custom assembly. You can also define an instance of the class used in the custom assembly but this is optional as you can access the function using the namespace.classname.functioname inside the expression as shown below:-
5. The output should be as follows

For the custom code to be working after deploying the report you will need to paste the testClass.dll  – i.e the dll file in the bin folder of the reporting services whose default location is
\Program Files\Microsoft SQL Server\Instance_Name\Reporting Services\ReportServer\bin
Finally, just click right on the solution and deploy it on the report server to view it in the production environment.
 

Design A Hybrid Report in SSRS

By

Hybrid reports are the reports which have two types of graphs using different axes within the same graph .
For example.

In the above report, Product cost is on the right axis while Sales Amount Percentage is shown on the left axis, so that viewers can easily compare Product Cost vs Sales Amount Percentage.
Below is the query which is required for the above graph.
WITH TotalSales ( SumTotal)
AS(
SELECT  SUM(S.SalesAmount)
FROM dbo.FactInternetSales S
)
SELECT T.CalendarYear,SUM(S.TotalProductCost) TotalProductCost,
100 * SUM(S.SalesAmount)/ MAX(TotalSales.SumTotal)  SalesAmountPercentage
FROM dbo.FactInternetSales S
INNER JOIN DimTime T ON S.OrderDateKey = T.TimeKey
CROSS JOIN TotalSales
GROUP BY T.CalendarYear
ORDER BY T.CalendarYear
You can use the AdventureWorksDW sample database to get the above data.
To start the report , create an SSRS project and add a data source and a data set with the above query.
Next, add a chart control to the report. So your report should look as below:

Then drag the Calender Year to the X axis and TotalProductCost and SalesAmountPercentage to the graph area and your report should look as below:

Then select the SalesAmoutPecentage as shown in the above graph and change the graph type to Line by right clicking and selecting Change Chart Type.
You will see following configuration, if you look closely you will see that in the chart data and chart type has changed.

Right-click and select Series  Properties and you will be taken to the following dialog:

From here you can select the Secondary option from the Axes and Chart Area.
Finally, you can change the marker to complete the process.

Implementing Transactions in SQL Server – Part II

By

In Implementing Transactions Part I I briefly described the role of Transactions in SQL Server and outlined a very basic implementation. In this second part, I will explain how a DBA can best implement Transactions in scripts that are to be deployed on production databases.
One of the regular tasks of a DBA is to generate database schema change scripts, and then deploy the scripts to SQL databases. If an organization is not using a third-party tool, as is common, then Database Professional (which is part of Visual Studio from Microsoft) is normally used to accomplish this task. In many companies, the process of generating schema change scripts is a daily routine. Create DB scripts are generated on from TFS, a schema compare is performed between the previous and the latest build, and the schema update script is generated. The only issue is they do not run as a single transaction. The reason is that the scripts that are generated do not have explicit transactions defined. Unfortunately it is not simply a matter of defining a transaction using BEGIN TRANSACTION, and based on the @@TRANCOUNT variable in the end of the script either perform a Rollback or a Commit. This is due to the fact that after every DDL statement, DBPro inserts a GO statement, causing each statement to run as a batch, and explicit transactions do not span multiple batches. A sample script that is generated by DBPro would be similar to this:
PRINT N'Creating [dbo].[test1]'
GO
CREATE TABLE [dbo].[Test1]
(
[Col1] [bigint] NOT NULL IDENTITY(1, 1),
[Col2] [int] NOT NULL,
[Col3] [varchar] (50) NOT NULL,
[Col4] [varchar] (50) NOT NULL
)
GO
PRINT N'Creating primary key [PK_Test1] on [dbo].[Test1]'
GO
ALTER TABLE [dbo].[Test1] ADD CONSTRAINT [PK_Test1] PRIMARY KEY CLUSTERED  ([Col1])
GO
PRINT N'Creating [dbo].[usp_SP1]'
GO
CREATE PROCEDURE [dbo].[usp_SP1]
AS
…..
ALTER TABLE [dbo].[Test2] ADD [Col1] VarChar(100)NULL
GO
If there are no syntax errors in any of the DDL statements, then all the statements will run successfully. If, however, any of the statements results in an error then we have issues. Let’s say, the sample script above fails at the following statement because the table dbo.Test2 doesn’t yet exist on the database:
ALTER
TABLE [dbo].[Test2]
ADD [Col1] VarChar(100)NULL
GO
In this case, all the statements before this statement in the script would have run successfully, and committed the schema changes on the database, and the remaining statements after this statement including this one will not update the database leaving it in an unstable state. One solution is to fix this statement and run only the remainder of the script. This is a manual step, and is fine if scripts are deployed manually to the database. But what if there are a series of scripts being deployed at a time using an automated process?
There are several options to address this:
Option 1 : Remove all the GO statements from the scripts and then wrap the entire script within a single explicit Transaction. This involves manually editing the files which may not be feasible if the files are very large in size with numerous GO statements.
Option 2 : Implement transactions. Didn’t I mention earlier it is not possible because of the GO statements? Actually it is possible with a little bit of tweak, and the use of a SET option in the script. In SQL Server, there is a SET option called SET XACT_ABORT. This option specifies whether SQL Server automatically terminates and rolls back a Transaction if a T-SQL statement raises a runtime error. The default option is OFF. But, if it is set to ON, the entire transaction is terminated and rolled back.
To avail of this the above above sample script can be rewritten as below:
:On Error Exit
SET XACT_ABORT ON
GO
Begin Transaction
      PRINT N'Creating [dbo].[test1]'
      GO
      CREATE TABLE [dbo].[Test1]
      (
      [Col1] [bigint] NOT NULL IDENTITY(1, 1),
      [Col2] [int] NOT NULL,
      [Col3] [varchar] (50) NOT NULL,
      [Col4] [varchar] (50) NOT NULL
      )
      GO
      PRINT N'Creating primary key [PK_Test1] on [dbo].[Test1]'
      GO
      ALTER TABLE [dbo].[Test1] ADD CONSTRAINT [PK_Test1] PRIMARY KEY CLUSTERED  ([Col1])
      GO
      PRINT N'Creating [dbo].[usp_SP1]'
      GO
      CREATE PROCEDURE [dbo].[usp_SP1]
      AS
      …..
      ALTER TABLE [dbo].[Test2] ADD [Col1] VarChar(100)NULL
      GO
If Xact_State()=1
Begin
      Print 'Committing Tranaction...'
      Commit tran
End
Else If Xact_State()=-1
Begin
      Print 'Rolling Back Transaction...'
      RollBack Tran
End
Please note the first four lines and the last ten lines in the script.

:On Error Exit
This command causes sqlcmd to exit the sql script upon encountering an error.

SET
XACT_ABORT ON
With this statement, if a Transact-SQL statement raises a run-time error, the entire transaction is terminated and rolled back.

Begin Transaction
This statement defines an explicit transaction for the entire sql script.

If Xact_State()=1 
This statement checks for any committable transactions at the end of the script. If there are any, then Xact_State() will be 1, and the transaction will be committed in the IF block.
If Xact_State()=-1 
This statement checks if the transaction introduced in the script is in an ‘uncommittable’ state. By uncommittable, I mean if any error was encountered during the execution of the script then the script cannot continue execution, and all changes introduced in the database up to the point of failure in the script need to be rolled back. This rollback is done in this IF block.
Please note that scripts in which we introduce transactions in this manner, can only be run from the command prompt via SQL Server’s command-line utility sqlcmd. If this script is run from Management Studio, then it is possible that it will run only partially and commit only those changes where there are no errors in the script. This is because there is a difference in the behavior of execution of sql scripts when they are run from the SQL Server Management Studio, and when they are run from the command prompt since SSMS uses the .NET Framework SqlClient for execution in regular and SQLCMD mode in Query Editor . Whereas, sqlcmd when run from the command line uses the OLE DB provider. Since different default options may apply, there are different behaviors when executing the same query in SSMS in SQLCMD Mode and when exuting the query using the sqlcmd utility
This solution is good for implementing transactions in individual scripts when they are called from sqlcmd. But what if there is a requirement to run a series of such scripts which should all either commit or all roll back it is not a good solution.
We have to tweak this approach a little bit to satisfy this requirement. For this purpose we could not define the transactions in individual scripts as I have explained earlier, but call these scripts from another sql script I call as Wrapper.sql, then define the transaction in this wrapper and call the wrapper from sqlcmd.  The script would like this:
:On Error Exit
SET XACT_ABORT ON
GO
Begin Transaction
:r script_1.sql
:r script_2.sql
:r script_3.sql
If Xact_State()=1
Begin
       Print 'Committing Tranaction...'
              Commit tran
End
Else If Xact_State()=-1
Begin
       Print 'Rolling Back Transaction...'
              RollBack Tran
End
Throughout this example the concept has remained the same. The only difference is how and where to define the transaction, and how to call the scripts.