Jamie Laflen
Barlay Hill
SQL Server Developer Tools, or SSDT (code-named “Juneau”), represents
Microsoft’s continued commitment to providing integrated tools for
developers targeting Microsoft SQL Server. Those familiar with the
previous versions of the Database Project in Visual Studio will find
that Juneau is an evolution of those tools for SQL Server, plus many new
capabilities and improvements. Juneau provides a unified toolset that
combines the projects found in SQL Server Business Intelligence Design
Studio (BIDS)—including Reporting, Analysis and Integration Services
projects—with the SQL Server Database Project.While you can install Juneau independently and have everything you need to develop databases for SQL Server, it’s also an integral part of Visual Studio, meaning you can now perform your database development in the same environment as your application development. Juneau is available as part of the SQL Server 2011 release (code-named “Denali”), but will also be available with future versions of Visual Studio.
This article focuses on the Juneau Database Project. This project system and its related features provide the tools to edit, compile, refactor and publish databases to specific versions of SQL Server and SQL Azure. In Juneau, there’s one database project for all versions of SQL Server, and it can include both Transact-SQL (T-SQL) scripts and code that defines SQL CLR objects. Let’s start by taking a look at setting up a database project.
The Database Project
The Database Project is a Visual Studio project enabling offline development of SQL Server databases. A database development team would move to project-based development to enjoy the benefits that this type of development affords the team over developing against a shared live database, including:- Isolation of developer changes
- Rich T-SQL editing support
- Verification of source prior to deployment and enforcement of team coding standards through code analysis rules
- Automated migration-script generation
Setting Up a Project
When you first create a database project, it appears empty, like most other projects created by Visual Studio. This means you can add source code and then build and debug prior to checking it into source code control. Unlike most other projects where you must start from source code, a database project can be created from an existing SQL Server database. There are several ways to populate a project with source code, depending on the level of control you want:- Import a complete database
- Import scripts
- Import a database package (.dacpac)
- Write updates for specific objects from a comparison of a project with a database
- Drag and drop from Server Explorer
- Convert an existing Visual Studio 2010 Project (Database/SQL CLR) into SQL Server Database Project
To introduce the database project, let’s start with a new empty database project and walk through its various features. There are several ways to create a database project. Because we’ll start with an empty project, just click File | New | Project and then select the SQL Server Database Project, as shown in Figure 1.
Figure 1 Creating a New SQL Server Database Project
Adding Source to Your Project
Most databases have at least one table, so let’s add one now. SQL Server Database Project provides default T-SQL templates for many of the commonly used SQL objects. To create a new object, right-click the project node and select Add | Table, then specify the name of the table in the dialog; we’ll use “Customer.”Figure 2 shows what a new table looks like in the new table designer.
Figure 2 The New Table Designer
Like the HTML designer, the table designer defaults to a split-pane view. The designer window contains a graphical representation of the table as well as the table’s script definition. No matter which view you choose to work in, your changes will be synchronized with the other view.
Often, direct access to a table is restricted and you’ll want to write a stored procedure to give an application programmatic access to the table’s data. With this in mind, we’ll add a stored procedure called “CreateCustomer” the same way we added the Customer table. The editor opens and you’re presented with a default procedure skeleton to populate. This might seem a little daunting because, for many developers, you develop your stored procedures against a live database so you can execute and validate your code as you write it. But don’t worry; the database project creates a new debug database to make project-based development much more productive. Within the editor you can select some text and right-click to get the context menu shown in Figure 3.
Figure 3 Executing Project Source Code Against the Debug Database
How does executing the text using Execute Query, without having configured a connection string, work? When any source code is opened, the editor is configured to execute against the debug database when asked to execute the script or selected text. In addition, the debug database is updated when you start debugging (for example, by hitting F5 or Ctrl+F5) to match the source of your database project. Juneau leverages a new feature in Denali called the SQL Server Express LocalDB to provide an instance and databases automatically to develop and debug against. For more information, see “Introducing SQL Server Express LocalDB” below.
Developing Your Source
Once you’ve looked at LocalDB and have seen how you can develop stored procedures within your source code, you’ll want to know about some other cool features available in the Database Project. A typically unsung hero, IntelliSense inside the project system has been improved based on both SQL Server Management Studio (SSMS) and the Database Project feedback. Many of the changes just smoothed out previous rough edges, but a few are significant:- Preview mode: One of the biggest complaints about T-SQL IntelliSense centered on unintended completions when referencing yet-to-be-defined types. For example, you may have wanted to type:
select t.id from Table1 as t
select t
select Table1.
- IntelliSense with unsaved files: In earlier database projects, you needed to save a file before the objects defined within were available to IntelliSense. For example, previously a newly added table would not appear in IntelliSense until the table’s file was saved. Now, the table will appear in IntelliSense even if the file is not saved.
As you know, refactoring databases is a lot harder than refactoring application code because databases have data and a seemingly innocuous name change can amount to many additional changes due to dependencies from other objects—or, worse still, data loss can occur when these changes are deployed. Juneau tracks refactor operations performed inside the project by tracking the changes in the MsdnDemo.refactorlog (in our case) so that it can take those operations into account and generate the appropriate sp_rename or ALTER SCHEMA statements.
In other managed languages, the act of building your project is the final step before running the new code. In database development, the analog would be creating all the database objects in your project within a running instance. To bring full database verification to a database project’s build, Juneau leverages another new feature in SQL Server Denali called SQL Server T-SQL Compiler Services. This component provides complete verification of your source code without the need to execute that source against a live SQL Server. To see this in action, add the following code after the CreateCustomer procedure:
go
create table dbo.ExtendedVerificationDemo
(
c1 int null,
c2 as c2 + 5
)
E:\projects\MsdnDemo\MsdnDemo\CreateCustomer.sql(12,1): Error: SQL01759: Computed column 'c2' in table 'ExtendedVerificationDemo' is not allowed to be used in another computed-column definition.
- The verification engine is built based on the SQL Server Denali engine and it enforces rules based on that engine. If your project uses deprecated syntax that has been removed in SQL Server Denali, it will fail verification.
- The verification engine does not understand objects referenced by a fully qualified three- or four-part name. The engine will flag a warning or error based on the SQL Server deferred name resolution rules, which you can read more about at bit.ly/pDStXE.
After all the build-time warnings and errors have been addressed, verify that your code actually works.
Verifying Your Source
Once you’ve built your project, you need to run (and possibly debug) your code before you check it into source code control. You have several options depending on where you are in the development cycle and what you want to do, as indicated in Figure 4.Figure 4 Options for Verifying Your Code
If You Are Here … | Do the Following ... |
You’ve made several changes and want to debug. |
|
You have an application project (Web application or .exe) that uses stored procedures that you’ve changed (and may want to debug). |
|
You want to debug your database using an application that accesses your debug database. |
|
You’ve made several changes and you want to test (and possibly debug) the changes. |
|
Migrating Changes
When your code reaches a stable point, it’s time to migrate it to an instance to be tested and eventually used by your customers. Juneau has an incremental update engine that will generate an update script based on the difference between your source and the target database. Although there’s one underlying engine, it’s exposed in three different ways within Juneau to support migration of your changes, as Figure 5 describes.Figure 5 Options for Migrating Your Code
Migration Scenario | Description |
Promote changes to debug database |
|
Publish changes to a separate database |
|
Compare schema and move changes between databases |
|
Figure 6 Options for Controlling Updating Behavior
Behavior | Option | Result |
Data loss | Block incremental deployment if data loss might occur | The incremental update engine will halt execution if a table has data and the script later makes a destructive change. A destructive change would include dropping a column or making a type change (bigint to int). |
Back up database before deployment | The engine will script a backup of the database prior to performing any changes. | |
Generating drops | Always recreate database | The update script will be written to detect if the database exists and, if it does, to set it into single-user mode and drop it. |
Drop objects in target but not in source | This option is a “hammer” that drops all objects in the target database that are not also in the source. This option overrides any data-loss options. | |
Drop constraints, indexes, dml triggers | Treats constraints, indexes and dml triggers as if they’re part of the table or view; thus, removal of these objects from the project causes a drop of the corresponding object in the target. | |
Drop permissions, extended properties | Similar to the previous option, treats these objects as if they’re part of their parent; thus, absence in the source causes target-only objects to be dropped. | |
Verifying new constraints | Check new constraints | When new foreign key and check constraints are created, they can “check” that existing data conforms to the constraint. The data check for new constraints is deferred to the end of the deployment script so that a constraint violation doesn’t cause an error in the middle of the deployment. |
Transactions | Include transactional scripts | When this option is enabled, the engine attempts to wrap the generated script within a transaction. Because some objects can’t be managed within a transaction, it’s possible that a portion of the script will not be within a transaction. |
- Open the background activity window by clicking on View | Other Windows | Database Activity Monitor.
- Right-click on the project and select Publish.
- Populate the Publish dialog, as shown in Figure 7.
- Click Publish.
Figure 7 Publish Database Dialog
When publishing has completed, open the publish step and click on the View Plan link. A report like the one in Figure 8 will be displayed.
Figure 8 The Deployment Preview Report
As you can see, the report indicates that the two tables (we fixed the bug in the ExtendedVerificationDemo table) and procedure we wrote earlier will be created when the script is executed. The highlights section is empty, but you can see that the report will highlight actions that could cause a significant performance impact or data loss. If we had only generated a script rather than publishing, we could use this report to help verify the script prior to execution.
Connected Development
So far, we’ve talked about ways to use the Database Project to develop our code declaratively outside the context of a running database. All this technology is extremely useful when working in team environments, but sometimes you just want to interact with a live server! Microsoft knows that live servers are important to developers, and it has invested in a rich, developer-oriented, connected experience. To demonstrate, let’s open the database to which we just published. To connect to that database:- Click on the View menu.
- Select Server Explorer.
- Right-click on the SQL Server node.
- Select Add SQL Server.
- Populate the connection dialog with (localdb)\v11.0.
- Navigate to the just-published MsdnDemo database, as shown in Figure 9.
Figure 9 MsdnDemo Database in the New SQL Server Node in Server Explorer
The first thing you might notice is that the tree looks very similar to the tree in SSMS. It should; the team wanted to limit the amount of relearning necessary when moving to Juneau. You can open a T-SQL Query editor right from the tree and begin writing code as you normally would; this editor is the same enhanced editor we talked about earlier. Unlike some of the SSMS capabilities, the Juneau tree is expressly meant for developer-oriented actions. For instance, if you right-click the table Customer and select Delete, you’ll see a dialog that presents the same preview report you saw earlier. However, in this case there’s a warning that the procedure dbo.CreateCustomer will be broken if the drop is executed, as shown in Figure 10.
Figure 10 Update Database Preview Dialog
** Warnings
The object [dbo].[CreateCustomer] will be broken if the
change is executed.
** Highlights
Tables that will be rebuilt
None
Clustered indexes that will be dropped
None
Clustered indexes that will be created
None
Possible data issues
None
** User actions
Drop
[dbo].[Customer] (Table)
** Supporting actions
If you cancel the delete and then right-click on the Customer table and select View Designer, you’ll see the same table designer you became familiar with in the project system—except this time it’s hosted over the definition of the table retrieved from the server. As you might expect, the designer has a CREATE table statement using the same declarative programming model as the project system. In the designer, rename the Id column to CustomerId and add a second int column called Age. Now if you look in the error list, you’ll see a warning that CreateCustomer has been broken by the column rename, as shown in Figure 11.
Figure 11 Error Resulting from Column Rename
To fix this error you can either View Code on the CreateCustomer procedure or just double-click the warning and modify the insert statement to update the column names and supply @param2 as the value for the Age column. At this point, you have two windows (one source, one designer) with declarative object definitions retrieved from the database defining a set of changes to the objects on the server. If you click on the Update Database button, you’ll see the now-familiar report that will tell you that the column will be renamed and both the table and procedure will be altered. Execute the script by clicking on Update Database and then navigate to the Customer table in Server Explorer. You’ll see the tree update to include the CustomerId and Age columns.
The connected development experience in Server Explorer provides a great deal of power to you by supporting the same declarative programming model and enhancing online changes with real-time warning and error support as you make changes.
Getting the Bits
Juneau is available in the SQL Server Denali CTP3 release. It will also be available as a standalone Web download, and will integrate into existing installations of Visual Studio 2010 and the next version of Visual Studio. Moreover, Juneau has a separate command-line tools installer for publishing databases without requiring Visual Studio to be installed, and for Team Foundation Server automated-build scenarios.Introducing SQL Server Express LocalDB
SQL Server Express LocalDB (or LocalDB for short) is basically the next generation of SQL Express User Instances, but without the need to explicitly manage a SQL Express instance on your desktop. LocalDB doesn’t have a background service hosting a named instance; rather, it provides a way for developers to define custom named instances and then interact with them. When a LocalDB instance is started, it runs as a process under the credentials of the user who started it. For example, if you start a LocalDB instance, in Task Manager you’ll see a sqlservr.exe process running under your own credentials. This by itself is cool because it means no setup to debug your T-SQL code! Once the instance is started, it’s just like a SQL Express instance. When the instance isn’t used for a period of time, it will shut down so that an idle SQL Server instance doesn’t consume resources on your machine.A LocalDB installation comes with a command-line tool that can be used to manage the LocalDB instances you’ve created. For example, you can create a new LocalDB instance called LocalDBDemo by executing the following:
C:\Program Files\Microsoft SQL Server\110\Tools\Binn>SqlLocalDB.exe create LocalDBDemo
Local Database instance "LocalDBDemo" created with version 11.0.
Once the instance has been created, you can start it using the command-line tool or you can just attempt to connect to the instance through Juneau, SQL Server Management Studio (SSMS) or your application. Because LocalDB instances aren’t resident instances, they can’t be accessed using the (local) prefix used to address instances on the local machine. Instead, use (localdb); in this example you’d type (localdb)\LocalDBDemo into Juneau to connect to and manage the instance.
When you create a new instance, a new directory is created within your user profile and the four built-in databases (master, tempdb, msdb and model) are placed within this directory. By default, any new databases will go in the instance’s directory as the default data path. In our example, the directory is:
C:\Users\user1\AppData\Local\Microsoft\Microsoft SQL Server Local DB\Instances\LocalDBDemo
If you don’t want to create a custom instance, you can use the LocalDB built-in default instance named v11.0. To access the instance just register a connection to “(localdb)\v11.0” in Juneau, and the instance will be automatically created for you by LocalDB.
Because LocalDB is new, a patch to the Microsoft .NET Framework 4 is required to use the (LocalDB) prefix when accessing an instance through SSMS or managed application code. Installing Juneau will include this patch. Support for the prefix will be built into the next version of the .NET Framework.
Database developers face a problem similar to what Web developers faced (and which IISExpress solved for them): how to develop and debug code that requires a server to execute without needing to run a full server product locally on the development machine. LocalDB provides a solution for database developers. Because LocalDB is a crucial part of database development, it’s installed on your desktop when you install Juneau. When a database project is added to a solution, Juneau creates a new LocalDB instance named after the solution and creates a database within that instance for each database project. Juneau creates the data and log file for each database project in a directory within the project’s directory. For more information about LocalDB, see go.microsoft.com/fwlink/?LinkId=221201.
No comments:
Post a Comment