Isthmus Blog Live

Pura Vida Amigos!,
We at Isthmus are pleased to present you our Architecture blog. The idea here is to provide more value to our clients thru sharing technical information that can be useful for your projects, current or future. We will be sharing with you our experiences with the latest technologies including the good, the bad and the ugly, keeping of course the confidentiality related with each project and each client.
We invite you to challenge us with your thoughts, comments and questions to increase the knowledge base so we all gain.
Let’s create synergy!
Thank you,

Adolfo Cruz
Delivery Director

Thursday, December 20, 2007

Takes Two to Tango

Sun's Java Web Services engineers has been working since 2006 along with Microsoft's Windows Communication Foundation (WCF) engineers into integrating their WS-* enterprise features such as reliability messaging, security and atomic transactions. The main features targeted by Tango project are

* Bootstrapping communication
* Optimizing communication
* Enabling reliability
* Enabling atomic transactions
* Securing communication

It was said since the arising of WebServices that they were the solution to system integration; although, this statement is only true in part, the range of Languages and platforms lead to a wide variety of implementations all with inherited differences from the platforms that gave birth to each one of those. Even though, what ever I might state about the Web Services, from a business perspective they managed the differences and end up doing the job of overcoming the technical challenge of system integration; however, this ability came at a price which is either the need of some customization for cross platform WebService integration or the ability to integrate system written in the same platform only, leaving us only half way of what it was promised: full system cross platform integration.

Customization being done to accommodate platform differences should had never been a problem; since, any body creating an implementation of a WebService infrastructure should had attend to the facto standards set; even if the implementors adhere to the standard; at most as possible; the platform itself could bring differences to the final product making it incompatible at some level with other implementations.

We decided to try out the Tango implementation using NetBeans 6.0, the main advantage of using NetBeans as IDE is that it generates lots of boiler plate code speeding development of WebServices and clients in Java, so first we create a new Web Project for a simple Book Bid web service, such service emulates a business application that search books by name and returns the price in dolars to the client. Lets see how easy the creation of the Web Service is in the following images.

We first create the project and select a name, for this example I decided to call it BidBook.



Then I just added a new Web Service, I select the name BidBook and the Package and click Finish.



After the web service has been created and new operation has to be added in the design screen BidBook Web Service.


I added a new method named BidBook with a single parameter bookName and a return type of float to return the price in dolars, the we just click Ok button and NetBeans will create all the xml descriptors and Java classes with methods and correctly annotated corresponding to the new Web Service.


At this point we are almost ready to use our newly created Web Service; however, before going any further lets add a bit of business logic to it so we change from Design view to Code view and customize the generated code as seen in


with something as seen in the following image


If we return to the design view the newly created method, coments parameters and so on will be displayed in the view


All steps seen before are preatty much just supporting task for the main one at hand, wich is adding QoS to a Web Service, again NetBeans lend us a hand for the job, to add the configuration corresponding to the features we want to enable for this particular Web Service. In the Design View for it we select the options Reliable Message Delivery and Secure Service; we select this two just for demo purposes and left all advance features in their default values.



What NetBeans does for us in to add the correspondant information to the xml descriptor for the web service, later this information will be exposed along with the methods in the WSDL file used by consumers of the web service.


And we are done with our service and it now has reliable message delivery and security. Before moving away from the Java/J2EE world lets test this web service, to do so we first create a new project as a Java Application we named the project ClientBidBook and added the corresponding packaging later we add new Web Service Client



After NetBeans connects to the server and reads the WSDL contract for the selected service, a new proxy class will be created for us to use, this class is conceladed by NetBeans and what it shows is a tree view with all methods avaialable a drag n'drop in the main method of the application is enough for NetBeans to create the nescesary code to use the method.



Finally we start the application server and deploy our Web Service project (as easy as clicking run button on NetBeans) and later we run our client application and we get the following result:


Now lets consume this web service from a .Net 3.0 application, using Visual Studio 2005 on a box with .Net Framework 3.0 we create a simple console application and named ClientBidBook as seeing in the following image


Later just add a Web Reference to the url of the running application server holding our Bid Book Web Service, after adding the reference all the proxy classes to consume the service will be created in our project



The next step is simply using the proxy class in our project to do so we instantiate the BidBookService class and called the method BidBook with correspondent string with the name of the project.


For matter of testing we write the result to console and wait for a key press to be abble to see the results.



And now we are done, from the previous demo I wanted to outline the following missing facts:

* Integration it is feasible; however, complex data types should throughly be tested to ensure seamesly integration (we didn't test it in this short demonstration).
* Security and Reliability should be tested and confirmed.
* The business logic behind of the Web Service should be part of business module and it can be as simple as exposing previous builded applications instead of creating a whole new infrastructure.

A look into LINQ to SQL

This November 2007 Microsoft released the .Net framework 3.5, and among the several additions incorporated in this release, probably the most significant is the Language Integrated Query (LINQ), which provides several enhancements in the way applications can be developed.

In this post we will take a look into LINQ to SQL, which is the approach provided by LINQ to help developers accelerate the development of the data access layer.

Probably the three most critical functionalities of the data access layer are transactions, concurrency, and efficiency. We will take a look into how LINQ to SQL supports this.

Getting started

In order to see how to use LINQ, we will create a simple application that will allow us to create, edit and delete books, and assign categories to it. The database has 4 tables and one stored procedure.

The first step is to create a new Class Library Project (Our data access layer project) and add a new LINQ to SQL class called Library.dbml to it.


As a next step, we need to connect to the Microsoft SQL Server 2005 database using Server Explorer in Microsoft Visual Studio 2008.

This way we can see in our IDE all the tables and stored procedures available in our database.

We need to drag and drop all of these tables into the left side of the design surface of the Library.dbml file we created.

We also need to drag and drop the stored procedure into the right design surface of the Library.dbml file.

Once this is done, our Library.dbml file should look like this:


When we dragged and dropped all this information into our dbml file, what we did was to create a DataContext that maps all the information related to the database itself, by using attributes inside our .NET code. If we want to see the code generated by the designer, we only need to open the Library.designer.cs file nested under the Library.dbml file in our Solution Explorer. As you may imagine, there are many attributes that can be used and specified for our applications to customize the database access, but for our example the default values will work just fine.

This data context will allow us to use simple entities to access the database without having to write SQL code, as we will see ahead. This shows us how easy it is to create a functional representation of our database, even without having to write code for our Data Access Layer.

Next we create a new Class Library Project (Our business layer project) and add a new C# class called BookBL.cs to it.

Here we will code all methods required by the presentation layer, in order to execute business operations and access the database layer.

A method to add a new Book will look like this:

public Book addBook(Book book)
{
LibraryDataContext db = new LibraryDataContext();
db.Books.InsertOnSubmit(book);
db.SubmitChanges();
return book;
}

Here we create a new instance of the data context file we just created (Library.dbml). We know it contains a Books entity because we mapped the Books table into our data context. LINQ to SQL provides the method InsertOnSubmit in order to automatically store the book information for us, so we just call this method to add the new book. At the end, we execute the method SubmitChanges, also provided by LINQ to SQL, in order to commit our changes into the database. As you may see, we haven’t written any SQL code, and the LINQ to SQL framework has taken care of most of the data access work.

Another interesting operation is to query the database. For this we need to create a method called findByCategory in our BookBL.cs class that looks like this:

public List<Book> findByCategory(int id)
{
LibraryDataContext db = new LibraryDataContext();
var books =
from b in db.Books
where b.Category.id == id
select b;

return new List<Book>(books);
}

Here we are using LINQ syntax in order to query the database. We are storing the information retrieved from the database into a var datatype variable. This isn’t a variable datatype as in the old days of VB 6; instead it is a generic datatype that will be strongly typed with the specific datatype resulting of the database query (in our example, it will be strongly typed with IQueryable). The compiler will restrict us what can we assign to this variable, and it will also provide intellisense to it.

Another interesting thing to mention is that LINQ to SQL will allow us to execute stored procedures using LINQ syntax. So for example if we wanted to rent a book using our rentBook stored procedure, we could invoke it this way:

public void rentBook(int id)
{
LibraryDataContext db = new LibraryDataContext();
db.rentBook("anonymous", id);
}

Transactions

LINQ to SQL also helps developers in this point. By default, when the SubmitChanges method is called, if a transaction is not already in scope, the SubmitChanges method will automatically create a new transaction. All database operations executed during a single SubmitChanges will be wrapped into a single transaction, and as such, if any error occurs, the whole operation is aborted.

If we need to handle transactions at a higher level, or maybe even nest several SubmitChanges into a single transaction, we can make us of System Transactions, as int the following example:

public Book addBook(Book book)
{
using (TransactionScope ts = new TransactionScope())
{
LibraryDataContext db = new LibraryDataContext();
db.Books.InsertOnSubmit(book);
db.SubmitChanges();

return book;
}
}

Concurrency

It is possible to implement both Optimistic and Pessimistic concurrency for conflict resolutions when using LINQ to SQL.

By default LINQ to SQL takes care of optimistic concurrency, providing two different options to handle it.

One option is to provide a timestamp datatype column in the table were conflicts are expected. Using this approach, the LINQ to SQL framework will automatically update the timestamp column with every insert / update commited into the database. When someone tries to update a record by executing the SubmitChanges method, LINQ to SQL checks if the timestamp provided is valid. If it is different to the current timestamp, it means someone previously updated the record, and because of this data needs to be refreshed, so a ChangeConflictException is thrown.

So, a proper way to do an update operation is listed below, where we handle any possible ChangeConflictException:

public Book updateBook(Book book)
{
LibraryDataContext db = new LibraryDataContext();
db.Books.Attach(book, true);
try
{
db.SubmitChanges();
}
catch (ChangeConflictException cce)
{
throw new DataOutOfSyncException();
}

return book;
}

The other option of providing optimistic concurrency is by specifying which of the columns of a table should be checked for conflicts. If the listed columns differ when making an update, then a ChangeConflictException will be thrown.

To handle pessimistic concurrency, the only thing that needs to be done is to read the record, and then update it, but executing both operations inside a single System Transaction.

Efficiency

Object Relational Mapping solutions have always faced the challenge of generating efficient SQL statements. Since this technology is not new, great improvements have been achieved towards this goal. Even though, it is well known that for particular database operations that demand high optimization, the best solution is to manually create and tune the SQL statement.

LINQ to SQL provides the feature to print out the SQL statements being generated, so that the developer can decide if the generated statements are good for the application, or if manual intervention is required.

Conclusions

LINQ to SQL is a very powerful solution that allows us to reduce the amount of code, and hence, the amount of time invested developing applications. The data access layer is considerably automatized, and operations such as transactions and concurrency are easier to handle.

All this ease of development comes with the price of not-optimized SQL statements being generated. When choosing LINQ to SQL as the core data access technology we should study generated SQL statements to identify whether or not we need to manually enhance them. Depending on the application, the SQL statements being generated may be just good for the job.

Object Relational Mapping (ORM) solutions are not new, but LINQ to SQL is. This means that it lacks many of the features that more mature frameworks such as Hibernate provide. To tackle this, Microsoft is developing the ADO.NET Entity Framework as a separate addition to the .NET framework. This new framework will provide more functionality to the mix, while making development a little bit more complex.

This kind of ORM solutions have dominated the Java world for enterprise applications, and have done it for several years now. It won't come as a surprise if something similar happens with .Net applications, so we should pay close attention to this kind of solutions.