Microsoft’s Windows Azure development platform isn’t exactly new (it was in beta for almost two years), but many developers are still unsure about how to get a pilot started.

Developers should think hard about what they want to achieve in the cloud and whether they want to start from scratch or migrate applications, said Joannes Vermorel, founder of Lokad, a maker of sales forecasting software. Vermorel is a popular Azure blogger and developer.

The cloud is ideal for Web applications that require a high degree of uptime, Vermorel said. ” ‘Always on’ is nearly impossible in the classic client/server setup.” It is also good for organizations that require extra capacity to handle peak loads, he said.

“We [Lokad] rented grids before Azure. We rented half a dozen powerful Web servers from month to month, never had enough processing power at peak [demand], and most of the time processing power was unused,” Vermorel said.

Other use cases for the Azure platform are business intelligence processing in the cloud, and workloads that would normally be run as batch operations during off hours, Vermorel added. Any usage of the cloud should justify heavy investment in development due to the special skills that are required to create cloud applications, he continued. “Unless there is a significant business shift, developers are better off with a smaller grid, staying on Windows 7.”

However, “People need to be aware that cloud computing is not easy. It took me one year to migrate a .NET application to Azure. It was a steep investment, but one of the best that we’ve ever made,” Vermorel said. Applications that are designed from scratch make the most out of Windows Azure, he added, adding that hosting storage in the cloud can “tremendously improve” the scalability of an application.

In the past, if you wanted scale, you bought a new machine, he explained. “The cloud takes a different approach to adding machines. The application needs to be designed to handle additional incremental machines, gradually increasing processing capability as you increase the [virtual] machines supporting the application.”

A big learning curve
Developers who come from a “straight desktop application” background that distributes nothing over a network face the biggest learning curve, according to Vermorel. “The gap will be huge.”

The average .NET developer will have no problem “picking up” basic Azure programming within a day, said Roger Jennings, OakLeaf Systems developer and author of over 30 books on Windows development topics. OakLeaf Systems is a consultancy that specializes in the Azure platform.

Some of its more sophisticated functions, such as the enterprise service bus in AppFabric, will be sticking points that will take “a bit of getting used to,” he added. “Differentiating between Web roles and worker roles will take conceptual exercise on the part of the developer.”

Jennings compared the difference in programming conventions between the cloud and on-premise servers to an ASP.NET developer having to learn the ASP.NET Model-View-Controller framework.

“There’s a different approach to coding techniques, but a developer will have a good idea of what problems there are after a few sample applications,” Jennings said.

“People tend to underestimate how deep you need to rethink the ways that you design applications when you want to leverage what Azure has to offer,” Vermorel said. When a developer uses multiple virtual machines in the cloud, he or she gives up synchronization, he explained. “People in classical programming expect it every single time that they run an application.”

Every time a cloud application runs, it executes differently, even if it has the same behavior, Vermorel noted. Developers must learn idempotent functions so that multiple applications of an operation do not change its results, he said.

“People are used to client/server-style [programming patterns], which you can find being taught at any university,” Vermorel said. “It’s not as clear for the cloud, and textbooks are limited. The field has not been polished by academics with theories about what to do and what not to do.”

Dealing with latency
However, the Azure services platform might be a boon to some developers. “Hardcore” .NET developers who are familiar with grid computing will even find the Azure platform simpler and more straightforward for building distributed applications, Vermorel said. But in the cloud, “latency is everywhere,” he added.

Running a pilot is the best way for a programmer to get a feel for Azure’s response times, according to Jennings. The response times are slower than on premises, and there is “much more latency” he said.

However, Jennings said that the increase in latency is “not too bad. If you look at the response time of my sample application, it’s pretty snappy. It uses local storage, so there’s no penalty on the back end.” He added that he experienced slight differences when testing cloud-based storage for the same application.

Due to latency, the cloud is not ready for online transaction processing applications, but is “good for Web applications,” said Jennings.

Latency becomes a major issue with legacy desktop applications that are not designed for distributed or service-oriented environments, Vermorel said. “If [the application] is very monolithic and heavy, and relies on local setup on a machine…migrating to the cloud is very difficult in that situation, and it may be easier to write it from scratch.”

Helpdesk software maker River Road Systems experienced further latency when the Azure platform hosted its database at a location separate from other application components, said Sebastian Holst, chief marketing officer of PreEmptive Solutions, which worked with River Road on that solution.

Microsoft hosts Azure in a series of “mega data centers” located around the world. PreEmptive Solutions makes software that instruments applications for analytics and performance evaluation. A limited set of PreEmptive’s instrumentation capabilities is built into Visual Studio 2010; it also sells a commercial product.

Instrumenting Azure applications can help developers benchmark against legacy patterns and practices as well as shine a light on any architectural and technical gaps, Holst claimed.

The cloud also changes how people interact with their software, Holst said. PreEmptive learned this by instrumenting .NET applications that were migrated to Azure to catalogue feature-adoption trends, patterns and practices in the cloud. “The skin changes or experience changes the user’s behavior,” he said.

Latency was negligible on classical systems that could access memory in nanoseconds, “but it’s everywhere in the cloud,” Vermorel said. The latency is not an inescapable situation, but it would require a developer to run calls in parallel on thousands of virtual machines, he added.

Nonetheless, the performance impact might be enough to discourage end users from using the application, Vermorel said. “People expect a blazingly fast user application.” For every 100 millisecond of latency, cloud applications drop 10% of their users, he said, citing a recent Google study.

Organizations might fare better migrating legacy green screen applications to the cloud instead of Windows applications, suggested Mark Haynie, CTO of application modernization for Micro Focus. Mainframe applications were designed for latency, and cloud computing parallels the way that sharing systems on mainframes worked from the 1960s to the 1990s, he said.

“The RESTful [representational state transfer] state of a Web service is close to pseudoconversational programming, which was the model for building mainframe applications in a bygone era… It’s just a matter of scale,” Haynie explained.

“Double-clickable desktop Windows applications with single process models are difficult to pull into a multi-tenant type cloud. Application information is not stored in session.”

Micro Focus is working on middleware for the Azure platform that would map a COBOL record IO file to Windows Azure’s blob storage. “[Migrating] is a harder job than simply saying ‘This language runs in the cloud now’ and issuing APIs,” Haynie said. Micro Focus is working with Microsoft to migrate IBM’s CICS (Customer Information Control System) to .NET and Windows Azure, he said.

Optimizing for Azure
There are challenges in handling data in cloud applications even when an application is written from scratch, Vermorel said. Data storage requires logic about what to do if an attempt to get the data fails, he explained.

Vermorel recommended using an abstraction layer such as NHibernate between business logic and the network. “Otherwise, your business logic will get mixed up with network [failover] management, and that will get very, very ugly,” he said. “We lost a lot of time trying to design our first [.NET] migration, and threw out our first prototype.”

Using an object-relational mapper also speeds up interactions with SQL Server or MySQL relational databases in the cloud,  Vermorel added. Lokad oversees an open-source .NET O/C mapper (object to cloud) for Windows Azure called lokad-cloud, which Vermorel recommends to developers who are new to the platform.

Developers should also be aware that they should not lock an application to a specific virtual machine, because a virtual machine may not exist anymore after an application scales down, Vermorel said. He recommended that developers use a “lease” instead, which would explore and allow another machine to take over a process.

Microsoft’s ADO Entity Framework serves the same purpose, Jennings said. “Version 4.0 is a considerable improvement over the original Entity Framework that shipped with Visual Studio 2008.” He recommended that developers become acquainted with Windows Azure’s Table Service API, because it has “virtually unlimited storage” and “good performance.”

SQL Azure’s 10GB size limitation is an issue that developers must overcome by learning how to shard databases, Jennings said. “Sharding is a black art. Microsoft says that it will relax the limit, but it’s been saying that for months and hasn’t done it yet.”

Embarcadero is creating a solution that optimizes SQL databases for SQL Azure, said Scott Walz, director of product management at Embarcadero. It will enable developers to tune queries and stored procedures locally, and then push them out to Azure, he said.

“Old-school DBAs like having control over everything. [Losing control] is what scares them about going to Azure. We allow them to work with local instances of a SQL Server 2005 database and to migrate that to Azure,” Walz said.

A look inside Windows Azure
The Windows Azure platform launched on Feb. 1, and it is comprised of Windows Azure, an operating system as a Web service that provides blob data storage; Windows Azure AppFabric, a solution for developing composite applications; and SQL Azure, an online relational database.

The finished product has diverged slightly from what Microsoft envisioned two years ago, with its .NET services being scaled down, but with more features added to SQL Azure.

AppFabric presently includes Microsoft’s Velocity caching technology and the Dublin management technology for Workflow and Communication Foundation applications. Microsoft removed .NET Services’ Workflow Service, a derivative of Windows Workflow Foundation, until further notice.

Live Framework and Services, which would have given developers access to Microsoft’s Windows Live offerings, did not make it into the final service offering; neither did the Live Mesh online synchronization and collaboration service.

Like other major cloud providers, the Azure platform follows a consumption pricing model that charges Windows Azure customers for bandwidth, CPU hours, storage and transactions. SQL Azure customers pay a monthly fee for either a US$9.99 1GB or a $99.99 10GB database, as well as the associated bandwidth costs.

Microsoft is offering special introductory pricing with a monthly “free” usage threshold for customers that subscribe to Azure’s core services. MSDN Premium subscribers receive a set amount of free usage every month. The company will announce volume-licensing packages later this year, a spokesperson said.

Developers should be using Visual Studio 2010 because it provides up-to-date templates for Azure projects, said OakLeaf Systems developer Roger Jennings, author of over 30 books on Microsoft development topics. Jennings also recommends that developers update their MSDN subscriptions to MSDN Premium or higher to receive a “substantial benefit in terms of free usage quotas” for the remainder of the year.

There are three MSDN subscription options for Visual Studio 2010: Professional, Premium and Ultimate. The Professional and Premium tiers offer special incentives for developers to test-drive Azure services.

The Azure billing system gives developers a chance to see what the costs of a real-life installation are as soon as the next day, said Jennings. “They can run multiple clients against specific numbers of [Azure] instances to support workload, but will have to pay if they go over the MSDN limit,” he added.

Another tip is to take the project out of staging and deployment, which goes beyond simply just “turning it off,” or Microsoft will continue to charge, Jennings said. Developers will find the most current information about the Azure platform on Microsoft’s and third-party blogs, he said. “Documentation does not always keep up with the changes and other new discoveries.”