Sunday, November 16, 2008

Oslo == 42

In Douglas Adams book The Hitchhiker's Guide to the Galaxy, he tells how a supercomputer named Deep Thought worked to calculate the answer to Life, the Universe, and Everything.  After 7½ million years, the supercomputer came up with the answer:  42.   Unfortunately, no one could understand the answer because they didn't know the question.

Some might argue that "Oslo" is Microsoft's answer to "life, the universe, and everything" IT-related, but now that it has been revealed, many developers are confused and left wondering what is the question that it solves.

One of the posts on the "Oslo" forums queries "Why do we need Oslo?"

Doug Purdy gave his answer on his blog this week:  Oslo v1 is about developer productivity.  If you watch his PDC talk, A Lap around "Oslo", he explains that Oslo is about capturing the essence of the code without the ceremony.

The bold purpose of "Oslo" is to enable capturing the intent of the business in terms that are equally clear to humans and to computers.

How does the "Oslo" modeling platform attempt to meet this objective? 

In a nutshell, "Oslo" attempts to provide developers and analysts with the tools to abstract their intent as data and stores this data so that it can be executed later.

A simple Example

While at my previous employer I was responsible for an application framework for a line of business application.  One of the features that we supported was the ability to create forms to display entity fields to the user. 

To do this, we separated the implementation of rendering the forms and controls from the business entities.   We had an XML file to store which fields should be displayed for each form.  It looked vaguely like this:

<form entity="Defect">
<field name="Defect Number"/>
<field name="Found By" />
<field name="Title" />
<field name="Description" />
<field name="Repro Scenario" />
</form>

We then had a forms engine that could use this XML file to render a web form to the user using the appropriate web controls based upon the data type of each field, similar to the Dynamic Data Controls in ASP.NET 3.5 SP1

This XML file was effectively a custom Domain Specific Language, and each XML document was our form model.  Each of these form models were stored either on the file system or in the database as our repository.  The application framework served as the execution engine for our form models.  We even had a form designer for laying out the various components of the form visually.

So what did this give us?

  • Developers could rapidly implement a new form-based UI more quickly than wiring up the ASP.NET controls manually.
  • Customers and Business Analysts could design their own custom forms and add them to the system.
  • We were looking at the possibility of transitioning to use Silverlight or a 3rd party web control library for a richer user experience.  If we did, we would only need to update the application framework and the form models could remain untouched.

This approach allowed us to have a better separation of concerns.  The form rendering logic was baked into our application framework, while the business logic about which fields should be displayed were separated and could evolve separately.

why "Oslo"?

If you have already been separating your "what" from your "how", as we did in our forms engine, what benefit do you get from "Oslo"?

  • Better tools to do what you're already doing
  • Less isolation and fewer silos of information
  • Better transparency into your models

Our XML model was fairly simple, yet with a large form, it still became cumbersome because of all of the XML noise.   Using "M" we could have written a terser DSL that captures the essence of our intent, without all the ceremony of the XML angle brackets.

For example, here's a model more friendly to developers:

form Defect
{
Defect Number,
Found By,
Title,
Description,
Repro Scenario
};

Or, if you want a more human readable grammar for your customers and business analysts:

Defect has a form with the following fields: Defect Number, Found By, Title, Description, and Repro Scenario.

If this were stored in the "Oslo" Repository, we might have been able to have designer that was richer and easier to implement with Quadrant, and we would be able to integrate with our entity designer, too, to see which fields were available for each entity type.

Finally, if the model were stored in the Repository, other tools might be able to better discover information about these forms. 

For example, if we rename "Repro Scenario" to just "Scenario", or remove "Found By" from our entity model, we could immediately detect which forms use these fields and update our form model appropriately.

But, if you're already doing declarative, model-driven design like this, you probably can already imagine many more useful scenarios that "Oslo" enables.

What if you have never looked at your code this way before? 

Well, this might seem like a lot of unnecessary overhead.  Is it really that hard to write good ASPX code, and how often do you really need to swap out your rendering engine?

Shawn Wildermouth gave his answer to the broader question, "Why do we need DSL's?", and I think his post explains why there is value in taking this approach.

Today, customers talk to business analysts, who might capture what they think that they heard in some UML model somewhere, maybe in Visio.  They might do mockups of screen shots with Photoshop of how the user experience should work.   They then pass this onto the architect or developer to implement, who implement what they understand the customer's intent to be.  

This is like an elaborate game of telephone, though, and there is bound to be misunderstanding somewhere along the way.

If you have a good team, your developer will get this right most of the time.  If you are using Agile, you will probably get it right more often because your developers will be talking to your customers directly, and you'll get feedback quicker when it is wrong, but you will still have waste. 

Model driven development offers the opportunity to allow the customer to express their intent directly and see the impact of their decisions immediately without being translated through many different layers.

If "Oslo" can succeed in their goals, there can be much more rapid development with much less waste, which should matter to anyone who cares about business value.

Should I start building my code with "Oslo" today?

That depends on you and your needs.  

The "Oslo" Team has a large vision, which cannot be fully accomplished in v1.   They will have plenty of work to do in future releases.  Even this is still early and "nascent" according to Don Box and Ray Ozzie.  "Oslo" is still raw and will probably change before v1.  If you need your tools to be fully baked, then it is too early for you to build on "Oslo".

If you are just now trying to understand how model driven development and DSL's can help you with your development, maybe it would be useful for you to download the tools and play with it so that you can begin to grasp how this is different or similar to how you do things today.  Try thinking about the problems that your current project has that might benefit from this mind set.

If you have already been using this approach, try out using the tools from Microsoft to see if they will help you do what you already do more easily.  Discover the gaps, if any, that would prevent you from using "Oslo".

When you are done give your feedback to Microsoft on the "Oslo" forum so that they can make the necessary changes so that v1 supports as many scenarios as possible and it lays a solid foundation for future releases.

Sunday, November 9, 2008

NYC Connected Systems User Group

Keith Pijanowski will speaking about Windows Azure and Software + Services tomorrow night at the NYC Connected Systems User Group.

I expect that his presentation will be similar to the one he gave a few weeks ago to the IASA New York group.  If you're interested in Microsoft's Azure Services Platform, you might want to sign up to attend.

Fairfield / Westchester Code Camp

I attended code camp yesterday in Stamford, CT.  I'm not used to getting up at 5:00 am to catch the train out there, so I was a little tired but I enjoyed the day.  While on the train to and from Manhattan, I used the time to dig into the "M" Language specification.

If you're not familiar with what a Code Camp is, it is a conference for developers and by developers.  I found the speakers were very good and excited by their topics, and the conversations with the other developers were interesting.

I recommend attending and getting involved with your own local code camp.  I've already volunteered to help with the NYC Code Camp in January.

These are the sessions I attended yesterday:

Building Applications with Microsoft Cloud Services

Bill Zack presented his version of A Lap Around the Azure Services Platform.  This was a essentially the same presentation that Bill gave to the IASA New York group a couple of weeks ago.  In case you missed both of these, Bill will be presenting on this topic again next month at the NYC .NET Developers User Group on December 18th.

Next Generation User Interfaces using Microsoft Expression Studio

David Isbitski covered how to help developers learn how to leverage Expresssion Blend to create a simple, but cool Silverlight interface.   It was a good introduction to Silverlight and how to use Expression Blend, which was probably perfect for the audience.  

I was hoping for a few more examples of how to do some interesting effects with XAML.  I'm a design-challenged developer who was hoping to find a few more tips & tricks on how to leverage Expression to build a compelling user experience.  The reflection example was good, but it left me eager to learn more.

Applying and Leveraging LINQ:  When you should, when you shouldn't, and Why, Part 1

Richard Hale Shaw was the celebrity presenter at this code camp.  I remember first seeing Richard at the 1992 Software Developer Conference in Santa Clara, CA.

I was already familiar with the LINQ stuff that he presented during this session, yet it was still fun to watch him present.  I wish I'd been able to see the second part which would probably have gone deeper into things that I wasn't as familiar with about LINQ, but I was just glad to see him again after all these years.

Developing Applications using Live Framework

Dmitry Lyalin presented the Live Services Framework and how it fits into the Azure Services Platform.  I was particularly interested in Dmitry's presentation because I knew that he was going to be presenting some of the interfaces of Live Mesh that aren't available to everyone yet, including an early look at the Silverlight integration.

As usual, Dmitry did a great job with his presentation.

WCF + Silverlight

Alan da Costa Pinto did great job presenting WCF and explaining why you want to use WCF instead of ASMX for your .NET web services.   He then did a quick introduction to Entity Framework so that he could show how to rapidly generate service using WCF and ADO.NET Data Services (aka "Astoria").  Finally, he hooked it up to a simple Silverlight application.   While many of Alan's examples were rough, I think the cool part was that he was able to demonstrate how little effort you need to do to get something basic in place using these technologies.

Parallel Extensions to the .NET Framework

Louis Hendricks covered the Parallel Extensions that are being rolled into .NET 4.0.  This is the first chance that I've had to take a look at these extensions so this was particularly informative for me.

If you're interested in learning more, it's too late to attend Louis's session, but there is more information at these PDC Sessions:

PDC Content all in one place

If you want to be able to find all the PDC sessions and Power Point slides all in one place, check out Mike Swanson's blog:

If you'd like to download them all to your local machine, check out this post by Luciano Evaristo Guerche:

You can find the Firefox addon here:

Thursday, November 6, 2008

Visual Studio Training Kits

newdotnet I've been posting a lot about the announcements from the Microsoft PDC 2008, including Windows Azure, the Live Services Platform, and Oslo. 

What if your skills are still back on an earlier version of .NET?  Is there a good way to catch up quickly?

Microsoft has previously created Quick Starts to cover the core concepts in .NET 1.x and 2.0.  More recently, they have published various training kits.  Here's where you can find them online:

Microsoft also has another site, MSDN Ramp Up, which is designed to quickly get developers up to speed on the latest version of the .NET Framework.

What is Microsoft "Oslo"?

I wanted to post about what Microsoft code name "Oslo" is all about, but I just read a post by Aaron Skonnard who already has described it more clearly and eloquently than I could, so I'll just point you to his post:

For a fun introduction, here's a short video from Models Remixed:

Wednesday, November 5, 2008

Liberation Day - 6 November 2008

Liberation Day

If you still have a thirst for more information about Windows Azure and the Live Services Platform and don't have an early bed time, you might want to stay up to hear Steve Ballmer, Tim Sneath, and Gianpaolo Carraro tomorrow night at 11:30pm Eastern US Time.

The three of them will be presenting a key note for the Liberation Day developer conference tomorrow in Sydney, Australia (hence the reason for the late time--it is actually 3:30pm Eastern Australian Time)

Sunday, November 2, 2008

PDC 2008 - "Oslo" Sessions

Tonight I finished viewing the "Oslo" sessions from the PDC last week.

So what is "Oslo"?  As pre-announced by Doug Purdy and Don Box, "Oslo" is

  • A textual language, "M" for authoring models and DSL's.
  • "Quadrant", which is a visual design surface for visualizing your models.
  • A repository for storing and sharing your models.

What is Oslo 

If you're interested, you can watch them online.

After you're done watching, if you're interested in learning more you can download the "Oslo" SDK October CTP from the MSDN "Oslo" Developer Center.

IMPORTANT NOTES:

A lot of people think that this is all cool stuff, but are having a difficult time grokking why it is important. I'll try to explain why I'm excited about this new technology in another post later this week.