Skip to main content

.NET: Separating the Hype From Reality

The .NET Framework is not the latest in application frameworks and tools from Microsoft.


Windows application frameworks have been evolving continuously over the past 10 to 15 years. First, there was the Windows application programming interface (API), functions that were called from C that hooked into parts of the operating system that allowed you to do different things with Windows. Then there was Object Linking and Embedding (OLE), which allowed your application to use objects and incorporate them or link to them in both Visual Basic and C/C++. Later iterations of OLE have been known as ActiveX or COM (the Component Object Model). With the increasing use of the Internet, another extension of COM and OLE, called the Distributed Component Object Model (DCOM), allowed for the use of object linking and embedding over a distributed network computing architecture.


In a lot of ways, these application interfaces have been built upon one another. OLE is built upon the Windows API, COM is an extension of OLE, and DCOM is an extension of COM over the network. The evolution of these interfaces took place as a result of the desire of developers to use components that could be utilized by different applications, specifically, dynamic link libraries (DLLs) that could be shared by many applications at the same time. Before Win32, to use a DLL's functions, the application either had to have a copy of the DLL in the application's path, or the application had to use an API call if it was using a function from a system DLL. With Win32 (Windows 95 or later, Windows NT 4.0 or later), the location, name, and interfaces of DLLs could be recorded in the Windows Registry. Everything that is a COM component must be recorded in the Windows Registry if it is to be called by another program.


This new Win32 architecture created both wonderful opportunities and some major headaches for developers. The promise of Win32 COM is that you could create a DLL for another application to access, and be able to use it no matter where in the system it was; it no longer had to be in the path of the application. This meant that DLLs didn't have to be copied in the system directory, they could be placed anywhere in the system and referenced to in your application code. All that you had to do is to register your component in the Windows Registry, and you could thus refer to it by name. The only drawback is, the Registry allows more than one component of the same name, and it refers to registered applications by the program name (known as a ProgID), instead of by the unique identifier (known as a GUID) generated whenever a component is registered. So which DLL an application ends up grabbing may depend on which order components were registered in, and whether they have different version numbers. As Franky Wong of Desaware explained it:

Microsoft may have provided a version resource capability, but that alone did not solve all of the problems with distributing applications. Even now, many dynamic link libraries are created without version resources, or their version resources are not updated correctly when the file is modified. Many applications still use installation programs that do not check the version information of existing DLLs, or installation programs that compare the wrong version information.

Still, as long as users had to deal with only a few shared dynamic link libraries, the problem was manageable. This all changed with the appearance of Microsoft's Visual Basic.

Visual Basic is the first product to take full advantage of a new software development philosophy called "Component-Solution" programming. Under this philosophy, programmers take advantage of "off the shelf" software components that implement specific functions. Visual Basic itself is the "glue" that binds these software components.

Under Visual Basic, software components consist of either dynamic link libraries or Visual Basic custom controls (VBXs, OCXs and ActiveXs).

This programming philosophy makes an enormous amount of sense. Why write your own communications function libraries when a single custom control can provide the same capability for a tiny fraction of the price? Visual Basic has literally thousands of different custom controls available, and they are a large part of the reason that it is such a highly effective programming environment. Visual Basic also established a precedent for increased use of reusable software components, which is becoming more popular with other languages as well...

The component-solution framework for programming has had one serious side effect concerning the distribution of Visual Basic applications. Now instead of a few DLLs that are shared by several applications, there are hundreds of DLLs, VBXs and OCXs that may be shared by literally thousands of applications.

And all it takes is a single DLL, VBX or OCX to be missing, or present in an older version (or even an incompatible newer version), for an application to fail. A poorly designed installation program, user error, registration error or change in the user's PATH environment variable are a few of the ways in which this problem can occur. Worse yet, there is no reliable way to identify the failure, since the symptoms of the failure can vary from a minor error to a General Protection Fault or memory exception.

But the problems do not end there. Some applications place software components in their own directory, meaning that you can have several versions of the same OCX, VBX or DLL on your system at the same time. The one that is used may depend upon the sequence in which two applications are run, or which component was last registered - leading to a whole list of "intermittent" problems that depend upon the interaction between unrelated applications.

It is not unheard of for technical support personnel to literally spend hours on the phone trying to track down elusive problems that turn out to be nothing more than the presence of an obsolete software component. With so many problems with so many causes, is it any wonder this headache was called DLL HELL?
(Article archived at Desaware, italics mine.)


Although Wong specifically mentions Visual Basic as a source of DLL Hell, this problem can also occur for applications written with Visual C++ or other programming tools. VB just enabled a lot of DLL components to be incorporated, and later, written, that resulted in version and name conflicts with existing DLLs. The real problem was the nature of COM and the swamp known as the Windows Registry. Programming with VB using COM components just made DLL Hell worse because of VB's tendency to hide the details of the implementation of linking to or embedding COM components. I think the VB developer who is not working with COM interfaces on a regular basis is often unaware of the potential problems that their creation and use of COM components may generate for a given system.


With Windows 2000, Microsoft made an attempt to resolve DLL hell by creating a special directory in the system known as the Global Cache. This is where all shared DLLs could be placed instead of registering their location elsewhere in the system. That way, if a shared DLL already existed, the developer would be alerted that there already was such a component and could take steps to ensure that components would not be overwritten and that applications would not break. From what I have heard, it must not be a very widely used feature, because DLL hell is still with us on Win32.


Finally, we come to the next framework for Microsoft Windows development-- .NET. .NET is Microsoft's answer to ending the pitfalls and complexity of Win32 and COM development, as well as providing a way to extend Windows programming tools to internet services. The .NET development system depends on a Common Language Runtime (CLR) that interfaces with the operating system much in the manner that the Java Virtual Machine interfaces with the local operating system. The difference with the .NET platform is that one can write applications in one of several programming languages that target the CLR-- VB.NET, C#, Visual C++.NET, J#, Python, COBOL, JScript, and others. C# and VB.NET have obviously seen the most adoption since .NET was introduced in 2001. Whereas Java runs in what James Gosling would describe as "secure" mode all the time, .NET languages run in "managed" code almost all the time. Managed code means that they run with data types designed to conform with the CLR's Common Type System, so that the CLR knows what variables to expect from an application. However, Visual C++.NET, C# and VB.NET allow running "unmanaged" or "unsafe" code, which is most commonly used to access lower-level Windows functionality that the .NET framework doesn't support, API calls for example.


.NET is designed to make it easier to develop and deploy an application by providing framework classes that all the .NET languages can access and use. All of the code that is used to create a Windows Form, for example, is now accessible to the VB developer in the editor, just as all the code for a Windows Forms app in C# is visible. All of the methods in the framework that are used in an application written in VB.NET are now visible too. For VB.NET developers, it is a different paradigm, and VB.NET is really a different programming language from VB 6. For deployment, there is no more registration of your DLLs (now known as .NET Assemblies) with regsvr32; all you do is copy them into the directory on the system where you want them to run, and it just runs. If you want your DLL to be a Shared Assembly, which other applications can use, you just copy it into the Global Assembly Cache, and it just works. The only caveat is that it must have a unique name on your system, but this is just keeping us honest! Each .NET executable or assembly has metadata associated with it that identifies it by name, by GUID (which is automatically generated by the .NET framework) and by a public key if it is a shared assembly. This metadata ensures that we will have the right version of our shared assembly loaded for each application that calls upon it.


How does this stack up to my requirements for a development framework? Let me recount the ways.


Highly Customizable User Interface. With .NET, there are more standard controls for Windows.Forms than ever. Thuan Thai and Hoang Q. Lam, in their book, .NET Framework Essentials, describe the Windows.Forms classes as being a wrapper on the Windows API functions that give you a simplified interface to these functions. What's more, for all the .NET languages, they are implemented in a consistent manner. For VB.NET developers, this means the possibility of greater control over the UI at design time. Another feature to like about Windows Forms is that you can derive a form from another form class, to retain the same look, feel, and set of controls.


Open APIs. What we have with .NET is a movement in the direction of Java, with an open and documented framework, but with an operating system that is for the most part akin to a car with the hood welded shut. .NET really has little to do with the core of Windows; one can think of it as a framework that wraps the API that is still the power of the operating system. And do we really know how much of the Windows API is documented? How much of the API is included in .NET? It's hard to know. Unless those APIs are documented, we can't know what is different about each new version of Windows.


Performance. The Shudo benchmarks, which I linked to in my last entry on Java, show that applications written in C# for the .NET framework were slower than some implementations of Java but were faster than applications written in C# on the Mono framework, an open-source implementation of the Common Language Infrastructure, the basis of the .NET framework. All of the application frameworks that rely on runtimes ran slower than C/C++ native code. Overall, the speed of .NET seems to be pretty good. It seems like runtimes in general will become less of a concern for performance because hardware has more memory and faster processors nowadays.


Garbage Collection. .NET has a built-in garbage collection facility, with objects automatically being destroyed when all references to the object are released. Visual Studio.NET automatically generates destructors and the developer isn't really expected to touch them unless absolutely necessary.


Flexibility for High and Low-Level Tasks. Like with Java, everything in .NET is now programmed within the context of classes. Some lower-level system functionality is wrapped within the System namespace. If necessary, developers can go an extra step and write unmanaged code to access the underlying Windows API. Those who do will find that the API calls are now different, with all parameters in VB.NET passed by value by default. I will have to look into this further to see how it affects API calls.


Support/Extensibility. .NET is backed by Microsoft and its partners, and the deal with Microsoft is that in order to run their software, you must be running Windows on the client and perhaps the server, too. Assuming that this is OK with the customer, .NET is a logical choice for building server applications now. How soon enterprises will come around to installing the .NET framework on client machines dictates the pace of adoption of .NET as the framework for Windows client applications. As Microsoft's creation, .NET is dictated by the enhancements to the framework that come out of Redmond.


Marketability. What would Microsoft be without marketing hype? In the case of .NET, it has been pretty steady hype of the framework for the past four years. And make no mistake about it, Redmond is absolutely right when they say that .NET really is a different way of developing applications for Windows machines. Some of this hype has had an impact, but mostly not. So adoption of .NET has been slower than Microsoft expected, but for such a different technology from its previous frameworks, perhaps it is justifiably slower. A lot of developers, especially VB developers, have had to work to change the way they approach Windows development. Going forward, I hope that .NET does succeed, if only for the potential to make developers' lives a little easier.


And yet, .NET is not the last chapter in the Windows development saga. There is a new framework that is coming on to the scene with the next version of Windows, Windows Vista. It's called WinFX. Really, WinFX is an extension of the .NET framework to provide enhanced user interfaces and more capability with web services. All the capabilities of WinFX are incorporated in .NET Framework 2.0, which is due to be released sometime later this year or early next year. The big additions to .NET are the new Windows presentation system called Avalon, and the new communications system called Indigo. Avalon relies on a dialect of XML created by Microsoft called Extensible Application Markup Language (XAML) that controls a lot of the presentational elements of an application, and is designed to be used with conventional Windows applications or with Web applications. Indigo is Microsoft's implementation of a service-oriented architecture that allows for the creation and use of network services.


My assessment of .NET is that it provides an integrated way of doing a lot of things on Windows-- creating desktop applications, ASP.NET applications, web services, and more. However, .NET and its adoption in the enterprise is hampered by the legacy of previous frameworks, and the fact that there is a lot of legacy code out there. The ability to use legacy COM and Win32 components is built in to .NET, so backward compatibility is not the problem. The problem is that if I want to seriously move forward in my career with .NET, I think I will still have to become familiar with COM and Win32 as well. They leave a lot to be desired in terms of the complexity of using and deploying objects and libraries. I also wonder if in the process of learning about .NET 2.0, it will not seem like having to completely retool. The advent of XAML may make creating Windows and Web applications a little easier for developers, but it adds another level of complexity to the development process. I think developers with Microsoft toolsets already have enough complexity to deal with.

Comments

Popular posts from this blog

Where Things Stand Now

So it's been over five years since my last entry in this blog.  A lot has happened since then, personally and professionally.  I am now married, I have two little daughters, and for the past four years, I have been with a company that makes inserting machines for the mailing industry, supporting and extending software that works under the Automated Document Factory (ADF) concept.  I continued with the manufacturing client (an auto plant) until March 2006, developing in VB 6 and C++. I had a 3-month hiatus from consulting when my wife came from the Philippines and we had our wedding.  After that, I had a couple of short-term consulting gigs with another staffing company where I was involved with creating applications based on Microsoft Office VBA-- one in Excel, the other in Access on a tablet PC.  After this same staffing company sent me on yet another Access project, I decided that they were not taking me in a direction that I wanted to go, so I quit the proje...

Accomodating the Client

Working out how far to go to accomodate the client's needs. Despite my talk about how I have "fired" the industrial client, I have been working with them over the past three weeks on certain issues that they are having with the VB 6/SQL Server systems I helped to build. This support has been for free. It is becoming the scenario that I didn't want to get into-- having to support their system because the project champions at the client don't have enough technical competence to support the users, and the IT department is unwilling to support the application because "something better" is coming along soon. Despite my desire to wriggle free from this client, I can't, after all, just throw them overboard; I don't see that as being the responsible thing to do. I have also committed to helping them with the application that interfaces with the tool controllers, but I have made it clear that that application needs to be written in C++, in order to a...

The Never-ending Battle

Hints at market share for Windows and Linux. In the wake of the attack of the Zotob worm, I saw a chart from Netcraft that tracked the uptime of Fortune 100 web servers over the previous 24 hours. Apparently there was no effect from Zotob among the Fortune 100, as the server that had the most downtime was running Sun Solaris, and that might have been taken down for maintenance. Zotob seemed to cause more trouble on the intranets of several media companies, as CNN reported on Zotob as if it were some cyber-Armageddon. As it turned out, CNN's LAN just got hit especially hard. Looking over the Netcraft chart is enlightening. In addition to uptimes and other network-related statistics, the operating system of each site is also listed. The dominant OS in the Fortune 100 is not Windows, but... Solaris. Solaris runs on 42 sites, Windows Server runs on 25 of the sites, Linux on 17, another flavor of Unix runs 5 sites, and 10 sites have an "unknown" operating system. My gu...