Note: This diatribe was written during the Visual Basic.NET beta in 2001. I never purchased the released product and have no idea which of the points I made during the beta still apply to the release version. I am leaving my rant as written for historical reasons. See the VB.NET Again page for a description of my brief adventures with VB.NET in 2008.
Back in 1997 when I first abandoned Visual Basic, I took a little trip to the other side to examine Delphi. I found a big difference in programming models between VB and Delphi, and to my surprise I discovered that I liked the VB model better. Later when I examined Java IDEs, I found the same difference. Java, Delphi, C++ Builder, Visual C++, and all the other object-oriented GUI-based languages all differed from VB in the same way.
Now two new languages with the same difference from Visual Basic are about to appear on the market. One is called C#. The other is called Visual Basic.NET (no relation to Visual Basic 1 through 6).
Like Java and Delphi, these new languages are based on a complex and powerful class library. The entire language is built on and integrally connected with this class library, and because the library is so complex, there is a certain level of complexity in even the simplest program. The IDEs associated with these languages attempt to hide the complexity, but cannot do so completely (and wouldn't want to).
Compare this to Visual Basic. While later versions of Visual Basic (4 through 6) had some object-oriented features, the language itself was not based on these features. It was based on magic. Something happened to make all those controls appear on forms and interact with your code. But you couldn't really tell what it was. And you certainly couldn't mess with it.
That was the appeal of the language. It just worked. You could produce amazingly powerful applications in a short time using techniques that felt just right, even though they didn't make sense if you looked at them too closely.
Now it's true that at some point you ran out of magic. The more complicated your application, the more likely you were to hit inconsistencies and incompleteness. Features added in the later versions of VB didn't quite fit the original plan.
By the end, Visual Basic looked like a Rube Goldberg construction with weird features fastened on with duct tape and mismatched fasteners. The foundation was solid, but it wasn't big enough to handle all the new rooms and added stories.
We all assumed that the structure would hold at least until they added our one new feature or fixed our one pet peeve, but I suppose in our hearts we knew it couldn't go on forever.
I asked for a class library, but I assumed they would put it on top of the existing system, not make it the foundation of a new one. I asked for inheritance, but I assumed I would be the one to use it, not them. I asked for procedure variables, but for my own use, not as the basis of a new event model. I asked for language changes, but not for a whole new language.
I was naïve, maybe even foolish. Now that I see Visual Basic.NET in the flesh, I realize it was a pipedream to believe we could have the old VB with a few new features. Things have gone too far. The original VB was one of the coolest hacks in history, but still a hack.
It turns out they couldn't tack all the things those other languages had onto the VB framework. They had to start over from scratch. And when they finished building a new language that had all the features to match Delphi and Java, strangely enough the new language looked and worked a lot like Delphi and Java.
So I accept that Visual Basic must be a completely new language to give me all the features I asked for. It has to look and work differently, and it can never be quite as easy and obvious. But like it's competitors, it could be more consistent and powerful.
I have heard the screams of agony on the VB news groups. Many developers can't bear the thought of completely rewriting their apps in a foreign language. Some of these folks want Microsoft to maintain and improve VB 6 as a separate language. Others suggest Microsoft should sell VB 6 to another language vendor. I hear your pain, but for those who want VB 6.1 followed by VB 6.2 and so on through VB 6.9999, I have one thing to say.
Get over it.
Microsoft will do the least they can possibly get away with to maintain VB 6. They'll wait a decent interval, and then they'll kill it. It's a new world, and we're not going back. If you can't handle the change to VB.NET, it's better to get out early than late.
In short, I completely support the idea of redesigning Visual Basic from the ground up. I embrace the major structural changes. It was necessary to make them, and in fact I probably would have made more.
The problem isn't what they did. It's how they did it. They assigned low-level C-oriented designers to redesign the high-level Basic language. The results are, well, alien. The new language is filled with gratuitous insults and ignorant stupidities. VB.NET just doesn't have the Basic spirit. It's hard enough for Basic programmers to swallow the necessary changes without imposing ridiculous and silly ones for no reason.
It's clear from studying the new design that Visual Basic.NET was an afterthought. They didn't try to fix the most popular and lucrative language in history. No, the starting point appears to be Java.
A few years ago Java was a very popular language with Microsoft developers. Everybody could see the advantages of using a high-level object-oriented language for developing applications. C++ is a fine language for low-level work on operating systems and components. But when you're developing end-user applications, the efficiency advantages of a low-level pointer language aren't worth the extra development time and bugs. Microsoft had a high-level language called Visual Basic, but most Microsoft developers were used to C++. They were more comfortable moving to Java.
But then came legal problems with Sun, and eventually Microsoft's J++ faded away for reasons that had nothing to do with the merits of the language.
This fit in nicely with a long-term Microsoft dream of inventing a new language. Most Microsoft languages were purchased from other companies, but for the last few years there have been rumors floating around about a radical, innovative new language called COOL.
If somebody wants to write an expose revealing the true story of COOL, I d like to read the juicy tidbits. What I can say is that the language that eventually emerged is not COOL. It's not radical or innovative. It's a workmanlike rip-off of and slight improvement on Java. The name of this new language, C#, matches the content. It's kind of clever, but not very original.
I don't know the inside story on how Visual Basic.NET was developed, but it appears to be based more on C# than on Visual Basic. Apparently after C# was designed, somebody said, Look, we could make a bunch of money by cloning this C# language into something that has a syntax sort of like Visual Basic.
This is like in the science fiction movies when aliens wearing masks try to pass as humans. They always give themselves away because they can't quite imitate the subtle aspects of earth culture. The C# people do their best to pass as VB designers, but ultimately they can't fool us.
I can imagine them discussing the job: I don't get it. Why are we doing this anyway? What do these VB people really want? Why don't they just use C#? Well, OK. So VB has several million users and C# has zero. Maybe we do have to humor them. But really! What kind of programmers think True is equal to negative one?
So let's talk about unBasic changes to Visual Basic.NET. Let's start with a lost feature that on a technical scale of 0 to 9 is probably a 3. Losing this feature isn't the end of the world, but it does illustrate the contempt the Microsoft language designers have for us and our annoying language.
Hardcore Visual Basic quotes the story of Basic arrays directly from the language authors, John Kemeny and Thomas Kurtz. They started out with one-based arrays because that's how ordinary people think and Basic was designed for ordinary people. But mathematicians and scientists complained that their matrixes start from zero, so Kemeny and Kurtz added the Option Base statement so that users could decide for themselves whether to start at 0 or 1. But later they noticed that civilized high-level languages such as Pascal allowed the user to specify the start and end of the array. So Kemeny and Kurtz adopted this feature for Basic, allowing the following:
Dim aiJoeAverage (1 To 10) As Integer<
Dim aiMathemetician(0 To 9) As Integer
Dim aiWeatherman (-50 To 125) As Integer
Dim aiNoneOfYourBusiness(iFirst To iLast) As Integer
My recommendation was that Visual Basic programmers should always specify the start and end of arrays. The following legal statement should be considered harmful:
Dim aiUnclearAndConfusing(10) As Integer
Originally Microsoft changed the meaning of this statement in VB.NET so that it would be confusing as in C++ rather than confusing as in VB. Recently they changed back to VB-style confusion and claimed that this was a major concession to VB compatibility. But they refused to restore user-defined array bases.
Notice that when designing the original Basic arrays, Kemeny and Kurtz never asked, How do computers base arrays? or What design would be the easiest to implement? or What would be most efficient? or How can we be compatible with cruder languages? or How can we work with a crippled run-time library designed for the lowest common denominator of languages? The fact that language compilers and interpreters consider arrays to be zero-based was completely irrelevant because it's a trivial matter for a language to adjust arrays to any base behind the scenes. And even if there were a loss of efficiency, it wouldn't matter because Basic is (or used to be) a high-level language designed for people, not for computers.
It's completely appropriate for low-level languages like C, C++, and Assembler to impose zero-based arrays on their users. Forcing programmer to adjust to zero-based arrays is the least of the logical problems in pointer languages. But how did zero-based arrays get into high-level language pretenders like Java and C#?
When Microsoft language designers developed a new language based on Java (although they deny it) and then changed Visual Basic to be as compatible with C# as possible, they had several choices on how to implement arrays:
They made the worst possible choice, 1, instead of the right thing, 2. Microsoft apologists may claim they couldn't have done 3 and still remain compatible with the Common Language Runtime (CLR), but I don't believe it. If Microsoft can't even make Basic be Basic, what credibility can'they have when they try to persuade language vendors to port other languages to .NET?
To add insult to injury, the conversion tool that fails to convert VB6 code to VB.NET takes the following input:
Dim ai(1 To 10) As Integer
And produces this:
Public ai As Object = New VB6.Array(GetType(Short), 1,10)
Apparently there is a compatibility class in the VB6 namespace that can fake a real Basic array, and the update utility assumed I would want to use this bizarre and crippled feature rather than cursing and redesigning my array to fit the new language. Not only are the array objects weird, but they don't work or at least not with Option Strict because of the late binding.
Every cloud has a silver lining. The stupidity of eliminating user-defined array bases is more than offset by the good sense of allowing compile-time initialization of arrays. I can now write this:
Public asFruit () As String = { Apple , Orange , Pear , Banana }
I just put the strings in the array, and the compiler automatically figures out how many of them there are just like in civilized languages. It took ten years, but we can finally initialize our variables.
Visual Basic.NET adds a long overdue feature called delegates, but like so most new features in .NET the idea is better than the design. The delegate syntax doesn't quite fit the Basic philosophy.
Delegates are similar to what C calls function pointers, but what high-level languages call procedure variables or procedure parameters. FORTRAN and Pascal have had versions of this feature for many years. The first edition of Hardcore Visual Basic showed a couple of gross hacks to fake procedure variables, and the second edition showed a less gross (but still complicated) hack based on interfaces. My guess is that under the surface delegates encapsulate the interface hack I recommended under VB5 and VB6.
I'm not going to get into the details of how delegates work. I 'll just say that the new .NET event model is based on them, and they are also used to launch threads. You can pass them as parameters, store them in arrays, and do all the other things people have done in other languages for years, plus a few things that other languages don't do.
Although delegates are cool, the syntax for using them sucks. Every time you use a delegate you are reminded of a simple fact: "I am using a high-level language designed by people who don't understand the concept of a high level language."
I'm referring to AddressOf in the syntax for assigning delegates:
' Assign the DestroyCars procedure having the DestroyProcs signature
' to the destroyer delegate variable
destroyer = New DestroyProcs(AddressOf DestroyCars)
Regardless of what the syntax implies, you are not assigning the address of DestroyCars to the destroyer delegate variable. You are assigning DestroyCars itself, with all its parameters and its return type if it is a function.
There is no such thing as an address in a high-level language. Addresses only exist in low-level languages such as assembler and C. That's why AddressOf was such an abomination in VB5/6. It really did introduce a typeless assembly language feature to a typed high-level language. You really were passing the address and you really would crash if you passed the wrong one. But the whole point of high-level languages is that you're not supposed to know or care that addresses exist.
The delegate feature is type safe. You can't assign a DestroyCars procedure to a delegate variable unless that variable has a compatible type such as DestroyProcs. So why give a safe new typed feature the name of a dangerous typeless feature from the past? If AddressOf is intended as a metaphor, it's a bad one. A random word such as TelephoneNumberOf or SocialSecurityNumberOf would make as much sense. If you wanted to use a good metaphor, why not call the operator SignatureOf? But if you wanted to be accurate and eliminate worthless noise, get rid of the extra keyword.
destroyer = New DestroyProcs(DestroyCars)
You're specifying the procedure itself, not the anything of the procedure. The language should know in context what this means without any operator.
Once again I'm off on a useless rant about symbolism and metaphors. The delegate feature itself works fine; so why rage about the syntax? Imagine you are a non-English-speaking Finnish programmer trying to learn this new language. None of the keywords make any sense to you anyway. You have to learn them by rote and apply meanings in your own language. What difference does it make that the keyword for declaring variables, Dim, seems to have been chosen by opening a dictionary at random. Senseless words like Sub and Multicast mean as much to you as sensible ones like If and Function.
Most people in this world don't speak English, so why make such a big deal about Visual Basic.NET using meaningless metaphors? It's culturally insensitive to design languages in English anyway. I don't know why they didn't design it in Latin so that no one would have an unfair advantage.
I could go on and on about the cool new features in Visual Basic.NET and the stupid and insulting way in which many of them were implemented. Why did Microsoft make so many risky changes in its most popular language? They state one of the reasons right up front:
"One of our major goals was to ensure Visual Basic code could fully interoperate with code written in other languages, such as Microsoft Visual C# or Microsoft Visual C++..."
Language interoperability was probably among Microsoft's top three priorities in designing the .NET languages. It might be among the top 300 priorities that Visual Basic programmers requested for the next version of their language. This difference in opinion causes a certain amount of tension.
Frankly, Microsoft can't get away with blaming all their petty changes on interoperability. Some changes seem almost like calculated insults. The designers of VB.NET, being C++ programmers, may simply be saying "#$!%& you, Basic scum. We're not bowing to your stupid prejudices just because there are ten times more of you than of us." Or they may be just breaking things that are unfamiliar to them because they don't understand or care about our language history and culture.
I 've been accused of unfairly trying to read the minds of VB.NET designers. Maybe so, although I think you can interpret peoples attitudes from what they produce. People certainly interpreted my personality from the text of my book, and many of them seemed to do so accurately. But if I'm wrong about the VB.NET designers, if they really are VB programmers who love Basic and have the spirit of the language designers, Kemeny and Kurtz, then so much the worse. These results could only be explained by incompetence and stupidity rather than ignorance and arrogance.
My generally negative attitude doesn't mean I don't have anything nice to say about VB.NET. I can be as stubborn and contrary as Microsoft, and I'm going to defend some changes that are driving other people crazy. The following list of changes and incompatibilities are taken from Karl Peterson 's Visual Fred page.
I'm just going to hit a few low and high points to get them off my chest:
The syntax for properties has been randomly changed for reasons no Visual Basic programmer can imagine. Here' s the old syntax:
Property Get Thing() As Integer
Thing = iThingA
End Property
Property Let Thing(iThingA As Integer)
iThing = iThingA
Property Let
Here s the new syntax:
Property Thing As Integer
Get
Thing = iThingA
End Get
Set
iThing = Value
End Set
End Property
What is the purpose other than to break existing code? The change is purely one of style. There is little technical advantage to either syntax. Some Microsoft apologists claim that this system is necessary for compatibility with the Common Language Specification (CLS), but this is obvious nonsense to anyone with an understanding of how language parsers work. Furthermore, Microsoft insiders have told me that the CLS designers deny any such necessity. If a syntax like this couldn't be processed, .NET would have little chance of ever working with non-Microsoft languages such as Cobol and Eiffel, which Microsoft claims will soon have versions compatible with the .NET framework.
Microsoft critics point out that the new syntax is actually less powerful than the old because visibility qualifiers such as Friend, Private, and Public are applied to the whole property, not to the Get or Set. It's impossible to make the Get Public and the Set Friend. While this may be true now, there's no reason Microsoft couldn't make minor adjustments to the new syntax to allow this functionality.
From a technical standpoint, the two syntaxes are equivalent. So why make such a capricious change? Programmers deal with so much substantive change in technology that they cling to stability in the minor features. Wise language designers save their surprises for important matters rather than wasting change quotas on trivialities.
For real programmers, the new property syntax is insulting and annoying, but otherwise causes few problems. But the new syntax causes real problems for sample programmers. The new property syntax takes an extra column of horizontal space. VB.NET already adds two extra levels of indentation. Functions and Subs that used to start at the leftmost column now start at the third level of indentation inside the Namespace and Class or Module blocks. The code for properties starts at the fourth indentation level. This makes it more likely that your code will either extend out of sight to the right, or that you'll have to wrap lines.
Book authors have to fit their sample code into less than 80 columns, and magazine authors have even less space to work with. I know that C++ and Java authors have been dealing with this sort of thing all along, but they have the advantage of working with languages that aren t line-based. They can wrap their code almost anywhere they like without worrying about the ugly VB line-break character. The old property syntax is much more compact where it matters.
To make matters even worse, the new syntax also adds two lines of vertical space. Vertical space isn't as precious as horizontal space, but authors still don't like to waste it.
One of the most in-your-face changes in VB.NET is renaming Integer to Short and Long to Integer. Criticism of this change has been so hot that whole Internet nodes have been fried crispy. The only non-Microsoft voice defending this change is mine. I hear the crackling sound of millions of lines of code being broken, but I don't care.
Most of that code would not be breaking if Microsoft had taken my advice during the beta of VB4. I told VB program managers then that it was crazy to name a 32-bit integer Long under a 32-bit operating system. I told them it was crazy to give the default name Integer to 16-bit integer types that careful programmers would almost never use. It gives me great satisfaction now to say, I told you so.
This is like if I discover a new species of butterfly and decided to name it the Blue Angel. One of my colleagues points out a minor problem with the name:
Why are you calling it the Blue Angel? This butterfly is red.
I don't care. I discovered it, and I've always liked the name Blue Angel. Blue was my mother's favorite color.
But it's red.
I don't care. If Microsoft can name their 32-bit integer Long when it's not long, I can name my red butterfly the Blue Angel.
So get over it. Short is the name for 16-bit integers. Integer is the name of 32-bit integers. Long is the name of 64-bit integers. And gone is the name of Variant and Currency. Of course removing these two types is going to break some code, including some of mine. But I'm not complaining. I used Variant occasionally for parameters that might be either String or Integer, but I could to do the same thing more cleanly in VB.NET with function overloading. I never used Currency, except for gross hacks that VB.NET makes unnecessary. The underlying storage format of Date has also changed, but again I don't care.
I know many programmers have good reason for being more concerned about these changes than I am. There will certainly be pain. But pain can be worthwhile if the new system is more flexible and makes more sense.
In VB.NET you always have to qualify Enum members with the name of the Enum -- a major syntactical change that appears to go unmentioned in beta documentation. This breaks every line of old Enum code I have. More important, it removes most of my reasons for using Enums. In rewriting my code, I would not add the Enum qualification as the update utility suggests. I'd change the Enums to Consts. In short, this is a ridiculous change that adds nothing to the language except inconvenience. If they think this adds some sort of safety, they should have added an Option Anal and require the qualifications only for programmers who choose it.
I try to picture a committee of language designers sitting around a conference table deciding how to cripple their language. But I lack imagination. I can't think of a reasonable scenario for removing static local variables. Why would anyone want to deliberately make our code less modular and readable? All I can come up with is that the Delphi people must have got to them with some sort of voodoo hex. I recently heard that the VB.NET designers have come to their senses and put static locals back in the language. If this is true, I welcome them back to their right minds.
Much blood has been spilt on VB user groups over Microsoft s initial decision to make VB.NET s logical operators compatible with C logical operators rather than with VB logical operators. Later more blood flowed when this decision was rescinded in favor of a return to VB compatibility. Before I risk life and limb by taking a position in this controversy, I want to call your attention to an equally important issue. I want to mourn the loss of two operators that were uniquely ours: Imp and Eqv.
Now I admit that I never used these operators and only encountered one person who did. To this day I have no idea what they meant, except that they were a symbol of our uniqueness. No other language had them. When other programmers tried to kick sand in our faces, we could point to the powerful Imp and Eqv operators. We would sniff in disgust rather than answer when they asked us what these operators did.
Now a bunch of foreigners with no understanding of our culture and traditions have removed unique features from our language merely because we didn't use them. And they didn't even have the grace to replace our symbols with something useful, such as the Shl and Shr operators provided by most every other programming language.
Let's get back to the equally important issue of logical operators, starting with some background and history. All modern programming languages have two kinds of operators for evaluating logical operations--bitwise and logical. Bitwise operators work on the individual bits of their arguments. Logical operators work on whether the sum of the bits is zero or not. In simple conditional tests, both kinds of operators give the same results. Logical operators can be more efficient because they automatically skip the second test in compound expressions if the first expression settles the question (this is called short-circuiting). Historically some languages (FORTRAN, Pascal, Basic) use bitwise operators as the primary operators. For technical reasons that I won t get into here, these languages use -1 as the value of the True constant. In contrast, C-based languages use logical operators as the primary operators and use 1 as the value of the True constant.
By the definition above, Microsoft Basics, including VB, were not modern languages because they had bitwise operators, but not logical operators. This is a trivial problem to fix, and I saw proposals to fix it by adding logical operators in the spec for a prior version of Visual Basic (VB5, I think). Like most proposals for simple language enhancements, this proposal died before being implemented in a vain attempt to provide more development time for major features.
In the first beta version of VB.NET, Microsoft added logical operators, but gave them the old names And and Or formerly used by the old bitwise operators. The old operators were renamed to BitAnd and BitOr. The value of the constant True was changed to 1 for compatibility with C-based languages such as C#. Microsoft made the foolish claim that these changes were necessary for compatibility with the Common Language Runtime, although obviously any runtime library that depended on a particular value for True would be broken. This angered a lot of VB programmers who saw that the changes would cause bugs in old code that assumed And and Or to be bitwise.
Microsoft heard these loud complaints and for second beta changed the meaning of And and Or back to bitwise. They added new logical operators with the peculiar names AndAlso and OrElse, This change was met by howls of protest of approximately the same volume as the earlier ones against the first change. Anyone who takes a position on these operators risks insult and injury from one side or the other, and if the position taken is equivocal, risks attack from both sides. Such danger has, of course, never stopped me from expressing an opinion.
From a technical standpoint this controversy is much ado about nothing. Under either system programmers will have access to both bitwise and logical operators. It makes no difference technically whether the True constant is -1, 1, or 13. When writing new code you'd probably use logical operators 80 percent of the time and bitwise 20 percent of the time. Your code would work the same regardless of the names of the operators. Warnings about the end of civilization from proponents of either side are hyperbole. In most cases it makes no difference.
The case for the old system is practical and cultural. You could leave your old code unchanged without any loss of correctness. You might want to gradually go through and change some bitwise operators to logical operators in bottleneck code to maximize efficiency. By contrast in the system used in Beta 1, you'd have to go through all your code and change any expressions that work on bits to use the new bitwise operators. Bitwise expressions would produce garbage if they failed to use BitAnd and BitOr. You'd have to modify these expressions one at a time. There is no mechanical way to tell the difference between logical and bitwise expressions. The Beta 1 update utility proved this by trashing all questionable expressions, making them unintelligible and inefficient with compatibility operators.
Why risk breaking code by changing a perfectly workable system that has a long history in Basic culture? The only explanation is cultural imperialism. The people redesigning our language simply want to impose their own religion. This is the first step toward curly braces. People who have to fix unnecessary bugs caused by this change won't forgive it easily.
On the other hand, the case for a C-style system is esthetic. Many logical expressions work more efficiently with logical rather than bitwise operators. It's distasteful to write most of your code using AndAlso and OrElse. Perhaps these names make sense in some twisted way, but the resulting code is going to be ugly. Proponents of the new system posted a very amusing thread on one of the VB.NET newsgroups in which they proposed other fitting operators such as ThenAgain, MaybeNot, ButAlso, CouldBe, WhoKnows, and ForSure. I would have preferred different names for the new logical operators, but you can't criticize their names too strongly unless you have better ones. I don't.
My dumb opinion is that Microsoft made a dumb mistake in changing the operators in the first dumb beta. They tried to justify that dumb mistake by claiming the change was for compatibility when it actually had nothing to do with compatibility. Now they have rolled back those dumb mistakes, but to prevent anyone from thinking they really know what they're doing, they gave the new logical operators dumb new names. Microsoft now implies that they made major sacrifices in order to respond to criticism from their dumb user base. In fact the changes are minor and will have little effect on anything other than a few million lines of dumb old code.
Back in the old days of COM, you knew when an object was terminating and when it wasn't. Wait a minute! The old days? Wasn't it just yesterday we were talking about COM in religious terms? Weren't Microsoft COM evangelists telling us that their classes, once defined, would last until the mountains crumbled to the sea? And now they laugh at our shrines to IUnknown and ridicule the priests of reference counting.
Basically Microsoft has finally figured out what some of the rest of us suspected, but dared not say aloud: COM sucks. It's too complicated, too hard to master. Still I never expected to see the whole architecture thrown the dogs. But that is precisely what has happened to COM reference counting. It's gone. And with it goes being able to determine when an object is destroyed.
Rivers of bits have been expended discussing this fundamental change on the .NET beta forums. I 'm not going to get into it except to say that the new system does not rely on reference counting and does not suffer from the circular references that sometimes drove COM programmers (particularly VB ones) crazy. The tradeoff in the new system is that the memory system uses unpredictable techniques to determine when an object is dead (unreferenced) and can be cleaned up.
You can define a Destruct method that cleans up the resources your object creates, but you can't predict reliably when the object will be destroyed and your destructor called by the system. Most of us are willing to let the system take care of its own resources when we want, but we're a little bit nervous when we can't predict when our own resources (such as files and connections) will be destroyed. Lots of VB programmers are really upset about this change. Its effect on VB6 code ported to VB.NET will be subtle and hard to predict.
When predictable destruction disappeared, it took with it one of the coolest tricks in Visual Basic--auto-reinstantiation of released objects. You could declare a variable like this:
Dim apple As New CFruit
The apple object would be magically created any time you used it, and if you destroyed it by setting it to Nothing, it would be magically created again if you used it again. But if there's no reliable way to destroy an object, there can be no reliable way to reinstantiate it. So the code above has a completely different meaning in VB.NET.
It's a rare subject that I don't have an opinion on, but I'm afraid we ll have to wait and see from experience whether this radical change in the model for memory cleanup turned out to be a good idea.
I haven't used a While/Wend loop since BASICA, and I told readers of my book not to use them either. Do/Loop is much more flexible and elegant. Fortunately my readers are a stubborn bunch. I often got mail from people who praised my book to the ends of heaven and then showed me samples demonstrating that they hadn't read it (or had ignored my advice).
So I suppose I could say 'serves you right' to all those people whose code will be broken because Microsoft changed the end of loop syntax from Wend to End While.
But I'm still confused by the change. If While/Wend is evil and unnecessary (which I think it is), then get rid of it. Do While/Loop is equivalent. If eliminating the feature would break too much code, then keep it for compatibility.
But what's the point of keeping it with a new spelling? You infuriate the While/Wend fans and disgust the Do/Loop fans without satisfying either.
The insulting language changes described here (and many others not described) are actually a fairly small part of VB.NET. Most of them could be changed for second beta. Perhaps Microsoft has heard our cries and is fixing these design bugs as I write. But frankly, I'm not going to be using VB.NET no matter what they do to fix it. The reasons are based mostly on the politics of corporate competition.
First, if I were going to use a .NET language, it would be C#, not VB.NET. Unlike many VB programmers, I am not addicted to the Basic syntax. I actually like curly braces. I prefer free-form languages to line-oriented languages. I like the terseness of C-style operators. I don't mind the extra parentheses and semicolons.
The things I don't need to deal with in my day-to-day programming are pointers, memory management, message loops, and multiple levels of mental indirection. If a C-style language can give me simple programming, I have no objection.
Now Microsoft has leveled the playing field by throwing away VB's unfair advantages. No more edit-and-continue. No more p-code. No more immediate window in design mode. No more magic. VB.NET is just another compiled language with wizards to automate the difficult parts. I don't see any significant reason to pick VB.NET over C#.
If you've got to learn a new language, why not start with a really new language rather one with just enough similarities to be confusing. You could probably write a VB to C# translator that would be more accurate and less confusing than the VB to VB.NET translator.
Besides, this beta has made it clear beyond any doubt that Microsoft's heart (if it has one) is in C#, not in VB.NET. Maybe the release version will have samples written in VB.NET, but they'll be the C# samples translated, not real Basic samples written by Basic programmers who like Basic.
So if I were going to program for .NET, I'd pick C#. But of course I'm not. C# and VB.NET are not general-purpose
languages. They are, for the present, Microsoft languages. A bet on one of these languages is a bet on Microsoft
and its ability to pressure or entice other companies into accepting its .NET vision. It's like betting that the
flies will fall in love with the spider. Maybe they will, but if so, I don't want to be involved. It's too kinky
for me.
If I were going to adopt the vision of a large company, it would be Sun rather than Microsoft. I don't see any
significant advantage between Java with its associated technologies and the emerging .NET vision. I don't want
to argue about the advantages or disadvantages of writing for the .NET CLR over writing for Java byte codes and
the Java virtual machine. Sure, the details are different, but they're both trying to get the same place. And Sun
has a big head start, and less of a reputation for stabbing its competitors. Some Microsoft proponents argue this
point, but it's kind of like debating whether Hitler was worse than Stalin.
What I really want is platform industry standards independent of big companies. Perhaps I'm indulging in weak-minded
nostalgia, but I long for the old days when languages were created by small companies (like Microsoft in 1984). I remember when your choice
of language was based on technical criteria rather than political ones. I remember when everybody compiled to native
code and you chose the language based on its features, not on its output format. I remember when vendors tried
to sell customers their languages, not their entire worldviews.
Well, those simple days are gone. We must all make difficult languages choices. For some, a .NET language will
be an attractive and sensible choice. For others there won't be any real choice for political or economic reasons.
Some will reasonably choose to go elsewhere. Some will choose VB.NET for the exact same reasons I reject it.
All I can suggest is that this is the time to rethink everything. The obvious upgrade to Visual Basic.NET is by
no means obvious.
Bruce McKinney