Sunday, October 18, 2009

TIPS: Fastest way to concatenate multiple strings in VB.NET

I wonder why my codes a lot more slower than my collegue's. Finally I detect something. It was the way how I concatenate the multiple strings. I always tend to use "&=" operator to concatenate multiple strings. While this way was a bit more faster than using "+=" operator, It's unfortunately 1000 times slower than using StringBuilder class!

Somebody has done comparison on this and it's really true!

Always use the StringBuilder when concatenate multiple strings especially when the text length is not small. Remember this!

Friday, August 7, 2009

MCTS 70-620 Exam

Alhamdulillah, Just come back from the exam. I pass it with quite a good score...

One good question, am I really a Microsoft Certified Technology Specialist, specialized in Vista Configuration ....ermmm..

Wednesday, August 5, 2009

VISTA: Top 10 worst things (in my case)


Not all the new technology is better its predecessor. Anybody who ever used XP and then forced or 'volunteerily' move to Vista will absolutely understand this very well. I've read from many sources and heard from many people about the worst things vista has 'offered'. I don't really care until finally I myself use it and now experience it deeper and deeper.

Not everything bad about Vista. There are some good things. Unfortunately, this good things are really really NOTHING compared to bad things Vista can give you!

What are those bad things? Just keep reading... I'll start with the one that relate much on my daily work...
1) Less backward compatibility
1 - You can't use your Visual Studio 2003 in Vista to build application. Ironically you can use Visual Basic 6...what the heck!

2- Visual Studio 2005 is not stable in Vista. The application frequently terminated without any cause (don't know what are the causes) So, always save your every 30 seconds so that you won't lost some of your work. This is NO 1 worst problem I'm facing in Vista. One example ..

Visual studio has stopped working, now looking for solution..couldn't find any solution anyway

3 - How about compatibility mode? It's good feature. Unfortunately good doesn't mean work. Most of the time, doesn't work at all. I think maybe microsoft just want to introduce this feature. Work or not was not a problem. Most importantly people know. (This might be true since the compatibilty mode in windows 7 working very much better according to some)

2: "Where all my hard disk space gone..??"

The C drive space is shrinking extremely fast. I don't even know where the culprit is. Worse when the result from "Computer" really look different from ctrl-A right-click all folders in folder C. For example, from Computer, I see 95.8 GB used, BUT if I go inside C drive, ctrl-A right click, I got 44.2 GB, it differs more than double!








The list will be updated...believe me, there are lots to be updated ...I just can't think so much bacause I feel of killing the Vista developers right now!!!!

Harggggghhhhhhhhhhhhhh!!!!!

Wednesday, July 22, 2009

XP, Vista and Windows 7

There are so many interesting things happen these few days. Mostly about the OS.

Vista
I've completed my Vista course (installation and configuration Vista client, course no 60-720)
Interestingly, a day after I received a Dell Vostro 1520 laptop with Vista Business loaded inside.

XP
Most people are now very sceptical to Vista. Yes, I know. There are bad things about it, and also a lot of good things. XP still a confort and 'safest' state for most people (might include me).


Windows 7
Counting down ... :-)

Wednesday, April 29, 2009

The Humble Programmer
by
Edsger W. Dijkstra

As a result of a long sequence of coincidences I entered the programming profession officially on the first spring morning of 1952 and as far as I have been able to trace, I was the first Dutchman to do so in my country. In retrospect the most amazing thing was the slowness with which, at least in my part of the world, the programming profession emerged, a slowness which is now hard to believe. But I am grateful for two vivid recollections from that period that establish that slowness beyond any doubt.

After having programmed for some three years, I had a discussion with A. van Wijngaarden, who was then my boss at the Mathematical Centre in Amsterdam, a discussion for which I shall remain grateful to him as long as I live. The point was that I was supposed to study theoretical physics at the University of Leiden simultaneously, and as I found the two activities harder and harder to combine, I had to make up my mind, either to stop programming and become a real, respectable theoretical physicist, or to carry my study of physics to a formal completion only, with a minimum of effort, and to become....., yes what? A programmer? But was that a respectable profession? For after all, what was programming? Where was the sound body of knowledge that could support it as an intellectually respectable discipline? I remember quite vividly how I envied my hardware colleagues, who, when asked about their professional competence, could at least point out that they knew everything about vacuum tubes, amplifiers and the rest, whereas I felt that, when faced with that question, I would stand empty-handed. Full of misgivings I knocked on van Wijngaarden's office door, asking him whether I could "speak to him for a moment"; when I left his office a number of hours later, I was another person. For after having listened to my problems patiently, he agreed that up till that moment there was not much of a programming discipline, but then he went on to explain quietly that automatic computers were here to stay, that we were just at the beginning and could not I be one of the persons called to make programming a respectable discipline in the years to come? This was a turning point in my life and I completed my study of physics formally as quickly as I could. One moral of the above story is, of course, that we must be very careful when we give advice to younger people; sometimes they follow it!

Another two years later, in 1957, I married and Dutch marriage rites require you to state your profession and I stated that I was a programmer. But the municipal authorities of the town of Amsterdam did not accept it on the grounds that there was no such profession. And, believe it or not, but under the heading "profession" my marriage act shows the ridiculous entry "theoretical physicist"!

So much for the slowness with which I saw the programming profession emerge in my own country. Since then I have seen more of the world, and it is my general impression that in other countries, apart from a possible shift of dates, the growth pattern has been very much the same.

Let me try to capture the situation in those old days in a little bit more detail, in the hope of getting a better understanding of the situation today. While we pursue our analysis, we shall see how many common misunderstandings about the true nature of the programming task can be traced back to that now distant past.

The first automatic electronic computers were all unique, single-copy machines and they were all to be found in an environment with the exciting flavour of an experimental laboratory. Once the vision of the automatic computer was there, its realisation was a tremendous challenge to the electronic technology then available, and one thing is certain: we cannot deny the courage of the groups that decided to try and build such a fantastic piece of equipment. For fantastic pieces of equipment they were: in retrospect one can only wonder that those first machines worked at all, at least sometimes. The overwhelming problem was to get and keep the machine in working order. The preoccupation with the physical aspects of automatic computing is still reflected in the names of the older scientific societies in the field, such as the Association for Computing Machinery or the British Computer Society, names in which explicit reference is made to the physical equipment.

What about the poor programmer? Well, to tell the honest truth: he was hardly noticed. For one thing, the first machines were so bulky that you could hardly move them and besides that, they required such extensive maintenance that it was quite natural that the place where people tried to use the machine was the same laboratory where the machine had been developed. Secondly, his somewhat invisible work was without any glamour: you could show the machine to visitors and that was several orders of magnitude more spectacular than some sheets of coding. But most important of all, the programmer himself had a very modest view of his own work: his work derived all its significance from the existence of that wonderful machine. Because that was a unique machine, he knew only too well that his programs had only local significance and also, because it was patently obvious that this machine would have a limited lifetime, he knew that very little of his work would have a lasting value. Finally, there is yet another circumstance that had a profound influence on the programmer's attitude to his work: on the one hand, besides being unreliable, his machine was usually too slow and its memory was usually too small, i.e. he was faced with a pinching shoe, while on the other hand its usually somewhat queer order code would cater for the most unexpected constructions. And in those days many a clever programmer derived an immense intellectual satisfaction from the cunning tricks by means of which he contrived to squeeze the impossible into the constraints of his equipment.

Two opinions about programming date from those days. I mention them now, I shall return to them later. The one opinion was that a really competent programmer should be puzzle-minded and very fond of clever tricks; the other opinon was that programming was nothing more than optimizing the efficiency of the computational process, in one direction or the other.

The latter opinion was the result of the frequent circumstance that, indeed, the available equipment was a painfully pinching shoe, and in those days one often encountered the naive expectation that, once more powerful machines were available, programming would no longer be a problem, for then the struggle to push the machine to its limits would no longer be necessary and that was all what programming was about, wasn't it? But in the next decades something completely different happened: more powerful machines became available, not just an order of magnitude more powerful, even several orders of magnitude more powerful. But instead of finding ourselves in the state of eternal bliss of all progamming problems solved, we found ourselves up to our necks in the software crisis! How come?

There is a minor cause: in one or two respects modern machinery is basically more difficult to handle than the old machinery. Firstly, we have got the I/O interrupts, occurring at unpredictable and irreproducible moments; compared with the old sequential machine that pretended to be a fully deterministic automaton, this has been a dramatic change and many a systems programmer's grey hair bears witness to the fact that we should not talk lightly about the logical problems created by that feature. Secondly, we have got machines equipped with multi-level stores, presenting us problems of management strategy that, in spite of the extensive literature on the subject, still remain rather elusive. So much for the added complication due to structural changes of the actual machines.

But I called this a minor cause; the major cause is... that the machines have become several orders of magnitude more powerful! To put it quite bluntly: as long as there were no machines, programming was no problem at all; when we had a few weak computers, programming became a mild problem, and now we have gigantic computers, programming had become an equally gigantic problem. In this sense the electronic industry has not solved a single problem, it has only created them, it has created the problem of using its products. To put it in another way: as the power of available machines grew by a factor of more than a thousand, society's ambition to apply these machines grew in proportion, and it was the poor programmer who found his job in this exploded field of tension between ends and means. The increased power of the hardware, together with the perhaps even more dramatic increase in its reliability, made solutions feasible that the programmer had not dared to dream about a few years before. And now, a few years later, he had to dream about them and, even worse, he had to transform such dreams into reality! Is it a wonder that we found ourselves in a software crisis? No, certainly not, and as you may guess, it was even predicted well in advance; but the trouble with minor prophets, of course, is that it is only five years later that you really know that they had been right.

Then, in the mid-sixties, something terrible happened: the computers of the so-called third generation made their appearance. The official literature tells us that their price/performance ratio has been one of the major design objectives. But if you take as "performance" the duty cycle of the machine's various components, little will prevent you from ending up with a design in which the major part of your performance goal is reached by internal housekeeping activities of doubtful necessity. And if your definition of price is the price to be paid for the hardware, little will prevent you from ending up wth a design that is terribly hard to program for: for instance the order code might be such as to enforce, either upon the progrmmer or upon the system, early binding decisions presenting conflicts that really cannot be resolved. And to a large extent these unpleasant possibilities seem to have become reality.

When these machines were announced and their functional specifications became known, quite a few among us must have become quite miserable; at least I was. It was only reasonable to expect that such machines would flood the computing community, and it was therefore all the more important that their design should be as sound as possible. But the design embodied such serious flaws that I felt that with a single stroke the progress of computing science had been retarded by at least ten years: it was then that I had the blackest week in the whole of my professional life. Perhaps the most saddening thing now is that, even after all those years of frustrating experience, still so many people honestly believe that some law of nature tells us that machines have to be that way. They silence their doubts by observing how many of these machines have been sold, and derive from that observation the false sense of security that, after all, the design cannot have been that bad. But upon closer inspection, that line of defense has the same convincing strength as the argument that cigarette smoking must be healthy because so many people do it.

It is in this connection that I regret that it is not customary for scientific journals in the computing area to publish reviews of newly announced computers in much the same way as we review scientific publications: to review machines would be at least as important. And here I have a confession to make: in the early sixties I wrote such a review with the intention of submitting it to the CACM, but in spite of the fact that the few colleagues to whom the text was sent for their advice, urged me all to do so, I did not dare to do it, fearing that the difficulties either for myself or for the editorial board would prove to be too great. This suppression was an act of cowardice on my side for which I blame myself more and more. The difficulties I foresaw were a consequence of the absence of generally accepted criteria, and although I was convinced of the validity of the criteria I had chosen to apply, I feared that my review would be refused or discarded as "a matter of personal taste". I still think that such reviews would be extremely useful and I am longing to see them appear, for their accepted appearance would be a sure sign of maturity of the computing community.

The reason that I have paid the above attention to the hardware scene is because I have the feeling that one of the most important aspects of any computing tool is its influence on the thinking habits of those that try to use it, and because I have reasons to believe that that influence is many times stronger than is commonly assumed. Let us now switch our attention to the software scene.

Here the diversity has been so large that I must confine myself to a few stepping stones. I am painfully aware of the arbitrariness of my choice and I beg you not to draw any conclusions with regard to my appreciation of the many efforts that will remain unmentioned.

In the beginning there was the EDSAC in Cambridge, England, and I think it quite impressive that right from the start the notion of a subroutine library played a central role in the design of that machine and of the way in which it should be used. It is now nearly 25 years later and the computing scene has changed dramatically, but the notion of basic software is still with us, and the notion of the closed subroutine is still one of the key concepts in programming. We should recognise the closed subroutines as one of the greatest software inventions; it has survived three generations of computers and it will survive a few more, because it caters for the implementation of one of our basic patterns of abstraction. Regrettably enough, its importance has been underestimated in the design of the third generation computers, in which the great number of explicitly named registers of the arithmetic unit implies a large overhead on the subroutine mechanism. But even that did not kill the concept of the subroutine, and we can only pray that the mutation won't prove to be hereditary.

The second major development on the software scene that I would like to mention is the birth of FORTRAN. At that time this was a project of great temerity and the people responsible for it deserve our great admiration. It would be absolutely unfair to blame them for shortcomings that only became apparent after a decade or so of extensive usage: groups with a successful look-ahead of ten years are quite rare! In retrospect we must rate FORTRAN as a successful coding technique, but with very few effective aids to conception, aids which are now so urgently needed that time has come to consider it out of date. The sooner we can forget that FORTRAN has ever existed, the better, for as a vehicle of thought it is no longer adequate: it wastes our brainpower, is too risky and therefore too expensive to use. FORTRAN's tragic fate has been its wide acceptance, mentally chaining thousands and thousands of programmers to our past mistakes. I pray daily that more of my fellow-programmers may find the means of freeing themselves from the curse of compatibility.

The third project I would not like to leave unmentioned is LISP, a fascinating enterprise of a completely different nature. With a few very basic principles at its foundation, it has shown a remarkable stability. Besides that, LISP has been the carrier for a considerable number of in a sense our most sophisticated computer applications. LISP has jokingly been described as "the most intelligent way to misuse a computer". I think that description a great compliment because it transmits the full flavour of liberation: it has assisted a number of our most gifted fellow humans in thinking previously impossible thoughts.

The fourth project to be mentioned is ALGOL 60. While up to the present day FORTRAN programmers still tend to understand their programming language in terms of the specific implementation they are working with —hence the prevalence of octal and hexadecimal dumps—, while the definition of LISP is still a curious mixture of what the language means and how the mechanism works, the famous Report on the Algorithmic Language ALGOL 60 is the fruit of a genuine effort to carry abstraction a vital step further and to define a programming language in an implementation-independent way. One could argue that in this respect its authors have been so successful that they have created serious doubts as to whether it could be implemented at all! The report gloriously demonstrated the power of the formal method BNF, now fairly known as Backus-Naur-Form, and the power of carefully phrased English, a least when used by someone as brilliant as Peter Naur. I think that it is fair to say that only very few documents as short as this have had an equally profound influence on the computing community. The ease with which in later years the names ALGOL and ALGOL-like have been used, as an unprotected trade mark, to lend some of its glory to a number of sometimes hardly related younger projects, is a somewhat shocking compliment to its standing. The strength of BNF as a defining device is responsible for what I regard as one of the weaknesses of the language: an over-elaborate and not too systematic syntax could now be crammed into the confines of very few pages. With a device as powerful as BNF, the Report on the Algorithmic Language ALGOL 60 should have been much shorter. Besides that I am getting very doubtful about ALGOL 60's parameter mechanism: it allows the programmer so much combinatorial freedom, that its confident use requires a strong discipline from the programmer. Besides expensive to implement it seems dangerous to use.

Finally, although the subject is not a pleasant one, I must mention PL/1, a programming language for which the defining documentation is of a frightening size and complexity. Using PL/1 must be like flying a plane with 7000 buttons, switches and handles to manipulate in the cockpit. I absolutely fail to see how we can keep our growing programs firmly within our intellectual grip when by its sheer baroqueness the programming language —our basic tool, mind you!— already escapes our intellectual control. And if I have to describe the influence PL/1 can have on its users, the closest metaphor that comes to my mind is that of a drug. I remember from a symposium on higher level programming language a lecture given in defense of PL/1 by a man who described himself as one of its devoted users. But within a one-hour lecture in praise of PL/1. he managed to ask for the addition of about fifty new "features", little supposing that the main source of his problems could very well be that it contained already far too many "features". The speaker displayed all the depressing symptoms of addiction, reduced as he was to the state of mental stagnation in which he could only ask for more, more, more... When FORTRAN has been called an infantile disorder, full PL/1, with its growth characteristics of a dangerous tumor, could turn out to be a fatal disease.

So much for the past. But there is no point in making mistakes unless thereafter we are able to learn from them. As a matter of fact, I think that we have learned so much, that within a few years programming can be an activity vastly different from what it has been up till now, so different that we had better prepare ourselves for the shock. Let me sketch for you one of the posssible futures. At first sight, this vision of programming in perhaps already the near future may strike you as utterly fantastic. Let me therefore also add the considerations that might lead one to the conclusion that this vision could be a very real possibility.

The vision is that, well before the seventies have run to completion, we shall be able to design and implement the kind of systems that are now straining our programming ability, at the expense of only a few percent in man-years of what they cost us now, and that besides that, these systems will be virtually free of bugs. These two improvements go hand in hand. In the latter respect software seems to be different from many other products, where as a rule a higher quality implies a higher price. Those who want really reliable software will discover that they must find means of avoiding the majority of bugs to start with, and as a result the programming process will become cheaper. If you want more effective programmers, you will discover that they should not waste their time debugging, they should not introduce the bugs to start with. In other words: both goals point to the same change.

Such a drastic change in such a short period of time would be a revolution, and to all persons that base their expectations for the future on smooth extrapolation of the recent past —appealing to some unwritten laws of social and cultural inertia— the chance that this drastic change will take place must seem negligible. But we all know that sometimes revolutions do take place! And what are the chances for this one?

There seem to be three major conditions that must be fulfilled. The world at large must recognize the need for the change; secondly the economic need for it must be sufficiently strong; and, thirdly, the change must be technically feasible. Let me discuss these three conditions in the above order.

With respect to the recognition of the need for greater reliability of software, I expect no disagreement anymore. Only a few years ago this was different: to talk about a software crisis was blasphemy. The turning point was the Conference on Software Engineering in Garmisch, October 1968, a conference that created a sensation as there occured the first open admission of the software crisis. And by now it is generally recognized that the design of any large sophisticated system is going to be a very difficult job, and whenever one meets people responsible for such undertakings, one finds them very much concerned about the reliability issue, and rightly so. In short, our first condition seems to be satisfied.

Now for the economic need. Nowadays one often encounters the opinion that in the sixties programming has been an overpaid profession, and that in the coming years programmer salaries may be expected to go down. Usually this opinion is expressed in connection with the recession, but it could be a symptom of something different and quite healthy, viz. that perhaps the programmers of the past decade have not done so good a job as they should have done. Society is getting dissatisfied with the performance of programmers and of their products. But there is another factor of much greater weight. In the present situation it is quite usual that for a specific system, the price to be paid for the development of the software is of the same order of magnitude as the price of the hardware needed, and society more or less accepts that. But hardware manufacturers tell us that in the next decade hardware prices can be expected to drop with a factor of ten. If software development were to continue to be the same clumsy and expensive process as it is now, things would get completely out of balance. You cannot expect society to accept this, and therefore we must learn to program an order of magnitude more effectively. To put it in another way: as long as machines were the largest item on the budget, the programming profession could get away with its clumsy techniques, but that umbrella will fold rapidly. In short, also our second condition seems to be satisfied.

And now the third condition: is it technically feasible? I think it might and I shall give you six arguments in support of that opinion.

A study of program structure had revealed that programs —even alternative programs for the same task and with the same mathematical content— can differ tremendously in their intellectual manageability. A number of rules have been discovered, violation of which will either seriously impair or totally destroy the intellectual manageability of the program. These rules are of two kinds. Those of the first kind are easily imposed mechanically, viz. by a suitably chosen programming language. Examples are the exclusion of goto-statements and of procedures with more than one output parameter. For those of the second kind I at least —but that may be due to lack of competence on my side— see no way of imposing them mechanically, as it seems to need some sort of automatic theorem prover for which I have no existence proof. Therefore, for the time being and perhaps forever, the rules of the second kind present themselves as elements of discipline required from the programmer. Some of the rules I have in mind are so clear that they can be taught and that there never needs to be an argument as to whether a given program violates them or not. Examples are the requirements that no loop should be written down without providing a proof for termination nor without stating the relation whose invariance will not be destroyed by the execution of the repeatable statement.

I now suggest that we confine ourselves to the design and implementation of intellectually manageable programs. If someone fears that this restriction is so severe that we cannot live with it, I can reassure him: the class of intellectually manageable programs is still sufficiently rich to contain many very realistic programs for any problem capable of algorithmic solution. We must not forget that it is not our business to make programs, it is our business to design classes of computations that will display a desired behaviour. The suggestion of confining ourselves to intellectually manageable programs is the basis for the first two of my announced six arguments.

Argument one is that, as the programmer only needs to consider intellectually manageable programs, the alternatives he is choosing between are much, much easier to cope with.

Argument two is that, as soon as we have decided to restrict ourselves to the subset of the intellectually manageable programs, we have achieved, once and for all, a drastic reduction of the solution space to be considered. And this argument is distinct from argument one.

Argument three is based on the constructive approach to the problem of program correctness. Today a usual technique is to make a program and then to test it. But: program testing can be a very effective way to show the presence of bugs, but is hopelessly inadequate for showing their absence. The only effective way to raise the confidence level of a program significantly is to give a convincing proof of its correctness. But one should not first make the program and then prove its correctness, because then the requirement of providing the proof would only increase the poor programmer's burden. On the contrary: the programmer should let correctness proof and program grow hand in hand. Argument three is essentially based on the following observation. If one first asks oneself what the structure of a convincing proof would be and, having found this, then constructs a program satisfying this proof's requirements, then these correctness concerns turn out to be a very effective heuristic guidance. By definition this approach is only applicable when we restrict ourselves to intellectually manageable programs, but it provides us with effective means for finding a satisfactory one among these.

Argument four has to do with the way in which the amount of intellectual effort needed to design a program depends on the program length. It has been suggested that there is some kind of law of nature telling us that the amount of intellectual effort needed grows with the square of program length. But, thank goodness, no one has been able to prove this law. And this is because it need not be true. We all know that the only mental tool by means of which a very finite piece of reasoning can cover a myriad cases is called "abstraction"; as a result the effective exploitation of his powers of abstraction must be regarded as one of the most vital activities of a competent programmer. In this connection it might be worth-while to point out that the purpose of abstracting is not to be vague, but to create a new semantic level in which one can be absolutely precise. Of course I have tried to find a fundamental cause that would prevent our abstraction mechanisms from being sufficiently effective. But no matter how hard I tried, I did not find such a cause. As a result I tend to the assumption —up till now not disproved by experience— that by suitable application of our powers of abstraction, the intellectual effort needed to conceive or to understand a program need not grow more than proportional to program length. But a by-product of these investigations may be of much greater practical significance, and is, in fact, the basis of my fourth argument. The by-product was the identification of a number of patterns of abstraction that play a vital role in the whole process of composing programs. Enough is now known about these patterns of abstraction that you could devote a lecture to about each of them. What the familiarity and conscious knowledge of these patterns of abstraction imply dawned upon me when I realized that, had they been common knowledge fifteen years ago, the step from BNF to syntax-directed compilers, for instance, could have taken a few minutes instead of a few years. Therefore I present our recent knowledge of vital abstraction patterns as the fourth argument.

Now for the fifth argument. It has to do with the influence of the tool we are trying to use upon our own thinking habits. I observe a cultural tradition, which in all probability has its roots in the Renaissance, to ignore this influence, to regard the human mind as the supreme and autonomous master of its artefacts. But if I start to analyse the thinking habits of myself and of my fellow human beings, I come, whether I like it or not, to a completely different conclusion, viz. that the tools we are trying to use and the language or notation we are using to express or record our thoughts, are the major factors determining what we can think or express at all! The analysis of the influence that programming languages have on the thinking habits of its users, and the recognition that, by now, brainpower is by far our scarcest resource, they together give us a new collection of yardsticks for comparing the relative merits of various programming languages. The competent programmer is fully aware of the strictly limited size of his own skull; therefore he approaches the programming task in full humility, and among other things he avoids clever tricks like the plague. In the case of a well-known conversational programming language I have been told from various sides that as soon as a programming community is equipped with a terminal for it, a specific phenomenon occurs that even has a well-established name: it is called "the one-liners". It takes one of two different forms: one programmer places a one-line program on the desk of another and either he proudly tells what it does and adds the question "Can you code this in less symbols?" —as if this were of any conceptual relevance!— or he just asks "Guess what it does!". From this observation we must conclude that this language as a tool is an open invitation for clever tricks; and while exactly this may be the explanation for some of its appeal, viz. to those who like to show how clever they are, I am sorry, but I must regard this as one of the most damning things that can be said about a programming language. Another lesson we should have learned from the recent past is that the development of "richer" or "more powerful" programming languages was a mistake in the sense that these baroque monstrosities, these conglomerations of idiosyncrasies, are really unmanageable, both mechanically and mentally. I see a great future for very systematic and very modest programming languages. When I say "modest", I mean that, for instance, not only ALGOL 60's "for clause", but even FORTRAN's "DO loop" may find themselves thrown out as being too baroque. I have run a a little programming experiment with really experienced volunteers, but something quite unintended and quite unexpected turned up. None of my volunteers found the obvious and most elegant solution. Upon closer analysis this turned out to have a common source: their notion of repetition was so tightly connected to the idea of an associated controlled variable to be stepped up, that they were mentally blocked from seeing the obvious. Their solutions were less efficient, needlessly hard to understand, and it took them a very long time to find them. It was a revealing, but also shocking experience for me. Finally, in one respect one hopes that tomorrow's programming languages will differ greatly from what we are used to now: to a much greater extent than hitherto they should invite us to reflect in the structure of what we write down all abstractions needed to cope conceptually with the complexity of what we are designing. So much for the greater adequacy of our future tools, which was the basis of the fifth argument.

As an aside I would like to insert a warning to those who identify the difficulty of the programming task with the struggle against the inadequacies of our current tools, because they might conclude that, once our tools will be much more adequate, programming will no longer be a problem. Programming will remain very difficult, because once we have freed ourselves from the circumstantial cumbersomeness, we will find ourselves free to tackle the problems that are now well beyond our programming capacity.

You can quarrel with my sixth argument, for it is not so easy to collect experimental evidence for its support, a fact that will not prevent me from believing in its validity. Up till now I have not mentioned the word "hierarchy", but I think that it is fair to say that this is a key concept for all systems embodying a nicely factored solution. I could even go one step further and make an article of faith out of it, viz. that the only problems we can really solve in a satisfactory manner are those that finally admit a nicely factored solution. At first sight this view of human limitations may strike you as a rather depressing view of our predicament, but I don't feel it that way, on the contrary! The best way to learn to live with our limitations is to know them. By the time that we are sufficiently modest to try factored solutions only, because the other efforts escape our intellectual grip, we shall do our utmost best to avoid all those interfaces impairing our ability to factor the system in a helpful way. And I cannot but expect that this will repeatedly lead to the discovery that an initially untractable problem can be factored after all. Anyone who has seen how the majority of the troubles of the compiling phase called "code generation" can be tracked down to funny properties of the order code, will know a simple example of the kind of things I have in mind. The wider applicability of nicely factored solutions is my sixth and last argument for the technical feasibiilty of the revolution that might take place in the current decade.

In principle I leave it to you to decide for yourself how much weight you are going to give to my considerations, knowing only too well that I can force no one else to share my beliefs. As each serious revolution, it will provoke violent opposition and one can ask oneself where to expect the conservative forces trying to counteract such a development. I don't expect them primarily in big business, not even in the computer business; I expect them rather in the educational institutions that provide today's training and in those conservative groups of computer users that think their old programs so important that they don't think it worth-while to rewrite and improve them. In this connection it is sad to observe that on many a university campus the choice of the central computing facility has too often been determined by the demands of a few established but expensive applications with a disregard of the question how many thousands of "small users" that are willing to write their own programs were going to suffer from this choice. Too often, for instance, high-energy physics seems to have blackmailed the scientific community with the price of its remaining experimental equipment. The easiest answer, of course, is a flat denial of the technical feasibility, but I am afraid that you need pretty strong arguments for that. No reassurance, alas, can be obtained from the remark that the intellectual ceiling of today's average programmer will prevent the revolution from taking place: with others programming so much more effectively, he is liable to be edged out of the picture anyway.

There may also be political impediments. Even if we know how to educate tomorrow's professional programmer, it is not certain that the society we are living in will allow us to do so. The first effect of teaching a methodology —rather than disseminating knowledge— is that of enhancing the capacities of the already capable, thus magnifying the difference in intelligence. In a society in which the educational system is used as an instrument for the establishment of a homogenized culture, in which the cream is prevented from rising to the top, the education of competent programmers could be politically impalatable.

Let me conclude. Automatic computers have now been with us for a quarter of a century. They have had a great impact on our society in their capacity of tools, but in that capacity their influence will be but a ripple on the surface of our culture, compared with the much more profound influence they will have in their capacity of intellectual challenge without precedent in the cultural history of mankind. Hierarchical systems seem to have the property that something considered as an undivided entity on one level, is considered as a composite object on the next lower level of greater detail; as a result the natural grain of space or time that is applicable at each level decreases by an order of magnitude when we shift our attention from one level to the next lower one. We understand walls in terms of bricks, bricks in terms of crystals, crystals in terms of molecules etc. As a result the number of levels that can be distinguished meaningfully in a hierarchical system is kind of proportional to the logarithm of the ratio between the largest and the smallest grain, and therefore, unless this ratio is very large, we cannot expect many levels. In computer programming our basic building block has an associated time grain of less than a microsecond, but our program may take hours of computation time. I do not know of any other technology covering a ratio of 1010 or more: the computer, by virtue of its fantastic speed, seems to be the first to provide us with an environment where highly hierarchical artefacts are both possible and necessary. This challenge, viz. the confrontation with the programming task, is so unique that this novel experience can teach us a lot about ourselves. It should deepen our understanding of the processes of design and creation, it should give us better control over the task of organizing our thoughts. If it did not do so, to my taste we should not deserve the computer at all!

It has already taught us a few lessons, and the one I have chosen to stress in this talk is the following. We shall do a much better programming job, provided that we approach the task with a full appreciation of its tremendous difficulty, provided that we stick to modest and elegant programming languages, provided that we respect the intrinsic limitations of the human mind and approach the task as Very Humble Programmers.

Sunday, April 26, 2009

XP in Windows 7

You don't need to buy 2 OS licenses to get your precious windows XP to run together with Windows 7. Microsoft now has the solution .... XP in Windows 7 

Conficker News Update - PLEASE READ!!

Conficker Virus Starts to Attack PCs, Experts Say

A malicious software program known as Conficker that many feared would wreak havoc on April 1 is slowly being activated, weeks after being dismissed as a false alarm, security experts said.

BOSTON - A malicious software program known as Conficker that many feared would wreak havoc on April 1 is slowly being activated, weeks after being dismissed as a false alarm, security experts said.

Conficker, also known as Downadup or Kido, is quietly turning an unknown number of personal computers into servers of e-mail spam, they added.

The worm started spreading late last year, infecting millions of computers and turning them into "slaves" that respond to commands sent from a remote server that effectively controls an army of computers known as a botnet.

Its unidentified creators started using those machines for criminal purposes in recent weeks by loading more malicious software onto a small percentage of computers under their control, said Vincent Weafer, a vice president with Symantec Security Response, the research arm of the world's largest security software maker, Symantec Corp.

Conficker installs a second virus, known as Waledac, that sends out e-mail spam without knowledge of the PC's owner, along with a fake anti-spyware program, Weafer said.

The Waledac virus recruits the PCs into a second botnet that has existed for several years and specializes in distributing e-mail spam.

Conficker also carries a third virus that warns users their PCs are infected and offers them a fake anti-virus program, Spyware Protect 2009 for $49.95, according to Russian-based security researcher Kaspersky Lab. If they buy it, their credit card information is stolen and the virus downloads even more malicious software.

Weafer said that while he believes the number of infected machines that have become active is relatively small, he expects a consistent stream of attacks to follow, with other types of malware distributed by Conficker's authors.

"Expect this to be long-term, slowly changing," he said of the worm. "It's not going to be fast, aggressive."

Researchers feared the network controlled by the Conficker worm might be deployed on April 1 for the first time since the worm surfaced last year because it was programed to increase communication attempts from that date.

The security industry formed a task force to fight the worm, bringing widespread attention that experts said probably scared off the criminals who command the slave computers.

That task force thwarted the worm partially by using the Internet's traffic control system to block access to servers that control the slave computers.

Viruses that turn PCs into slaves exploit weaknesses in Microsoft's Windows operating system. The Conficker worm is especially tricky because it can evade corporate firewalls by passing from an infected machine onto a USB memory stick, then onto another PC.

The Conficker botnet is one of many such networks controlled by syndicates that authorities believe are based in eastern Europe, southeast Asia, China and Latin America.

© Thomson Reuters 2008. All rights reserved. Users may download and print extracts of content from this website for their own personal and non-commercial use only. Republication or redistribution of Reuters content, including by framing or similar means, is expressly prohibited without the prior written consent of Reuters. Reuters and the Reuters sphere logo are registered trademarks or trademarks of the Reuters group of companies around the world.

Tuesday, April 14, 2009

XP: Mainstream Support Ends Today, Entended Support till 8 April 2014

Microsoft has ended the mainstream support for XP and now started the extended support for the OS until 8 April 2014, about 5 years from now.
Mainstream support which was normally ended 5 years everytime any of its product been released seems not really work for XP. MS finally took 8 years to end the support.  Besides the lag of Vista release, too relying on the XP by corporate and personal users was 2 main factors contributing to this longer support. The 'too relying on XP' factor doesn't seem to stop even until windows 7 being released.  MS will enable users who buy the PC with preinstalled windows 7 to downgrade to XP.
See how big the XP influence!
For those buying new PC and still wondering to go for Vista and XP. Again, it's up to you to choose. If you still  don't trust Vista, it's okay to go for XP. Remember, the extended support is still there. Once your install genuine XP, you still can get any security updates from MS (until April 2014 - quite long enough for a PC) . However, while giving free security updates, you now need to pay certain amount to get non-security hotfix ... it's just fine for me.
Support providedMainstream support
Extended support
Paid support (per-incident, per hour, and others)xx
Security update supportxx
Non-security hotfix supportxRequires extended hotfix agreement, purchased within 90 days of mainstream support ending.
No-charge incident supportx 
Warranty claimsx 
Design changes and feature requestsx 
Product-specific information that is available by using the online Microsoft Knowledge Basexx
Product-specific information that is available by using the Support site at Microsoft Help and Support to find answers to technical questionsxx
P/S - Whether or not MS extended the support, I now enjoy working with Vista despite of all bad perceptions I received earlier. :-)

Sunday, March 22, 2009

What's hot this month

iPhone 3.0, Internet Explorer 8 Beta release, Windows 7, IBM deals with Sun and recession related topic are among the hot news that most IT and Computer-related magazines are covered.

1 - iPhone 3.0
I don't like to comment anything on the stuff I can only dream for at this moment..

2- Internet Explorer 8
Is spite of popular claim by Microsoft that the browser is one of the best version ever release. Outsiders are still at doubt whether this browser able to place itself on top, pushing out Mozilla Firefox version 3.5 which will be released sooner.  Being hacked at the first day of Pwn20wn has nothing but to show how vulnerable IE8 is as their previous version was. Luckily, the counterpart mozilla firefox faced same fate. On the other hand, it was surprised to know 'light-weight contender' Google Chrome was the only browser remains unhacked until the end of the first day. As a result, now I'm writing this blog under Chrome. Back to IE8, this version will be made available inside windows 7 and those who hate IE now have chance to remove it from add/remove program list unlike before.

3- Windows 7
This is the first ever operating system which was fully developed  without Bill Gates's final say.  This is also the first OS to be launched after Gates totally off from Microsoft July last year. According to CEO Steve Ballmer, this is really Microsoft team's product compared to the previous product where most of them was the result of Gates's thinking. Initial plan, windows 7 will be shipped 3 years after release of Vista (Vista was released on 2006). However, with the current economic situation, I doubt it will be postponed to the  next year mid.

4- IBM's 7billion deal with Sun
Whatever it is, there must be some advantages and disadvantages if the IBM deal of bought over Sun is closed. Generally, both will gain benefits. Sun will have enough money to run the business without changing any of it's traditional culture on R&D where has helped the technology industry evolved a lot. IBM on the other hand will now have more to be proud off by having more market share and owns quite a number of technology  they need years to develope if not decades.

5 - Recession related topics.
There are a lot of articles, tips and advice of how to survive recession subject to those who has already been laid off from the job to the action CIO should take during the recession time. I would advise everybody regardless of  his/her confortable level with the current job to read some of these stuffs. Who knows, one day your day will come, if not as someone to be laid off, may be as someone to decide who's going to be laid off. Anyway, It works for both.

Wednesday, March 18, 2009

Bypassing Vista for Windows 7

Windows 7 is now released for it's Beta version. I don't know quite well how good it is compared to Vista. Never have time to dirt my hand into it except some information telling by others who've tried it. Many of them admit that the coming version is far better than current Vista especially on handling network and hardware stuff which were according to some was the worst part in Vista.

Some people who's trying windows 7 considered it as best windows
OS since Windows 3.1

Again, I'm not sure how true it is. Wait and see. Some people might ask whether buying the PC with Vista is a right decision right now or wait after Windows 7 being released. Some might suggest to wait. But some said, it's okay to buy now, since Microsoft will provide the upgrade, you'll only have to pay for the upgrade fees and some even hope that Microsoft will give free upgrade to those who do buying since 12 months before the release.(Risky dream?). But for me, if you are not depressed enough and can wait for at least two more quarters of economic pain before shipping, why not?

It is good to always have some good info so that you won't make any costly decision.









Sunday, March 15, 2009

Backup Your Data: Looking for the best solution

Some information written here was based on my working experience and some are the information given by many others on the net. Since servers and networking was not my expertise (at least at the time this article was written), I always welcome any comments and feedbacks from readers.

*******************************************************************

Data is important. Private data is even more. Either personal or commercial. No disputes on this.
When talking about data backup, THREE things are always highly focused on which are:

1- Cost
How expensive is the solution to be implemented.

2 - Reliability and Security
How reliable and secure the data once backup. Any security loop holes which can expose to any malicious attacks? virus? natural disaster? etc

3- Speed
How fast can we backup and restore the data.

Scenario to consider
------------------------
Now, consider we've large amount of sensitive data (200 - 300 Gb) to be periodically backup in a monthly basis. Restore will be done any time, depend on the request. Meaning that, the media must be large enough, the speed must be fast enough for restore process, and should not compromise in security factor. Let we consider solutions below.

1) CD is out of the consideration! Next please ..

2) DVD
Still limited in storage. While the dual layer DVD normally below 10 GB and single layer below 5GB, You still need a lot of pieces to complete your monthly backup.
Result: Reject!


3) Tape Drive
Good storage size. We need 1 to 2 tapes to backup 300Gb. Have to wait at least half to almost whole day to restore 1 tape data. In term of cost, we need USD1000 0r (MYR3000 - MYR4000) to buy the tape drive and 40USD (MYR 200 - RM300) for individual tapes. Not so good at speed and expensive at cost.
Result: Reject

4) External Hard Disk
The speed is much better than tape. A bit slower than internal. Not so costly compared to it ability. Not much security holes since you won't expose it to the network. However, it doesn't allow multiple accesses. One single client or server can access it in a time. Therefore not suit for server backup where the data is normaly shared by multiple users. Fortunately, this already satisfed above requirements.
Result: Just nice.

5) Storage Area Network (SAN)
Unlike the previous. This is more as architecture technology then a media technology. A single dedicated server is required plus few storage nodes will be connected to the network in order to implement the technology. Due to it highly advance technology, it is seldom used by personal and small company. The main benefit that a SAN network brings a company is speed with data transfer plus the ability to connect large data networks that span thousands of miles- and the devices on that network can still communicate effectively. By organizing these devices on a data device-only network, faster data transfers can occur. Very helpful to system administrator since it can backup data from multiple sources at any time.
Do we really need this?

Result: Acceptable for multiple access and highly scalable solution.


6) Network Attached Storage Server (NAS)
A NAS server is essentially a server that is set aside strictly for distributing files to other servers, and client computers. Instead of doing any processing, this server only act as a storage to other servers or clients therefore reducing risk of file corruption and most OS-related issues and very good at speed. Standard servers typically run server-class Operating Systems, which can become corrupted, or otherwise damaged. NAS, however, store their Operating System on flash memory which can only be overwritten in an upgrade. Due to that, it is normally more expensive. Moreover, this is not end stage of backup due to the physical disaster which can be potentially crashed the whole storage and leave no single data once occurs. That's why technology such as clustering and RAID should be implemented as well. (Wikipedia: It should be noted that NAS is effectively a server in itself, with all major components of a typical PC – a CPU, motherboard, RAM, etc. – and its reliability is a function of how well it is designed internally. A NAS without redundant data access paths, redundant controllers, redundant power supplies, is probably less reliable than Direct Attached Storage (DAS) )

Finally, In most cases, you'll still need to do backup for this server and bring us back to the original question "where and how to backup the data?"

Result: Acceptable for data storage not a data backup. Full Stop.

Besides these solutions, there are other solution we can implement.

7) Optical Jukebox.
From Wikipedia: Optical Jukebox is a robotic data storage device that can automatically load and unload optical discs, such as Compact Disc, DVD, Ultra Density Optical or Blu-ray disc and can provide terabytes of tertiary storage. Jukebox capacities have greatly increased with the release of the 50GB dual layer Blu-ray (BD) format, with a road-map to increase to eight layers and 200GB per disc. The current format allows 35TB of storage from a single 700 disc jukebox.

I might miss out some advantages or disadvantages at every single data backup solution I disucssed in this article. However, this overview might give you some introductory concept and idea on the data backup solution which might help you deciding the best solution and most importantly BEST AFFORDABLE SOLUTION as most people might concern.

Monday, January 12, 2009

FTP using VB.NET

It was not my initial intention to share how to code FTP library. Initially, I thought .NET has already supplied complete library for ftp. However, I got a big "NOT!".

While you are able to do download and upload files via FTP using System.Net.FtpWebRequest -creating, renaming and deleting folder was not so straight forward. You still have to go back to the basic ftp command such as MKD, RNFR, RMD, etc.

I finally came across a library that did this library in C# codes .

This library will enable you to do few important actions such as:
1 - Connect to FTP server (for sure!)
2 - upload file
3 - download file
4 - delete remote file
4 - create remote folder
5 - delete remote folder
6 - rename remote folder
7 - list down all the files in the remote path
8 - get remote file size

All these functions should be enough to develop one powerful FTP client.

I've spent couple of hours to translate this code into VB.NET and done some modifications to make this library works for me, then this is the result. (quite a long code)

**********************************************************************************


Imports System
Imports System.Net
Imports System.IO
Imports System.Text
Imports System.Net.Sockets

Public Class FTPClass
Private remoteHost As String, remotePath As String, remoteUser As String, remotePass As String, mes As String
Private remotePort As Integer, bytes As Integer
Private clientSocket As Socket
Private retValue As Integer
Private debug As Boolean
Private logined As Boolean
Private reply As String
Private Shared BLOCK_SIZE As Integer = 512
Private buffer As Byte() = New Byte(BLOCK_SIZE - 1) {}
Private ASCII As Encoding = Encoding.ASCII
Public Sub New()
remoteHost = "192.168.X.XX" 'Please specify correct IP
remotePath = "."
remoteUser = "username"
remotePass =  "password"
remotePort = 21
debug = False
logined = False
End Sub
'''
''' Set the name of the FTP server to connect to.
'''
''' Server name
Public Sub setRemoteHost(ByVal remoteHost As String)
Me.remoteHost = remoteHost
End Sub
'''
''' Return the name of the current FTP server.
'''
''' Server name
Public Function getRemoteHost() As String
Return remoteHost
End Function
'''
''' Set the port number to use for FTP.
'''
''' Port number
Public Sub setRemotePort(ByVal remotePort As Integer)
Me.remotePort = remotePort
End Sub
'''
''' Return the current port number.
'''
''' Current port number
Public Function getRemotePort() As Integer
Return remotePort
End Function
'''
''' Set the remote directory path.
'''
''' The remote directory path
Public Sub setRemotePath(ByVal remotePath As String)
Me.remotePath = remotePath
End Sub
'''
''' Return the current remote directory path.
'''
''' The current remote directory path.
Public Function getRemotePath() As String
Return remotePath
End Function
'''
''' Set the user name to use for logging into the remote server.
'''
''' Username
Public Sub setRemoteUser(ByVal remoteUser As String)
Me.remoteUser = remoteUser
End Sub
'''
''' Set the password to user for logging into the remote server.
'''
''' Password
Public Sub setRemotePass(ByVal remotePass As String)
Me.remotePass = remotePass
End Sub
'''
''' Return a string array containing the remote directory's file list.
'''
'''
'''
Public Function getFileList(ByVal mask As String) As String()
If Not logined Then
login()
End If
Dim cSocket As Socket = createDataSocket()
sendCommand("NLST " & mask)
If Not (retValue = 150 OrElse retValue = 125) Then
Throw New IOException(reply.Substring(4))
End If
mes = ""
While True
Dim bytes As Integer = cSocket.Receive(buffer, buffer.Length, 0)
mes += ASCII.GetString(buffer, 0, bytes)
If bytes < buffer.Length Then
Exit While
End If
End While
Dim seperator As Char() = {ControlChars.Lf}
Dim mess As String() = mes.Split(seperator)
cSocket.Close()
readReply()
If retValue <> 226 Then
Throw New IOException(reply.Substring(4))
End If
Return mess
End Function
'''
''' Return the size of a file.
'''
'''
'''
Public Function getFileSize(ByVal fileName As String) As Long
If Not logined Then
login()
End If
sendCommand("SIZE " & fileName)
Dim size As Long = 0
If retValue = 213 Then
size = Int64.Parse(reply.Substring(4))
Else
Throw New IOException(reply.Substring(4))
End If
Return size
End Function
'''
''' Login to the remote server.
'''
Public Sub login()
clientSocket = New Socket(AddressFamily.InterNetwork, SocketType.Stream, ProtocolType.Tcp)
Dim ep As New IPEndPoint(Dns.Resolve(remoteHost).AddressList(0), remotePort)
Try
clientSocket.Connect(ep)
Catch generatedExceptionName As Exception
Throw New IOException("Couldn't connect to remote server")
End Try
readReply()
If retValue <> 220 Then
close()
Throw New IOException(reply.Substring(4))
End If
If debug Then
Console.WriteLine("USER " & remoteUser)
End If
sendCommand("USER " & remoteUser)
If Not (retValue = 331 OrElse retValue = 230) Then
cleanup()
Throw New IOException(reply.Substring(4))
End If
If retValue <> 230 Then
If debug Then
Console.WriteLine("PASS xxx")
End If
sendCommand("PASS " & remotePass)
If Not (retValue = 230 OrElse retValue = 202) Then
cleanup()
Throw New IOException(reply.Substring(4))
End If
End If
logined = True
Console.WriteLine("Connected to " & remoteHost)
chdir(remotePath)
End Sub
'''
''' If the value of mode is true, set binary mode for downloads.
''' Else, set Ascii mode.
'''
'''
Public Sub setBinaryMode(ByVal mode As Boolean)
If mode Then
sendCommand("TYPE I")
Else
sendCommand("TYPE A")
End If
If retValue <> 200 Then
Throw New IOException(reply.Substring(4))
End If
End Sub
'''
''' Download a file to the Assembly's local directory,
''' keeping the same file name.
'''
'''
Public Sub download(ByVal remFileName As String)
download(remFileName, "", False)
End Sub
'''
''' Download a remote file to the Assembly's local directory,
''' keeping the same file name, and set the resume flag.
'''
'''
'''
Public Sub download(ByVal remFileName As String, ByVal [resume] As Boolean)
download(remFileName, "", [resume])
End Sub
'''
''' Download a remote file to a local file name which can include
''' a path. The local file name will be created or overwritten,
''' but the path must exist.
'''
'''
'''
Public Sub download(ByVal remFileName As String, ByVal locFileName As String)
download(remFileName, locFileName, False)
End Sub
'''
''' Download a remote file to a local file name which can include
''' a path, and set the resume flag. The local file name will be
''' created or overwritten, but the path must exist.
'''
'''
'''
'''
Public Sub download(ByVal remFileName As String, ByVal locFileName As String, ByVal [resume] As Boolean)
If Not logined Then
login()
End If
setBinaryMode(True)
Console.WriteLine((("Downloading file " & remFileName & " from ") + remoteHost & "/") + remotePath)
If locFileName.Equals("") Then
locFileName = remFileName
End If
If Not File.Exists(locFileName) Then
Dim st As Stream = File.Create(locFileName)
st.Close()
End If
Dim output As New FileStream(locFileName, FileMode.Open)
Dim cSocket As Socket = createDataSocket()
Dim offset As Long = 0
If [resume] Then
offset = output.Length
If offset > 0 Then
sendCommand("REST " & offset)
If retValue <> 350 Then
'throw new IOException(reply.Substring(4));
'Some servers may not support resuming.
offset = 0
End If
End If
If offset > 0 Then
If debug Then
Console.WriteLine("seeking to " & offset)
End If
Dim npos As Long = output.Seek(offset, SeekOrigin.Begin)
Console.WriteLine("new pos=" & npos)
End If
End If
sendCommand("RETR " & remFileName)
If Not (retValue = 150 OrElse retValue = 125) Then
Throw New IOException(reply.Substring(4))
End If
While True
bytes = cSocket.Receive(buffer, buffer.Length, 0)
output.Write(buffer, 0, bytes)
If bytes <= 0 Then
Exit While
End If
End While
output.Close()
If cSocket.Connected Then
cSocket.Close()
End If
Console.WriteLine("")
readReply()
If Not (retValue = 226 OrElse retValue = 250) Then
Throw New IOException(reply.Substring(4))
End If
End Sub
'''
''' Upload a file.
'''
'''
Public Sub upload(ByVal fileName As String)
upload(fileName, False)
End Sub
'''
''' Upload a file and set the resume flag.
'''
'''
'''
Public Sub upload(ByVal fileName As String, ByVal [resume] As Boolean)
If Not logined Then
login()
End If
Dim cSocket As Socket = createDataSocket()
Dim offset As Long = 0
If [resume] Then
Try
setBinaryMode(True)
offset = getFileSize(fileName)
Catch generatedExceptionName As Exception
offset = 0
End Try
End If
If offset > 0 Then
sendCommand("REST " & offset)
If retValue <> 350 Then
'throw new IOException(reply.Substring(4));
'Remote server may not support resuming.
offset = 0
End If
End If
sendCommand("STOR " & Path.GetFileName(fileName))
If Not (retValue = 125 OrElse retValue = 150) Then
Throw New IOException(reply.Substring(4))
End If
' open input stream to read source file
Dim input As New FileStream(fileName, FileMode.Open)
If offset <> 0 Then
If debug Then
Console.WriteLine("seeking to " & offset)
End If
input.Seek(offset, SeekOrigin.Begin)
End If
Console.WriteLine(("Uploading file " & fileName & " to ") + remotePath)
While (InlineAssignHelper(bytes, input.Read(buffer, 0, buffer.Length))) > 0
cSocket.Send(buffer, bytes, 0)
End While
input.Close()
Console.WriteLine("")
If cSocket.Connected Then
cSocket.Close()
End If
readReply()
If Not (retValue = 226 OrElse retValue = 250) Then
Throw New IOException(reply.Substring(4))
End If
End Sub
'''
''' Delete a file from the remote FTP server.
'''
'''
Public Sub deleteRemoteFile(ByVal fileName As String)
If Not logined Then
login()
End If
sendCommand("DELE " & fileName)
If retValue <> 250 Then
Throw New IOException(reply.Substring(4))
End If
End Sub
'''
''' Rename a file on the remote FTP server.
'''
'''
'''
Public Sub renameRemoteFile(ByVal oldFileName As String, ByVal newFileName As String)
If Not logined Then
login()
End If
sendCommand("RNFR " & oldFileName)
If retValue <> 350 Then
Throw New IOException(reply.Substring(4))
End If
' known problem
' rnto will not take care of existing file.
' i.e. It will overwrite if newFileName exist
sendCommand("RNTO " & newFileName)
If retValue <> 250 Then
Throw New IOException(reply.Substring(4))
End If
End Sub
'''
''' Create a directory on the remote FTP server.
'''
'''
Public Sub mkdir(ByVal dirName As String)
If Not logined Then
login()
End If
sendCommand("MKD " & dirName)
If retValue > 400 Then
Throw New IOException(reply.Substring(4))
End If
End Sub
'''
''' Delete a directory on the remote FTP server.
'''
'''
Public Sub rmdir(ByVal dirName As String)
If Not logined Then
login()
End If
sendCommand("RMD " & dirName)
If retValue <> 250 Then
Throw New IOException(reply.Substring(4))
End If
End Sub
'''
''' Change the current working directory on the remote FTP server.
'''
'''
Public Sub chdir(ByVal dirName As String)
If dirName.Equals(".") Then
Exit Sub
End If
If Not logined Then
login()
End If
sendCommand("CWD " & dirName)
If retValue <> 250 Then
Throw New IOException(reply.Substring(4))
End If
Me.remotePath = dirName
Console.WriteLine("Current directory is " & remotePath)
End Sub
'''
''' Close the FTP connection.
'''
Public Sub close()
If clientSocket IsNot Nothing Then
sendCommand("QUIT")
End If
cleanup()
Console.WriteLine("Closing...")
End Sub
'''
''' Set debug mode.
'''
'''
Public Sub setDebug(ByVal debug As Boolean)
Me.debug = debug
End Sub
Private Sub readReply()
mes = ""
reply = readLine()
retValue = Int32.Parse(reply.Substring(0, 3))
End Sub
Private Sub cleanup()
If clientSocket IsNot Nothing Then
clientSocket.Close()
clientSocket = Nothing
End If
logined = False
End Sub
Private Function readLine() As String
While True
bytes = clientSocket.Receive(buffer, buffer.Length, 0)
mes += ASCII.GetString(buffer, 0, bytes)
If bytes < buffer.Length Then
Exit While
End If
End While
Dim seperator As Char() = {ControlChars.Lf}
Dim mess As String() = mes.Split(seperator)
If mes.Length > 2 Then
mes = mess(mess.Length - 2)
Else
mes = mess(0)
End If
If Not mes.Substring(3, 1).Equals(" ") Then
Return readLine()
End If
If debug Then
For k As Integer = 0 To mess.Length - 2
Console.WriteLine(mess(k))
Next
End If
Return mes
End Function
Private Sub sendCommand(ByVal command As String)
Dim cmdBytes As Byte() = Encoding.ASCII.GetBytes((command & vbCr & vbLf).ToCharArray())
clientSocket.Send(cmdBytes, cmdBytes.Length, 0)
readReply()
End Sub
Private Function createDataSocket() As Socket
sendCommand("PASV")
If retValue <> 227 Then
Throw New IOException(reply.Substring(4))
End If
Dim index1 As Integer = reply.IndexOf("("c)
Dim index2 As Integer = reply.IndexOf(")"c)
Dim ipData As String = reply.Substring(index1 + 1, index2 - index1 - 1)
Dim parts As Integer() = New Integer(5) {}
Dim len As Integer = ipData.Length
Dim partCount As Integer = -1
Dim buf As String = ""
Dim i As Integer = 0
While i < len AndAlso partCount <= 6
Dim ch As Char = [Char].Parse(ipData.Substring(i, 1))
If [Char].IsDigit(ch) Then
buf += ch
ElseIf ch <> ","c Then
Throw New IOException("Malformed PASV reply: " & reply)
End If
If ch = ","c OrElse i + 1 = len Then
Try
parts(System.Math.Max(System.Threading.Interlocked.Increment(partCount), partCount - 1)) = Int32.Parse(buf)
buf = ""
Catch generatedExceptionName As Exception
Throw New IOException("Malformed PASV reply: " & reply)
End Try
End If
i += 1
End While
Dim ipAddress As String = (((CStr(parts(0)) & ".") + CStr(parts(1)) & ".") + CStr(parts(2)) & ".") + CStr(parts(3))
Dim port As Integer = (parts(4) << 8) + parts(5)
Dim s As New Socket(AddressFamily.InterNetwork, SocketType.Stream, ProtocolType.Tcp)
Dim ep As New IPEndPoint(Dns.Resolve(ipAddress).AddressList(0), port)
Try
s.Connect(ep)
Catch generatedExceptionName As Exception
Throw New IOException("Can't connect to remote server")
End Try
Return s
End Function
Private Shared Function InlineAssignHelper(Of T)(ByRef target As T, ByVal value As T) As T
target = value
Return value
End Function
End Class

To call this class, you can do like this:

Private Function UploadFile() As Boolean

Dim ft As New FTPClass
Try
ft.setDebug(True)
ft.setRemoteHost("192.168.1.87")
ft.setRemoteUser("user")
ft.setRemotePass("123456")
ft.setRemotePath("Root/")
ft.setRemotePort(21)

ft.login()
Try
ft.chdir("Test")
Catch ex As Exception
Finally
ft.mkdir("Test")
End Try
ft.chdir("Test")

ft.setRemotePath("test.txt")
ft.setBinaryMode(True)
ft.upload("C:\test.txt")

ft.close()
Catch ex As Exception
ft.close()
MsgBox(ex.Message)
Return False
End Try
MsgBox("Upload success!")
Return True
End Function



****













Monday, January 5, 2009

How to totally remove a virus from pendrive

Removing virus from pendrive normally requires 2 steps:

Step 1: Scan the pendrive using Antivirus software.
Step 2: Remove the hidden file autorun.inf

***********************

Step 1: should be handled by the antivirus software. So I won't talk on this. As suggested by some people on the net, you can always refer to below link.

1) Superantispywhere
http://www.superantispyware.com/superant...
it will possible detect and get rid of the Trojan. It gets rid of some of the toughest problems.

2)Smitfraudfix
http://siri.geekstogo.com/SmitfraudFix.p...
this tool should be launched in safemode. To learn how to do that look here
http://www.pchell.com/support/safemode.s...
run this tool and choose to clean. it will get rid of pop-ups trying to sell you fake things

3)Vundofix
http://www.atribune.org/content/view/24/...
this tool gets rid of Vundo trojans and more.
the site shows how to use it.

4)Combofix
http://download.bleepingcomputer.com/sUB...
this is a last resort
Combofix is a general tool that helps the helper cleaning up a Hijackthis log.
It is able to remove some common infections and helps a user detect files that general scanners cannot find. It also lists registry keys such as the key keys, the desktop keys, and other areas where malware hide. The tool has some rootkit detectors too, allowing a helper to see if a rootkit is present on the PCsmi

**********************

2) Step 2: However, you normally have to perform step 2 manually since antivirus will not remove autorun.inf since there is nothing wrong with the file in the first place since the bad thing actually the script written into the file, not the file itself.

Now, follow this steps.

1 - Start windows command prompt. If you use XP, type "cmd" in the Run textbox in Start > Run to start the program.

2 - While in the command prompt, type your pendrive drive letter such as "E:" and press Enter
3 - type attrib -r -h -s autorun.inf
(Now, if you type dir you will able to see the file. I normally, open this file as notepad to see what is actually written inside. Actually, the content here is the script that will automatically execute once you double click your pendrive.)
5 - type del autorun.inf to delete the file permanently.
6 - Eject and Plug in back your pendrive.

Your pendrive should now work fine.