If we were to describe the C# language and its associated environment, the .NET Framework, as the most important new technology for developers for many years, we would not be exaggerating. .NET is designed to provide a new environment within which you can develop almost any application to run on Windows, and possibly in the future on other platforms too, while C# is a new programming language that has been designed specifically to work with .NET. Using C# you can, for example, write a dynamic web page, a component of a distributed application, a database access component, or a classic Windows desktop application.
Don't be fooled by the .NET label. The NET bit in the name is there to emphasize that Microsoft believes that distributed applications, in which the processing is distributed between client and server, are the way forward, but C# is not just a language for writing Internet or network-aware applications. It provides a means for you to code up almost any type of software or component that you might need to write for the Windows platform. Between them, C# and .NET are set both to revolutionize the way that you write programs, and to make programming on Windows much easier than it has ever been.
That's quite a substantial claim, and needs to be justified. After all, we all know how quickly computer technology changes. Every year Microsoft brings out new software, programming tools, or versions of Windows, with the claim that these will be hugely beneficial to developers. So what's different about .NET and C#?