Monday, December 4, 2006

partition & formatting

A brand new hard drive cannot be used until it has been formatted. Formatting places magnetic markers on the drive surface to define the sectors in which data is stored. When you format a hard drive, you erase all its files and prepare it as if it were a new or blank hard dive. Formatting your hard drive will wipe clean your drive just like a new hard drive.
A hard drive consists of numerous metallic platters. These platters store data magnetically. Special read/write heads realign magnetic particles on the platters, much like a recording head records data onto magnetic recording tape.

Before data can be stored on any disk, including your system's hard drive, that hard drive must first be formatted. A hard drive that has not been formatted cannot accept any data. When you format a hard drive, your computer prepares the surface of the drive to accept and store data magnetically.

But, When you buy a new PC, your hard disk is already formatted for you. And, in most cases, your operating system and most key programs are preinstalled for you.
If you decided to upgrade and install a second hard drive you can format that second hard drive in Windows Explorer. However, to format your C drive you need to use a bootable disk to format the C drive. The following describes how to format a hard drive using Windows 98/Me or DOS. Be sure to back up any files you want to keep. You may need to partition your drive before you format it -See: How to Partition a Hard Drive here.

Windows Me/98 comes with an updated Fdisk.exe, which you can use to partition your drives. In most cases, the computer manufacturer will have already set up disk partitions on your hard drive when you get a new computer or drive. You can use the updated Fdisk.exe to make changes to the partitions, but you will lose any files stored on them unless you back them up first.

Hard drive size is measured in gigabytes (GB). The bigger the drive, the higher the price $$$. If you're running Windows 95 or earlier, your operating system will only support drives up to 32GB in size. If you're running Windows 98/ Me/XP/NT, or Windows 2000, you can use drives that exceed the 32GB limit.

"If you need to format your drive in Windows XP"

In Windows 98/Me, you have to partition and format your hard drive manually. It isn't that hard really; we'll start with the portioning process, which uses a DOS-based utility called FDISK. WebTechGeek.com
The Steps:

1. From within Windows, click the Start button and select Run. When the Run dialog box appears, enter FDISK into the Open box and click OK.

2. When FDISK starts (in a DOS window), you'll be asked if you want to enable large disk support. Answer Yes.

3. The FDISK menu is now displayed. Select the drive you want to partition, and then choose option 1 (Create DOS Partition or Logical DOS Drive).

4. When asked if you want to use the entire drive for your DOS partition, answer Yes. (If you're asked whether you want to enable FAT32 support, also answer Yes.)

5. After the partition is created, follow the onscreen prompts to exit FDISK and restart your computer.

6. After your new drive has been partitioned, you have to format it for use. To format a new hard drive manually.

Hard disks can be partitioned to run separate operating systems on the same disk, or to break down a disk into manageable chunks for storage. Partitioning is performed on a new or reformatted disk. These instructions describe using FDISK for PCs using DOS 3 or later.

"Instructions "

STEP 1: Start the computer in DOS. The screen will show the C:/ prompt.

STEP 2: Type "FDISK." The partition window will appear with menu options.

STEP 3: Enter 5 if you're partitioning a second drive, and select the drive; otherwise, skip to the next step.

STEP 4: Enter 1 (Create DOS Partition or Logical DOS Drive).

STEP 5: Enter 2 (Create Extended DOS) to create a partition.

STEP 6: Enter N when the program asks if you want to use the maximum available size.

STEP 7: Designate the amount of disk space to allocate to the second partition (the partition will be assigned the next drive letter).

STEP 8: Type a name for the new partition and press Enter. The partition menu will appear.

STEP 9: Repeat steps 5 through 8 to create additional partitions.

STEP 10: Press Esc to exit the partition command.

STEP 11: Format the newly created partitions (see "How to Format a Hard Drive").

Note: Don't run FDISK on an existing hard disk drive, unless you're really sure you want to return your hard drive to its original from-the-factory condition. Partitioning the drive will delete all data on the drive!

A. From within Windows, open My Computer.

B. Right-click the drive you want to format, and select Format from the pop-up menu.

C. When the Format Local Disk dialog box appears, select the File System you want to use (only select NTFS if you're running Windows XP; for older operating systems, choose FAT32), enter a label for the disk, select the Quick Format option, and then click Start.

D. After your new drive has been partitioned and formatted, it's ready to store whatever data you need stored.

First make a Startup Disk (Boot Disk): To create a Startup Disk, click the Add/Remove Programs icon in Windows Me/98 Control Panel. WebTechGeek.com
The Steps for Formatting The C Drive in DOS:
1. If you installed a new C drive and you need to formatting the C drive, you need to boot-up from a boot floppy disk and use DOS to format the C drive. Start or boot-up your computer using a startup boot disk. If your new hard drive has a utilities disk use it to format your new drive. Optional: you can use the Ultimate Boot Disk Software here.

2. Use your floppy boot disk to start your computer. Put your boot disk in your disk drive and start the computer. The screen should show the A:\ prompt.

3. Now type in FORMAT C: /S /V This command formats the hard disk, transfers the appropriate system files to make it a startup disk, and prompts you for a new name. The /S command tells DOS to copy two hidden files and COMMAND>COM to the drive. The /V command is optional, it allows you to enter a volume label that is displayed at the top of the directory list to help you id the disk.

4. You need to enter the new disk name. The hard drive is now bootable. Now eject the boot floppy disk and restart your computer. The hard drive is ready to use, You can install a new OS, Windows. Remember to come back to WebTechGeek.com for more How to tips!
Note: It's important that the boot disk has the same version of DOS that's on your hard drive. Use the VER command at the DOS prompt to display the current version of DOS.

"Windows - Second Drive Format Steps:"

First make a Startup Disk: To create a Startup Disk, click the Add/Remove Programs icon in Windows Me/98 Control Panel. WebTechGeek.com
1. You can start by double-clicking on the My Computer icon. Now click the second drive icon to select it.

2. Now open the File menu and select Format. The format window should displays the hard disk capacity.

3. You need to select Format type and enter a disk label (name). If you want to Boot up from your second drive, you need to check the "Copy system files" box.

4. Now click Start. Windows displays a warning box. After you read the warning, and then click OK to begin formatting. Warnings: The format procedure deletes all files on the hard disk, This action is not reversible. Be sure to back up any files you want to keep. Remember to come back to WebTechGeek.com for more.

In Windows, click Start > Programs > MS-DOS Prompt. Type fdisk at the DOS prompt and press Enter. Press Enter again to allow the hard disk to be formatted (later) in 32-bit FAT format,
Now press 5 to change the current fixed disk drive and then press Enter. You are going to format the second drive, your new drive, make sure that you don't format the first drive (the C drive). Then press 2 (to format the new drive) and press Enter.
You need to press 1 and then Enter to create a DOS partition on drive 2.
Now press 1 again, and then press Enter to create a primary DOS partition. You want to create a primary DOS partition to allow this partition to become the bootable volume, the C: drive.
Now press Enter to use the maximum available size of your new hard disk for the primary DOS partition. All drives less than 8GB will use 4K clusters for file storage. You might as well format the whole drive as one primary partition unless you have criteria other than optimal file storage space efficiency for dividing the drive. Now press Esc to end FDISK.
OK reboot Windows, and your new formatted hard drive will be recognized as drive D:. The second primary partition is assigned the drive letter D:.

Monday, November 27, 2006

history of unix

"...the number of UNIX installations has grown to 10, with more expected..."
- Dennis Ritchie and Ken Thompson, June 1972"...When BTL withdrew from the project, they needed to rewrite an operating system (OS) in order to play space war on another smaller machine (a DEC PDP-7 [Programmed Data Processor] with 4K memory for user
programs). The result was a system which a punning colleague called
UNICS (UNiplexed Information and Computing Service)--an
'emasculated Multics'; no one recalls whose idea the change to UNIX
was"

Since it began to escape from AT&T's Bell Laboratories in the early 1970's, the success of the UNIX operating system has led to many different versions: recipients of the (at that time free) UNIX system code all began developing their own different versions in their own, different, ways for use and sale. Universities, research institutes, government bodies and computer companies all began using the powerful UNIX system to develop many of the technologies which today are part of a UNIX system.
Computer aided design, manufacturing control systems, laboratory simulations, even the Internet itself, all began life with and because of UNIX systems. Today, without UNIX systems, the Internet would come to a screeching halt. Most telephone calls could not be made, electronic commerce would grind to a halt and there would have never been "Jurassic Park"!
By the late 1970's, a ripple effect had come into play. By now the under- and post-graduate students whose lab work had pioneered these new applications of technology were attaining management and decision-making positions inside the computer system suppliers and among its customers. And they wanted to continue using UNIX systems.
Soon all the large vendors, and many smaller ones, were marketing their own, diverging, versions of the UNIX system optimized for their own computer architectures and boasting many different strengths and features. Customers found that, although UNIX systems were available everywhere, they seldom were able to interwork or co-exist without significant investment of time and effort to make them work effectively. The trade mark UNIX was ubiquitous, but it was applied to a multitude of different, incompatible products.
In the early 1980's, the market for UNIX systems had grown enough to be noticed by industry analysts and researchers. Now the question was no longer "What is a UNIX system?" but "Is a UNIX system suitable for business and commerce?"
Throughout the early and mid-1980's, the debate about the strengths and weaknesses of UNIX systems raged, often fuelled by the utterances of the vendors themselves who sought to protect their profitable proprietary system sales by talking UNIX systems down. And, in an effort to further differentiate their competing UNIX system products, they kept developing and adding features of their own.
In 1984, another factor brought added attention to UNIX systems. A group of vendors concerned about the continuing encroachment into their markets and control of system interfaces by the larger companies, developed the concept of "open systems."
Open systems were those that would meet agreed specifications or standards. This resulted in the formation of X/Open Company Ltd whose remit was, and today in the guise of The Open Group remains, to define a comprehensive open systems environment. Open systems, they declared, would save on costs, attract a wider portfolio of applications and competition on equal terms. X/Open chose the UNIX system as the platform for the basis of open systems.
Although UNIX was still owned by AT&T, the company did little commercially with it until the mid-1980's. Then the spotlight of X/Open showed clearly that a single, standard version of the UNIX system would be in the wider interests of the industry and its customers. The question now was, "which version?".
In a move intended to unify the market in 1987, AT&T announced a pact with Sun Microsystems, the leading proponent of the Berkeley derived strain of UNIX. However, the rest of the industry viewed the development with considerable concern. Believing that their own markets were under threat they clubbed together to develop their own "new" open systems operating system. Their new organization was called the Open Software Foundation (OSF). In response to this, the AT&T/Sun faction formed UNIX International.
The ensuing "UNIX wars" divided the system vendors between these two camps clustered around the two dominant UNIX system technologies: AT&T's System V and the OSF system called OSF/1. In the meantime, X/Open Company held the center ground. It continued the process of standardizing the APIs necessary for an open operating system specification.
In addition, it looked at areas of the system beyond the operating system level where a standard approach would add value for supplier and customer alike, developing or adopting specifications for languages, database connectivity, networking and mainframe interworking. The results of this work were published in successive X/Open Portability Guides.
XPG 4 was released in October 1992. During this time, X/Open had put in place a brand program based on vendor guarantees and supported by testing. Since the publication of XPG4, X/Open has continued to broaden the scope of open systems specifications in line with market requirements. As the benefits of the X/Open brand became known and understood, many large organizations began using X/Open as the basis for system design and procurement. By 1993, over $7 billion had been spent on X/Open branded systems. By the start of 1997 that figure has risen to over $23 billion. To date, procurements referencing the Single UNIX Specification amount to over $5.2 billion.
In early 1993, AT&T sold it UNIX System Laboratories to Novell which was looking for a heavyweight operating system to link to its NetWare product range. At the same time, the company recognized that vesting control of the definition (specification) and trademark with a vendor-neutral organization would further facilitate the value of UNIX as a foundation of open systems. So the constituent parts of the UNIX System, previously owned by a single entity are now quite separate
In 1995 SCO bought the UNIX Systems business from Novell, and UNIX system source code and technology continues to be developed by SCO.
In 1995 X/Open introduced the UNIX 95 brand for computer systems guaranteed to meet the Single UNIX Specification. The Single UNIX Specification brand program has now achieved critical mass: vendors whose products have met the demanding criteria now account for the majority of UNIX systems by value.
For over ten years, since the inception of X/Open, UNIX had been closely linked with open systems. X/Open, now part of The Open Group, continues to develop and evolve the Single UNIX Specification and associated brand program on behalf of the IT community. The freeing of the specification of the interfaces from the technology is allowing many systems to support the UNIX philosophy of small, often simple tools , that can be combined in many ways to perform often complex tasks. The stability of the core interfaces preserves existing investment, and is allowing development of a rich set of software tools. The Open Source movement is building on this stable foundation and is creating a resurgence of enthusiasm for the UNIX philosophy. In many ways Open Source can be seen as the true delivery of Open Systems that will ensure it continues to go from strength to strength.
1969
The Beginning
The history of UNIX starts back in 1969, when Ken Thompson, Dennis Ritchie and others started working on the "little-used PDP-7 in a corner" at Bell Labs and what was to become UNIX.
1971
First Edition
It had a assembler for a PDP-11/20, file system, fork(), roff and ed. It was used for text processing of patent documents.

1973
Fourth Edition
It was rewritten in C. This made it portable and changed the history of OS's.

1975
Sixth Edition
UNIX leaves home. Also widely known as Version 6, this is the first to be widely available out side of Bell Labs. The first BSD version (1.x) was derived from V6.

1979
Seventh Edition
It was a "improvement over all preceding and following Unices" [Bourne]. It had C, UUCP and the Bourne shell. It was ported to the VAX and the kernel was more than 40 Kilobytes (K).

1980
Xenix
Microsoft introduces Xenix. 32V and 4BSD introduced.

1982
System III
AT&T's UNIX System Group (USG) release System III, the first public release outside Bell Laboratories. SunOS 1.0 ships. HP-UX introduced. Ultrix-11 Introduced.

1983
System V
Computer Research Group (CRG), UNIX System Group (USG) and a third group merge to become UNIX System Development Lab. AT&T announces UNIX System V, the first supported release. Installed base 45,000.

1984
4.2BSD
University of California at Berkeley releases 4.2BSD, includes TCP/IP, new signals and much more. X/Open formed.
1984
SVR2
System V Release 2 introduced. At this time there are 100,000 UNIX installations around the world.

1986
4.3BSD
4.3BSD released, including internet name server. SVID introduced. NFS shipped. AIX announced. Installed base 250,000.

1987
SVR3
System V Release 3 including STREAMS, TLI, RFS. At this time there are 750,000 UNIX installations around the world. IRIX introduced.
1988
POSIX.1 published. Open Software Foundation (OSF) and UNIX International (UI) formed. Ultrix 4.2 ships.

1989
AT&T UNIX Software Operation formed in preparation for spinoff of USL. Motif 1.0 ships.
1989
SVR4
UNIX System V Release 4 ships, unifying System V, BSD and Xenix. Installed base 1.2 million.

1990
XPG3
X/Open launches XPG3 Brand. OSF/1 debuts. Plan 9 from Bell Labs ships.

1991
UNIX System Laboratories (USL) becomes a company - majority-owned by AT&T. Linus Torvalds commences Linux development. Solaris 1.0 debuts.

1992
SVR4.2
USL releases UNIX System V Release 4.2 (Destiny). October - XPG4 Brand launched by X/Open. December 22nd Novell announces intent to acquire USL. Solaris 2.0 ships.

1993
4.4BSD
4.4BSD the final release from Berkeley. June 16 Novell acquires USL
Late 1993
SVR4.2MP
Novell transfers rights to the "UNIX" trademark and the Single UNIX Specification to X/Open. COSE initiative delivers "Spec 1170" to X/Open for fasttrack. In December Novell ships SVR4.2MP , the final USL OEM release of System V

1994
Single UNIX Specification
BSD 4.4-Lite eliminated all code claimed to infringe on USL/Novell. As the new owner of the UNIX trademark, X/Open introduces the Single UNIX Specification (formerly Spec 1170), separating the UNIX trademark from any actual code stream.

1995
UNIX 95
X/Open introduces the UNIX 95 branding programme for implementations of the Single UNIX Specification. Novell sells UnixWare business line to SCO. Digital UNIX introduced. UnixWare 2.0 ships. OpenServer 5.0 debuts.

1996
The Open Group forms as a merger of OSF and X/Open.

1997
Single UNIX Specification, Version 2
The Open Group introduces Version 2 of the Single UNIX Specification, including support for realtime, threads and 64-bit and larger processors. The specification is made freely available on the web. IRIX 6.4, AIX 4.3 and HP-UX 11 ship.

1998
UNIX 98
The Open Group introduces the UNIX 98 family of brands, including Base, Workstation and Server. First UNIX 98 registered products shipped by Sun, IBM and NCR. The Open Source movement starts to take off with announcements from Netscape and IBM. UnixWare 7 and IRIX 6.5 ship.

1999
UNIX at 30
The UNIX system reaches its 30th anniversary. Linux 2.2 kernel released. The Open Group and the IEEE commence joint development of a revision to POSIX and the Single UNIX Specification. First LinuxWorld conferences. Dot com fever on the stock markets. Tru64 UNIX ships.

2001
Single UNIX Specification, Version 3
Version 3 of the Single UNIX Specification unites IEEE POSIX, The Open Group and the industry efforts. Linux 2.4 kernel released. IT stocks face a hard time at the markets. The value of procurements for the UNIX brand exceeds $25 billion. AIX 5L ships.

2003
ISO/IEC 9945:2003
The core volumes of Version 3 of the Single UNIX Specification are approved as an international standard. The "Westwood" test suite ship for the UNIX 03 brand. Solaris 9.0 E ships. Linux 2.6 kernel released.

Sunday, November 26, 2006

history of linux

a. In The Beginning

It was 1991, and the ruthless agonies of the cold war were gradually coming to an end. There was an air of peace and tranquility that prevailed in the horizon. In the field of computing, a great future seemed to be in the offing, as powerful hardware pushed the limits of the computers beyond what anyone expected.

But still, something was missing.

And it was the none other than the Operating Systems, where a great void seemed to have appeared.

For one thing, DOS was still reigning supreme in its vast empire of personal computers. Bought by Bill Gates from a Seattle hacker for $50,000, the bare bones operating system had sneaked into every corner of the world by virtue of a clever marketing strategy. PC users had no other choice. Apple Macs were better, but with astronomical prices that nobody could afford, they remained a horizon away from the eager millions.

The other dedicated camp of computing was the Unixworld. But Unix itself was far more expensive. In quest of big money, the Unix vendors priced it high enough to ensure small PC users stayed away from it. The source code of Unix, once taught in universities courtesy of Bell Labs, was now cautiously guarded and not published publicly. To add to the frustration of PC users worldwide, the big players in the software market failed to provide an efficient solution to this problem.

A solution seemed to appear in form of MINIX. It was written from scratch by Andrew S. Tanenbaum, a US-born Dutch professor who wanted to teach his students the inner workings of a real operating system. It was designed to run on the Intel 8086 microprocessors that had flooded the world market.

As an operating system, MINIX was not a superb one. But it had the advantage that the source code was available. Anyone who happened to get the book 'Operating Systems: Design and Implementation' by Tanenbaum could get hold of the 12,000 lines of code, written in C and assembly language. For the first time, an aspiring programmer or hacker could read the source codes of the operating system, which to that time the software vendors had guarded vigorously. A superb author, Tanenbaum captivated the brightest minds of computer science with the elaborate and immaculately lively discussion of the art of creating a working operating system. Students of Computer Science all over the world pored over the book, reading through the codes to understand the very system that runs their computer.

And one of them was Linus Torvalds.

Back




b. New Baby in the Horizon

In 1991, Linus Benedict Torvalds was a second year student of Computer Science at the University of Helsinki and a self-taught hacker. The 21 year old sandy haired soft-spoken Finn loved to tinker with the power of the computers and the limits to which the system can be pushed. But all that was lacking was an operating system that could meet the demands of the professionals. MINIX was good, but still it was simply an operating system for the students, designed as a teaching tool rather than an industry strength one.

At that time, programmers worldwide were greatly inspired by the GNU project by Richard Stallman, a software movement to provide free and quality software. Revered as a cult hero in the realm of computing, Stallman started his awesome career in the famous Artificial Intelligence Laboratory at MIT, and during the mid and late seventies, created the Emacs editor. In the early eighties, commercial software companies lured away much of the brilliant programmers of the AI lab, and negotiated stringent nondisclosure agreements to protect their secrets. But Stallman had a different vision. His idea was that unlike other products, software should be free from restrictions against copying or modification in order to make better and efficient computer programs. With his famous 1983 manifesto that declared the beginnings of the GNU project, he started a movement to create and distribute softwares that conveyed his philosophy (Incidentally, the name GNU is a recursive acronym which actually stands for 'GNU is Not Unix'). But to achieve this dream of ultimately creating a free operating system, he needed to create the tools first. So, beginning in 1984, Stallman started writing the GNU C Compiler(GCC), an amazing feat for an individual programmer. With his legendary technical wizardry, he alone outclassed entire groups of programmers from commercial software vendors in creating GCC, considered as one of the most efficient and robust compilers ever created.

Richard Stallman
Richard Stallman, father of the GNU Project

By 1991, the GNU project created a lot of the tools. The much awaited Gnu C compiler was available by then, but there was still no operating system. Even MINIX had to be licensed.(Later, in April 2000, Tanenbaum released Minix under the BSD License.) Work was going the GNU kernel HURD, but that was not supposed to come out within a few years.

That was too much of a delay for Linus.

In August 25, 1991 the historic post was sent to the MINIX news group by Linus .....


From: torvalds@klaava.Helsinki.FI (Linus Benedict Torvalds)
Newsgroups: comp.os.minix
Subject: What would you like to see most in minix?
Summary: small poll for my new operating system
Message-ID: <1991aug25.205708.9541@klaava.helsinki.fi>
Date: 25 Aug 91 20:57:08 GMT
Organization: University of Helsinki

Hello everybody out there using minix -
I'm doing a (free) operating system (just a hobby, won't be big and
professional like gnu) for 386(486) AT clones. This has been brewing
since april, and is starting to get ready.I'd like any feedback on
things people like/dislike in minix, as my OS resembles it somewhat
(same physical layout of the file-system(due to practical reasons)
among other things). I've currently ported bash(1.08) and gcc(1.40),and
things seem to work.This implies that I'll get something practical within a
few months, andI'd like to know what features most people would want. Any
suggestions are welcome, but I won't promise I'll implement them :-)
Linus (torvalds@kruuna.helsinki.fi)
PS. Yes - it's free of any minix code, and it has a multi-threaded fs.
It is NOT protable (uses 386 task switching etc), and it probably never
will support anything other than AT-harddisks, as that's
all I have :-(.

As it is apparent from the posting, Linus himself didn't believe that his creation was going to be big enough to change computing forever. Linux version 0.01 was released by mid September 1991, and was put on the net. Enthusiasm gathered around this new kid on the block, and codes were downloaded, tested, tweaked, and returned to Linus. 0.02 came on October 5th, along with this famous declaration from Linus:


From: torvalds@klaava.Helsinki.FI (Linus Benedict Torvalds)
Newsgroups: comp.os.minix
Subject: Free minix-like kernel sources for 386-AT
Message-ID: <1991oct5.054106.4647@klaava.helsinki.fi>
Date: 5 Oct 91 05:41:06 GMT
Organization: University of Helsinki
Do you pine for the nice days of minix-1.1, when men were men and wrote their own device drivers?
Are you without a nice project and just dying to cut your teeth on a OS you can try to modify for your
needs? Are you finding it frustrating when everything works on minix? No more all-nighters to get a nifty program working? Then this post might be just for you :-)
As I mentioned a month(?)ago, I'm working on a free version of a minix-lookalike for AT-386 computers. It has
finally reached the stage where it's even usable (though may not be depending on
what you want), and I am willing to put out the sources for wider distribution. It is just version 0.02 (+1 (very
small) patch already), but I've successfully run bash/gcc/gnu-make/gnu-sed/compress etc under it.
Sources for this pet project of mine can be found at nic.funet.fi (128.214.6.100) in the directory /pub/OS/Linux.
The directory also contains some README-file and a couple of binaries to work under linux
(bash, update and gcc, what more can you ask for :-). Full kernel source is provided, as no minix code has been
used. Library sources are only partially free, so that cannot be distributed currently. The system is able to compile
"as-is" and has been known to work. Heh. Sources to the binaries (bash and gcc) can be found at the
same place in /pub/gnu.

Linux version 0.03 came in a few weeks. By December came version 0.10. Still Linux was little more than in skeletal form. It had only support for AT hard disks, had no login ( booted directly to bash). version 0.11 was much better with support for multilingual keyboards, floppy disk drivers, support for VGA,EGA, Hercules etc. The version numbers went directly from 0.12 to 0.95 and 0.96 and so on. Soon the code went worldwide via ftp sites at Finland and elsewhere.

Back



c. Confrontation & Development


Linus displays Linux running on a notebook pc

Soon Linus faced some confrontation from none other than Andrew Tanenbaum, the great teacher who wrote MINIX. In a post to Linus, Tanenbaum commented:


"I still maintain the point that designing a monolithic kernel in 1991 is a fundamental error. Be thankful you are not my student. You would not get a high grade for such a design :-)"
(Andrew Tanenbaum to Linus Torvalds)

Linus later admitted that it was the worst point of his development of Linux. Tanenbaum was certainly the famous professor, and anything he said certainly mattered. But he was wrong with Linux, for Linus was one stubborn guy who won't admit defeat.

Tanenbaum also remarked that : "Linux is obsolete".

Now was the turn for the new Linux generation. Backed by the strong Linux community, Linus gave a reply to Tanenbaum which seems to be most fitting:


Your job is being a professor and researcher: That's one hell of a good excuse for some of the brain-damages of minix.
(Linus Torvalds to Andrew Tanenbaum)

And work went on. Soon more than a hundred people joined the Linux camp. Then thousands. Then hundreds of thousands. This was no longer a hackers toy. Powered by a plethora of programs from the GNU project, Linux was ready for the actual showdown. It was licensed under GNU General Public License, thus ensuring that the source codes will be free for all to copy, study and to change. Students and computer programmers grabbed it.

Soon, commercial vendors moved in. Linux itself was, and is free. What the vendors did was to compile up various software and gather them in a distributable format, more like the other operating systems with which people were more familiar. Red Hat , Caldera, and some other companies gained substantial amount of response from the users worldwide. While these were commercial ventures, dedicated computer programmers created their very own volunteer-based distribution, the famed Debian. With the new Graphical User Interfaces (like X-window System, KDE, GNOME)the Linux distributions became very popular.

Meanwhile, there were amazing things happening with Linux. Besides the PC, Linux was ported to many different platforms. Linux was tweaked to run 3Com's handheld PalmPilot computer. Clustering technology enabled large number of Linux machines to be combined into a single computing entity, a parallel computer. In April 1996, researchers at Los Alamos National Laboratory used Linux to run 68 PCs as a single parallel processing machine to simulate atomic shock waves. But unlike other Supercomputers costing a fortune, it was rather cheap. The do-it-yourself supercomputer cost only $152,000, including labor (connecting the 68 PCs with cables)-about one tenth the price of a comparable commercial machine. It reached a peak speed of 19 billion calculations per second, making it the 315th most powerful supercomputer in the world. And it was a robust one too. Three months later it still didn't have to be rebooted.


A Beaming Linus Today

The best thing about Linux today is the fanatic following it commands. Whenever a new piece of hardware is out, Linux kernel is tweaked to take advantage of it. For example, within weeks after the introduction of Intel Xeon® Microprocessor, Linux kernel was tweaked and was ready for it. It has also been adapted for use in Alpha, Mac, PowerPC, and even for palmtops, a feat which is hardly matched by any other operating system. And it continues its journey into the new millennium, with the same enthusiasm that started one fine day back in 1991.

Patricia Miranda Torvalds
Linus in 2002

As for Linus, he remains a simple man. Unlike Bill Gates, he is not a billionaire. Having completed studies, he moved to USA and landed a job at Transmeta Corporation. After conducting a top-secret research and development project, Transmeta launched the Crusoe™ processor. Linus was an active member of the research team. Recently married to Tove, he is the proud father of a girl, Patricia Miranda Torvalds. But he remains as the world's most favorite and most famous programmer to this date. Revered by Computer communities worldwide, Linus is by far the most popular programmer on this planet.

Back


d. After a Decade: Linux Today

Proving all the warning and prophecies of the skeptics wrong, Linux has completed a decade of development. Today, Linux is one of the fastest growing operating systems in the history. From a few dedicated fanatics in 1991-92 to millions of general users at present, it is certainly a remarkable journey. The big businesses have 'discovered' Linux, and have poured millions of dollars into the development effort, denouncing the anti-business myth of the open-source movement. IBM corp. once considered the archenemy of open-source hacker community, has come forward with a huge fund for development of open source Linux based solutions. But what's really amazing is the continuously increasing band of developers spread throughout the world who work with a fervent zeal to improve upon the features of Linux. The development effort is not, as many closed-sourced advocates accuse, totally engulfed with chaos. A well designed development model supervised by some maintainers is adopted. Along with this, there are thousands of developers working to port various applications to Linux.

Commercial enterprises are no longer wary of Linux. With a large number of vendors providing support for Linux based products, it is no longer a 'do-at-your-own-risk' thing to use Linux at the office. As for reliability, Linux certainly proved it during the nasty attacks of the CIH virus in 1999 and the love bug a year later, during which Linux based machines proved to be immune to the damages caused by these otherwise quite simple computer viruses. Linux startups like Red Hat received a cordial response as they went public. And even after the dot-com bust of the recent years, these companies continue to thrive and grow. With this added confidence, many large and small businesses have adopted Linux based servers and workstations as an integral part of their offices.

Rise of the Desktop Linux

What is the biggest complain against Linux? Perhaps in the past, it was the text based interface which scared off many people from using it. 'Text mode gives total control', some dedicated hackers and heavy users may explain. But for the millions of ordinary people, it also means a lot of effort towards learning the system. The existing X-Window system and the window managers were not up to the general computer users' expectation. Exactly this argument had always been put forward by dedicated followers of the Windows(TM) camp. But things began to change in the last couple of years. The advent of professional looking desktop environments like KDE( K Desktop Environment) and GNOME completed the picture. The recent versions of these desktop environment have changed the general perception about the 'user friendliness' of Linux to a great extent. Though hard-core users grumble about the loss of purity of the hacker-culture, this great change in the mindset of the common users has increased the popularity of Linux.

Today, almost distributions of Linux include user-friendly GUIs. Installation has also become easier. Gone are the days when users would need detailed expertise in computer hardware to install Linux ... distributions like Ubuntu, Debian, Suse, Knoppix, and Red Hat's Fedora Core can be installed by even novice users. Most distributions are also available in Live CD format, which the users can just put in their CD drives and boot without installing it to the hard drive, making Linux available to the newbies.

Linux in the Developing World

Perhaps the greatest change is the spread of Linux to the developing world. In the days before Linux, developing countries were way behind in the field of computing. The cost of hardware fell down, but the cost of software was a huge burden to the cash-strapped computer enthusiasts of the Third World countries. In desperation, people resorted to piracy of almost all sorts of software products. This resulted in widespread piracy, amounting to billions of dollars. But then again, the pricetag of most of the commercial products were far beyond the reaches of the people in developing countries. For example, a typical operating system product costs at least US $100 or more. But in countries with per capita incomes of about US$200-300, is a huge amount.

The rise of Linux and other related open source product has changed it all. Since Linux can be scaled to run in almost computer with very few resources, it has become a suitable alternative for low budget computer users. Old, ancient 486/Pentium 1 computers that has become a part of history in the developed world are still used in developing countries. And Linux has enabled to unleash the full potential of these computers. The use of open source software has also proliferated, since the price of software is a big question. In countries of Asia, Africa and Latin America, Linux has appeared as a way out for the masses of computer enthusiasts. And a testament to the true global nature of Linux, local customizations were made in obscure parts of the world. The Linux documentation now includes documents written in almost all the major languages ... and also many minor ones, for example, Vietnamese.

From Desktop to SuperComputing

When Linux was first envisaged by Linus Torvalds, it was just another hackers hobby. But from the humble Intel 386 machine of Linus that ran the first kernel, Linux has come a long way. Its most notable use now is in the field of massively parallel supercomputing clusters.

In August 2001, BBC reported that the US Government was planning to build what would be a mega computer, capable of performing over 13 trillion calculations per second (13.6 TeraFLOPS). The project, called Teragrid would consist of a connected network of 4 US supercomputing centers. The four labs that are collaborating to create the Teragrid are: National Center for Supercomputing Applications at the University of Illinois(NCSA), San Diego Supercomputer Center (SDSC) at the University of California Argonne National Laboratory in Chicago; California Institute of Technology in Pasadena. At each of these centers, there would be a supercomputer. In total, there would be more than 3000 processors running in parallel to create the Tetragrid.

By 2005, the use of Linux became more prevalent in Supercomputing. The 2005 Top500 list of Supercomputers shows that 4 of the top 5 fastest supercomputers use Linux as their operating system.

The Journey Continues

The journey of Linux from a hacking project to globalization has been more like an evolutionary experience. The GNU Project, started in the early 1980's by Richard Stallman, laid the foundation for the development of open source software. Prof. Andrew Tanenbaum's Personal Computer operating system Minix brought the study of operating systems from a theoretical basis to a practical one. And finally, Linus Torvald's endless enthusiasm for perfection gave birth to Linux. Throughout the last couple of years, hundreds of thousands of people forming global community nurtured it and brought it to its glorious place in the annals of the computer revolution. Today Linux is not just another student's hacking project, it is a worldwide phenomenon bringing together huge companies like IBM and the countless millions of people throughout the world in the spirit of the open source software movement. In the history of computing, it will forever remain as one of the most amazing endeavors of human achievement.



e. Tux the penguin: Linux's Dear Logo

The logo of Linux is a penguin. Unlike other commercial products of computer operating systems, Linux doesn't have a formidable serious looking symbol. Rather Tux, as the penguin is lovingly called, symbolizes the carefree attitude of the total movement. This cute logo has a very interesting history. As put forward by Linus, initially no logo was selected for Linux. Once Linus went to the southern hemisphere on a vacation. There he encountered a penguin, not unlike the current logo of Linux. As he tried to pat it, the penguin bit his hand. This amusing incident led to the selection of a penguin as the logo of Linux sometime later.

Back



f. Some Linux Cookies

Here are some famous words by Linus himself.

Dijkstra probably hates me
(Linus Torvalds, in kernel/sched.c)

"How should I know if it works? That's what beta testers are for. I only
coded it."
(Attributed to Linus Torvalds, somewhere in a posting)

"I'm an idiot.. At least this one [bug] took about 5 minutes to find.."
(Linus Torvalds in response to a bug report.)

"If you want to travel around the world and be invited to speak at a lot
of different places, just write a Unix operating system."
(By Linus Torvalds)

> > Other than the fact Linux has a cool name, could someone explain why I
> > should use Linux over BSD?

> No. That's it. The cool name, that is. We worked very hard on
> creating a name that would appeal to the majority of people, and it
> certainly paid off: thousands of people are using linux just to be able
> to say "OS/2? Hah. I've got Linux. What a cool name". 386BSD made the
> mistake of putting a lot of numbers and weird abbreviations into the
> name, and is scaring away a lot of people just because it sounds too
> technical.
(Linus Torvalds' follow-up to a question about Linux)

> The day people think linux would be better served by somebody else (FSF
> being the natural alternative), I'll "abdicate". I don't think that
> it's something people have to worry about right now - I don't see it
> happening in the near future. I enjoy doing linux, even though it does
> mean some work, and I haven't gotten any complaints (some almost timid
> reminders about a patch I have forgotten or ignored, but nothing
> negative so far).

> Don't take the above to mean that I'll stop the day somebody complains:
> I'm thick-skinned (Lasu, who is reading this over my shoulder commented
> that "thickheaded is closer to the truth") enough to take some abuse.
> If I weren't, I'd have stopped developing linux the day ast ridiculed me
> on c.o.minix. What I mean is just that while linux has been my baby so
> far, I don't want to stand in the way if people want to make something
> better of it (*).
Linus

> (*) Hey, maybe I could apply for a saint-hood from the Pope. Does
> somebody know what his email-address is? I'm so nice it makes you puke.
(Taken from Linus's reply to someone worried about the future of Linux)

`When you say "I wrote a program that crashed Windows", people just stare at
you blankly and say "Hey, I got those with the system, *for free*".'
(By Linus Torvalds)

back




Wednesday, November 22, 2006

cpu sockets

Starting with the 486 processor all PC processors started to be “socketed” instead of being soldered directly to the motherboard. Since then both Intel and AMD have been creating several different sockets and slots to be used by their processors. In this tutorial we will list all socket and slot types released to date with their respective pinouts, also giving examples of compatible CPUs.
Up to the 386 processor almost all CPUs were soldered directly to the motherboard. There were socket-based Intel 386 CPUs, but CPU upgrade was a very rare procedure among users and even among technicians. So up to that time if you wanted to install a faster CPU you would have to replace the motherboard as well.
This story changed with the launching of the 486 processor and the massive use of ZIF (Zero Insertion Force) socket, also known as LIF (Low Insertion Force), which has a lever that installs and removes the CPU from the socket without the need of the user or the technician to press the CPU down in order to be installed on the socket. The use of this socket lowered a lot the chances of breaking or bending the CPU pins during its installation or removal. The use of the same pinout by more than one processor allowed the user or the technician to install different processor models on the same motherboard by just removing the old CPU and installing the new one. Of course the motherboard needed to be compatible with the new CPU being installed and also properly configured.
Since the 486 processor times both Intel and AMD have been developing a series of sockets and slots to be used by their CPUs.
The socket created to be used together with the very first 486 processor wasn’t ZIF and didn’t allow you to replace the CPU with a different processor model. Even though this socket didn’t have an official name, let’s call it socket 0. After socket 0 Intel released socket 1, which had the same pinout of socket 0 with the addition of a key pin. It also adopted ZIF standard, allowing the installation of several different processor types on the same socket (i.e. on the same motherboard). Other socket standards were released for the 486 family after socket 1 – socket 2, socket 3 and socket 6 – in order to increase the number of CPU models that could be installed on the CPU socket. Thus socket 2 accepts the same CPUs accepted by socket 1 plus some more models, and so on. Even though socket 6 was designed, it was never used. Thus we usually call the pinout used by 486-class processors as “socket 3”. Intel called “overdrive” the possibility of a socket to accept more than one CPU model. Intel also adopted this name on CPUs that used a pinout from an older CPU, in order to allow it to be installed on an older motherboard.
The first Pentium processors (60 MHz and 66 MHz) used a pinout standard called socket 4, which was fed with 5 V. Pentium processors from 75 MHz on were fed with 3.3 V and thus required a new socket, called socket 5, which was incompatible with socket 4 (a Pentium-60 couldn’t be installed on socket 5 and a Pentium-100 couldn’t be installed on socket 4, for example). Socket 7 uses the same pinout as socket 5 with the addition of one key pin, accepting the same processors accepted by socket 5 plus new CPUs, especially CPUs designed by competing companies (the real difference between socket 5 and socket 7 is that while socket 5 always fed the CPU with 3.3 V, socket 7 allowed the CPU to be fed with a different voltage level, like 3.5 V or 2.8 V, for example). Super 7 socket is a socket 7 capable of running up to 100 MHz, used by AMD CPUs. We usually call the Pentium Classic and compatible CPUs pinout as “socket 7”.
As you may notice, sockets and pinouts at this stage were very confusing, as a given processor could be installed on different socket types. A 486DX-33 could be installed on sockets 0, 1, 2, 3 and, if it were released, 6.
For the next CPUs manufacturers followed a simpler scheme, where each CPU could be installed only on just one socket type.
On the table below we list all socket and slot types created by Intel and AMD since the 486 CPU and examples of CPUs compatible with them.
Socket
Pin Count
Example of Compatible CPUs
Pinout
Socket 0
168
486 DX
Pinout
Socket 1
169
486 DX
486 DX2
486 SX
486 SX2
Pinout
Socket 2
238
486 DX
486 DX2
486 SX
486 SX2
Pentium Overdrive
Pinout
Socket 3
237
486 DX
486 DX2
486 DX4
486 SX
486 SX2
Pentium Overdrive
5x86
Pinout
Socket 4
273
Pentium-60 and Pentium-66
Pinout
Socket 5
320
Pentium-75 to Pentium-133
Pinout
Socket 6
235
486 DX
486 DX2
486 DX4
486 SX
486 SX2
Pentium Overdrive
5x86
(Never Used)
Socket 7
321
Pentium-75 to Pentium-200
Pentium MMX
K5
K6
6x86
6x86MX
MII
Pinout
Socket Super 7
321
K6-2
K6-III
Pinout
Socket 8
387
Pentium Pro
Pinout
Socket 370
370
Celeron
Pentium III FC-PGA
Cyrix III
C3
Pinout
Socket 423
423
Pentium 4
Pinout
Socket 463
463
Nx586
Pinout
Socket 478
478
Pentium 4
Celeron
Celeron D
Celeron M
Core Duo
Core Solo
Pentium 4 Extreme Edition
Pentium M
Mobile Pentium III
Mobile Celeron
Mobile Pentium 4
Pinout
Socket 479(Socket M)
479
Core Duo
Core Solo
Pentium M
Mobile Pentium III
Mobile Celeron
Mobile
Pentium 4
Celeron M
Pinout
Socket 775(LGA775)(Socket T)
775
Pentium 4
Pentium 4 Extreme Edition
Pentium D
Pentium Extreme Edition
Celeron D
Core 2 Duo
Core 2 Extreme
Pinout
Socket 603
603
Xeon
Mobile Pentium 4
Pinout
Socket 604
604
Xeon
Pinout
Socket 771
771
Xeon
Pinout
Socket 418
418
Itanium
Pinout
Socket 611
611
Itanium 2
Pinout
Socket 462(Socket A)
453
Athlon
Duron
Athlon XP
Sempron
Pinout
Socket 754
754
Athlon 64
Sempron
Turion 64
Pinout
Socket 939
939
Athlon 64
Athlon 64 FX
Athlon 64 X2
Opteron
Pinout
Socket 940
940
Athlon 64 FX
Opteron
Pinout
Socket AM2
940
Athlon 64
Athlon 64 FX
Sempron
Athlon 64 X2
Pinout
Socket S1
638
Turion 64 X2
Pinout
Socket F
1,207
Opteron
Pinout
Slot 1
242
Pentium II
Pentium III (Cartridge)
Celeron SEPP (Cartridge)
Pinout
Slot 2
330
Pentium II Xeon
Pentium III Xeon
Pinout
Slot A
242
Athlon (Cartridge)
Pinout

Wednesday, November 15, 2006

celeron vs pentium 4

The Celeron and Pentium Processors are two of Intel's best selling CPUs. They are found in a majority of home computer systems. When comparing the two processors it should be first understood that there are different types of Pentium processors - the original Pentium all the way to the Pentium 4 (the latest Pentium processor). The Celeron processors are more or less the same, although you will find them in a wide variety of speeds.
The Intel Celeron processor was always designed to be a low-cost alternative to the Pentium processor line. It is much like a car company that offers various priced cars from the luxury sedan to the economy compact. The Celeron is simply a downgraded Pentium, that almost anyone can afford (it is essentially the compact). To begin, Celeron chips have a smaller L2 cache 9128kb compared with 512kb in the Pentium 4 Northwood, which translates into slower processing speeds. In fact, current Celerons have a clock speed limit of about 2.0GHz, where as the Pentium for is capable of speed in excess of 3.0GHz. In addition, the Pentium runs at a lower core voltage because it is more energy effecient (1.75V vs. 1.5V).
In summary, the Pentium 4 is more powerful than the most advanced Celeron processor on the market. However, Intel has planned it to be this way. Many applications will work just great with a Celeron processor, despite a little less power than the Pentium 4. It is a way to save a little cash when buying a new pc - but don't forget the saying "you get what you pay for." Celeron processors are of good Intel quality, but they will never be as good as the Pentium.
This Celeron vs. Pentium review was brought to you by SciNet Science and Technology Search Engine. SciNet is not affiliated with or specifically endorses the Celeron or Pentium processors or the manufacturer, Intel Corp. Please consult the Celeron and Pentium product information and configuration before you purchase either processor. It is also a good idea to seek other up-to-date product reviews and information as necessary.

Type of Computers

There are four basic types of computers: PC (Personal Computer), Workstations, Laptops, Servers.
The PC, or Personal Computer, is the most common category of computers. This category would include your home PCs and most of your business class PCs.
A workstation is a breed of computer which is a high performance version of the PC. Workstation manufacturers took many of the high speed and high availability components normally found in servers and created a workstation PC which is a high performance version of the original. Throughout the rest of this tutorial, you will find the word PC and the word workstation used interchangeably, while technically there is a difference, we will begin using “workstation” now as a word to describe a PC since the two are in the same basic category.
Laptops are portable computers. Originally, laptops were large, heavy, short battery life beasts. Nowadays, laptops (also referred to as notebooks) are light, powerful, have good battery life, and serve as a desktop replacement for many individuals (including myself).
A server is a machine developed to allow for file or print serving, application hosting, or some other task usually involving many simultaneous connections. Common features of servers include redundancy, multiple drives, large amounts of memory, multiple processors.
PDAs
A Personal Digital Assistant (PDA) is a commonplace item in many businesses and homes. Spurred by the success of Palm Pilots, the PDA industry has had tremendous growth in the past few years.
A PDA is a device which allows an individual to keep their notes, email, schedule, small documents, and other information with them at all times. It is a useful device for record keeping and usually syncs with your computer to allow your contact list and emails to be "in sync" with each other in your handheld and normal computers.
Palm Pilots are the most prevalent PDAs on the market, with a market share of about 30%.
Most PDAs are based on either the Palm operating system or the Microsoft PocketPC operating system.
Palm Pilots present an interesting challenge to IT staff who are trying to support it. At one firm I worked at, we went through several different policies in regards to supporting PDAs. Originally, we would not support any PDA, though if a member of management were to purchase one, we would support them individually. Eventually, we decided on a standard and if anyone were to buy the standard PDA we would support it. This allowed us to designate an expert (which coincidentally happened to be me) on PDAs and be able to develop support procedures for the specific PDA we would support.

1394 firewire

Firewire is a high-speed, hot-swappable peripheralinterface that supports data transfer rates of up to800 megabits/second. Firewire was originally developedby Apple Computer Corporation and it was adopted asan industry standard (IEEE 1394) in 1995.

Firewire specifications

The original implementation of Firewire operated at a speed of 400 megabits/second, but the latest version (Firewire 800 - IEEE 1394b) has doubled the data transer rate to a whopping 800 megabits/second. Firewire 800 also doubled the maximum length of a Firewire cable to 15 feet.

Firewire devices

The amazing speed of the Firewire bus and the ability to connect and disconnect Firewire devices while the computer is running has made Firewire the connection of choice for manufacturers of digital camcorders. Firewire is also supported in a number of hard drives and CD/DVD burners.

Advantages of Firewire over USB

Firewire is twice as fast as USB 2.0.
The Firewire bus can directly power devices up to 45 watts which means that many moreFirewire devices can operate without adedicated power supply than with USB.
When used with a Firewire 800 optical repeater, you can use Firewire devices up to 3300' away.

Disadvantages of Firewire

Not all PCs come factory equipped withFirewire ports.
Firewire devices are typically more expensive than USB devices.

Adding Firewire ports to your computer

You can add Firewire capability to virtually any PC that has an open PCI slot simply by purchasing and installing a Firewire adapter, and PCMCIA Firewire adapters are readily available for notebook computers.

Conclusion

There are more Firewire devices being put on the market every day, and the prices of existing Firewire devices continues to drop. If peak performance is your goal, you owe it to yourself to switch over to Firewire.