SCSI is usually considered "server grade" hardware, and has the big advantage over IDE that it allows 8 or 16 devices (depending on the exact SCSI standard) per device chain, which makes it perfect for use in RAID arrays and the like. It also has performance advantages over IDE which get a bit arcane and mostly matter only if you are in a heavy multitasking / multithreaded environment with a lot of background disk I/O heavy tasks going on. IDE is cheaper "desktop" grade hardware. However IDE has arguably gotten better faster than SCSI, so there is less reason to prefer SCSI these days except for specialized server applications.
To put it in woodburner terms, think of it as being like chainsaws - IDE = "homeowner grade" saws, and SCSI = "pro-grade" saws for BIG trees. Used to be the quality of the drives was significantly different, but there has been a lot of convergence so most times these days the same internals are used for both SCSI and IDE, and only the controllers are different.
With SCSI there are a few problems - one is that there are a huge number of different SCSI standards that have evolved over the years, not all of which play nicely together, and some of which are totally incompatible and will cause smoke if connected...
However it's a solvable problem.
Start with the hardware configuration - if the hardware isn't right, it's possible nothing will work
1. All the devices must be connected in a single, non-branching, chain.
2. There must NOT be cable flopping around past the end devices unless the cable end is terminated.
3. The END of each chain MUST be terminated, and there must NOT be a termination on any device in the middle of the chain.
4. EACH device must have a UNIQUE ID number! By convention / custom, the controller is usually device ID 0, and the hard disks start at 1 and count up, (with the boot disk on ID 1) and tape drives start at ID 7 and count down, with CD's one number under the tapes, again counting down.
5. The devices may appear on the chain in any order, and it is possible that a controller can drive some devices on an internal cable from one connector, and other external devices with another cable from a different connector as long as there is no branching.
Termination on a drive may be done by a "SIP" resistor pack, a jumper or a dip switch. The ID is usually set by jumpers. There may be several other jumpers or switches that you will need to consult the manual for, but mostly are likely to be OK with the default settings.
A device chain may also be terminated by a special "resistor plug" on the very end of the cable, in which case the drives should NOT be terminated.
The controller may be configured by jumpers, switches or software. You will need to (again) check the manual for details / instructions.
You may need to select IRQ's and address spaces for the controller - these must NOT conflict with other devices in the system (Note that as long as they are on different addresses and IRQ's, you can have as many SCSI controllers in the machine as will fit)
Once you have the hardware right, power up the PC and go into the BIOS setup routines. In the PC BIOS, make sure that you don't have any non-existent IDE devices showing, and set to boot off the SCSI controller if you want to boot from SCSI. Note that if you have both SCSI and IDE drives in the same box, you must boot off the IDE drive. If you aren't using the IDE controllers, it is best to disable them.
In most machines, there is a separate SCSI BIOS, on Adaptec controllers (perhaps the most common brand) most often are accessed by hitting "[CTRL][A]" at the SCSI BIOS prompt that will come up (hopefully) after the PC finishes its initial POST routine. Go into that BIOS and check to see if all the drives are recognized in the ways you expect, at the correct locations, etc. Resolve any hardware problems that show up at this point. A drive that is not recognized at this point will NEVER be seen, so this is a critical point. The drive MUST be recognized in HARDWARE or it will NOT be recognized in software!
Most controllers will have a bunch of configuration options, set them appropriately per the manuals. Adaptec controllers have pretty good internal help functions, mostly with good advice... Pay special attention to things like OS type, the IRQ and interrupt settings, and any termination options.
Once you have everything set up the way you think it should be, most controllers have a test option, go into that and verify that the controller passes the self test and any of the tests to other devices that it will perform (Adaptec controllers will only do internal tests w/ hard drives, other devices will come back with an error to the effect that the device is not a disk, this is normal. CAUTION - Some tests will destroy any data on the disk, it is good to run them if possible, but make sure you have backups. Also check the "empty" device numbers, make sure the controller doesn't think there is anything in them.
Once all that happens exit from the controller BIOS (this will normally reboot the machine) and try to boot. After the initial PC BIOS self test, you should see the SCSI BIOS self test, which should list all the devices that the controller sees (which should be all of them) and then you will boot. Note that the SCSI drives may need to be partitioned, formatted and so forth before they are recognized by the O/S.
If you are still having problems, remove all the drives, and see if you can make the controller work, then add the drives (w/ the system powered off) back one at a time. Dead drives can cause the chain to hang, but more common problems are either termination errors or ID conflicts.
If you are still having problems, please post more info, especially the Controller make / model, whether it's on-board or an add-in card, the make and model numbers of each drive, and the way that you have them cabled and jumpered.
Gooserider