I sent this email to team-members of the Helpdesk/Inventory Project on Monday, after my first day of the Microsoft 2072A class on SQL Server 2000 Administration, taught by Jim Ferguson at New Horizons in Edina:
We just finished my first day of SQL Server class and I wanted to share some ideas about our Asset Nav server config to come. Please share any differences of opinion on this:
1. Use *hardware* RAID controls to create
three partitions: WINDOWS, DATA, TEMP;
Make WINDOWS at least 10-20 gig for all
anticipated patches etc., TEMP 5-10 gig
for all logs (only), the rest to DATA -
not just for Asset Nav but possibly for
other future databases, just in case
2. Install Win03 plus SQL Server on WINDOWS
(standalone, standard) and patch it all.
We should plan/discuss authentication
methods and administrative passwords
3. Setup all new dbs (just AN now) on DATA
partition, separate from the executables.
4. Setup the AN executables on C: (defaults)
and tweak IIS and Win security as needed
5. Use SQL Server Enterprise Manager to set
restrictions on how big the database can
grow (maybe the whole D: drive minus Y)
and where log files land (the E: drive)
6. Setup autoshrink to have SQL Server make
its own decisions about how and when to
"compact" things (sort of like defrags).
Microsoft claims that users won't see a
performance hit. We can always disable it
7. Periodically defrag partition WINDOWS but
almost never DATA - SQL Server can do its
own reallocations for database(s) on D:
and sometimes fragmentation isn't such a
terrible thing with something as random-
access as the typical OLTP database.
8. Periodically cleanup/backup the log files
or they may eventually fill that E: drive
I'm sure other thoughts will emerge over the
coming days, e.g. about backups and upgrades,
but I wanted your feedback on this before we
get the new server hardware and do anything
about allocating drive space or installing. I
think these tweaks will help performance over
time and help us avoid problems downthe road.
P.S. Good New Horizons class. The instructor,
Jim Ferguson, obviously knows his stuff and
holds about every Microsoft advanced cert you
can get. Gave a tip or two on what the book
says vs. what his real-world experience was.
This is a great little blurb on Kerberos security, from MIT:
Target __ Mac __ PC __ Staff __ Public
Description: Staff XP baseline image for [target hardware]
Start date/Date uploaded/Appr date/Appr by/Creator's initials
Who worked on the image?
What’s in this version?
--sample data --
Started on [brand/model] system with primary drive partitioned into a [size] C: and a somewhat larger D:, the D: drive formatted as FAT32 so that we can use a Win98-based boot disk to ghost from C: to D:. Onto that C: drive I installed WinXP Pro sp2 (mostly default settings) built from a campus-licensed ISO, downloaded from http://download.software.umn.edu. The product code, listed at that website and which one needs for each deployment of this image, is:
After the OS install I configured it for Biomedical Library staff and/or “Green” PCs as follows:
1. Administrator account (new password) renamed to [local admin acct]
2. Account xpuser (old password) member of Administrators and Users
3. Disabled DCOM (used by some viruses) with Start -> Run -> dcomcnfg
4. Network set to support TCP/IP only, enabled NetBIOS over TCP/IP
5. Enabled automatic Windows updates for every day at 4 pm
6. Applied all Windows Updates except a few optional ones like Journal Viewer
7. Simplified the interface a la Win2K and optimized it for performance
8. Reduced the size of the System Restore area to the minimum available
9. Removed MSN Explorer, Windows Messenger, Outlook Express, other baggage
10. In Internet Explorer, set history to one day, deleted cookies, set home page to Biomed’s, disabled automatic completion of passwords and forms, set to delete files upon closing
11. Installed campus-licensed Symantec Antivirus [version] with defaults, updated defs, tweaked to delete bad files it cannot clean and to skip network drives
12. Killed any ASP.NET or related user account(s) created in the optional .NET install
13. Installed Spybot [version] plus all available updates, immunized the system
14. Downloaded and installed FireFox [version] into its default directory
15. Downloaded and installed Acrobat Reader [version], updated to [version]
16. Downloaded and installed Macromedia Shockwave [version] player
17. Downloaded and installed Macromedia Flash [version] player
18. Changed FireFox preferences to make all cookies session cookies, to use pictures only in toolbar, to block most pop-ups, and to save neither passwords nor forms
19. Installed ActiveState Perl [version], mostly defaults
20. Installed standard plug-ins for Chime 2.6 sp5, CN3D 4, and Isis Draw 2.3
21. Installed Scifinder Scholar [version] plus U of MN file C:\SFSCHLR\ site.prf
22. Installed Beilstein Commander [version] with the latest Crossfire connection software
23. Installed UMCal 9 and tweaked settings per OIT advice
24. Installed WinSCP [version] for remote file access (uninstall where not needed)
25. Installed Office 2003 using CD from http://download.software.umn.edu - included Word, Excel, PowerPoint, and InfoPath, all run locally, but excluded Access, Outlook, Publisher to keep this image reasonably-sized and fairly clean. We can add those apps as needed
26. Updated Office 2003 and added the Remove Hidden Data util (to clean docs over time)
27. Created custom default profile based on [xpuser], to apply to all new users of this PC
28. Changed default IP address to DHCP, but left Advanced settings in place
29. Used Sysprep 2 (see k:\systools\wxptweak\sysprep) with all options except nosidgen
30. Booted from CDROM, ghosted from C: to D: with compression
31. Rebooted, let Sysprep run, set IP address (1-2 minutes to become active)
32. Joined PC to domain, rebooted, tested, copied image from D: to K:
Please share any questions or concerns…
Bootable DVD Image Deployment Instructions:
1. Enable target CMOS to boot CD/DVD disk before hard drive
2. Use bootable DVD to wipe target, create C: and D: partitions
3. Reboot, format D: /u/v:D, then xcopy E:\*.* D: (make image local)
4. Reboot with same bootable DVD, then ghost from D: to new C:
5. Remove DVD, reboot, walk away for 10 minutes as it Syspreps
6. Enter the WinXP campus license product code ** listed above **
7. After Sysprep is done login as xpuser (old password)
8. Set the IP address and DNS and give it a minute to take effect
9. Launch Windows Update to verify the IP works and get patches
10. Update Spybot and Symantec Antivirus definitions as needed
11. Setup printers, departmental apps, other local settings as needed
12. If on a public PC, remove xpuser from the Administrators group
If on a staff PC, ask Brad or Dan to join the PC to the domain
13. Login as xpuser (public) or a domain user (staff) to test
I saw this in the sample pages of a demo server:
Aoccdrnig to a rscheearch at an Elingsh uinervtisy, it deosn't mttaer in waht oredr the ltteers in a wrod are, the olny iprmoetnt tihng is taht frist and lsat
ltteer is at the rghit pclae. The rset can be a toatl mses and you can sitll raed it wouthit porbelm. Tihs is bcuseae we do not raed ervey lteter by itslef but the wrod as a...
Is it a coincidence that in this, the week Darth becomes Darth, I wiped my older GX100 at home and installed XP sp2? I was already running XP on my newer eMachines. This makes my basement a de facto Windows-only zone. My wife's iMac remains our email/surfing/iPhoto mainstay upstairs.
My old Dell has been a Linux/FreeBSD testbed for years, running everything from Red Hat to SuSE to Gentoo to Mandriva, but Friday night I finally decided to focus on what I/we actually use to get stuff done.
I am not against Linux. If all I wanted were Apache web services, MySQL or PostgreSQL or Oracle, and/or Samba file services, I'd pick FreeBSD or a major Linux (RH or SuSE) in a heartbeat. These are safe, reliable, cheap, run fast on affordable hardware, and for those server apps are very proven.
They do not, however, hold a candle to Microsoft or Apple on the desktop, particularly in games, nonlinear video editing, and Office productivity tools. Even Openffice looks and works better on Windows and OS X than on Linux. Most of my time is spent using or supporting desktops, so it makes more sense to build a deeper understanding of Windows and OS X.
Solaris I have to know because our web servers run it. We also run FreeBSD, but just for Samba and Amanda, and these require little daily support. I need to keep up to date on basic Solaris and FreeBSD skills,and on Apache and related technologies, but most of my time goes elsewhere.
From a technical perspective I much favor OS X. It's cleaner, safer, and more attractive than Windows. Most of the software on my Mac mini just works, and I never have to worry much about viruses and spyware. "Tiger" remains my main work environment, and I support a small OS X Server for local DHCP, Netboot, and possible (in time) FMPro Server services.
Windows by contrast demands babysitting. That's a blessing and curse. Despite many efforts by Microsoft to improve Windows security and reliability to the point that XP sp2 is adequate for most users - iff patches plus anti-virus and anti-spyware software are maintained - there is still enough support required to keep me employed for years to come.
If ever I start a "real" company, I may make it all-Mac. The labor I'd save with a Netboot environment and XServers at the core, not to mention free client licensing, would far outweigh the up-front hardware costs and limited software availability. No need for a full-time IT staff.
But as long as I have a mortgage I'll probably rely indirectly on Microsoft to pay bills. Apple may be the Maytag of PCs, but who wants to be that repairman? If not for Windows, what would I do all day? I also happen to love games, and whatever Microsoft's failings, they make it easy to build great apps. See an earlier posting for more about games and nonlinear video editing.
For all these reasons I might as well accept the sad fact that I'm a Microsoftie. Resistance was futile; Dark Side was strong; pick your analogy. Job security, my desire and ability to tinker with registries, and games and videos matter more to me than the elegance or security of the platform.
Sorry Linux, but if my experience is any indication you're doomed to remain even more niche than Apple on the desktop. Look at the metric system - better does not necessarily guarantee acceptance. People get entrenched. I may be retired long before Windows (inevitably) loses its dominant position.
When the Bio-Medical Library staff first developed Windows 2000 configurations last year, we enabled two local accounts. w2kuser is the account housing user applications and settings. w2kuser was intended to be used mostly in User or Power User mode, to restrict what can be changed, but must become a local Administrator for many software changes to occur.
One of the more time-consuming aspects of maintaining our PCs involves adding w2kuser to the Administrators group before any significant change, and then removing w2kuser from Administrators afterwards. As student worker Haudy describes, there are about 13 steps that not only take time but also are easy to forget.
If we enable the Windows RunAs service and use a simple batch script Haudy developed, we can cut in half the time and trouble of this rights switch. When I first heard of this fix it sounded good, but I told Haudy that we had to look into the security implications of enabled the RunAs service.
The approach is similar in concept to running the “su” command in UNIX. Based on Haudy’s research and what I have seen, I see no good reason not to enable RunAs. It’s a few mouse-clicks per PC to enable, after which Haudy’s script could run from the K: drive. The script requires the Administrator's password - which is not embedded – and the script is based on the common Windows “net” executable, meaning we’re not exposing anything important by making it executable from a public account. No password, no go.
In short, let’s do this ASAP, and thereafter be more diligent about dropping the w2kuser account on public PCs back to User mode after each such change. This should make maintaining security much easier.
The notes below are largely Haudy's from a couple of years ago. Note that most of the Novell-specific stuff is no longer valid for most of our users since we switched in 2004 to Samba file services.
--- Haudy notes ---
Current method to give w2kuser admin level rights in order to make system and software configuration changes, starting from a logged-in w2kuser account:
2.Log out of w2kuser
3.Login as the local Administrator by changing the username on the Novell login screen and entering the local Administrator's password
4.Go to Users and Groups control panel
5.Add w2kuser to Administrators group
6.Logout of the local Administrator account
7.Login as w2kuser (who now has admin level rights)
8.Make necessary configuration changes
9.Apply the novell.reg registry patch that enables auto-logon
10.Apply the w2kpublic.reg registry patch that enables auto-logon
11.Go to Users and Groups control panel
12.Remove w2kuser from Administrators group
13.Reboot, which will auto-logon as w2kuser and auto re-enable WinSelect
Proposed way to give w2kuser admin level rights in order to make system and software configuration changes, starting from a logged-in w2kuser account:
2.Run adminme.bat and enter the local Administrator account password
3.Log out of w2kuser
4.Log in as w2kuser (no username changes on the Novell login screen)
5.Make necessary configuration changes
7.Reboot, which will auto-logon as w2kuser and auto re-enable WinSelect
Streamlines making changes to w2kuser account
Much easier and faster to make changes to w2kuser account
Eliminates problems caused by forgetting to re-apply registry patches after every configuration change
Improves security by making security easier to use
6 fewer steps to go thru; these were time-consuming steps.
RunAs service to be enabled in automatic startup mode.
RunAs security issues:
I found no security issues with RunAs in numerous searches of SANS, Google, or Google Groups. The only thing I found is a general security tip to turn off unneeded services. RunAs is needed to make the batch file adminme.bat work. The SANS Securing Windows 2000 Step-by-Step document considers RunAs to be a service that “need to be running on production systems”.
runas /user:[local Administrator] "net localgroup Administrators %username% /add"
@echo You must logout and log back in again for Admin rights to take effect.
net localgroup Administrators %username% /delete
@echo You must logout and log back in again for removal of Admin
@echo rights to take effect.
In July 2002, I used Access 2000 to merge some BIOM ORDER tables with thirteen columns and over 19,000 records. Access is easier to use and more accurate for this than any Excel formula I tried. Some steps might be automated if there were a need (not so far), but the process is fairly quick once you have done it a few times. It takes me about an hour or two on a fast PC, as in 2 gigahertz with lots of memory.
Here is how I did this:
1.Verify the the data is properly delimited in Excel 2000 in two separate XLS files, that each column has a good heading with no punctuation, and that some heading could become the primary field (e.g. Key) and has an entry for every record in both tables. The common column is later used to define a critical one-to-many relationship between tables.
2.Create a temporary c:\projects directory, and then in Access 2000 To create new database "combined.mdb" in that directory. Close the dialogue asking how to create a table so you can do this manually. Use Windows Explorer to copy the source Excel files there as well.
3.In Access 2000 with Tables highlighted under the Objects section of the "combined:Database" window and three "create table" selections on the right, Insert -> Table -> Import Table, select the first Excel file, check the "First Row Contains Column Headings" box, let Access store the data "In a New Table", go with all defaults for field names, and let Access select/create a primary key (you can change it later).
4.Repeat for the second spreadsheet/table. Then rename the tables if the resulting names do not reflect their purposes. Fortunately these did: BIOM ORDER EXTRACT and BIOM ORDER NOTES EXTRACT.
5.Right-click the "parent" table - that is, the one in which that common column ('Key" here) contains only unique entries and no null entries. In this case BIOM ORDER EXTRACT was that parent table.
6.Select Design View, right-click the little box to the left of the name of the common column (again "Key" here), and make it the primary field. Then highlight the former (Access-generated) primary key and use "Delete Rows" to kill it. Close the table (not Access), saving changes.
7.Click Tools -> Relationships and add both new tables to that view. Resize and move windows so all fields can be seen, some space exists between the two, and the parent table (with unique "keys") is on the left. Primary keys should appear in bold. Click and drag the left "key" to the same-named "key" on the right and release.
8.Verify that "key" is the field selected from both tables. If not, cancel. Check the box to "enforce referential integrity" and click Create. The result should show a line between those fields with a "1" above it on the left or "parent" side, and an infinity sign on the right or "child" side. This indicates a new "one-to-many" relationship between tables.
Note: These relationships are at the heart of any "relational" database. If they are missing or configured badly, expect many problems. In a larger system (like ALEPH) there may be hundreds.
9.Close, save your changes, click "Queries" in the list of Objects, and double-click "Create query by using Wizard". Click the double arrows (>>) to pick every field from the first table. Then under Tables/Queries pick the second table and use the double arrows again. You should see a list of every field from both tables on the right. Then click Next.
10.Leave the default "Detail" view because you'll fix it up in Excel. For now we just want the query to show the raw data. Note that tweaking such queries allows you to filter or combine data in many clever ways. For now we want everything. Click Next, then Finish (default name). The results of the query should appear in an Excel-like window.
11.From within the query results window, click File -> Export, select the target directory (usually the same as the sources), pick Excel 97-2000 from the "Save as type" drop-down list, and give the new file a sensible, short name reflecting the contents of this process. I named the results of this one BIOM_ORDER_Combined_Query.XLS.
12.Close Access 2000, open Excel 2000, and open the new file you made.
13.Create a new first column, move the "Key" contents there, and delete the empty column you left behind plus any (Access-generated) "ID" column.
14.From the menus click Data -> Sort -> and select a few useful criteria. In this case I chose "key" then "internal_note" and then "divnote". The users could do this, but it saves them some trouble. Now save the file.
15.Copy this modified Excel file to wherever the users expect to find the data, in this case O:\biomed\aleph\NOTIS Reports\06262002. This directory name is a reminder of when I started working with the source data. A new directory with new data may be created later this summer.
The SQL statement below, hidden within the wizard-generated query, is a VERY simple example of the power of Structured Query Language. With a few tweaks, we could limit the results by any number of criteria.
Even more impressive to some systems admins is how the SQL client and server can be so cleanly divided (client/server), so systems using pure SQL over TCP/IP with no direct file access are generally more secure, faster, and more reliable than any older low-end systems like Paradox. Properly implemented, a small SQL system can even work by modem.
Access 2000 can work as either a low-end traditional database or as a powerful client in a true client/server environment, with SQL Server or any other supported SQL database at the server end. For this reason alone I think it's a wonderful learning tool compared to other low-end databases.
*** SQL from that query ***
SELECT [BIOM ORDER EXTRACT].title, [BIOM ORDER EXTRACT].ordunit, [BIOM ORDER EXTRACT].scope, [BIOM ORDER EXTRACT].vendcode, [BIOM ORDER EXTRACT].action, [BIOM ORDER EXTRACT].vendnote, [BIOM ORDER EXTRACT].internal_note, [BIOM ORDER EXTRACT].divnote, [BIOM ORDER NOTES EXTRACT].ID, [BIOM ORDER NOTES EXTRACT].key, [BIOM ORDER NOTES EXTRACT].statement, [BIOM ORDER NOTES EXTRACT].type, [BIOM ORDER NOTES EXTRACT].[M Date], [BIOM ORDER NOTES EXTRACT].[A Date] FROM [BIOM ORDER EXTRACT] INNER JOIN [BIOM ORDER NOTES EXTRACT] ON [BIOM ORDER EXTRACT].key = [BIOM ORDER NOTES EXTRACT].key;
This is about University of Washington's autoclave, as tested in 2003 on a test PC and on my own (then). It seems to work great, despite being beta. We looked at it based on a tip from another SysAdmin.
Autoclave is built on a stripped-down version of Linux that fits on a floppy and includes everything necessary to completely wipe a hard drive clean. More powerful than Windows format, and free.
Since 2003 we have used Autoclave when retiring old PCs at the Bio-Medical Library. To build this disk (for our Systems staff):
1. Copy directory k:\systools\software\autoclave to your C: drive
2. Put a blank, formatted floppy into your a: drive
3. Start -> Programs -> Accessories -> Command Prompt
4. Within the command prompt (not Windows) "CD \autoclave"
5. Enter the command "rawrite" from within that directory
6. Enter "clave03.img" for the image source file
7. Enter "a:" for the target, then press enter twice
8. In a minute or two the floppy is ready. Type "exit"
Rawrite is a very old and widespread standard tool for creating Linux floppies from a DOS command prompt. After the prompt is closed, remove the floppy and label it "Autoclave" or the like. You can then delete directory c:\autoclave or make more disks.
When using Autoclave here, option 2 (with one random pass) is probably adequate for old "public" PCs, but Autoclave also offers several more exhaustive (and much slower) choices.
I am impressed with how a complete (albeit stripped) Linux fits so nicely on a floppy. Other floppy Linux versions are listed at www.linuxlinks.com/Distributions/Floppy. They include routers, terminal servers, firewalls and toolkits. One should be careful with such freeware (likewise with commercial software).
These instructions assume that the target computer has been configured for dial-up testing by Systems. The preparation of a computer for this includes starting with a standard staff Windows image, installing a modem and its driver, connecting it t the analog phone port of one of the new phones, creating shortcuts for LAN and University dialup, and changing a group policy right (via gpedit.msc) so that all users of that computer can enable/disable LAN connections on the fly without being administrator. Also, for security reasons all server services (web, ftp, vnc) should be removed or disabled.
That said, here’s how I prefer that staff handle dial-up testing. While it may be possible to be use both the LAN (Local Area Network) and the modem simultaneously, please use only one at a time. This will be better for security, and you will only get a true end-user experience if you disable the LAN. Fortunately, there is no need to unplug or move any cables.
How to disable the LAN and connect the modem:
1. Ensure that the modem (to the right of the PC) is on.
2. Hang up the phone so the line is available for modem.
3. Login to the PC and network as you normally would.
4. Close all applications. Do NOT run email or anything.
5. In the tray at the bottom right of the screen, right-click
the LAN icon and then click "disable" to disconnect.
6. Prove to yourself that the LAN is off by trying to open
Internet Explorer. It should fail to find its home page.
7. Double click the new desktop icon for "dialup" and
enter your Internet/email user ID and password.
8. After the modem connects, restart Internet Explorer
and test whatever Internet stuff you like. Speeds in
three tests were around 41k. This is probably typical.
How to disable the modem and reconnect the LAN:
1. Close all applications. Do not run email or anything.
2. In the tray at the bottom right of the screen, right-click
the modem icon and then click "disconnect".
3. Double-click the new desktop icon labelled "LOCAL
AREA CONNECTION" and watch the corresponding
tray icon reappear below (the one you killed earlier).
4. If/as needed, Start -> Programs -> Novell -> Login
to reconnect to server(s) and get drives H:, O:, etc.
Please only use modems for end-user testing. Your mileage many vary. Follow these steps precisely. We tweaked the PC so that this should work for any user. Let me know if you have trouble with this.
In the course of upgrading or moving Eudora, we sometimes find configurations that do not comply with our current Biomedical Library standards. One problem arises when attachments are reset to reside in the standard directory c:\Eudora\attach, but old messages still refer to another attachment directory. This may true if the icon for an attachment has an “X” over it, indicating that Eudora cannot find the attachment.
Fortunately (in this case) the Eudora mailbox files are ASCII text, meaning that we can use various text-manipulation tools to search and replace. This is particularly handy when dealing with many mailboxes - in the case of one user, 659 mailboxes. Opening and closing each mailbox file in Notetab could take hours.
Facing this situation, we looked for a free utility that could do global search and replace functions for all mailbox files (*.mbx) under c:\Eudora in one step. Of many options available, we selected and tested the free tool Handy File Find and Replace from http://silveragesoftware.com/handytools.html.
We had some trouble with it at first because we misunderstood regular expressions, but UNIX sysadmin Dan, who often deals with regular expressions, explained it. To specify “\” in a search without it being interpreted as the first half of an escape sequence (e.g. “\n” = newline) you use “\\”. It looks weird, but in the world of regular expressions it works really well. Substitutions like these are often done in scripts or at the command line, but HandyTools gives the process an easy, graphical interface.
The results are reduced to c:\Eudora\attach. Of many options, we chose to recurse subfolders, and within the Properties tab we chose to modify all files, even those marked read-only.
In the rather huge c:\Eudora directory structure I tested, the process only took a few minutes. After the change, mailboxes are fixed but most tables of contents must be rebuilt. We do not yet know how to make this happen globally, but Eudora makes TOC files easy.
If you see messages about currupt or damaged mailboxes, just select the highlighted button (Create New or Please Do), let Eudora fix things, then click OK at the results window. Nothing is guaranteed, but I have yet to see this fail. The user may have to do
this for each mailbox opened, but it takes very little time. Many or all fixed messages might then be marked as read, but otherwise everything should be fine.
When the attachment references and tables of contents are fixed to the user’s satisfaction, they should backup all of Eudora to the network. Here at the Biomedical Library we run a little batch file to copy all of c:\Eudora to h:\Eudora_backup. This can take anywhere from a minute to hours, depending on how much data is stored. This is the ONLY backup method we currently support for Eudora. Run it at least weekly.
If you have any concerns, questions, or corrections, please let us know through x45937 or let me know personally. Otherwise I will assume that it works as expected, as it seemed to in our tests.
Biomed Library Access Services now runs a Ricoh copier/scanner/printer. This document explains driver setup for the Ricoh 1232c for Windows 2000 on Bio-Medical Library staff PCs. Other uses of the Ricoh system may be described elsewhere.
To install the 1232c printer driver, first verify that the target PC has a standard staff Win2K configuration and that TCP/IP works. Configuration of the printer driver is a two-step process: installation and settings. To install the driver:
1.Login normally, then close all running applications.
2.Start -> Settings -> Printers, make sure you don’t already have the driver, then double-click Add Printer.
3.Click Next, uncheck the “Automatically detect” box, and select Local Printer.
4.Check for a Ricoh_Color port. If no such port exists, select Create a New Port of type Standard TCP/IP.
5.Click Next, enter the printer’s IP address as 18.104.22.168, and name the port Ricoh_Color.
6.Click Finish, then take a break as Windows finds what it needs.
7.At hardware selection, click Have Disk, replace A:\ with K:\, then click Browse.
8.Double-click down to Systools\Drivers\WXP\RicohColor\OEMSETUP.INF
9.Click OK, pick the RICOH Aficio 1232c (PCL 5), and click Next until…
10.At Name Your Printer, enter “Ricoh_Color” and make it NOT the default.
11.Select “Do not share this printer” and no test page, then click Finish.
12.When done with the install, return to the Start -> Settings -> Printers folder.
Now that the driver is installed, it should be changed to reflect attached accessories and to use the Document Server feature. Document Server allows the printer to hold a job until you go to pick it up, as described in the last section below.
13.From Start -> Settings -> Printers, right-click Ricoh_Color and select Properties.
14.Under the Accessories tab, click the boxes for Bypass Tray, Tray 3 (LCT), Finisher 1000, and Duplex Unit, and set the RAM to 384mb. Click Apply.
15.Under the General tab, click the Printing Preferences button.
16.Change the Job Type to Document Server, but leave other preferences at defaults.
17.Under the Document Server selection, click Details.
18.Enter the same unique name (ideally your x.500 name) for the User ID and User Name. This identifies your jobs at the printer when you are ready to pick up. Please do not enter a password. You can use passwords on particularly sensitive documents, but protecting everything will complicate troubleshooting by Access Services and Systems.
19.Click OK and OK to close the dialogue and return to the Printers folder.
To retrieve your Document Server print jobs:
20.At the copier/printer/scanner, click the big button on the left labeled Document Server.
21.Use the LCD touch-screen to find the job. If you password-protected it, enter the code.
22.After your job prints to your satisfaction, please delete it on the LCD to make room for others. The printer has a lot of memory, but that memory will fill very quickly if people don’t clear their work.
Once in printer memory a print job looks and acts like a copy job. You can print multiple copies, double-side, collate, staple, whatever the copier does. See www.ricoh-usa.com/productshowroom/digitalimagingsystems/af1224/aficio1224.pdf for more info. If you have trouble using this Windows driver, please call x45937. If the printer seems to have physical problems like low toner, lost jobs, or a paper jam, please inform Access Services (Emily).
The DataEase application used by some Tech Services staff is actually delivered via a Novell Application Launcher (NAL) process from the Wilson server. Check with SysAdmins regarding current Novell login parameters.
Once logged in, the icon for Dataease should appear in the NAL window. If it does not, even after a good connection to the Wilson server(s), then a systems admin at Wilson may need to adjust some rights or settings to enable it.
At the local Win2K PC, the only requirement beyond Win2K and this link to Wilson is a local environmental variable "dename" set to match the username being used at that PC for Dataease. Set this via Start -> Settings -> Control Panel -> System, click Advanced, then Environmental Variables, and enter the settings for user w2kuser.
Once variable “dename” is set to match the local Dataease (NOT Novell) username, click OK until the box is closed, then shutdown the Win2K PC properly and restart.
I set dename for the Bindery Prep users as a System variable because I had the rights and was learning, but as Power User you cannot do this and frankly don’t need to. Just make it specific to w2kuser, and let me know if you have any other concerns or questions.
Notepad, built into Windows, has several limitations that make it unsuitable for use with Aleph for printing call numbers on spine labels. The primary limitations are Notepad's inability to remember Font, Page Setup and Margin settings between Notepad sessions. NoteTab has the ability to remember these settings between sessions. While it is possible to modify Aleph's configuration settings via its INI files, I did not do that because the INI files are regularly updated from a central Aleph server, and any changes I made to local INI's would quickly be overwritten. This procedure does not modify or interact with the Aleph INI files.
Replacing Notepad with NoteTab on Windows 2000 is more challenging than on Win9x because of Windows File Protection (WFP). To successfully replace Notepad with NoteTab while leaving WFP enabled, you must use this or a similar procedure. Totally disabling WFP per Microsoft's instructions is not desirable as you lose the benefits of increased system reliability and self-healing properties that WFP provides, gaining only the ability to more easily replace the Notepad executable (notepad.exe). This procedure does not disable WFP.
I’ve tested my procedures (with appropriate modifications) on the following systems: Windows 95, Windows NT 4.0, and Windows 2000 using the Epson FX-880 drivers. I believe Windows 98 will behave like Windows 95 and indicate this by the term ‘Win9x’. Windows NT 4.0 had label printing issues preventing the use of labels spaced closer than ~3.65 inches, so it was not tested as much as Win95 and Win2k. Other Microsoft operating systems like Windows XP, Whistler, or beyond will probably behave more similarly to what I’ve outlined for Windows 2000 than Windows 9x. Lastly, when following the procedures please take note that some apply to certain operating systems only.
The installation must be done in the order listed otherwise WFP will undo the changes you make, as you make them, and prevent you from successfully completing the installation. Be aware that WFP will prompt you during the procedure about replaced and/or missing files. All you need to do is hit 'Cancel' on the prompts for the Windows CD, and ‘Yes’ to “Are you sure you want to ignore changes.” Once you’ve completed the steps below, NoteTab will have completely taken over Notepad’s functions. WFP will not try to replace it again except as explained in some special cases in the Caveats section of this document.
1.)Windows 2000 only: Make sure you can view hidden files, and all file extensions by taking a peek at Explorer’s Tools | Folder Options | View menu. Check “Show hidden files” and uncheck the two “Hide…” options. Depending on how you configure your systems you may want to re-enable these settings at the end of the procedure. Personally I consider the hiding of file extensions to be a very bad thing as it can help viruses masquerade as other file types, and makes changing a file type difficult should a file be saved with the wrong extension.
2.)Windows 2000 only: In C:\WINDOWS\system32\dllcache, rename Notepad.exe to Notepad.old. This step disables the first place WFP looks for its files. (Whether you rename or delete it is your choice.)
3.)Windows 2000 only: In C:\WINDOWS, rename Notepad.exe to Notepad.old. This step disables the second place WFP looks for its files. At this point there should only be a Notepad.exe file in C:\WINDOWS\system32, and this will be replaced next.
4.)Install NoteTab (I used NoteTab Light v4.86c from www.notetab.com) using the default settings except deselect the options for Desktop and QuickLaunch Tray shortcuts.
5.)Run NoteTab, deselect “Show tips on Startup”, and use the Help | "Replace MS Notepad" feature. On operating systems without WFP (Win9x, NT4) there should be no messages. On Windows 2000 only, NoteTab will warn you about WFP, just dismiss the dialog as we already have tended to WFP. WFP may prompt you a few times during these next few steps, just hit ‘Cancel’ to the CD prompt, then ‘Yes’ to the “Sure?” prompt.
6.)Windows 2000 only: Now take a look in C:\WINDOWS\system32 using Windows Explorer. You should see a 19kb file named NOTEPAD.EXE, a 1kb file named NOTEPAD.INI, and 50kb file named NOTEPAD.EXE.MS that is the original Microsoft version.
7.)Windows 2000 only: In order to make this work on Win2k with Aleph (and possibly other programs that specify C:\%SystemRoot%\NOTEPAD.EXE) you need to copy NOTEPAD.EXE (19kb) and NOTEPAD.INI (1 kb) from C:\WINDOWS\system32 into C:\WINDOWS. This is necessary because Aleph specifically looks for Notepad.exe in C:\WINDOWS. At this point attempting to print in Aleph should load NoteTab.
8.)Run NoteTab again to make a few minor configuration changes:
a.)Look in View | Options | General tab for this setting "Reload open documents and uncheck it (turn it off).
b.)Look in View | Printing Options and configure the Margins and Fonts. I set the margins as follows with success on our printers and labels. Tweaking may be needed for your system configuration. Make sure to click the "Save" button when you have chosen the settings you want.
Top=0” - if we come up with a better alignment baseline than the top of a
given label aligned with the lowest edge of the plastic guide on the printer head assembly, this might change.
Left=1.25” - adjust this and/or paper's physical positioning as needed
Right=1” - setting to 0" doesn't work, value isn't critical
c.)Under Lumina/NOTIS, labels were printed using the printer's built in font. The same font is selectable in NoteTab under the name “Roman 10CPI”.
Be aware that because this is a printer font (indicated in Windows by a small printer icon next to the font name), it may look different on different printer models using a different print driver. Printer fonts depend on the print driver that is in use. This is in contrast to using Courier New 12pt, a TrueType Font (TTF) that is constant across all printers. I configured NoteTab to use Roman 10CPI (CPI=Characters Per Inch) as the default, so the labels printed will have the same font and look as previously seen with NOTIS-produced labels.
Some good font alternatives besides Roman 10CPI are Roman 12CPI when more, although slightly smaller, text is needed on a label, or Courier New 12pt when you want the benefits of using a TrueType Font. Courier New 12pt looked very similar to Roman 10CPI in my testing. With the Roman 12CPI font default, my testing let me fit up to 10 lines onto a correctly aligned label, corresponding nicely to the 10 available label info lines in Aleph.
9.)Next configure the default paper size and print quality for your label printer. These settings will affect all applications using that copy of the printer driver. The settings I used for the Epson FX-880 printers were:
a.)Windows 9x only: Inside of Start | Settings | Printers | Epson | Properties | Paper: Choose "Custom" paper size. In early testing, I successfully used width=600, and length=294 (units are in .01"). The width is not critical, although the length should be the label-to-label distance on the form fed sheets. I measured our labels to be about 75mm metric (2.9523 in) or 2 15/16 inches (2.9375 in). Click 'OK' to save these as the defaults for this printer. If the printer is used by other applications, please keep in mind we just changed the Windows defaults. Going thru these same steps from within an application's Printer Settings option would not save the default settings we need to use with NoteTab (and Aleph labels), and force you to switch from "Letter" size paper to "Custom" each time NoteTab opened up. These settings will apply to all applications that use this copy of the printer driver.
Windows NT 4.0 only: Inside of Start | Settings | Printers click on the File | Server Properties menu item. You need to create a new form, which is effectively a custom paper size, for our labels. Click “Create a New Form”, and give it a description; I used “Aleph labels”. In the Measurements section, check that the units are set to “English”, and set width to 6 inches and height to 3.65 inches. Thorough testing on WinNT 4.0 showed me that values lower than 3.63, or 3.61 in some cases, lead to this new custom paper size not showing up on the choice of paper sizes on Page Setup dialogs. With larger values, our custom paper size did show up. This problem was only seen on WinNT 4.0, not Win95, or Win2k. The effect is using closely spaced labels is problematic on WinNT 4.0 due to operating system driver limitations. These limitations were observed on Windows 95 or Windows 2000. The “Printer Area Margins” can all remain at 0”; we will set the margin settings inside of NoteTab. Now go into the Epson printer driver and select “Aleph labels” as the default paper size for all applications that use this copy of the printer driver.
Windows 2000 only: Inside of Start | Settings | Printers click on the File | Server Properties menu item. You need to create a new form, which is effectively a custom paper size, for our labels. Click “Create a New Form”, and give it a description; I used “Aleph labels”. In the Measurements section, check that the units are set to “English”, and set width to 6 inches and height to 3 inches. The “Printer Area Margins” can all remain at 0”; we will set the margin settings inside of NoteTab. Click Save and then OK.
Next you need to right-click on the Epson printer driver and select Properties. Click Advanced | Printing Defaults | Advanced | Paper/Quality and select “Aleph labels” for the paper type and 240x144 for the print quality. Click OK. Now click on the Device Settings tab and choose “Aleph labels” for the paper type here too. There are two places in the Epson driver you must select the “Aleph labels” paper type. These steps will select “Aleph labels” as the default paper size for all applications that use this copy of the printer driver.
All operating systems: in later testing I changed the "Custom" paper size from 600 width x 294 height, to 600 width x 300 height. This change of .06 of an inch seemed to help with paper alignment. I am not sure if this change really made a difference...it may simply be too small to be noticeable thru the paper slippage and drift involved in the printer's paper handling. I still believe the value of "294" to be 'ideal' although minor tweaking of it by either increasing or decreasing it may help with vertical alignment issues.
b.)Inside of Start | Settings | Printers | Epson | Properties | Graphics (or Advanced):
Default changed from 120x144 to 240x144 resolution. Not sure why the previous lower resolution was selected by default. This should not negatively affect output, and may improve printing results in some cases.
10.)This completes the initial setup. Once you've done this, the Aleph | Items | Label printing function should work fine, although minor tweaks may be needed to the margins and/or paper size settings. I tested NOTIS printing on Win95 successfully with NoteTab installed and configured as described here.
Caveats (other than these NoteTab works fine on Win2k in lieu of Notepad):
1.)Windows 2000 only: If in the future you use NoteTab's Help | "Restore MS Notepad" feature, the NOTEPAD.EXE.MS file will be renamed to NOTEPAD.EXE. Then if you run the restored (original) Notepad.exe, WFP will detect the original Notepad is back in the system, and you'll need to go thru the procedure all over again from the beginning.
2.)Windows 2000 only: Beware of using the System File Checker (SFC) because it will detect that you've replaced Notepad and ask for the Windows CD. Once you give it the CD, SFC will undo the replacement you worked so hard upon.
3.)Paper alignment/slippage issues: adjusting the paper in the printer by manually pushing/pulling it around may confuse the printer and cause it to be out of alignment for the first printed line. It is better to use only the turn knob on the side of the printer to move the paper forward and backward. The best option may be to use the printer's "Micro Adjust forward and backward" buttons. In any of these cases, the first line of the first label printed after aligning the paper in the printer may still be corrupted/misprinted; i.e. no guarantees on the first label printed out once you've moved the paper manually. (Labels printed subsequently, without moving the paper manually, should be okay.)
Why I think this happens: the margin is set to zero (0") because we want to print on the very first line of a label, and because aligning the label in the printer in other ways is hard. Because the label is aligned manually, the printer does not know exactly where the paper is until it line feeds, and that doesn't happen on the first line. So the result is you may see the first line printed incorrectly...too close to the 2nd line, or scrunched up, or other problems.
Workarounds: either make the first line blank on the label so the printer must form feed, or consider the first label printed potentially 'lost' if you have just manually adjusted the paper. Try printing again to the 2nd label.
4.)The exact settings and measurements for the margin options described above, and custom paper/label settings could vary from printer to printer depending on how the printers are already configured and how the forms are already aligned. Adjust them as necessary.
In regards to NoteTab Light (free) vs NoteTab Standard and NoteTab Pro:
Using the evaluation versions of NotePad Standard and NotePad Pro makes no difference to the printing of Aleph call number labels. The main differences from the free version are availability of technical support and the inclusion of a spell checker and thesaurus. A few other advanced features intended for programmers and web page designers are also included. They have a web page that compares the 3 versions side by side; a copy of that has been printed and put in the NoteTab green license folder at Bio-Medical Library. My impression is that it would be unnecessary to purchase these more advanced versions for support reasons, as there is a free support forum for all NoteTab users to help each other. The advanced features available would likely go unused in our situation, as these capabilities are present in Microsoft Word, and are not in use even there. The only good reason I can see to purchase some copies is to provide support and incentive to the programmers who designed this useful product and encourage them to continue developing it. It'd be a shame to see a good product die because people only use the free version, hence paying for some copies we use may be within the spirit of their offering a free version.
Copies of all these web pages were stored in the same folder as the document you are reading, on the Biomed servers.
Here is a document about the printer’s controls:
Here are two headers to a groups.google.com search for “Windows 2000 & Tractor Printers”. These are relevant threads that discuss printer and label issues other people have had.
From: Eric Blumer (email@example.com)
Subject: Re: Windows 2000 & Tractor Printers
Date: 2001-03-20 16:20:07 PST
From: Chet Swanson (cswansonNOTME@kendra.com)
Subject: Tractor Feed Labels Question
Date: 2002-05-13 09:21:12 PST
For testing purposes I have found it convenient to install the free Apache 2 web server software atop Windows 2000 and XP. Apache 2 normally run as a service atop Windows, and almost any tech knows that adding services to Windows can become a security risk. This little summary attempts to address that concern.
First, make sure the target Windows 2000 (or later) system starts out clean of other services. That is, disable anything else that might otherwise pose a risk, including IIS, FTP, Telnet, etc. Starting clean can simplify troubleshooting in the unlikely event that these techniques do not help protect your system.
Once you system is clean, login as a local Windows administrator and run teh latest apache2[xxx]nossl.msi, from our K: drive or web download, with defaults. This will install Apache onto your Windows PC with default settings. If the PC’s name matches what the University thinks it should be in DNS, you will be able to reach your server by name. If you have no valid DNS name or don’t know it, don’t worry. You can still use the IP address of the PC.
Near the end of the Apache install you may have the opportunity to edit the Apache configuration file:
C:\Program Files\Apache Group\Apache2\conf\httpd.conf
Take it. If you miss the opportunity, open the file with Notepad. In this file you should be able to seek and find the line shown blow. Beneath that are entries controlling who can connect to your new Apache server. Change those lines as needed. For example:
# Controls who can get stuff from this server.
# UPDATED on [date] by [your name]
deny from all
allow from 22.214.171.124/25
allow from 126.96.36.199/25
allow from 188.8.131.52/26
then save the file. Now, to activate the changes, stop and restart the Apache service. You can either do this through Start -> Settings -> Control Panel -> Administrative Tools -> Services or by right-clicking the Apache icon by your desktop clock, opening the Apache Service Monitor, clicking Stop, waiting a minute, then clicking Start. Either way, the modified httpd.conf file should now be active.
The IP entries in this Biomed-specific example indicate three partial subnets: 236.128-236.254, 237.128-237.254, and 141.128-141.192. I excluded the Tech Services partial subnet so I could use one such PC to verify that outsiders are blocked. To allow Tech Services in as well, add “allow from 184.108.40.206/26” as a fourth exception to the deny all, save the change, and restart the service again. You can narrow the list of allowed IP address to a single subnet, or even a few specific address by excluding the part after the slash.
This is a very simple and limited approach to security, not a proper firewall. It only protects Apache, ignoring any other services you might have installed, so be careful about what extras you add. That said, this can be a great way to experiment with technologies like HTML, XHTML, PHP, Perl, access controls, and a hundred other Apache-specific technologies. The inner workings of the Apache server are nearly identical on Windows and UNIX, so understanding Apache on Windows can give an idea of how some of our production web services work.
Note: All research on this was done by Haudy K. a couple years ago -
Symptoms: Some 16-bit internal and external PC Card devices (PCMCIA) with Windows 2000 and Service Pack 3 (SP3), may stop working correctly. Upon creation of the W2K, SP3 image for the Dell Latitude laptops, the internal wireless card did not configure properly.
A. Uninstall existing TrueMobile 1150 Client Manager software
B. Modify Registry Key as follows:
2. Choose Edit – New – DWORD - Value
3. Change the value name: DisableIsaToPciRouting
4. Value data should be: 0
C. Reinstall Dell Driver from factory
D. Follow wireless set up and configuration as per OIT specifications:
E. Restart computer
To read more about this fix, consult Microsoft Knowledge Base Article - 327947
Copied from Library staff Monday Memo on 5/15/2005 ---
A staff member recently asked, "How come I never receive any emails from our book vendor?" We responded back, "It's possible that the emails are being blocked." We then asked the staff member if they knew how to check whether a message has been blocked. And so begins today's tech tip, "How to check your incoming email control settings."
First, go to your "Internet Account Options" page on the web. Go to: http://www.umn.edu/dirtools .
Enter your University of Minnesota Internet ID (X.500 username) and password and click on the Login button. You should see a listing of your Internet Account Options. In the "Manage Your E-mail" grouping, click on the link for "Incoming E-mail Controls." On this page we see choices for how we want the University's email servers to behave when they receive your email.
Most staff will have "Allow email from well-behaved servers" chosen, which means campus email servers will block a message if they are not happy with the server that sent the message. While we do recommend this choice for libraries staff, there's a chance that something you thought should come through will get blocked. If you suspect this is happening, there is a way to help those messages get through.
First, go back to the "Internet Account Options" page. Again in the "Manage Your E-mail" grouping, this time click on the link for "Show Blocked Incoming Email." Here you will see your list of emails that were not delivered to you. Notice the last column of this list: "Make an Exception." If you find an email address that you do not want blocked (like mail from that vendor), click on the Make an Exception link for that address and it will be added to your list of email addresses that are allowed to bypass the University's site restrictions. Note, however, unblocking the email address doesn't deliver the original blocked message, you will still have to ask the sender to resend any messages you missed.
For the technically inclined (and the curious), the University also provides a glossary of reasons that messages get blocked: https://www.umn.edu/dirtools/etc/blockreasons.html .
Questions or comments? Feel free to contact ITS at 4-9094 or firstname.lastname@example.org or your IT staff in Bio-Med, MINITEX, or the Law Library.
-- Mike Sutliff and the friendly folks at ITS
I'm so glad we're talking about doing FMPro7 Server
Advanced with a new G5 tower. What else is there?
MS Access (IIS security?) and one immature option
coming in OpenOffice that I looked at recently. Most
users can't do their own Perl/MySQL nor ASP.NET,
much less the Oracle stuff Jo once wanted for BIS.
Think of what we have saved in IT labor by enabling
power users to build and support their own solutions.
That's not even considering how entrenched we are
with FMPro - that is, the hundreds of hours a major
migration to any other environment would require.
Report formatting alone would take a huge effort.
We will not regret this upgrade. About pricing, see
Separate from Advanced Server (get that regardless),
it looks as if we do as I say for FMPro 7 desktops and
buy 50 at once, it's at least 40% off retail. To upgrade
would be about 40% off, so it probably wouldn't save
us anything to upgrade from FMPro 6 vs. site license.
Those 50 seats will cost at least $5k for all, as Steve
said, probably more, in addition to Advanced Server,
but we could probably then forego further upgrades
for at least a couple years. Version 7 is a major step
up. If too expensive, we could probably delay most
of the client upgrades, but at least upgrade all those
who will build or test FMPro databases for the web.
That includes a few of us in IT plus Emily and Dave.
A big thing for me is not having to worry about which
staff get which license. Under the 10-pack deal if we
exceeded the limits of any 10-pack the integrated
license checkers could complain. Jo was adamant
about avoiding that embarrassment after it happened
in some meeting (before my time, but remembered).
With a site license I can embed FMPro in images and
save maybe 10 minutes per desktop deployment not
having to install particular licenses afterwards. That's
like getting a free day over the course of any all-staff
PC rollout, more time over the course of a few years.
A few individual upgrades (a ten-pack?) won't break
the bank, but ultimately I'd get that site license. If we
could swing the site license concurrent with the new
PCs we could take advantage of embedding FMPro
in [the next Windows and desktop PC] rollout...
From February 2004 --
Some of you may know that I'm now more
interested in C# than in Java for the same
reasons I was looking at Java: marketability
of skills and ubiquity of available knowledge.
C# also happens to be a great language, or
so I have heard, faster and more flexible on
a Windows PC than a JRE. Not that I know.
The devil is of course in C# being Windows-
centric, but so are 90% of desktops, and if
one really wants to run C# code atop OS X
or Linux it can be done with the free Mono
framework (a bit immature; wait for ver. 2).
If not Mono, VPC7 is a less-than-ideal but
workable fallback for running .NET apps.
C#, .NET, and SQL Server go together like
beer and salt, but SQL Server isn't the only
database option. I think that maybe I'll try
building .NET Windows front-ends to very
simple PostgreSQL databases hosted on
UNIX. A .NET Data Provider does for .NET
what JDBC does for Java, and Npgsql is a
data provider for PostgreSQL servers. See
If the .NET client does all the connectivity
work using data providers like Npgsql for
abstracting access, then client and server
can remain independently tweakable. That
is, the same data can be also accessed via
PHP or Ruby, and clients can also access
other back-ends like Oracle or SQL Server.
If you are also interested in C# or just want
to heckle, feel free to ask/comment. I deny
being a stooge for M$oft, but resistance is
futile and your species will be assimilated.
--------- example of C# with Npgsql ---------
// in C#
public class Test
public static void Main(String args)
String connstring = "Server=192.168.0.1;User
NpgsqlConnection conn = new NpgsqlConnection(connstring);
From December 2004 --
Microsoft's .NET Framework attempts to simplify
and standardize many software components that
previously sucked up countless coding hours
and led to inconsistent and incompatible code.
.NET includes support for several dev languages.
Our current XPBASE04 image includes .NET 1.1,
and the RSS reader I most favor depends on it.
Included "assemblies" are chunks of free, prefab
.NET code you can use as is - or write your own -
all part of a huge object-oriented code hierarchy.
Many see the .NET Framework as Microsoft's
answer to Java - a controlled, standard runtime.
Microsoft open-sourced the core of .NET, and the
Ximian Corporation (now owned by Novell) is
using that core to build a compatible framework
for Linux and OS X. The experiment is maturing...
I have another correction for my email: I checked
and saw that features I noted earlier are planned
for MONO 1.2 (not 2.0), including limited VB.NET
support. Q2/2005. MONO 1.2 also aims to include
ADO.NET 2.0 and ASP.NET 2.0 features, etc.
See that website for a longer list of the features
planned for MONO 1.2 and beyond, including
the System.Windows.Forms 1.1 assembly. Also
see notes on MonoDevelop, a MONO/C# IDE
for Linux and Mac OS X. To develop for C#
and .NET in Emacs, check http://davh.dk/script
I've begun to play with ASP.NET and Visual Basic.NET
a bit. Aside from their currently requiring Windows and
IIS, I'm pretty impressed so far. Not that I've done much
of anything yet, but the tools seem to make some things
easy. I have a Dummies book and one from MS Press.
I intend to keep playing with these as well as Ruby.
I guess it depends on whether one considers Windows
Server '03 a safe and worthy platform for web services
and/or how successfully Novell ports the .NET stuff to
Linux in subsidiary Ximian's MONO effort (March '05).
If that goes well, .NET will no longer be Windows-only.
MONO 1.0 already runs on Linux and OS X, bringing
the C# language. 2.0 is supposed to include VB.NET
and some binary compatibility with Windows (limits?).
No idea how or whether Novell will get around ASP's
requirement of IIS (in addition to the .NET framework).
If ASP.NET doesn't fly on Linux, it may still be possible
to use VB.NET with Apache 2 via the .NET framework.
Of course I still think Ruby or PHP + Apache + MySQL
or PostgreSQL is probably "tighter" as in fewer obvious
risks and lower demands of the server - not to mention
free and open-source. The only downside to the open-
source approach, in my rather uneducated opinion, is
that it may take more time to do anything interesting
with databases. Then again, Ruby on Rails may make
Ruby-based web projects almost as easy as VB.NET.
Some factors against using ASP.NET at Biomed are
that we already have the servers (Solaris) and skills
(Dan) to run open-source apps. ASP.NET would take
new skills, discipline to secure and support a Windows
2003 Server, and cash for hardware and OS license.
I dunno if the benefits of ASP.NET could ever justify
those costs, so for now I consider ASP.NET just a toy.
Part of discussion with a superior in Feb, 2004 ---
If you need to enforce very granular security,
I agree that ACLs can be wonderful. The thing
is, I have yet to see a situation at Biomed
that really requires them. Few if any of the
staff here, or elsewhere I suspect, want to
get very fine-grained about who can do what.
One of the definite downsides of ACLs is the
dearth of good tools for managing them in UNIX.
NetWare has Console One, Windows has whatever,
but on FreeBSD Dan had to write a perl script
just to enable me to tweak ACLs for a big set
of directories and files without mucking up
the timestamps of the files in the process.
ACLs are used to grant two or more existing
groups different rights to one directory,
but this can also be accomplished by making
a new group for each new special-purpose dir
(and all its children except as over-ridden).
Maintenance of a dozen or two special-purpose
groups strikes me as easier than maintaining
ACLs for every directory, and yet does about
the same thing from a user perspective. Samba
can automatically handle file and directory
inheritance of ACLs and standard UNIX rights.
One constraint worth noting is that there are
limits to how many users can be in a group and
to how many groups a user can be assigned to.
In the case of FreeBSD 5.2.1, a default is 16
groups per user and something around a hundred
users in a group - it's really the line length
of a user list = 1024. Tweaking this could be
tricky (Thanks, Dan, for all that detail).
Since we have way less than a hundred staff
and no staff member should need more than 16
groups, assuming we don't go crazy with rights,
then those limits should not be a problem here
at Biomed. They may be in a larger population,
though, so I would not call this idea scalable.
I understand that we are not the only ones who
thought ACLs were critical but later found that
they aren't. The relative merits of ACLs have
been debated extensively in online discussions
regarding FreeBSD and probably other platforms.
It seems easiest from a human perspective to keep
the explanation of security relatively simple:
1. Assume that all staff get either no rights or
read-only to various directories off of drive O:.
BIS and Admin, for example, have hidden theirs.
2. All members of a department get full rights
to their directory and its children. Samba can
handle inheritance of rights below directories.
3. A new subdirectory of an existing directory
(and its children) can be granted other rights
simply by making that new dir owned by a more
inclusive or exclusive group, one that might
include, say, FTEs plus dept. student workers.
4. Directories to be shared across departments
can be created in a separate, general-purpose
area independent of departmental directories.
As noted above, UNIX rights may not be terribly
scalable or granular, but my comments pertain
mostly to the departmental (e.g. Biomed) level
rather than enterprise needs. Fifty staff with
uncomplicated security needs don't need ACLs.
If I'm mistaken, please help me see what I have
missed. I do appreciate the thoughtful feedback.
From December 2004 msg to coworkers ---
I just overwrote Ubuntu on the stage left helpdesk
PC (that Lee and I favor) with Novell Desktop Linux.
This is a limited trial, though it may work forever.
I am configuring it like I did Ubuntu, so account
linuxedu (password linuxedu) has its custom config
automatically wiped and rebuilt at each boot.
This could become a general-purpose config used
for some Blue PCs with a few more tweaks as well
as (later) a training/testing option on the 5150
laptops. Novell Linux includes many useful tools,
not the least of which is good wireless support.
While the default install would work fine for the
average staff member, I wanted to include a few
extras that seem pertinent to UNIX skill-building,
so here's how I went about it for the helpdesk:
1. Configured (previously) the PC with about 20%
of the drive for Windows C:, 20% for D:, and
the rest to be untouched/reserved for Linux,
then installed/updated image XPBASE04 on C:.
These I partitioned with a Win98 boot disk.
Novell sees any other Linux and overwrites
that by default, but leaves C: and D: alone
as long as there is enough space for Linux.
2. Downloaded the free trial ISO (DVD or CD) and
burned it to DVD; booted the PC from the DVD.
Registering at novell.com (required) is free.
3. Selected KDE as the preferred interface. SuSE
was built around KDE, which Dan and I favor.
Though Novell owns the Ximian, makers of Gnome,
their official plan is to support both options.
4. At the main YaST installer screen, I checked
the header for software selection and then
used the drop-down Selections to search for
a number of items, including...
5. Under the main Selections list, check the
boxes for Development and Accessibility.
6. Using the drop-down (upper lefthand) search
for "Ruby" and check that box (version 1.8x).
A great object-oriented language from Japan.
7. Search for "Python" and add Python-curses,
-demo, -devel, -doc, -doc-pdf, -idle, -mysql.
Python is a better scripting lang than Perl
(Linux, Solaris, OS X all now include Perl)
yet more mature and widely-used than Ruby.
8. Search for "MySQL" and add MySQL-unixODBC,
-devel, perl-DBD-mysql, python-mysql, and
qt3-mysql. MySQL is the most common free
SQL server, though PostgreSQL is also free
and a more powerful option. I favor simple.
9. Search for "FireFox" and check both boxes.
10. Search for "Java" and add everything there
including JBoss, a powerful app server (if
you really want the whole Java she-bang).
11. Search for "Samba" and add only samba-doc
and yast2-samba-client (no other options).
We do not want a slew of SMB servers here.
12. Search for "obj" and add libsoap. This is
a library for inter-system communication.
Microsoft and others have embraced SOAP.
13. Search for "qt3" and add qt3-unixODBC
and qt3-extensions. QT is a great cross-
platform (Linux, Win, Mac) GUI toolset
for C/C++ development. wxPython may be
easier, but QT apps may perform better.
14. Search for "autoyast" and add everything.
I doubt we'll use it, but if we ever do
more than a couple dozen Linux installs
it might be worth digging into autoyast.
15. In the window's upper-left dropdown pick
Package Groups -> Productivity -> then
under Archiving, add the app "star".
We use star to backup server Biomed2.
16. Under Package Groups -> Productivity ->
Graphics, check the boxes for dia, tiff,
and ImageMagick. dia is a crude but OK
alternative to Visio, and ImageMagick,
like GIMP, is a Linux alt to PhotoShop.
17. Under Package Groups... Publishing, check
boxes for docbook_4, sablot, and texinfo.
I don't know whether we'll ever use em,
but docbook and/or texinfo can simplify
UNIX-based doc publishing over time.
TeX authors can convert output to RTF.
18. After all these changes click Accept in
the lower-right, then Yes to install. If
YaST complains about package conflicts,
I do NOT install conflicting packages.
19. Back in the main installer screen, if
the partitioning options look wrong you
can click that header to fix them.
20. If Windows was installed and you want
to default to Windows, not Linux, check
the Boot header to change the default.
Stick with boot loader grub. It works.
21. Fix the time zone to USA-Central.
22. Click OK, then Yes to accept the changes,
then let the install run for 40+ minutes.
---- later, after about 30-45 minutes ---
23. Let a sysadmin enter the root password
(we have one common standard for Linux).
If warned that it's too long and will
be truncated, just proceed. No probs.
24. At Network Config, edit the Interfaces
settings to define the PC's static IP,
mask, routing, hostname (e.g. x237-XXX)
and domain (lib.umn.edu), and two name
servers: 220.127.116.11, 18.104.22.168.
NEVER guess. Ask a SysAdmin about this.
25. At the next screen, do not yet test the
connection. It might fail at this point
but work fine after you're all done.
Also skip online update for the moment.
26. Authentication method is local for now
(we may someday try that LDAP option).
27. At "Add a New Local User" is did this:
Full Name: Username is Password
User Login: linuxedu
Do NOT recieve System Mail or change
any of the other defaults. It will
complain about the password, but this
won't be a problem because linuxedu
will not be granted special rights.
28. Skim the Readme that comes up, but
don't worry about all those details.
29. Accept the default Hardware Config and
click Finish. Waintfor the desktop to
load, then remove the setup CD or DVD.
30. If it's a public Linux setup, Ask Brad
or another SysAdmin who knows Linux to
make Linux recreate the linuxedu home
dir at each boot, and to set a printer.
Printer setup in Linux is a bit wierd.
Wow, eh? Actually, this shouldn't take more than
an hour of human intervention (plus file copying)
once you've done it a few times. More importantly,
after Linux is installed and tweaked to rebuild
linuxedu at boot, I suspect that we won't have to
maintain it for a long time. We won't enable any
easily-hacked server services, and spyware and
viruses are almost nonexistent (as with OS X).
This is not to say we will never need to patch
Linux, just that it's a lot safer and easier to
keep clean than Windows, and cheaper than OS X.
When we do patch Linux, YaST makes that easy.
Please share any concerns or questions about
this or about experiences with Novell Linux.
From October 2004. Support for these laptops has since been handed off to a co-worker.
This is very quick one-off and may contain errors,
but here is what I am doing/have done for every
Dell Inspiron 5150 laptop in 555. Copying some
interested librarians just FYI - I expect that only
we in Systems will be doing this to the Inspirons.
Note that this assumes the laptops have already
been partitioned for Linux (these were last week):
1. Download the SuSE Personal 5.1 ISO image
from www.suse.com and burn to CD. I did 7
so that I could install on seven concurrently.
2. Boot the 5150 and hit F12 right away to get a
one-time boot menu, then insert SuSE CD,
and select the boot from CD/DVD option.
3. As it boots, right away scroll to the second
choice, being Installation. Otherwise it will
just boot the default OS from the hard disk.
4. After SuSE's installer (YaST) loads from CD,
select English. Give it a few seconds to think.
5. Scroll down to the "booting" choice, click
that, double-click "default" and then single-
click Windows. Click "Set as Default" at the
bottom of the screen, then OK, then Finish.
6. Scroll to change Time Zone to USA/Central.
7. Don't change anything else in the defaults.
Double-check your changes, click the Accept
button and then Yes. Take a half-hour break.
--- it restarts in default OS, which is Windows ---
8. Shutdown/restart, and at boot, select Linux.
9. Ask Brad or Dan to enter root password
(for now). This should be consistent on all
5150s, different from our other passwords.
10. Click Next to accept the first defaults.
11. Skip the Internet test (no IP to test yet).
13. Full Name = Username is Password
Username = linuxedu
Password = linuxedu
Otherwise stick with defaults, including
autologin as this user, don't get sys mail.
14. Click Next, and Yes to password warning.
15. Click Finish. Shutdown and restart.
16. The first time the GUI loads, it needs to adjust
video. This demands a root password. Ask
Brad (who will record password in our book).
----- the following are still under development -----
17. Apply automatic login fixes using script and
symbolic link copied from Brad's CD and a
dupe of /home/linuxedu to /home/template.
Do this as root, being extremely careful.
18. Eventually (not today) we can add a compiler
and other tools using www.openpkg.org...
Please share any questions or concerns. This should
be as easy and self-correcting, once done, as Fedora
was, but cleaner and more reliable.
---- Way back on 28 Apr 2005, David Farmer declared:
NAT is EVIL! There I have balanced the universe again. :)
Well, actually NAT is a tool and like most tools it is morally neutral.
Also like most tools in the hands of a professional or someone else
who knows what they are doing, it can be useful and even a good
thing. However, in the hands of someone who doesn't know what
they are doing, most any tool can be bad and even dangerous.
---- Brad's response ----
Thanks for the detailed response. I obviously don't
share your concern over NAT drawbacks, particularly
in the cases of firewire, VPC, or home configs, but
you do make an interesting case with good details.
Generally speaking, you raised valid criticisms,
but based on extensive experience I'll explain in
more detail (for the last time, I promise) why I
still think NAT is actually good for some things.
The main drawback is NAT breaks an external admin's
ability to ID (e.g. traceroute) which node behind
NAT is causing or experiencing net problems, and to
directly reach the IP address from outside the NAT.
A few apps will also misbehave, but most users don't
do videoconferencing and such. Surfing, email, ssh,
some VPN clients, most other client apps work fine.
There are benefits. NAT also frustrates hackers and
prevents users from running world-reachable servers.
I'm not saying NAT is a firewire, but it does help
and can be used in conjunction with a real firewall.
Breaking central control of course matters when you
manage ten thousand nodes (as you do). More nodes,
more of a problem, but not every department needs to
be seen that way. Some small shops could easily and
safely operate as locally-managed abstractions. If
you find major network probs coming from IP address
X, kill X - temporarily. Then it's that admin's prob.
If you don't see any probs on a given IP, why worry?
A local NAT would not scale well beyond a couple of
hundred nodes (including printers, etc.) but in one
smaller org I ran NAT for 200+ PCs over four years.
Success depends on a halfway-competent admin, a few
compromises, and intelligence when troubleshooting.
Where I worked we paid just a few hundred bucks per
month for shared Internet over most years, saving
taxpayers over the period maybe ten thousand bucks
vs. allocating around 200 real IPs. Many use this
logic at home: Cable modem plus router is usually
much cheaper than getting several real IPs in DSL.
Responses to Krukenberg claims about limitations:
1. Global addressibility - so what? My home
boxes (OS X, Windows, and Linux) all work
fine behind a cable/DSL router. See above.
2. Global uniqueness - again, so? As long as
the provider (at home, Comcast) is able to
associate a problem with a particular IP,
they can blame or restrict the main node.
3. Persistence of host-to-address binding -
Breaks a few apps, kills remote control
from outside NAT, but otherwise no big.
4. Address structure - I don't buy this.
From outside, one whole NAT world can
be seen as one IP address (as VPC can).
The local admin can map what's inside.
5. Deployability of applications - Partly
true. Apps are usually not deployed from
way up top but from departmental servers.
In NAT I could deploy apps from server(s)
in my LAN, and NAT can span buildings if
the respective network is so configured
(not here; it was where I last worked).
6. Reliability - Why would NAT be any less
reliable than "proper" routing? I've seen
over a million emails route fine via NAT.
Reliability depends on hardware and a good
provider, not on how the IPs get assigned.
7. Scalability - yes, limited, but I see no
inherent issues up to around 200+ nodes.
8. Private address spaces and VPNs - a
concern if remote access into the boxes
matters, less so if key servers get real
IP addresses using secondary NICs - as I
set it up to reach GroupWise from home.
---- the next is from Dave ----
Most people don't know what NAT really is, how it works, what its
limits are, what it breaks....
---- Brad's response ----
Most folks won't care. See above about the
apps I can use at home without caring it's
NAT. I setup the same for several relatives,
including one who does all her work for IBM
over a VPN connection via NAT + cable modem.
Comcast has never had a reason to complain.
---- the next is from Dave ----
Lets look at another tool, a hand gun, this is a very useful tool in
some situations. But I don't think anyone would argue that it is not
dangerous in the wrong hands. I'll also note that the University has
rules about having hand guns on campus, counter to the way the
state legislature thinks things should be, I'll add.
---- Brad's response ----
That is not a good analogy. The default behavior
of a handgun is to destroy things. It takes skill
and thought to avoid nasty results. This is not
the case with NAT, which in typical uses - VPC
or cable/DSL routers, requires almost no skill
to configure and use safely and effectively.
---- also from Dave ----
I know, you have a weak argument when you have to bring hand
guns into the argument. Here are some much better arguments!
[the discussion goes on in this vein, but this is enough]
Disclaimer: This is just wild brainstorming, and
doesn't necessarily reflect any particular plans
or goals of my employer (at this point, anyhow).
The March issue of Linux Journal included part 2
of a 2-part thing on Centralized Authentication and
Authorization by Alf Wachsmann. While the article
acknowledge LDAP as a capable option for open-
source centralized authentication, it suggested
that the older NIS standard can be lighter on the
server, lighter over the wire, and easy to integrate
with filesystem auth for various kinds of groups.
The only real negative I've heard about NIS is that
there were security issues, but Dr. Wachsmann's
article mentioned something about integrating NIS
with Kerberos, which I've heard is very secure. He
also suggested tweaks so that NIS only talks to
systems within given IP ranges, rejecting others.
One thing I've heard about NIS that appeals to me
is that it looks to various applications - I would
hope Samba, for instance - just like traditional
UNIX auth/auth, meaning an app may not require
much customization to get the benefits of NIS.
Tying Samba into LDAP takes some tweaking.
I don't know if this is true but am curious about
anything that simplifies management of user
accounts and rights and keeps overhead on the
server and network to a minimum. So, I'm curious...
a. Do you use NIS for authentication/authorization?
b. Do you use Samba in conjunction with this stuff?
c. Do Windows clients connect to your NIS domain?
d. Is your connection from Windows to NIS secure?
...and anything that's relevant.
An entirely separate question is how and whether
this could somehow be tied into X.500 passwords.
If NIS won't do or can't be made secure enough
then it's not even worth trying to tie into X.500.
Part of an email to Biomed coworkers in April 2005 ---
Below are some alternatives to WinSelect that
may obviate (in part) the need to manually hack
public Windows configs into shape for security.
Note that there are two aspects: RESTRICTING,
which I'd minimize if possible, giving users
as many tools as we safely can, and RESTORING
public PCs to known good/safe states upon exit
or reboot (vs. constantly re-ghosting).
I think we could do restriction better, that we
could do both, and that we could get rid of the
menu system and make the desktops look and work
more like home PCs, making for a more familiar
and flexible user experience. The background
could be customized to present a Biomed "look"
but users mostly just want to get things done,
and our current approach seems to unnecessarily
get in the way of that (it's why we're here).
Some alternative tools:
1. Deep Freeze from http://www.faronics.com
[many other] techs have used this, and it gets
great press. It's not about restriction
so much as restoring a standard config
every boot. Very affordable in bulk. See
2. FreezeX, also from Faronics, can become
part of a baseline config and prevent the
execution of anything that wasn't in the
whitelist you made (or later tweaked). See
More expensive that Deep Freeze, but still
very reasonably priced for bulk purchase.
3. Secure PC, at www.citadel.com/securepc.asp
This product uses security controls already
in Windows to restrict operations, offering
far more control than WinSelect. It may be
aging, but got great reviews. Not cheap.
I'll forgo including SpyLock because I get the
impression it's a bit like WinSelect, blocking
the front door but not really posting a guard
inside the system, meaning you have to create
your own guards (policies), which I already
indicated is difficult to get right.
I'd much rather use Deep Freeze plus FreezeX
and a few OIT-recommended tweaks, otherwise
keeping the public PCs vanilla and standard.
The more we hack configs, the more likely we
are to see little problems like the ones I've
witnessed. Customization also takes time.
Recently I've speculated with [two coworkers]
what we might do long-term about the many
smaller databases we have, especially in the
event that one key database support person
were to leave. Not that we anticipate this.
One thought has been to narrow our focus to
a smaller set of tools more commonly used in
other UMN libraries, including Ruby and PHP.
New databases could be built for web access,
and databases already in FileMaker or Access
could be gradually rewritten around MySQL or
PostgreSQL backends. We have server space.
Until this week I've been all for migrating
to more open-source, UNIX-based tools. For
apps that need to be very scalable, that are
enterprise in nature, or that must integrate
with other open-source tools like cookieauth,
Ruby, PHP, and Perl make a lot of sense.
For smaller/departmental databases, though,
higher priorities can include fast/easy dev,
fine-grained control over the look and feel
of forms and reports, and quick tweaks later.
Desktop database tools were built with these
needs in mind. FileMaker (especially), Access,
and maybe the HSQLDB-based database to be in
OpenOffice 2.0 (hsqldb.sourceforge.net) ease
and speed construction of small databases.
I for one love FileMaker. It is nicely cross-
platform (vs. Access) and very mature/proven
(vs. hsqldb), and we have a huge investment
in both locally-developed databases and local
expertise. Nothing in the larger open-source
client/server world, except maybe the beta
of OpenOffice 2, comes close in enabling the
non-programmer to develop powerful databases.
Moreover, were we to move FMPro databases to
PostgreSQL or MySQL with front-ends in PHP,
Ruby, VB.NET, whatever, we'd be taking on for
a long time to come all responsibility for
database migration and subsequent tweaks to
the databases. This would run counter to one
of the ideas from that IT Assessment process
- empowering users to support themselves.
From a management perspective I may want a
lot of control over departmental databases,
but from the perspective of users it may
seem, to put it nicely, counter-productive
to take away a tool that already works well.
Bottom line: our goals should include making
computing as transparent and comvenient as
possible for our users, to provide the tools
they need, and to teach/help them to fish vs.
doing all the fishing (dev work) for them.
For smaller databases, I think we should keep
FileMaker Pro and try to take better advantage
of it, boosting what we have (v7, web access).
Sent this to the Apple-L email list at the U of MN last week.
I just joined this group and have some experience
to share and a question. First tips about installing
Tiger from scratch, then your apps:
1. As many learned, the image needs to be burned
in a SuperDrive or the like, not an external DVD
drive like a Plextor PX-708UF. One can, however,
use such an external with firewire to install 10.4.
You can convert the image to iso but the result
won't boot and the installer starts by rebooting.
There are workarounds, but use a SuperDrive.
2. The University image, as I'm sure many of you
know, does not include iLife apps. It is possible
(see question below) to reinstall apps off 10.3
install CDs that came with your system or that
were purchased earlier. Disk 2 of the Panther
install set includes a packages folder with an
installer for additional packages. For most you
get newer versions in Tiger or can get em free
online, so I'd customize that install and then
uncheck everything except iMovie and iPhoto.
3. Some apps obviously don't work yet, including
Cisco VPN and VPC. Add to that list Carbon
Copy Cloner. To backup my Tiger install using
just provided tools I first made two partitions,
made sure the second was at least 5 gig, did
a separate Tiger install to that, removing all of
the options (e.g. other language support), and
then booting from that and used DiskUtility to
make a dmg of the first partition, having done
all configuration already on the first partition.
This may not make a bootable image, but at
least I have a way to restore it all (a la ghost).
4. Whether building an image for Mac or Win, I
always start with a primary account and get
everything just right, add a secondary admin
account, then if needed lower the rights of
the primary account (say, macuser) to non-
admin. That way I know everything has been
configured for the account to be used and I
won't have to fix things if some app needs
account-specific tweaks (e.g. Office on either).
5. Haven't tried this yet with Tiger, but one can
build in Panther a third "template" account,
use sudo in Terminal to duplicate the entire
contents of that primary account's home dir
(including hidden files and subdirs) into the
template account's home dir, chown and
chmod -R so the template user owns all the
copied-in files, and then -- this is the great
thing -- script so at boot the main account's
home dir gets (a.) emptied, (b.) duped from
the template account, and (c.) chowned &
chmod'd to work right. Then you always get
the same config and can tweak it using the
template account. It's great for public Macs.
Script could also empty the Shared folder.
Question: If we (University department) own a
Mac and it came with Panther including all the
iLife apps, is it legal to reinstall all those iLife
apps atop Tiger as described? I prefer this to
a simple upgrade because a fresh install will be
cleaner and more predictable than upgrading.
From March 2005; upon handing off responsibility for 15 laptops toa co-worker I summarized my support to date of these laptops as follows. This does not include all the customization details, which we very similar to what we do for Biomed staff configs (see separate future post).
In preparing Inspiron 5150 laptops I did this:
0.1 Partition drives as 20%=C:, 20%=D:, the
rest left for Linux (currently SuSE 9.1/2)
0.2 Format D: as FAT-32 so it's readable by
Win98-based boot disk, C: as NTFS...
0.3 Install SuSE 9.x (free) with some tweaks
I've discussed elsewhere. You can skip
0.4 Build or update image in "cleanroom"
as in don't use laptop for anything else
before changes are all uploaded to K:.
For me, proper image updates take up
to a week, so plan on one less laptop
when updating. All 15 are now in use.
To deploy the latest image I did the following:
1. Boot via F12 from DVD, accept all defaults
2. Prepare what looks to be C: (D: when in XP)
with "format c: /q/u/v:D" on every target 5150
3. XCOPY D:*.* C: (five at once), take a break.
This copies both image and newer ghost
4. Enter C: and then GHOST to launch ghost
5. Take DVD out and start step 1 on the next
row. I've worked on 10 Inspirons at a time
6. As the next PC xcopies, ghost the first PC's
NTFS partition FROM the image on that C:
7. When ghost finishes, reboot and walk away
as Sysprep operates. Check that next row...
8. After sysprep, enter the XPsp2 product ID:
[get ID from download.software.umn.edu]
9. Login to Windows as XPUSER, no password
10. Double-click wireless green signal icon and
connect to the U's wireless network. Ignore
for now warning about it being unprotected
11. After wireless connects, close that thing,
right-click the desktop, open its Properties
12. At ScreenSaver tab, click Power settings,
change it to Always On, and change that to
never turn anything off, even if on batteries.
Otherwise some class situations will choke.
13. Save the power changes, then use Start ->
Settings -> Control Panel -> Users... to
drop XPUSER rights from admin to limited
14. Shutdown and restart, then test by surfing
If you skip any of these steps, sooner or later
things may get messed up and instructors may
get annoyed before bothering to say what's up.
Deploying well helps prevent embarrassment.
The dropping to limited rights, keeping admin's
password secret (only for Biomed IT staff), and
other tweaks can help ensure that these 5150s
need little day-to-day support. Better to deploy
some new app as part of the image after testing
than to have to manually apply every tweak.
From March 2005 - note that I've since had second thoughts about becoming too dependent on Terminal Server as a single point of failure for Macs needing to run Windows apps. I liek systems to be self-sufficient, and a good eMac with VPC is moreso than an RDP client to Terminal Server. Open to discussion. Anyhow...
---- original message ----
Dave and one of the student workers helped
me verify Friday and Monday that we can now
run Aleph via RDP on that shared CCR eMac
via Windows Terminal Server from Biomed1
(which does double-duty for helpdesk tests).
RDP is a free client for Terminal Server from
Microsoft, available for Windows and OS X.
I set up RDP on that eMac to automatically
login to Biomed1 as user "biomcirc" and to
immediately run c:\al400\circ\bin\circ.exe,
starting in directory c:\al400\circ\bin. Users
never see a Windows desktop nor notable
delay, just Aleph. Closing Aleph closes RDP.
When printing, one chooses either locally-
connected printer(s) or CCR_A or CCR_B.
I can easily add printer CIRC_A to the list.
Three questions remain: (a.) how well it can
work for concurrent Aleph users, which we
can test using the existing Circ PCs and a
Win2K RDP client from k:\systools\software,
(b.) how well an Intermec USB adapter does
with handheld scanners - I'll test that soon,
and (c.) whether our superiors will OK this.
I have already seen that Aleph via Terminal
Server seems faster and starts MUCH faster
than via VPC, that Terminal Server can end
the need to support decentralized Windows
software at Circ PCs, and that requirements
on the client are easy: any current Mac with
512 mb RAM (demands less, but get 512+).
The goal of this for me is to reduce desktop
support. Having only OS X on Mac Minis at
Circ desks will do that. Terminal Server is
easy to support and cheap under U deals,
and we already have Biomed1 as a server
that can do double-duty. If ever we need to
part out Biomed1 to keep Biomed2 alive we
can fall back to using VPC7 for Aleph. Dave
has already helped verify that Aleph works
with VPC7 (pending handheld scan tests).
Why not just use thin clients if we're going to
use Terminal Server? (a.) Thin clients aren't
much cheaper than Mac minis, (b.) with true
thin clients there'd be no VPC7 fallback, (c.)
local Macs enable all other apps, including
browsers and FMPro, to run in native OS X,
(d.) desktop OS X is safer than any WIndows
configuration, (e.) a local OS may be needed
for flash drives, and (f.) many student workers
already favor OS X and would welcome this.
I'd replace all Circ PCs, except maybe Copy
Center, with Mac Minis. Warranties on those
five 866MHz PCs have long since expired. If
that went well, I'd consider doing the same
for other student worker stations and/or any
willing FTEs who do not depend on certain
Windows-only clients like Illiad or Novell's.
If we don't try this we'll have to pay at least as
much for PCs and then maintain Windows
on every Circ desk, rather than one shared
Terminal Server. The chance to easily move
staff to Macs only comes around so often. I
say take it now, even if only for Circ staff. If
not via Terminal Server, then using VPC7.
From Fedbruary '05, for U of MN Bio-Medical Library
As some of you know, Dave has been helping me
evaluate how well Aleph 16 runs atop OS X. I
have been hoping that using OS X in certain
capacities, e.g. at the Circ/CCR desks, might
decrease maintenance over time. The jury is
still out on that, but here's what I've seen:
---- OS X Pros ----
First, thanks to the intro of $499 Mac Minis,
gradual decreases in the prices of other Macs,
and the fact that the U now licenses Virtual
PC 7 for OS X, the base price of buying a Mac
and running Windows apps on it is comparable
to the cost of a reliable PC - e.g. a Dell
Optiplex (not a consumer-oriented Dimension).
Consumer Reports and others rate Apple way
above other vendors for reliability, so from
that perspective it's easy to justify Macs.
My own experience, and that of almost all
regular Mac users, seems to confirm that a
Mac with OS X is a very reliable and stable
platform, certainly as good as any other PC.
One other pro, and this is a huge point for
Macs, is that OSX-native apps and configs
are easy to secure and maintain. Unlike
Windows, OS X has never been subject to a
significant virus or spyware outbreak. Holes
in OS X security are rare and patched fast.
Some argue that's only because OS X gets
less attention with its small market share,
but I believe that the UNIX foundation of
OS X coupled with smart development choices
in the Aqua interface make OS X inherently
more secure that Windows. For example, with
few exceptions (all patched), rogue code
cannot just install itself atop OS X. Even
if you login as admin, installers ask you
for permission to do anything significant.
I have run OS X since its beta in 1999 on
a home eMac and have NEVER had a security
or virus problem, and I surf a lot of sites
and mess with it a lot. Steve can probably
say the same about public Macs. One just
has to apply period patches (very easy).
---- OS X Cons ----
There are few cons to running OS X, but in
some cases they may be significant. First,
if an app you require (Aleph, Illiad, Ariel)
is Windows-only then you probably need to
run Virtual PC atop OS X. This works well,
but there are drawbacks: (a.) VPC starts a
lot slower than native OS X apps, (b.) VPC
demands a lot of memory - increasing the
minimum hardware requirements slightly,
and (c.) some apps behave differently.
For example, Aleph 16 works great on Dave's
Mac once VPC is started, but the start time
can be up to a minute depending on what else
is running. One could launch VPC at login,
but this is still less than ideal. VPC can
print fine to a Mac's default printer but
apparently needs tweaking to see special
configs, e.g. extra trays and settings.
One quirk is that the Internet Explorer (IE)
now included with Macs is old, version 5.x.
You can of course run other browsers like
Safari (my favorite), Firefox, or Opera on
OS X, but a few $^$@ websites demand
not just IE but the latest version of IE.
Most importantly, Windows within VPC has
many of the same security issues as Windows
on x86 hardware, requiring extra care. If a
user only uses VPC for a particular app and
uses native OS X apps for everything else -
FileMaker, Office, email, surfing - then the
risk is small, but one must be careful.
I saw that some installs and updates don't
work well within XP atop VPC7, including
Windows Update and the Sun JRE 1.5x setup.
We got around the latter be installing an
older Microsoft Java runtime in two parts
(to enable printing Aleph loan receipts),
but I can see how such workarounds might
be cause for concern over coming years.
One extra cost if we were to move Circ
staff to Macs would be a USB adapter for
the Intermec hand scanners, raising the
cost per desktop by at least about $95.
Our scanners are PS/2 only be default.
A last con or at least consideration for
OS X, as with Linux, is that techs must
then know how to support at least two
operating systems. I'm not too concerned
about this, having worked for years with
both, but some consider this a downside.
--- Summary and a Bit About Linux ---
Some might say any computing environment
that ties you to a single vendor is risky.
In Apple's case I consider that poppycock,
both because Apple will be around for at
least as long as the average life of most
any PC (five years or more) and because
this can be more advantage than drawback.
OS X is nicely tailored to run on Apple
hardware, whereas Windows and Linux have
to content with a thousand different x86
configs. Buying into any complex system
forces choices. Macs offer fewer choices
in hardware and software, but enough for
most purposes, and it's very high-quality.
Given the idiosyncracies I have seen with
VPC7 I must qualify my otherwise unbridled
enthusiasm for OS X. Windows apps start
slower on VPC7, and some Windows apps may
have some trouble. To the degree that one
can use native OS X (whenever possible)
and otherwise deal with rare quirks in
in VPC7, I'd certainly recommend Macs.
I should note that SuSE Linux has many of
the same security/reliability advantages
of OS X and runs nicely on most modern x86
hardware. The main drawbacks of SuSE are
(a.) the user interface and many of the
free open-source apps are not as polished
as those included in or available for OS X,
(b.) running Windows on SuSE Linux demands
the purchase of VMWare or other extras,
whereas VPC7 is free for U-owned equipment.
I'd certainly choose SuSE Pro (about $80)
over most other Linux versions. It's very
complete, reliable, and works on a wide
variety of x86 hardware, including laptops.
That said, I'd choose OS X over Linux for
almost any end user because the interface
and apps are cleaner and easier to use.
Questions and comments welcome.
From February 05
Here are some sites I mentioned yesterday.
Sites for managing Mac OS X servers, labs and clients:
Site for info on cross platform issues:
Font related tools:
From September 2004...
Regarding desktop patches, I'll divide the topic
into operating system (OS), antivirus (AV), apps,
and spyware patches, interspersed with some
of my own philosophy of systems maintenance.
My views may or may not reflect a consensus.
Sorry this is long. I want to be a clear as I can.
Bottom line is that I think we do most PC support
just fine without server-driven patch deployment,
though we could improve how data is gathered.
We're a small shop and would not gain as much
from a server-driven approach as would a larger organization.
I'd make an exception for a particularly clean
and reliable server-based approach like I think
we might find in Netboot (part of OS X Server),
but Windows desktops are not as monolithic nor
as easy to manage as OS X, and this can make
balancing usability and control a tricky issue.
In short, there is no "silver bullet" for Windows.
Hands-on support is inevitable as long as staff
have Admin or even Power User control of their
systems, and it would be unwise to remove that
- for morale, because some of our apps/tweaks
demand it, and because it eases ad hoc fixes.
I simply take advantage of occasional software
rollouts (like Office 2003) to clean up whatever
wasn't already automatically patched.
About those four subtopics:
1. For 2K and XP patches we do what they do
[elsewhere], using Windows Automatic Updates to
locally download and apply (or offer to apply)
critical patches every day. This seems to work
fine, mostly, though some cleanup is needed
when staff ignore/decline critical patches. On
public PCs the Automatic Updates are forced.
Note that whenever a big nasty hits campus,
network operators scan the networks for infected
PCs and will block the IP of any infected PC. We
fix the infected PC (just ghosting and patching)
and then I call ops back to re-enable the IP(s).
Apple OS X patches should be automatically
offered to those with admin rights to their Macs.
Dangerous Mac viruses and hacks are rare.
2. Norton/Symantec Antivirus has for years been
configured in all our public and staff images to
patch itself from Symantec's website, and this
has worked extremely well. Setting up a push
server might cut the deployment time from a
day to an hour, but that would also add another
point of failure and another server to maintain.
I'd prefer to just stick with what we've got. We've
only been caught off-guard once by a very fast
virus, and we had that cleaned up the next day.
3. Most of our desktop apps don't need regular
patches. Some apps get automatic updates as
needed during login, and Office gets patched
manually when we must for security (e.g. GDI).
It would be nice to automate that, but it's only
been once in two years that it was necessary.
Most MS Office patches are in no way critical.
4. Spyware: this has become a real challenge.
We do not yet have a good way to automate
blocking or watching for it. Spybot 1.3 works
pretty well ad hoc, and the increasing use of
Firefox/Mozilla with popup-blocking helps a
LOT (many users now favor IE-alternatives),
as do some recent IE fixes in XPsp2, but we
still see more spyware than we should.
Aside from Spyware we're not in bad shape. I've
been gradually replacing Win2K on staff PCs with
XPsp2 (with Windows Firewall), which seems to
reduce both spyware and the risks of hackers. A
dozen non-IS staff PCs are now running WinXP.
More generally, it has been my experience and
IT training that it is best to keep separate systems
as self-sufficient as possible, to avoid unnecessary
interdependencies. Always aim for fewer "moving
parts". That's one reason I replaced all our print
queues with direct-IP, and one reason I like how
Automatic Updates works. One less middleman.
from September 2004
This is partly fun, partly to record how I got around an annoying restriction with a particular piece of software we use to control door locks. This software must run on DOS 6.22, not Windows 98 command line, not in a DOS window, but native DOS. Very particular.
I had been installing DOS 6.22 onto a spare laptop, loading the software and then our data, running the program, backing up the data, wiping the laptop's
drive, and restoring our standard Win2K setup via ghost. That worked, but it killed a couple of hours every time I had to run that program, and I wasn't
inclined to just dedicate a laptop to this one thing. It should be a laptop because it must run in close proximity to the door lock being reprogrammed.
So here's what I did instead:
1. Installed the software as described, made
sure it works and has a current data set.
2. Loaded DOS drivers for the CDROM and
a RAM drive. Then the program failed to load.
3. Ran DOS 6.22 memmaker to optimize the
memory a bit. This took a couple of tries.
4. Cleaned up autoexec.bat and config.sys
so the memmaker settings would mostly
be retained but not address-specific refs,
since addressing might change as I...
5. Added some conditional branching to
config.sys and autoexec.bat (see below),
and made copies (replacing C: with A:)
6. Created a DOS 6.22 bootable floppy and
copied these tweaked autoexec.bat and
config.sys files plus critical DOS stuff and
the CDROM driver to the floppy, and then
tested to see that the drivers would load.
7. When satisfied that the floppy worked as
needed and left enough memory to run
the DOS program, copied its contents to
a backup on the network along with all
the lock program's data to another dir.
Copying the program took some doing
because DOS didn't include network
stuff. I actually ended up using DOS
interlnk/intersvr to copy to another box
with a FAT16 partition (boot from floppy)
and then using Windows to copy from
that FAT16 partition out to the network.
8. From that network backup, copied the
program (with included baseline data)
to a local drive on a PC with CD writer.
9. Used my custom DOS 6.22 boot floppy
with Nero 6 to start building a bootable
CDROM. This CD boots as a virtual A:
drive, sees the real A: as B:, and with
the DOS drivers loads its own CD and
RAM drives as C: and D: (when asked).
10. Burned all the program's data plus
a batch file to automate some steps
onto the non-floppy-emulating part
of the CD (the majority of the CD).
Remember that the goal here was to not have to rebuild the dang laptops. This way I can load DOS 6.22 and a baseline of the program plus data from CD, copy the program into the writeable RAM drive, load updated data from B: (the real floppy) to the RAM drive, run the program, save changes back to B:, remove
the CDROM, and be done.
The laptop's hard drive never gets touched, and this makes any standard PC or laptop instantly available as a fallback. As long as the data backup is preserved along with the bootable CD (or image thereof), this should work until the death of standard x86 PCs.
It was a trip down memory lane, having left DOS behind (I thought) half a decade ago. Here are the floppy's boot files. The actual CDROM driver is not CDROM.SYS. Use whatever DOS CD driver works for you:
------------ config.sys -------------------
menuitem=NONE, Load neither of these drivers...
menuitem=RAMDRIVE, Load only the RAMDRIVE driver
menuitem=CDONLY, Load only the CDROM driver
menuitem=ALL, Load RAMDRIVE and CDROM drivers
devicehigh=a:\DOS\RAMDRIVE.SYS 8192 512 /E
devicehigh=a:\DOS\RAMDRIVE.SYS 8192 512 /E
------------ autoexec.bat -------------------
choice Load Smartdrive (default is no in 5 seconds) /tn,5
if errorlevel 2 goto prompter
lh mscdex /d:cd
if not exist d:\yabadaba.bat goto end
echo Next I'll call batch file d:\yabadaba.bat, which should
echo include DOS or data tweaks specific to using whatever
echo software was copied onto the larger portion of this CD.
echo To prevent calling d:\yabadaba.bat, break now. If you
echo have questions about this CDROM, just ask its creator.
The following is a copy of a very rough draft laptop policy from September 2004. I don't think we ever did anything specific with it, fir for future reference...
The [org] has A [brand and model] laptops available for [specify] use in room B of [building], plus a limited number of other laptops reserved for [org] staff use, but which have also occasionally helped absorb overflow demand for [blah].
This policy applies primarily to the [brand and model] laptops but could also apply to any staff laptop used within or outside the [org] for [blah] purposes. Note that all such laptop use assumes the active participation of a [org] staff member who can keep an eye on the laptops. These laptops should never be checked out to end-users nor left unsupervised in an unlocked room.
Classes normally take place at [place]. While various [org] staff teach these classes, laptop configurations will remain the responsibility of the [org] IT department.
Consistent, reliable performance of these laptops depends on the use of a standard configuration and on minimizing unauthorized changes to that configuration over time. I.S. will periodically update the standard configuration to meet evolving needs. We try to include any licensed, standard software that will typically be needed in classes. We do not include unlicensed or clearly extraneous software.
If there is a software package that an instructor believes should become part of the standard configuration, that is paid for (or free) and can be legally installed on ALL our laptops, and that we can realistically support in addition to other software in the image, we can try to add that to the next configuration (with adequate advance notice).
Software to be used beyond our standard configuration must be first approved and installed by the [org] IT department. All software to be added, either to the
standard configuration or ad hoc, must bear proof that it can be legally installed. Unapproved software and all user data will be removed without notice as laptops are patched, upon reconfiguration, or sooner depending on needs and staffing. Never leave user data on a laptop.
If any laptops sport multi-boot configurations to load a choice of operating systems (e.g. Windows and Linux) the maintenance procedures may differ but the principles remain the same for any operating systems or software
package: it must be legal, approved, installed by IT.
Our latest [abrand and model] configurations include (at least):
Windows XP sp2 with most of the common accessories,
recent patches, and a "Windows Classic" layout/style
Adobe Acrobat Reader 6.0.2
ActiveState ActivePerl 5.8
Microsoft Word 2003
Microsoft Excel 2003
Microsoft PowerPoint 2003
Mozilla FireFox (recent version) with Macromedia patches
Spybot Search & Destroy 1.3
Symantec Antivirus 9 (with automatic updates)
WinSCP 3 (mostly for software maintenance)
This list is subject to revision without notice, but we will try to maintain a good list of included software. Some or all laptops may also feature SuSE Linux or Fedora Core with all included free components loaded in a dual-boot configuration.
Each [bradn and model] includes the following....
When laptops are in [building] or otherwise under the care of [org] staff, the [org] is responsible for them. When any laptop is in use [other org or individual]
the [other org or individual] assumes all responsibility for the laptop.
Staff members and users from all departments agree not to install software, alter system hardware or software configurations, or otherwise violate the [larger org] guidelines for acceptable use. Any files or programs saved on the laptops will be deleted. The [org] is not responsible for damage to patron’s personal disks, for loss of data, or for files left on a laptop’s hard drive.
Well in advance of any given class, the staff member planning a class should...
Aside from periodic reconfigurations, patches to Windows and Symantec Antivirus should be automatic. IT will regularly check the laptops to see that these are being patched, to patch anti-spyware software as well, and to scan for problems and remove any non-standard data or applications. Please direct any
maintenance support questions to [org] IT staff.
User Responsibilities (very rough)...
The following provisions describe conduct prohibited under these guidelines:
1. Altering system software or hardware configurations without authorization, or disrupting or interfering with the delivery or administration of computer resources.
2. Attempting to access or accessing another's account, private files, or e-mail without the owner's permission; or misrepresenting oneself as another individual.
3. Installing, copying, distributing or using software in violation of: copyright and/or software agreements; applicable state and federal laws; or [larger org] standards
4. Using computing resources to engage in conduct which interferes with others' use of shared computer resources and/or the activities of other users, including studying, teaching, research, and University administration.
5. Using computing resources for commercial or profit-making purposes without written authorization from the [larger org].
6. Failing to adhere to individual departmental or unit lab and system policies, procedures, and protocols.
7. Allowing access to computer resources by unauthorized users.
8. Using computer resources for illegal activities. Criminal and illegal use may include obscenity, child pornography, threats, harassment, copyright infringement, defamation, theft, and unauthorized access.
Composed in September 2004, Revised xxxx
In early 2000 my soon-to-be-wife and I bought a MacGregor 26x motor/sailor and outboard motor. We chose a Mercury 30HP "bigfoot" four-stroke, which cost around four grand. It was quiet and a good fit, but it had a huge problem that ended up stressing me out every trip and almost stranded us once.
Carburated engines (http://en.wikipedia.org/wiki/Carburator) like to run hot for long periods on newer gas. Burns all the carbon. We usually run slowly for short periods to get in and out of docks. In typical sailboat usage we found that carbon built up quickly in the three carbs of a Merc 30.
We then had to have the engine serviced too often, the engine would kill at low speeds, we could never achieve full speed, and it ultimately got so bad that we had to have all three carbs replaced for about $1300. Ouch! I think that was due in part to a bad servicing at a place on Central Ave (can't recall the name).
Dan's Southside Marine in Blomington, MN replaced the carbs and it helped a little, but we later decided we'd had enough and traded in the Merc, still worth about two grand, on a Suzuki 40HP with EFI - electronic fuel injection. The trade-up wasn't cheap. With tilt/trim, start, mounting, deluxe controls and all, it cost us our entire tax return plus the trade-in, but the new Suzuki purrs like a kitten at any speed and runs as fast as the old one ever could - without even breaking a sweat (4000 rpm).
I have yet to push our new motor beyond 4100 rpm, which gets us a nice 10 mph even upwind - plenty for the average motor/sailor. If we drain our ton of water ballast and push it to 5000 rpm I bet we can hit 14 mph, plenty for tubing or maybe to get from Madeleine Island in the Apostles (WI) to Barker's Island near Duluth, and back, in two days. We plan to try that trip this summer.
EFI is the way to go. Never again will I buy a mid-sized boat motor without EFI. Upgrading when we did also got us double the warranty, so we're covered through 2011. Having a good motor removes much of the stress of owning a boat. The rest depends on planning, crew, and experience.