February 2, 2009

From Windows to Unix: a mental journey

January 24th, 2009

Posted by Paul Murphy

Last week reader leigh@ wrote:

OK I get the picture but…

When will, or how will we get an article that helps us unfortunates who were trained on M$ across the line with Linux?

The second comment that cited the article as pro M$ made me laugh, and the response to that is typical and I didn’t read any more of the inevitable OS flame wars. Could we have a clear concise article on what they should have done in the transition from NT4 to Linux or even better…the same article covering how to transition from what they have now to Linux.

We use Fedora 9 in a VM at work, on a M$2008 server. I’d like to go away from M$ servers, retain .net stuff and move a lot of stuff to php. Troubles is the ratio of info about ‘How OS xyz is better’n OS $’ to ‘How to architect a change to OS xyz and why’ is about a hundred to one. I know mono may help me but I am having trouble finding time and information because juvenile jingoistic OS pundits write reams of crap. Help I’m drowning in FUD, and some of it is open source…

I will be revising my Unix Guide to Defenestration before serializing it here later this year – and that book, originally written in 1999/2000, is dedicated to meeting his needs.

Notice that I’m not concerned, and I assume Leigh isn’t either, with the specifics of individual conversion processes – i.e. the question isn’t “how do you convert from .net to mono?” but “how do you convert from a Wintel orientation to a Unix one?”

The single most important pre-condition for success in doing this is to be very clear that Unix is not Windows, not zOS, and not VM – and that what you know about managing these other architectures has co-evolved with those architectures and therefore may, but more likely will not, apply in the Unix world.

Some skills and ideas are transferable, but many of the management and technology assumptions, particularly those people are surest of (and are therefore least likely to think about) are likely to prove wrong in subtle and devastating ways.

Two very basic differences, for example, tend to so utterly confound data processing (and now Wintel) people that they never learn to use Unix properly:

  1. With Windows (and zOS) you care mostly about applications, with Unix you care mostly about users.

    This has many consequences; for example, the natural thing to do with Windows (and any other data processing architecture) is to put your people as close to the gear, as you can – where with Unix you do the opposite: spreading your people throughout the organization by putting them as close as possible to users.

  2. With Windows (and zOS) the IT job is mostly about managing the IT resource: infrastructure, people, and applications – but with Unix, the IT job is mostly about serving users.

Both of these are consequences of differences in costs, risks, and performance. With zOS adding another job can upset a delicate balance between limited time and expensive resources; in Windows adding an application for a few users can have unexpected consequences across the entire infrastructure, and, of course, in both cases performance and flexibility are limited while change costs are high.

In contrast, the risk of adding a new application in the ideal Unix environment – big, central, processors with smart displays – is trivial; and the cost of doing things like creating a container for users who want “the database” as it was last February 19th at 10:00AM please, is, like its performance impact, essentially zero.

From a grunt’s perspective the key operational difference is that, with with Windows you spend most of your time keeping things working -but, with Unix you set systems up to work and trust that they do, thus freeing yourself to spend most of your time, not in futzing with the system, but as the human facilitator in the system’s interface to users.

As a manager the big difference between Unix and traditional data processing gets expressed most clearly in the default response to user originated change requests. With zOS (and now Wintel) the risks, and costs, of change are so high that the right answer is almost always “no” – and shunting persistent requesters into the budget process is an appropriate learned reflex because it works to provide both budget growth and time.

In contrast, Unix costs and risks are so low that the right answer is almost always to simply say “yes” and move directly to the how and when.

This difference has a major organizational consequence with respect to role separation. When Finance spun out data processing in the 1920s, role separation naturally came along – and is still embedded in the CoBIT/ISACA data center operational standard. Unix, however, came from the science side and has no evolutionary history to justify any of this – meaning that the right thing to do is to wear a suit and a bland look in meetings with your auditors, but actually cross train everyone to do just about everything while leaving within team assignments for team members to sort out among themselves.

In practice, of course, you see rigid role separation applied to Unix, but this is almost always the result of organizational evolution and the decision making roles played by people whose assumptions reflect Finance, audit, or data processing backgrounds. In general what happens in those cases is that Unix gets used as a cheaper something else – and that can work, but doesn’t take advantage of the technology’s real strengths.

Most IT executives find it extraordinarily difficult to accept that you get the best results with Unix by cross training your people, freeing them to make their own operational decisions, and centralizing processing while distributing real functional control to people working one on one with users; but this is the only known route to making cialis trial pack corporate IT what it should be: a cheap, fast, and trusted “nervous system” for the business.

As I say in defen, the difference is that between management and leadership. With management you organize to get a known job done in repeatable, controllable, ways -and that’s the right way to address something like printing customer lading reports for a railway in the 1920s: you train people to operate each machine, put someone in charge of each line, yell “go” at the right moment, and then wander around ensuring that each batch gets handled “by the book” by exactly (and only) the right people at each stage from keypunch to print distribution.

With IT, however, the job changes daily and you need leadership: the process of focusing more brains on goals; not management: the process of organizing hands to execute well understood processes. Thus the very basis of applying data processing methods, whether with zOS or Windows, is antithetical to the IT job – and therefore to Unix as a tool for doing the IT job. Basically, most corporate data processing is organized and equipped to pound square pegs into round holes – and thus the amazing thing about them isn’t that they constrain organizational change while costing too far too much and doing far too little, it’s that they work at all.

Permalink • Print • Comment

10 mistakes new Linux administrators make

  • Date: November 29th, 2008
  • Author: Jack Wallen

If you’re new to Linux, a few common mistakes are likely to get you into trouble. Learn about them up front so you can avoid major problems as you become increasingly Linux-savvy.


cialis price compare align=”justify”>For many, migrating to Linux is a rite of passage that equates to a thing of joy. For others, it’s a nightmare waiting to happen. It’s wonderful when it’s the former; it’s a real show stopper when it’s the latter. But that nightmare doesn’t have to happen, especially when you know, first hand, the most common mistakes new Linux administrators make. This article will help you avoid those mistakes by laying out the most typical Linux missteps.

Note: This information is also available as a PDF download.

#1: Installing applications from various types

This might not seem like such a bad idea at first. You are running Ubuntu so you know the package management system uses .deb packages. But there are a number of applications that you find only in source form. No big deal right? They install, they work. Why shouldn’t you? Simple, your package management system can’t keep track of what you have installed if it’s installed from source. So what happens when package A (that you installed from source) depends upon package B (that was installed from a .deb binary) and package B is upgraded from the update manager? Package A might still work or it might not. But if both package A and B are installed from .debs, the chances of them both working are far higher. Also, updating packages is much easier when all packages are from the same binary type.

#2: Neglecting updates

Okay, this one doesn’t point out Linux as much as it does poor administration skills. But many admins get Linux up and running and think they have to do nothing more. It’s solid, it’s secure, it works. Well, new updates can patch new exploits. Keeping up with your updates can make the difference between a compromised system and a secure one. And just because you can rest on the security of Linux doesn’t mean you should. For security, for new features, for stability — the same reasons we have all grown accustomed to updating with Windows — you should always keep up with your Linux updates.

#3: Poor root password choice

Okay, repeat after me: “The root password is the key to the kingdom.” So why would you make the key to the kingdom simple to crack? Sure, make your standard user password something you can easily remember and/or type. But that root password — you know, the one that’s protecting your enterprise database server — give that a much higher difficulty level. Make that password one you might have to store, encrypted, on a USB key, requiring you to slide that USB key into the machine, mount it, decrypt the password, and use it.

#4: Avoiding the command line

No one wants to have to memorize a bunch of commands. And for the most part, the GUI takes care of a vast majority of them. But there are times when the command line is easier, faster, more secure, and more reliable. Avoiding the command line should be considered a cardinal sin of Linux administration. You should at least have a solid understanding of how the command line works and a small arsenal of commands you can use without having to RTFM. With a small selection of command-line tools on top of the GUI tools, you should be ready for just about anything.

#5: Not keeping a working kernel installed

Let’s face it, you don’t need 12 kernels installed on one machine. But you do need to update your kernel, and the update process doesn’t delete previous kernels. What do you do? You keep at least the most recently working kernel at all times. Let’s say you have 2.6.22 as your current working kernel and 2.6.20 as your backup. If you update to 2.6.26 and all is working well, you can remove 2.6.20. If you use an rpm-based system, you can use this method to remove the old kernels: rpm -qa | grep -i kernel followed by rpm-e kernel-{VERSION}.

#6: Not backing up critical configuration files

How many times have you upgraded X11 only to find the new version fubar’d your xorg.conf file to the point where you can no longer use X? It used to happen to me a lot when I was new to Linux. But now, anytime X is going to be updated I always back up /etc/X11/xorg.conf in case the upgrade goes bad. Sure, an X update tries to back up xorg.conf, but it does so within the /etc/X11 directory. And even though this often works seamlessly, you are better off keeping that backup under your own control. I always back up xorg.conf to the /root directory so I know only the root user can even access it. Better safe than sorry. This applies to other critical backups, such as Samba, Apache, and MySQL, too.

#7: Booting a server to X

When a machine is a dedicated server, you might want to have X installed so some administration tasks are easier. But this doesn’t mean you should have that server boot to X. This will waste precious memory and CPU cycles. Instead, stop the boot process at runlevel 3 so you are left at the command line. Not only will this leave all of your resources to the servers, it will also keep prying eyes out of your machine (unless they know the command line and passwords to log in). To log into X, you will simply have to log in and run the command startx to bring up your desktop.

#8: Not understanding permissions

Permissions can make your life really easy, but if done poorly, can make life really easy for hackers. The simplest way to handle permissions is using the rwx method. Here’s what they mean: r=read, w=write, x=execute. Say you want a user to be able to read a file but not write to a file. To do this, you would issue chmod u+r,u-wx filename. What often happens is that a new user sees an error saying they do not have permission to use a file, so they hit the file with something akin to chmod 777 filename to avoid the problem. But this can actually cause more problems because it gives the file executable privileges. Remember this: 777 gives a file rwx permissions to all users (root, group, and other), 666 gives the file rw privileges to all users, 555 gives the file rx permissions to all users, 444 gives r privileges to all users, 333 gives wx privileges to all users, 222 gives w privileges to all users, 111 gives x privileges to all users, and 000 gives no privileges to all users.

#9: Logging in as root user

I can’t stress this enough. Do NOT log in as root. If you need root privileges to execute or configure an application, su to root in a standard user account. Why is logging in as root bad? Well, when you log on as a standard user, all running X applications still have access only to the system limited to that user. If you log in as root, X has all root permissions. This can cause two problems: 1) if you make a big mistake via a GUI, that mistake can be catastrophic to the system and 2) with X running as root that makes your system more vulnerable.

#10: Ignoring log files

There is a reason /var/log exists. It is a single location for all log files. This makes it simple to remember where you first need to look when there is a problem. Possible security issue? Check /var/log/secure. One of the very first places I look is /var/log/messages. This log file is the common log file where all generic errors and such are logged to. In this file you will get messages about networking, media changes, etc. When administering a machine you can always use a third-party application such as logwatch that can create various reports for you based on your /var/log files.

Sidestep the problems

These 10 mistakes are pretty common among new Linux administrators. Avoiding the pitfalls will take you through the Linux migration rite of passage faster, and you will come out on the other side a much better administrator.

Permalink • Print • Comment

Just what does it take to switch to desktop Linux (part 2)?

December 1st, 2008

Posted by Christopher Dawson

At well over 300 talkbacks and counting, plenty of folks took my challenge (and my reader’s challenge) to sort out just what it would take to switch from Windows to desktop Linux. Obviously, there was plenty of the standard Windows vs. Linux bickering, but there were also a lot of well-thought out responses. Given that our hypothetical office to be converted (the superintendent’s office) largely runs vanilla productivity applications with our mission-critical (and proprietary Windows only) applications running via Terminal Services on a Windows 2003 server, it seemed as though the conversion would be pretty straight-forward.

Here are the highlights from the talkbacks, though, with some important considerations. None of these seem to be deal breakers, but they certainly need to be part of a well-planned and successful conversion if we decide to head down that road:

  • Printing: Do all of the printers we access have Linux drivers? As much as we might want to be paperless, the super’s office, perhaps more than any other district administrative unit, must produce printed documents.
  • Backup: With our Windows machines, we can redirect desktops and user folders to a regularly backed-up server; Vista does a particularly nice (if slow) job of dealing with offline file synchronization. There are plenty of ways to handle this in Linux, but as far as I know, there isn’t anything quite as slick as either group policies in Windows for the redirects or the similar functionality enabled in OS X server (feel free to post a link or instructions for making this happen easily in Linux).
  • Replacing group policy and domain/enterprise levels of control in general: as noted above, while AD may have its share of issues, it makes pushing updates, enforcing policies, etc., really easy. Anyone have a good “Linux administration for dummies” link that covers good ways to handle policy for workstations across a network?
  • Remote access: A relative was visiting for Thanksgiving and couldn’t access his web-based VPN client on our Ubuntu laptop. Again, there are plenty of remote access solutions that will work quite well with Linux, but any existing infrastructure needs to be tested for compatibility.
  • Complex Excel files: Compatibility between OpenOffice 3 and Microsoft Office is generally quite good. However, since the super’s office also handles budget administration, there are most likely some fairly complicated spreadsheets floating around. A period of testing should certainly go on with OO.org, but a more important consideration may actually be the impact on productivity for budget admins who are extremely proficient in Excel.
  • “Extracurricular crap”: I really like this one, actually. Reader JoeMama_z makes a very good point: “Check out any extra curricular crap they may have, iTunes, Skype, etc. Yes these are silly but if users are pissed off you took away music they’ll be more likely to resist and sabotage.” Reader Ye offered this advice: “In my experience it’s not the mainstream applications that prevent a switch but rather the myriad of smaller programs which have no OSS replacements. Be sure to identify and factor these programs into any migration strategy.”
  • ADA compliance: This hadn’t even hit my radar screen, but it’s a very good point made by ZDNet contributor, Marc Wagner. As he asks, “What about a superintendent (or staffer) with special needs? Are their sufficient ADA-compliant tools in the open-source community?” Any feedback on experience with ADA compliance and open source applications that can meet a variety of needs would be much appreciated.

So there you have it. Some new questions, some new considerations, and several good points. For us, I don’t think that any of these are cialis prescriptions insurmountable, particularly because so much of what we do is either web-based or strictly productivity-oriented.

We’re a small district who (since we now have a tech director instead of the occasional teacher or parent who jumps in and does some tech stuff) is finally starting to build infrastructure and think “enterprise”, so enterprise tools like Exchange haven’t even entered the picture yet. In some ways, now is the time to decide whether we fully embrace a Windows ecosystem or move to a much more open system with all the advantages and disadvantages that might carry.

Permalink • Print • Comment

Just what does it take to switch to desktop Linux (part 1)?

November 30th, 2008

Posted by Christopher Dawson

Last week, when I asked “Are you sure you don’t just want to use Ubuntu?” I received a record number of talkbacks, good, bad, and in between. One of the more interesting, though, came from reader ksheppard, who responded with a challenge:

…Here’s a challenge to you: Make a list of everything – absolutely everything – a hypothetical school superintendent would be required to do to switch his personal laptop (which he uses at work and at home) from XP to Ubuntu, while maintaining, as much as possible, his usual behavior. Allow your blog followers to critique and refine the list until you think it is reasonably complete. Then have the research staff at ZDNet assign time and money expense to each item on the list. Allow your blog followers to help you critique and refine those figures.

Whaddaya think?

I think I wish we had research staff here at ZDNet, but I also think that it’s a really useful exercise to undertake. I think our district represents a good starting point for the exercise since we don’t actually use any applications in the superintendent’s office on the desktop that don’t have an open source alternative. As ksheppard pointed out, some districts may have bus routing applications, for example, that are Windows only.

Server-side, as well, we’re a Windows shop due to payroll and budgeting applications that are Windows only (these are accessed via RDP, so the desktop platform is irrelevant). However, on the desktop, where we, spend most of our resources and my users in the superintendent’s office spend most of their time, we have very few barriers to adoption. This is also where I spend most of my support resources; as readers have pointed out, while Linux is not necessarily more secure than Windows, it is far more immune to attack right now than Windows cialis prescription online (or even Mac) platforms, simply by virtue of market share.

I’d like to turn this into a series of posts, ultimately asking what it takes to switch to desktop Linux in education. Starting small, though, I’d like to answer ksheppard’s question. What is required to switch my superintendent to Linux on his desktop?

Here’s my initial list. Keeping the caveats above in mind, take ksheppard’s challenge with me and critique the list in the talkbacks.

  • Know how to access Windows shares on the network
  • Have a rudimentary understanding of the file system to ensure that he could copy, paste, and otherwise move around his files, including how to make backups
  • Understand how to save documents in PDF, ODF, and Office formats
  • Understand the differences in interface between OpenOffice and Office 2003 (the current system used in the superintendent’s office)

These are all free. Because our desktop deployments (keep in mind that I’m not talking about server deployments yet) are quite simple in this office, I’m seeing very little downside and very little cost. This will obviously become far more complicated as I start to look enterprise-wide, where do have some Mac/Windows only software. In this office, though, the only concerns my users have are,

  1. Where are my files?
  2. How do I access the budgeting software?
  3. How do I access the student information system (web-based)?
  4. How do I communicate (email and chat systems are currently web-based)?

What am I missing (just for this microcosm – we’ll get into the other environments later this week)?

Permalink • Print • Comment
« Previous Page
Made with WordPress and a search engine optimized WordPress theme • Sky Gold skin by Denis de Bernardy