February 4, 2009

Vista gets a halo effect from Windows 7

February 2nd, 2009

Posted by Ed Bott

Conventional wisdom says corporations have completely rejected Windows Vista. But I’m seeing evidence lately that Vista’s image is improving with age. A new report issued today by Benjamin Gray and his colleagues at Forrester Research confirms that Vista is getting a new lease on life in the enterprise.  Microsoft’s well-executed development of Windows 7 might be a big part of the reason.

Forrester surveyed 962 IT decision-makers at North American and European companies with more than 1000 seats (more than a quarter of the survey respondents represent organizations of 20,000 employees or more) and found that Vista is now installed on just under 10% of all PCs within enterprises. One-third of all respondents have already begun Vista deployments, and another 26% have plans to begin deploying Vista this year or next. Another 15% are going to skip Vista and go straight to Windows 7.

In other words, the supposedly despised Vista is about to do what its predecessors did and begin significant adoption after a few years of apparent snubbing. That’s what happened to XP, which had less than 10% total market share (corporate and consumer) after a year on the market and didn’t hit the 50% mark until four years into its lifecycle. Based on Forrester’s numbers, I would expect Vista to approach 50% share by the end of 2010, with IT pros watching Windows 7 to see whether its performance in the field justifies the great early reviews .

I remember reading surveys of IT pros about their intended adoption rates back in 2006 before Vista shipped. Most of those numbers predicted that Vista would be at least modest success for Microsoft. A year later, after Vista’s troubled launch and a tidal wave of bad publicity and devastating Apple ads, the numbers had swung to extreme pessimism.

And now, two years into Vista’s life, those opinions have swung back to a fairly normal adoption curve. Why? The number one reason is Service Pack 1, which made a big difference for Vista. The overwhelming generic cialis prices consensus among reviewers was that it fixed a long list of bugs, including some deployment blockers, and improved performance noticeably. SP2 is just around the corner, and anyone who’s doing their own testing instead of believing what they read on Slashdot has had plenty of time to decide whether it’s a smooth stable update (it is).

Vista is part of the same family as Windows Server 2008 and Windows 7, both of which have earned almost universal rave reviews. Server 2008 is built on the same code base as Vista SP1, which adds credence to the idea that Vista wasn’t fatally flawed, only badly botched at launch.

Another factor in Vista’s favor is that the same management team that is doing so well with Windows 7 is also in charge of keeping Vista running. By hitting a steady series of public milestones with Windows 7, Windows boss Steven Sinofsky is restoring corporate confidence in Microsoft’s ability to ship software on a reliable schedule with predictable quality. That confidence makes it easier for IT pros to conclude that the early troubles with Vista were a temporary glitch and not a sign of things to come.

Ironically, deploying Vista SP2 is the most conservative option for Windows shops. XP is about to enter the extended support phase (on April 14, 2009). By contrast, Vista has more than three years left in mainstream support, which runs until at least April 10, 2012. The same instincts that make an IT pro conservative enough to stick with XP for more than seven years will also prevent him from adopting Windows 7 too quickly, no matter how glowing its reviews. Caution dictates waiting at least one year or one service pack, whichever comes later. All of which makes the currently supported, well-documented Vista SP2 the surprisingly safe choice.

Permalink • Print • Comment

XP, Vista, Win 7: The brewing of a perfect storm

February 2nd, 2009

Posted by Mary Jo Foley

Whenever Microsoft releases a new version of Windows, there’s always some period of uncertainty when customers face the choice of moving to the current release or waiting for the new product. This year,  however, that transition period is especially uneasy.

Windows 7 is — by all accounts (except from the Microsoft honchos) — due out later this year and is looking faster, smaller and more stable than any Windows release out there. Windows Vista is here, but not a user favorite (to put it mildly). And eight-year-old Windows XP is still the dominant version of Windows out there.

So what’s a Windows user to do? Follow Microsoft’s corporate guidance and upgrade to Vista now in preparation for 7? Hang on a bit longer with XP? Try mixing and matching the three in your IT shop?

Microsoft’s Windows brass have been reticent to provide a detailed answer to the question “What should my desktop strategy be?” But Mike Fiorina, a Microsoft account tech specialist based in New England, grabbed the Windows-upgrade-confusion bull by the horns in a blog post this past weekend.

Fiorina explained that a perfect storm is brewing: XP SP2 mainstream support is set to end in July, 2010 April 2009 (and all support for it by July 2010).  XP SP3 extended support isn’t retiring until April 2014, which, Fiorina said, “gives XP environments some breathing room, but not necessarily as much as you might think.”

Even though Vista SP1 has been out for a year (and Vista SP2 is expected some time in the next few months), Vista still is suffering from both real and imagined limitations, Fiorina admitted. From his January 30 post:

“The one recurring theme in discussions with corporate customers is that (Vista) application compatibility is a problem. Applications may not run in Vista, or maybe they can, but it’s not supported by the vendor. Remediation will be costly and time consuming. We get it. Many of the acquisitions and investments we’ve made in the past few years are targeting that problem specifically (Application Virtualization – SoftGrid, Enterprise Desktop Virtualization – Kidaro, etc.)”

Fiorina noted that the generally positive beta reviews of Windows 7 has meant “we’re hearing from a lot of folks ‘Why should I upgrade to Vista when Windows 7 is right around the corner?’” His answer:

“If we look at it from the perspective of an enterprise with fairly unaggressive adoption cycles, then you’ll see that you may be putting yourself in an untenable situation a few years down the road.”

Untenable? Fiorina continued his line of reasoning with the caveat, “for the sake of argument, make these assumptions”:

  • “Company A doesn’t deploy new operating systems or major applications until Service Pack 1 (or a similar bug-fix milestone) has been provided by the vendor
  • Company A probably won’t even begin testing their application footprint against the new OS until said SP1 is available
  • Windows 7 ships in the fourth quarter of 2009
  • Service generic cialis overnight Pack 1 for Windows 7 would likely not be final until the first half of 2011, if not later (going by our historical timelines for SP1 releases)
  • So, Company A would begin testing migration from Windows XP to Windows 7 SP1 in 2011 sometime. How long would it take to perform adequate testing of your application suite to certify\remediate it for Windows 7? For most, this is at least a 6 to 12 month process…so, now we’re in mid-2012.  At that point, you’re ready to start building an image (hopefully using the MDT to make your lives easier).  Maybe the image is ready to go in early 2013. Then you have a little over a year to get it out company-wide until Windows XP hits end-of-life. Is that enough time?  Perhaps…but is it worth backing yourself into a corner?”

Sure, you could argue that Fiorina is a sales guy and is looking for any way possible to chalk up a few more Vista sales while Windows 7 is gaining steam. But, to me, his post highlights what’s likely to be one of the biggest IT questions in 2009: On which version of Windows should I standardize as my corporate desktop?

Corporate users: What’s your thinking here? Has your first taste of Windows 7 led you to change your deployment plans?

Permalink • Print • Comment

February 3, 2009

Windows Defender

Just For Vista?

I received a phone call yesterday morning from a fellow newsletter reader who was wondering if the Windows Defender security package only works with Windows Vista. At first, I told him yes, because from what I understood from articles I've read in the past, that's how it's set up. He then asked me to double check on it and I gladly said I would. Well, let me tell you, I'm really glad I did! I have been wrong this whole time and I'm not too proud to admit it. Here's the scoop!

After doing some more research, I found out that Windows Defender automatically comes along with Vista, but you can also download a free version of it for earlier Windows operating systems. Wow, who knew?! The free version of Windows Defender is available for Windows XP SP2, as well as, Windows 2000 and Windows Server 2003. (It will only work with the last two if you validate it through the Windows Genuine Advantage program first though).

Now, I know you're probably wondering how the free version stacks up next to the Vista version, right? Well, as far as I can see, it seems to work just as well. If you download Windows Defender to an earlier operating system, it will run through all the same scans, etc. that it does for Vista. generic cialis fast delivery It's obviously going to work a little better for Vista, because it's the security program that was made specifically for it, but if you're looking for something new for an older Windows computer, the free download will do the trick! So, if you're not using Vista and would like to download the free version of Windows Defender, you can do so right here. I hope you enjoy it!

Permalink • Print • Comment

Microsoft confirms no more betas for Windows 7

January 30th, 2009

Posted by Mary Jo Foley

Microsoft officials on January 30 reiterated that there will be no public Beta 2 of Windows 7 and the next milestone will be the Release Candidate (RC) test build of the operating system.

On the “Engineering Windows 7″ blog, Windows development chief Steven Sinofsky reiterated what officials stated less plainly at the Professional Developers Conference last year: There will be just one beta of Windows 7.

Sinofsky emphasized in his new post that cialis without prescription color=”#004d99″>Microsoft is not sharing any new ship date targets for Windows 7. As has been known for a while now, Microsoft’s delivery plan for Windows 7 is to deliver a public beta, an RC (it’s not clear at this point if that will be public or private) and then release to manufacturing (RTM). The RC will be”Windows 7 as we intend to ship it,” Sinofsky blogged. More about the RC from his post:

We will continue to listen to feedback and telemetry with the focus on addressing only the most critical issues that arise. We will be very clear in communicating any changes that have a visible impact on the product. This release allows the whole ecosystem to reach a known state together and make sure that we are all ready together for the Release to Manufacturing. Once we get to RC, the whole ecosystem is in “dress rehearsal” mode for the next steps.”

Microsoft’s “official” response when asked for a ship-date target for Windows 7 remains three years after Vista’s general availability date (which was January 29, 2007). Many customers and partners believe Microsoft is continuing to target Q3 of this year as its RTM date.

For those hoping Microsoft might rush Windows 7 and release it now? Don’t hold your breath.

Permalink • Print • Comment

February 2, 2009

From Windows to Unix: a mental journey

January 24th, 2009

Posted by Paul Murphy

Last week reader leigh@ wrote:

OK I get the picture but…

When will, or how will we get an article that helps us unfortunates who were trained on M$ across the line with Linux?

The second comment that cited the article as pro M$ made me laugh, and the response to that is typical and I didn’t read any more of the inevitable OS flame wars. Could we have a clear concise article on what they should have done in the transition from NT4 to Linux or even better…the same article covering how to transition from what they have now to Linux.

We use Fedora 9 in a VM at work, on a M$2008 server. I’d like to go away from M$ servers, retain .net stuff and move a lot of stuff to php. Troubles is the ratio of info about ‘How OS xyz is better’n OS $’ to ‘How to architect a change to OS xyz and why’ is about a hundred to one. I know mono may help me but I am having trouble finding time and information because juvenile jingoistic OS pundits write reams of crap. Help I’m drowning in FUD, and some of it is open source…

I will be revising my Unix Guide to Defenestration before serializing it here later this year – and that book, originally written in 1999/2000, is dedicated to meeting his needs.

Notice that I’m not concerned, and I assume Leigh isn’t either, with the specifics of individual conversion processes – i.e. the question isn’t “how do you convert from .net to mono?” but “how do you convert from a Wintel orientation to a Unix one?”

The single most important pre-condition for success in doing this is to be very clear that Unix is not Windows, not zOS, and not VM – and that what you know about managing these other architectures has co-evolved with those architectures and therefore may, but more likely will not, apply in the Unix world.

Some skills and ideas are transferable, but many of the management and technology assumptions, particularly those people are surest of (and are therefore least likely to think about) are likely to prove wrong in subtle and devastating ways.

Two very basic differences, for example, tend to so utterly confound data processing (and now Wintel) people that they never learn to use Unix properly:

  1. With Windows (and zOS) you care mostly about applications, with Unix you care mostly about users.

    This has many consequences; for example, the natural thing to do with Windows (and any other data processing architecture) is to put your people as close to the gear, as you can – where with Unix you do the opposite: spreading your people throughout the organization by putting them as close as possible to users.

  2. With Windows (and zOS) the IT job is mostly about managing the IT resource: infrastructure, people, and applications – but with Unix, the IT job is mostly about serving users.

Both of these are consequences of differences in costs, risks, and performance. With zOS adding another job can upset a delicate balance between limited time and expensive resources; in Windows adding an application for a few users can have unexpected consequences across the entire infrastructure, and, of course, in both cases performance and flexibility are limited while change costs are high.

In contrast, the risk of adding a new application in the ideal Unix environment – big, central, processors with smart displays – is trivial; and the cost of doing things like creating a container for users who want “the database” as it was last February 19th at 10:00AM please, is, like its performance impact, essentially zero.

From a grunt’s perspective the key operational difference is that, with with Windows you spend most of your time keeping things working -but, with Unix you set systems up to work and trust that they do, thus freeing yourself to spend most of your time, not in futzing with the system, but as the human facilitator in the system’s interface to users.

As a manager the big difference between Unix and traditional data processing gets expressed most clearly in the default response to user originated change requests. With zOS (and now Wintel) the risks, and costs, of change are so high that the right answer is almost always “no” – and shunting persistent requesters into the budget process is an appropriate learned reflex because it works to provide both budget growth and time.

In contrast, Unix costs and risks are so low that the right answer is almost always to simply say “yes” and move directly to the how and when.

This difference has a major organizational consequence with respect to role separation. When Finance spun out data processing in the 1920s, role separation naturally came along – and is still embedded in the CoBIT/ISACA data center operational standard. Unix, however, came from the science side and has no evolutionary history to justify any of this – meaning that the right thing to do is to wear a suit and a bland look in meetings with your auditors, but actually cross train everyone to do just about everything while leaving within team assignments for team members to sort out among themselves.

In practice, of course, you see rigid role separation applied to Unix, but this is almost always the result of organizational evolution and the decision making roles played by people whose assumptions reflect Finance, audit, or data processing backgrounds. In general what happens in those cases is that Unix gets used as a cheaper something else – and that can work, but doesn’t take advantage of the technology’s real strengths.

Most IT executives find it extraordinarily difficult to accept that you get the best results with Unix by cross training your people, freeing them to make their own operational decisions, and centralizing processing while distributing real functional control to people working one on one with users; but this is the only known route to making cialis trial pack corporate IT what it should be: a cheap, fast, and trusted “nervous system” for the business.

As I say in defen, the difference is that between management and leadership. With management you organize to get a known job done in repeatable, controllable, ways -and that’s the right way to address something like printing customer lading reports for a railway in the 1920s: you train people to operate each machine, put someone in charge of each line, yell “go” at the right moment, and then wander around ensuring that each batch gets handled “by the book” by exactly (and only) the right people at each stage from keypunch to print distribution.

With IT, however, the job changes daily and you need leadership: the process of focusing more brains on goals; not management: the process of organizing hands to execute well understood processes. Thus the very basis of applying data processing methods, whether with zOS or Windows, is antithetical to the IT job – and therefore to Unix as a tool for doing the IT job. Basically, most corporate data processing is organized and equipped to pound square pegs into round holes – and thus the amazing thing about them isn’t that they constrain organizational change while costing too far too much and doing far too little, it’s that they work at all.

Permalink • Print • Comment
« Previous PageNext Page »
Made with WordPress and a search engine optimized WordPress theme • Sky Gold skin by Denis de Bernardy