NT Couldn’t Crush Unix

Sunday, February 22, 2015 at 10:07 am

Who says last rites need to be administered to Unix? The Dealer Services Group of Automatic Data Processing Inc. is among the many faithful keeping the operating system alive and well. Last month, the $750 million technology division of ADP signed a $100 million deal with Digital Equipment Corp., committing to Digital’s 64-bit AlphaServer Unix system to drive enterprise applications across its 18,000 car and truck dealerships.

DSG is part of a much larger flock of followers that believe in the virtues of Unix for mission-critical and enterprise applications. Officials at companies across all industries who run some flavor of the operating system say its scalability, reliability and management capabilities still make it the most logical environment to host large applications, including SAP AG’s R/3 and Oracle Corp.’s Financials, along with data warehouses and all the back-office operations that make their organizations run.

While a Windows NT setup can cost significantly less, these Unix devotees maintain that Microsoft Corp. still has significant work to do before NT can compete with the traditional strengths of Unix. But these believers are by no means blind followers. They acknowledge that in some cases, platform preference will be dictated by the availability of appropriate third-party applications. Therefore, many are keeping a watchful eye on Microsoft’s progress to decide if–and when–the time is ripe to convert.

“In the enterprise area, we feel Unix today is the right place to be … and [that it] offers the best performance and reliability, although we’re carefully watching NT,” says Miles Lewitt, vice president of DSG’s Global Product Development, in Portland, Ore. “We intend to be pragmatists. It’s not a religious thing for us.”

The principal issue for Unix remains unchanged: If the operating system is to maintain its position in the market, vendors need to accelerate their progress in developing standard extensions as well as pare down the number of versions available. “ISVs are having to port to 20 different Unix systems, and that makes life very difficult,” says David Floyer, an analyst at International Data Corp., in Framingham, Mass. “The … Unixes have to consolidate … and that’s happening.” Floyer says there’s still work to be done to enhance Unix standards for clustering (an architecture that provides continuous, uninterrupted service), high availability (24-by-7) and administration.

Clearly, there is a market for NT, primarily in the desktop and workgroup arenas, analysts say. IDC forecasts NT spending to increase from $6.9 billion to $27 billion between 1997 and 2000. By contrast, sales of Unix systems will experience slower growth, from $24.7 billion to $39.5 billion in total worldwide sales in the same period for hardware spending.

Seamless Web integration

Internet Shopping Network Inc. counts itself among the disciples of Unix. Although the online computer superstore evaluated both NT and Sun Microsystems Inc. Unix systems in 1994, it chose the latter’s Sun SPARC and Solaris UltraSPARC servers as the platform to host its first Web site (www.isn.com). While NT admittedly didn’t have a strong presence in 1994, it did last year. It was then, once again, that ISN opted for Sun over NT to roll out its second site (www.firstauction.com), this time using 64-bit Solaris UltraSPARC servers.

With page views for both sites running slightly over 1 million per day and daily order transactions for both just under 1,000 per day, ISN needed a platform that could accommodate its rapid growth. “Unix has been great for us, and we’ve stuck with it for three reasons: scalability, manageability and flexibility,” says Brett Colbert, director of quality assurance and IS for ISN, in Sunnyvale, Calif.

Unix also excels from a systems administration perspective, says Colbert, who cites the availability of tools compared with what’s out there for NT. For example, if an IS administrator wants to pinpoint a directory on a loaded hard drive to free up space, it requires only a simple command in Unix, Colbert explains. On NT, however, the administrator must go into the NT Explorer utility to view the system’s entire configuration as the first route for figuring out which directories are most full.

Similarly, says Colbert, the operating systems offer striking differences in their approaches to remote access. The Telnet protocol built into Unix allows an administrator to easily tap into clients via remote dial-up, while NT requires a third-party program such as Symantec Corp.’s pcAnywhere or Compaq Computer Corp.’s Carbon Copy to control the system remotely, he says. “That’s important if you get paged at 3 a.m. and you want to just log in, take care of the problem and go back to sleep,” says Colbert. “It’s much more complicated with NT.”

Scaling new heights

The ability of Unix to scale reliably as business needs grow is also unmatched by NT, users say. For running SAP R/3 alone, Chevron Corp. is using about 150 HP-UX servers from Hewlett-Packard Co. Granted, Chevron is more of an ardent Unix believer than most. It has been running other large applications on Unix for several years, specifically for engineering and processing geologic data, says Bob Washa, technical manager for SAP R/3 implementation at Chevron, in San Ramon, Calif.

“We did not consider NT a viable option at that time. … HP [Unix] met all of Chevron’s requirements, and it’s continued to get stronger each year in functionality, scalability and performance,” he says. “We feel today it’s the only viable choice” for handling Chevron’s needs, he adds. At Chevron, that’s no small task. The company’s SAP installation has 7,000 users connected to three HP 9000 servers with one 350GB database, Washa says. In that configuration, R/3 produces an average of 12 million online transactions per month, covering all aspects of corporate finance.

For now, Skyway Freight Systems Inc. also views Unix as the platform of choice to run its mission-critical business applications, including Oracle Financials. The $160 million logistics and supply chain management company does, however, deploy a number of NT servers to operate desktop applications, according to Tom Duck, vice president of IS at Skyway, in Watsonville, Calif. “Today we wouldn’t be able to run our systems on NT–it’s not big enough,” says Duck. “We have applications that need to have high-end, fast servers,” and Unix systems can handle that need, he adds.

Skyway’s Unix spread includes HP 9000 systems used for distributed processing, which can be expanded from a single machine with less than 1GB of memory to systems containing six processors and 4GB of memory, Duck explains. “We can move our processes from one [Unix] server to another fairly easily and upgrade boxes easily,” says Duck, who says Skyway is committed to Unix for the foreseeable future.

“Five years from now we may have a different opinion, but I don’t think things will change radically,” Duck says. The mix of desktop and PC application servers deployed on NT 4.0, along with database servers and large-scale application servers standardized on HP-UX, “seems to be a good combination,” he adds.

Skyway maintains an electronic data interchange server that processes 5,000 files a day and Concerto, its desktop-based supply chain management suite of applications, processes around 4,000 new shipments every day on HP-UX. “There has not been a circumstance where we could not upgrade to cover the growth,” says Duck. “As the business grows, we add CPUs. … We cluster Unix boxes together so if there is a hardware failure, the other box in a cluster can pick up processes and run with it so you don’t have an interruption.” Skyway will add several more HP-UX servers this year, primarily for expanding Concerto’s functionality, he says.

With the expected roll-out of the 64-bit Intel Merced Unix architecture late next year, loyal followers say they’re covered for any kind of growth. Nokia Mobile Phones, which has used primarily HP servers for the last 15 years, is one company that is very interested in Merced. “That will be a consideration in sticking with Unix,” says Bob Schultz, an IT manager at Nokia, in San Diego. Nokia is running about 200 Unix servers (including some Solaris) for research and development, mechanical design, and engineering applications. The company also has about a half-dozen NT servers in play. But, says Schultz, “a lot of high-end tools are not available on NT,” specifically office automation applications.

“While some other operating systems are spending their time fixing bugs and problems, Unix has been working well and has been improved upon for over 25 years,” observes Mike Dotson, program manager for professional development programs at Florida Institute of Technology, in Orlando, which is running SCO Unix on Intel processors. Longevity has given Unix vendors “the luxury of spending their time making improvements,” Dotson says. “The Unix [community] will be the first to come out with the 64-bit operating system,” and that will allow for even greater scalability, speed and functionality, he says.

In many cases, companies are dependent on application availability to dictate their choice of operating system platform. Take SSM Health Care, in St. Louis, which owns and manages 24 entities, including 19 acute-care hospitals. SSM has approximately 40 HP-UX and Sun SPARC servers running its back-end applications for all clinical day-to-day patient care functions, including lab, radiology, admissions, discharge, census information and patient accounting. The company also uses some NT and Novell Inc. NetWare servers.

If given the choice, SSM officials say they’d probably stick with Unix. However, that will not be an option, since the health care provider is committed to rolling out Pathways Care Manager, software from HBO & Co., of Atlanta, which will only be available on NT 4.0. PCM allows SSM’s member hospitals to manage multiple contracts with different health care providers.

“It is being rolled out on NT simply because it was only available on NT,” says Jack Adams, director of operations for SSM. “I think we’d standardize on Unix if we had a choice, but unfortunately we don’t.” PCM will, however, interface to the HBOC Unix system, Adams adds.

Striving for diversity

With many ISVs strengthening their commitment to NT, some Unix shops are hedging their bets and picking vendors that embrace both disciplines. Magnet Interactive Communications has chosen Silicon Graphics Inc.’s Origin system as the primary platform for its Web site, application development and hosting environment. However, the multimedia company also maintains a heterogeneous operations center that includes Unix systems from Sun, along with PC-based Unix and NT servers. Like others, Magnet’s IT officials maintain that NT is limited in the amount of processors it can support and in bus speeds. But they have “some doubts as to our vendors’ commitment to the Unix way of things,” admits David Brookshire, director of IT at Magnet, in Washington.

“SGI is making a major play into the NT market later this year,” Brookshire says. With the price of high-end graphics PCs coming down, “SGI is having to focus on NT-based systems to compete with the likes of Intergraph. … We’re a little insecure along these avenues for obvious reasons, but that is why we remain a diverse environment.”

Microsoft is “doing a great job of convincing software vendors to port to NT–sometimes exclusively,” agrees Mike Prince, CIO of Burlington Coat Factory Warehouse Inc., in Burlington, N.J. “So I think more and more, you’ll find an increase in the use of NT in the application server space.”

Before that can happen, though, NT will have to be weighed in the balance alongside Unix and prove its equal in all features–not just continue to excel in price/performance. And that may be a tough challenge, given the strategy of many Unix loyalists that “fewer is better.”

For example, Chevron’s Washa maintains it is ultimately less expensive to run fewer big Unix servers that can be integrated rather than lots of small, isolated NT machines. “Fewer is better in our book–they’re easier to manage and there’s a lower total cost of ownership associated with that [approach] in software maintenance and repairs,” he says.

Burlington Coat Factory’s Prince agrees. “Spreading enterprise processing to lots of little systems makes an administrative nightmare. You wind up with cross-dependencies in boxes,” he says. For example, one Unix box that provides file server capabilities might depend on another server for other functions, and they get to be interdependent. “Then it becomes hard to unravel the two when there is a problem,” Prince explains.

With an industry shift back to centralized computing, Microsoft faces the issue of being unable to build an NT system that’s big enough, says Prince. To compensate, you get into “racks of NT [servers], and administration of those racks is undesirable,” he explains. “We’re going back to a more centralized model.”

ISN’s Colbert also buys into the fewer-is-better model, noting that when ISN’s load grew significantly on the first auction site, “the beauty of the [Sun] box was we could pop in” six additional processors to double capacity.

Given NT’s limitations, Unix-based systems will remain the primary high-end server of choice for Magnet Interactive for the next few years, “but we’re not wearing blinders,” Brookfield says. With responsibility for managing more than 8,000 Unix servers in North America, DSP’s Lewitt is inclined to agree: “When [Unix] ceases to be the best product, it will be time for us to do something different.”

For that reason, DSP has also hedged its bets by committing to Digital, which offers both platforms. Leavitt says the company’s belief that both Unix and NT systems have a role to play means preserving his customers’ investments over the long term. “We’re committed [to Unix] for as long as being committed is the right thing to do in the market,” he says.

So, for the foreseeable future, many users are remaining faithful, but the message is clear: No one is such a Unix zealot that they can’t be tempted to convert. Burlington Coat Factory’s Prince says it won’t be a problem for him if the market changes and NT gains momentum for enterprise applications.

“The truth of the matter is, what everybody ought to be concerned about is what’s the best way to deploy computing power,” Prince says. “Today, that answer is clearly Unix. That’s not a Unix bigot’s opinion. That’s the opinion of a lot of CIOs. … It’s what works best that counts.”

Categories: Platforms

Hard Drive Data Recovery Using A Software

Friday, February 13, 2015 at 5:27 am

hddrsIf you have unfortunately deleted an important folder or mistakenly formatted a wrong drive, or due to a similar reason, you have got your hard drive out of order, you may need to fix it using a software. Luckily, software related problems are easily to deal with when you are concerned about hard drive data recovery. In order to recover your data by treating a software related problem, there are a few steps you need to take.

The most important thing to keep in mind while attempting for hard drive data recovery is that you must immediately stop working with the hard drive, which is out of order. That is because you obviously do not want your data to be overwritten and lost for good! Once you have shut down the computer, try making a clone of your hard drive and attempt to recover your data from that clone.

Now scan the clone you have created using different recovery programs which are easily available. You can go for free or paid hard drive data recovery programs according to your own preferences. If you go for a free program, a good, reliable option can be to go for Recuva. Whereas, Zero Assumption Recovery is a much relied upon data recovery program, if you choose to go for a paid program.

Hard Drive Repair – Selecting The Most Suitable Professional Service

Hard drive repair in most cases is something that cannot always be performed without the help of trained professionals, especially if the damage is of serious nature. However, a problem here is how much you are willing to pay for such services. Depending upon the nature of damage and the type of device, the hard drive repair services can sometimes cost you much higher than you expect. However, if you consider your data to be more costly than their fee, it is important that you make a learned choice while selecting such a professional service.

Before you go for a particular hard drive repair service, you need to make sure that the company has a high success rate in recovering the data and repairing drives with perfection. Besides this past record, confidentiality and the standards of cleanrooms maintained by that company are the two factors of prime importance. You need to select a company that uses certified cleanrooms only. Last but not the least, an up to date, 24 hours help and support network is a must so that you may keep a track of the progress. Select the company that provides 24 hours support and has more than one ways in which you can communicate with them.

Fixing The Problem With Drobo Disk

Fixing the problem with Drobo disk might be a little challenging. Any wrong move can affect the efficiency of your device or permanently delete your stored files. This is the reason why you have to call an expert computer technician when you are experiencing a problem with a Drobo disk. Usually, the technician will assess the condition of your device and conduct tests to identify the cause for such technical problem. When these are done, he will make it sure to restore the lost files and make the Drobo disk functional again. However, it is important to remember that attempting to repair the device can bring a lot of risks. If you are quite unsure with what you are doing, you have to seek a professional as much as possible.

You can also call the manufacturer or contact them through email so that you will be guided on what to do with this kind of concern. You can never have your Drobo disk fixed if you do not let the right person do it for you. Hence, do research on the best computer technician in your area so that you can guarantee that your problem with Drobo disk is solved with minimal risks.

Categories: Platforms

Unix Knowledge Just Not As Important To Today’s Job Market

Friday, February 6, 2015 at 12:37 am

Anecdotal evidence suggests Lotus Notes expertise is still very much in demand, so maybe this has just been an abnormal quarter (last time its growth rate was 33%). None the less, it has fallen six places in the table as a result.

The biggest growth was shown by Java, with the jobs on offer up nearly threefold to 2,800, raising it 12 places in the table. This is testimony to the continuing rise in interest in Web-based applications, as shown also by a 72% increase in demand for generic Internet-related skills (now 27th) and a doubling in demand for HTML expertise (now 34th).

Esconced

Only one other skill in the top 25 featured in more than twice as many ads as a year ago, and that was SQL, up from 2,800 to 5,700 posts. It remains firmly esconsed in the top 10 in eighth place. Two other skills that just failed to reach 100% growth were Windows NT-up from 4,900 to 9,700 posts – and, one place outside the top 25, SAP, which appeared in just under 1,500 advertisements this time.

The others that showed more than average growth in demand were, in descending order, Visual Basic, object-oriented programming, Access, Delphi, Oracle, RPG400, Ingres and Cics.

Apart from the last three, all the skills listed as growing in popularity in 1998 are new wave products. This suggests the boom in IT recruitment is being fuelled by a combination of a flourishing UK. economy and a chronic shortage of skilled IT personnel, rather than short-term factors such as the year 2000 issue.

RPG400 owes its return to the top 10 to a resurgence in recruitment by AS/400 sites – this sector showed the biggest increase in demand over the quarter. Cics, similarly, is showing a growth rate almost identical to that of IBM mainframe recruitment overall.

Curiously, DB2 has not shared in this growth – demand here was up just 28%, compared to 55% for IBM mainframe staff generally and 57% for Cics expertise.

Two older IBM mainframe database products, IMS and IDMS, showed much bigger growth. IMS appeared in twice the number of ads as a year ago (620) and IDMS in nearly three times as many (500, 75% of which were in IBM sites), and these two skills are now in 40th and 44th places respectively.

Other IBM mainframe legacy skills to appear in more than twice as many ads as a year ago were PL/I (850 posts, now 33rd in the fist, its highest position since 1993), DL1 (490, 46th) and JCL (450, 50th). There is perhaps a year 2000 factor here – IBM mainframe sites are clearly looking for a greater proportion of staff with legacy skills than in early 1997.

But the numbers are so small in the context of 65,000 jobs on offer in total, that this does not materially affect the overall picture

Epitome

Cobol, the epitome of legacy skills, has shown growth of 52%, which is very much in line with the overall market growth and of the rise in IBM mainframe recruitment (more than four out of five Cobol jobs are in these sites). As a result it has actually fallen a place in the table, as demand for Visual Basic has risen at a significantly faster rate.

Taking a long-term view, it is instructive to look back four years to that first quarter of 1994, when Unix moved into the first place it has held till now. The changes in the skills most in demand then and now are remarkably few, and most of these were not widely forecast in 1994.

Of the 10 skills most in demand four years ago. eight remain in the top 10 today.

Those that have dropped out are Ingres and Lan, and their replacements are Windows NT (which was down in 43rd place in the first quarter of 1994) and Visual Basic.

Apart from Ingres. only four others dropped out of the top 20 MS-Dos, VMS, graphical user interface (GUI) and Informix. Their replacements are Java, Powerbuilder, Office and object programming.

So, comparing the two tables, we can see the rise of Windows NT. and the growth in interest in the Internet (as represented by Java), in object-oriented programming (both as a generic skill and in terms of the displacement of C by C++), and in more modern application development methods (as represented by Visual Basic and Powerbuilder).

Offsetting that, MS-Dos and Vax VMS have fallen from favour, while all the open systems databases apart from Oracle have lost significant popularity.

The biggest surprises, though, are what have not happened. Cobol and RPG400, far from disappearing, are today in more or less the same places as they were then.

The mainframe, which many in 1994 thought would be extinct by now, is still represented by DB2 (in exactly the same place) and by Cics (actually five places higher), as well as by Cobol.

Categories: Platforms

ERP Proved A Solid Foundation For NT

Friday, December 12, 2014 at 11:52 am

For a long time, Unix looked to be the most fertile ground for seeding Hydro-Agri North America’s extensive ERP applications. Installed at the agricultural chemical and fertilizer distributor were two high-powered Hewlett-Packard Co. Unix servers running SAP AG’s R/3, with a third off-site for disaster recovery purposes. Systems analyst Jim Wiedrich, who ran the operation, was a self-described Unix fan, dubbing the “backslash” mark in Windows/DOS “backwards.” His preference? Leave Windows to the desktops and other non-mission-critical applications.

Wiedrich began to consider plowing new fields, however. With information systems dollars scarce and a low crop yield of Unix programmers, Hydro-Agri last year decided to root its future enterprise resource planning applications in Windows NT. The Tampa Bay, Fla., company pulled out its Unix servers, save for the database server, and replanted six Compaq Computer Corp. ProLiant servers with SAP R/3 applications on NT.

Norsk Hydro was a heavy embracer of NT.

Why the shift? As Wiedrich admits, conventional wisdom would dictate choosing the more scalable Unix platform, given that Hydro-Agri is on a rapid expansion path. The $900 million North American subsidiary of Norsk Hydro, of Oslo, Norway, is farming new terrain in chemical and fertilizer manufacturing as well as in distribution. But it was that very growth that precipitated the move to Windows NT. “As we grew, we needed to upgrade the performance of our systems,” Wiedrich explains. “With the HP systems, we had to have an HP guy come out and upgrade them after-hours. With the Compaq [Windows NT] platform … we can go in and add components without bringing the system down. And we can do it ourselves.”

Hydro-Agri’s shift to Windows NT is emblematic of what’s happening with ERP in general. At ERP leader SAP, more than half of all new licenses are for NT platforms. Even last year, with NT versions of R/3 modules only first available, 22 percent of SAP’s new customers chose that platform. The trend is expected to continue this year–particularly among small and mid-size organizations (from $200 million to $2.5 billion in revenues annually) implementing ERP, according to Scott Lundstrom, director of research and enabling technologies at Advanced Manufacturing Research, a Boston-based consulting group.

One major reason is Microsoft Corp.’s dead aim at the enterprise market. “Microsoft is a much more focused application platform than Unix,” Lundstrom says. Despite many attempts to unify the “Heinz 57 varieties” of Unix flavors, each provider adds its own mix. And “backwards” slash or not, Microsoft has given application vendors a focused product, he says.

There are other reasons users are migrating from Unix into NT territory, Lundstrom says. “Windows NT is simply lower-cost than Unix solutions,” he says. A PC server capable of running an ERP application typically starts at around $10,000–about half the cost of a Unix server. Add to that some aggressive pricing by Microsoft Corp. on the operating system side, and the Unix system may be out of reach for most small to medium-size businesses, Lundstrom says.

But it’s not a perfect growing season for Windows NT. Performance and scalability issues need to be addressed, and some changes to the NT file structure are required before the fruits of the platform can be fully realized with ERP. Also, the fact that Unix is a mature operating system with a large contingent of ERP add-on software gives it an advantage.

For Hydro-Agri, Unix definitely had a leg up two years ago when the company sent out an RFP to 25 ERP vendors. A user panel from the company chose SAP, however, partly because Hydro-Agri liked its aggressive plans for NT, along with the software’s data accessibility and configuration flexibility. Hydro-Agri implemented Phase 1 of its ERP deployment in eight months, running initially on the HP 9000 Unix platform with an Oracle Corp. database. The company estimates that the installation of back-office functions, such as accounting, finance and cash management, has saved the company more than $1 million by better managing and controlling inventory flow and costs.

In Phase 2 of the installation, Hydro-Agri added profitability analysis and reporting applications to the system, giving managers real-time cost and revenue information online. Then, last year, the company began plans to migrate its SAP applications to NT from Unix, leaving the Oracle database on a single remaining Unix server. “It worked better than expected. On Friday, we were running on the Unix boxes, and on Monday morning, we had the applications running on Windows,” Wiedrich says. By the end of this year, the Oracle database will be transplanted as well, moving onto a new Compaq ProLiant 7000 server running under NT.

Widespread migration

It’s not just users that are finding NT to be fertile ERP ground. The operating system has become the testbed platform of choice for leading ERP vendors SAP and Baan Co. “You’ll see all of our newest versions of our applications show up first on NT,” says Allen Brault, director of NT business development at SAP’s Waltham, Mass., office.

While SAP has led the way among ERP vendors pushing the NT platform, other vendors are catching up. Baan’s NT products have only been available for eight months and they already comprise more than 10 percent of new shipments, says Anil Gupta, vice president of industry marketing at the Menlo Park, Calif., company.

The ERP vendors are finding that moving to NT opens up new markets among smaller, non-Fortune 500 companies. “Our growth market is now from the sub-$200 million- up to the $2.5 billion-size company,” says Steve Rietzke, head of SAP’s Microsoft partnership, based in Bellevue, Wash. “That’s where NT is making the most impact.”

Driving ERP’s success in that market is price. “Windows NT hardware and software are cheaper than the Unix versions,” Lundstrom says. That lower price comes from both prices for the software licenses and the cost of the hardware platform, he adds.

And even if Unix vendors lower hardware prices, as Sun Microsystems Inc. has done at times to counter Intel Corp. servers, PC servers still have an advantage, notes Hydro-Agri’s Wiedrich. “When I’m finished with a Unix server, it doesn’t go anywhere else for use. With a Compaq server, I can use it as a Notes server or, if worst comes to worst, I can cannibalize the parts for other servers and even desktops,” he explains.

That lower price also attracts customers who couldn’t afford ERP software on a Unix platform. That was the case at $47 million Green Mountain Coffee Roasters Inc., in Waterbury, Vt. “We wanted a solid ERP system, but it would have been nearly impossible to justify if we had chosen a Unix system,” notes Jim Prevo, CIO at the coffee retailer and wholesaler. Green Mountain opted for a three-tier client/server system after attempting to brew its own ERP system. Already a Compaq shop, the company decided the high-end servers would be more than adequate to meet its ERP needs. In early 1997, Green Mountain purchased 17 modules from PeopleSoft Inc., and by early June, seven were percolating: general ledger, accounts payable, purchasing, production management, bill of materials and routing, cost management, and inventory management.

Scaling new heights

The momentum behind NT doesn’t mean the combination of Unix and ERP is going away, however. “Among our large customers with large deployments, Unix remains their primary platform,” SAP’s Brault admits. NT is simply not ready for those large deployments, explains Lundstrom, because it lacks the fault tolerance and redundancy of Unix systems. Former Unix devotee Wiedrich agrees. “We run an 8 a.m.-to-8 p.m. shop. If we were 7-by-24, I don’t know if we could have chosen Windows NT at the time we did,” he says.

Others acknowledge that NT’s areas of vulnerability include its scalability. That may be remedied soon in NT 5.0 when Microsoft, of Redmond, Wash., includes an LDAP (Lightweight Directory Access Protocol) directory structure. Without the LDAP directory, converting files from the database to the NT front end is sluggish.

NT 5.0 will also include a single-sign-on feature that will allow users to log in to NT and get rights to SAP R/3 at the same time, says Edmund Yee, manager of network operations at Chevron Canada, a user of SAP on NT. Chevron Canada is beta testing NT 5.0. “Currently, you have to log on twice, so the single sign-on will make things a bit easier for the users,” Yee says.

Adding to NT’s limitations is its close ties to Microsoft’s own SQL Server database, which doesn’t scale as well as others, according to AMR’s Lundstrom. Coming soon, however, is a new release of SQL Server with improved scalability.

But perhaps one of the biggest hurdles to running ERP applications on NT is the relatively limited crop of add-on tools–for example, performance enhancement and application management software–that have been ported to the platform. “There are a number of those types of tools that simply aren’t available to Windows NT users,” Lundstrom says. Prevo at Green Mountain agrees, although he notes that many of those tools may not be of much use to a company of his size. “Even if we had those tools, it would have taken way too long to learn how to use them. That’s why we used an integrator and some PeopleSoft contractors–people with experience there–for our project,” he says.

However, notes Baan’s Gupta, many of those tools are being built into NT because it is application-oriented, which he sees as a plus for the operating system. “Backup and file copying are fully integrated into NT,” he says. “With a mainframe environment, you’d have to go buy a Tivoli [Systems Inc.] product or a [Computer Associates International Inc.] Unicenter product to do that.”

Further down the line, there are other technology improvements under way on the hardware side to strengthen NT’s roots in ERP. Compaq, in Houston, has made ERP a priority on its systems and has established several SAP Competency centers to provide a testing environment for companies interested in SAP on an Intel platform. (See story, left.) Compaq is also bundling Baan software along with Microsoft’s BackOffice on its servers. And recently, Intel has announced that it will begin working with SAP to ensure that the Intel platform is well-suited for ERP applications.

A close fit with the hardware will be key when the Intel platform moves to 64-bit microprocessing next year with Merced, making it roughly equivalent to Unix microprocessors. Realistically, though, it may be another year or so before NT servers are robust enough to stand shoulder-to-shoulder with large, entrenched Unix servers.

For one thing, apart from a few vendors, such as Sequent Computer Systems Inc., NCR Corp. and HP, large eight-way servers are not generally available on the Intel platform running NT. That will change with Merced, but it won’t happen overnight, analysts say. And even if it does, it would take some time and a compelling reason to dislodge bulletproof Unix installations. “You’d have to have a pretty compelling case to toss a system that’s working, just to replace it with new technology,” AMR’s Lundstrom says.

And Unix vendors themselves aren’t standing still. “As Windows NT is moving more towards 7-by-24 operations, Unix systems are moving to five 9’s [99.999 percent uptime] in reliability. So the bar is moving in both areas,” says James McDonnell, group marketing manager for personal information products at HP, in Palo Alto, Calif.

Dual approach

Vendors’ strong backing of NT has given some large companies the confidence to move at least some of their ERP applications to the platform. For example, Chevron Corp., in the United States, has standardized on SAP running on Unix, but the company’s smaller Canadian branch chose NT, says Yee, manager of network operations for the company, based in Vancouver, British Columbia. While the U.S. operation has more than 7,000 users, the group in Canada counts only about 180 active users. Because SAP has provided ways to migrate between the two platforms, the groups are able to share information. “We grab information from their servers when we need it,” Yee says.

R/3 can be configured to have the database and application running on Unix, while the presentation appears on Windows. In Chevron’s case, the Canadian database is run on NT, while the U.S. database runs on Unix. While the two systems don’t interact on a constant basis, Yee says there is no problem in converting the Unix database information into NT.

And while Unix has been appropriately advertised as a more stable platform than NT, Hydro-Agri’s Wiedrich has found a hidden advantage to moving to the more desktop-oriented platform. “When the Unix server went down, we had to either call me in or call someone from HP in to fix it. With the Compaq servers, about half the time, the desktop guys can make any adjustments. It’s nice not having your beeper go off every weekend,” he says. Even if that backslash does look “backwards” to him.

Categories: Platforms

Real Time Gets Real

Friday, October 3, 2014 at 12:33 am

Real time is all about providing a result in a bounded amount of time. It is about juggling multiple inputs from the outside world and supplying outputs back to it exactly when needed. An example of a real-time application is the antilock brakes on your car (they must be accurately pulsed tens of times per second). To satisfy the needs of real-time systems, APIs must be available to support accurate timing, fast communications and I/O, and precise, priority-driven scheduling. Conventional wisdom held that without proprietary APIs and OSes, it was impossible to achieve the level of performance required to solve real-time problems. However, there are significant costs associated with coding solutions to a proprietary product.

Recognizing this, OS vendors, researchers, and users participated in an IEEE working group known as Posix.4. The group’s goal was to refine existing Posix APIs and develop new APIs to address the needs of the real-time environment. The result of this effort was the Posix 1003.1b-1993 standard, or Posix.4. To address the need for efficient communications, the group added APIs that support memory mapping, message queues, semaphores, signals, and asynchronous I/O, or it extended existing calls. It also added timers, memory locking, and programmable scheduling capabilities to support the accurate timing and scheduling necessary for time-critical tasks. Many legacy OSes offered one or more of these features, but often as a clumsy, heavyweight, kernel-based implementation, rather than the simple and fast implementation necessary for real-time applications.

Real-Time Additions

Posix specifies an OS environment where multiple processes operate independently, each with its own protected address space. While the protected address space ensures that processes do not affect a system’s integrity or that of other processes, this environment is confining for real-time communications. This is because programs must perform time-consuming OS calls to communicate with each other and with the outside world. The fastest form of communication is through memory itself. Therefore, for processes to interoperate at the highest possible speed with each other or with devices, they must be able to share physical memory. Posix.4 defines a sophisticated yet elegant function called mmap() that uses a file descriptor to establish shared-memory mapping. Here’s how it works.

First, the shm_open() call provides an easy means to obtain a file descriptor corresponding to a supplied path. If multiple processes call shm_open() with the same path argument, and each process supplies this returned file descriptor to mmap(), this effects a mapping to the same physical memory. That is, all the processes access the same block of memory, as shown in the figure “Client/Server Application of Shared Memory.” Mmap() is extremely powerful. It maps memory among processes. Also, you can use it to map a disk file into memory. You can read or alter data in the file through fast memory operations, rather than through the traditional set of open/read/write/close calls. To ensure safe operation, mmap() provides control over which processes are allowed to read or write given shared-memory areas. Traditional synchronous I/O (e.g., writing to a disk file) puts an application to sleep while the I/O operation is pending (an unbounded time). Clearly, this action is not suitable for a real-time application that must always be ready to handle an event. To satisfy this requirement, Posix.4 provides asynchronous I/O (e.g., aio_read()), so that the application can continue executing and be notified (via a signal) when the operation completes. A list I/O (lio_listio()) function can execute many synchronous or asynchronous I/O operations via a single command.

When a real-time process executes on a demand-paged OS, it must ensure that any memory it uses stays locked in physical RAM. If it does not, the OS might page that memory out to disk. When the process next accesses this memory, the system’s memory management unit (MMU) must schedule a synchronous I/O operation to reload the memory page into RAM. Meanwhile, the process is put to sleep, leaving it vulnerable to missing time-critical events. Posix.4 provides the mlock() and mlockall() calls to accomplish the desired memory locking.

Control and Access Shared memory is great for fast interprocess communications (IPC), but some means must be provided to manage access to it. Confusion will result if a process attempts to read or write to shared memory that is being updated by another process. The Posix.4 solution to this problem is the semaphore. Both named and unnamed semaphores are provided. You create and access named semaphores via a path name. Access to named semaphores is through the sem_open() call. Unnamed semaphores are created directly in shared memory and managed by the user. Through the use of shared memory and unnamed semaphores, it is possible to build elaborate shared data structures with fine-grained locking. Event notification is vital to real-time applications, because processes must react quickly to outside events–such as releasing the brake on a wheel just about to lock up. Posix.4 provides an extension to traditional Posix signals called real-time signals. Posix signals simply set a bit, so it is impossible to know how many signals were actually sent to a process, or why. In contrast, real-time signals are queued so that none are lost. They have also been extended to contain additional information. If a signal is sent by sigqueue(), an integer or pointer value can be passed to the recipient. This result provides some indication of the actual event to be processed.

Messaging is a staple of real-time applications. Posix.4 provides an elegant set of APIs to implement message queues. To address real-time requirements, Posix.4 message queues support at least 32 levels of priority and can use real-time signals to notify a recipient of delivery. Message queues are efficient: Tests show that they are two to four times faster than traditional communications mechanisms such as sockets. Message queues are accessed via mq_open(), which uses a path name to identify the message queue to access. Real-time applications must ensure that operations occur on schedule. To meet these requirements, Posix.4 provides real-time clocks (clock_gettime()) with up to nanosecond resolution and real-time timers (timer_create()). Unlike traditional Unix timers, many real-time timers can coexist in one process. Posix.4 timers use real-time signals to notify the process when an interval has expired. If all that is needed is a simple time delay, the nanosleep() call can delay the current thread of execution for a precise amount of time. In addition, Posix.4 provides a set of scheduling APIs that let a process define, query, and alter the scheduling policies and characteristics that apply to that process.

Categories: Platforms

Notes From The 90s

Monday, August 25, 2014 at 9:44 pm

muI’m writing this in early January, and it will be in editorial offices from Istanbul to Tokyo by the 8th. The first part of this column will be online in the United States before the end of the month. That’s progress. It gets the awards information out faster. More importantly, I don’t have to guess each month what’s going to be interesting three months later.

The User’s Choice Awards are subjective and my own, and generally follow from what I have found useful here. I can’t pretend to have looked at everything, so when I use phrases like “Best of the Year” please understand there are some restrictions here, and perhaps I ought to say “Best I have looked at and used.” That’s truthful, but overly restrictive. I hear about a lot of stuff from readers, from associates, from Byte.Com’s excellent editorial staff, and even from press releases, particularly if they come from press people I know and respect.

Incidentally, there are a lot of respectable public relations people in this business: people who will put the best face on their products, but won’t lie about them, and don’t try to waste my time getting me to look at something that would be, well, a waste of time for me. My thanks to all of them. And once I decide I will probably like something, it’s not all that hard for me to get it. So, while I haven’t seen everything, I do see and use a lot of the best hardware and software around. It’s nice work if you can get it.

The User’s Choice Systems AwardsWe have Wintel PC Systems, Macs, and Linux boxes. Of those, the most useful systems to me are:

The Compaq SP 750 Dual Pentium III Professional Workstation running Windows 2000 Professional. This machine does all my Net cruising, e-mail, accounting and books, expense reports, most artwork, and generally everything except creative writing. It would do that, too, but I generally employ a different “main machine” for writing, because I like to have a workstation nailed online to get e-mail and such like, and it can be distracting when important mail comes in. It’s simpler to have two machines. Regina, the Compaq SP 750, gets the User’s Choice Award as most useful workstation for the year.

The best all-around system I have found is an Intel D815EEAL motherboard and an Intel Pentium III of 800 MHz or greater. I’m fond of the new 1-GHz chips, but they cost significantly more than the 933 for no great performance improvement. Put in at least 128 megs of either Kingston or Crucial memory, add a Seagate Barracuda ATA III 40 gigabyte hard drive and a BTC 8x DVD drive as the CD drive, put it into a PC Power and Cooling Personal Mid-Tower case, and the result will be a splendid all-purpose machine, with video and sound good enough for office work and most games. It assembles easily and without problems from Intel’s Good Enough documentation. This combination gets my User’s Choice Award for Best All Around System for the year 2000.

The D815EEAL just got better. Analog Devices and Intel have got together, and the drivers will be available to make use of all the features of SoundMAX that are built into the board. You need the SoundMAX 2.0 drivers, which you can get from Analog Devices. Big orchids to both Intel and Analog Devices. You can now have excellent sound built onto your motherboard.

If you want to turn that D815EEAL with 1-GHz P III into a screaming games machine, add an Nvidia GeForce2 AGP board (which will displace the AGP video built onto the motherboard). Those come in several flavors, and they’re all good. The Ultra is the fastest, but quite expensive. The GeForce2 MX is a lot of bang for the buck. The GeForce2 GTS is a good compromise.

Whichever flavor you get, you will now have as good a game machine as anyone you’re likely to know, and that combination gets my User’s Choice Award for Games Machine, and the Nvidia GeForce2 series of video cards are hands down the winner of the User’s Choice Award for Best Graphics Board. The Hercules version of the Nvidia has a slight edge over the Elsa version, but they are both excellent. One thing, whichever implementation of the board you buy, check the Nvidia website for drivers. You’ll be glad you did. Incidentally, this system with Nvidia video board will also make a crackerjack video-editing system, although for serious video work you’ll want more and larger hard drives.

I built my own copies of the above systems without problems. It’s of course possible to buy off-the-shelf systems with those components.

Finally, a medium orchid to Compaq for the iPac “legacy free” workstation (see December 18, 2000 column). This is designed to run Windows 2000 Professional, and has a drive bay with components identical to those in the Armada series of portables. There are no slots. Sound, video, and Ethernet are built in and are good enough. If you want other peripherals, add them with USB. All told, it’s hard to beat these for simple, cost-effective off-the-shelf workstations, so long as you are satisfied with the built-in graphics.

Operating Systems The Users Choice Award for operating systems goes to Microsoft Windows 2000 Professional, which is to say, Workstation. This is the OS to use for most SOHO applications, either for stand-alone ops or as workstations to a network or, in my case, both.

I not only have Windows 2000 Professional on my laptop, but I routinely disconnect an Intel Pentium III Windows 2000 Professional system from the network, stuff it into the Explorer, and carry it down to the beach house to use for communications and writing. It works very well for both.

I have used Windows 2000 Professional as stand-alone, in peer-to-peer networks, in a Windows NT 4 Domain, and now in a Windows 2000 Server domain environment. It has worked well in all those cases. With a minor exception I can run all my legacy software other than games. The exception has to do with certain DOS programs I wrote in compiled BASIC in 1986. I find I have to keep a Windows 98 system to use that because, for reasons not entirely clear, I can’t make it print properly under 2000, although it does under Windows 98. I can live with that.

People heavily into older games will not want Windows 2000 Professional. Very few of the old DOS full-screen games will run properly (if at all), and some early Windows games don’t work either. On the other hand, most modern Windows games run just fine, and so do a few older WIN-G games that won’t run under 98.

If there’s a game or some other DOS legacy program you just have to have, you might want to test it on someone else’s Windows 2000 Professional system before making the change. Otherwise, go with 2000 Pro. It’s more stable than Windows 98, it’s FAR more stable than Windows Me, and it’s just a great deal easier to work with than Windows NT Workstation. Windows 2000 Professional is plug and play with most hardware you’re likely to have. Once again, if you have some special legacy hardware you can’t live without you’d better test before you commit. And installation of new hardware is infinitely easier than it was with NT 4.

While we are on operating systems, Microsoft gets a tiny orchid and a very large onion for Windows Me. The orchid is for fixing some legacy problems. Most of those legacy games that would not work with Windows 98 work just fine with Windows Me.

The onion is for the general instability and kludginess of Windows Me, which was rushed out before it was ready. Microsoft clearly hoped to come up with a single operating system based on NT technology, with a “home” and a “professional” flavor. It didn’t achieve that in time, and Me was a stopgap, designed to capture a new revenue stream and fill the gap for those who just had to have something new. It is not compatible with all modern games, including Microsoft’s own Crimson Skies. It is prone to odd failures for inexplicable reasons. It does have the virtue of running a number of older DOS programs — mostly games — that Windows 98 won’t run, but it also breaks some older DOS programs (including my accounting program that will run under Windows 98, runs but won’t print under Windows 2000 Professional, and won’t run at all under Windows Me). I don’t think of many reasons to “upgrade” to Windows Me from Windows 98, and I know of a number of reasons not to. Stay with 98 or go to 2000 Professional and give Windows Me a miss.

CD-R And CD-RW A Chaos Manor User’s Choice Award to Plextor’s PlexWriter, which reliably burns CD-R and CD-RW disks without problems or concerns. These come in different speeds, and they all work. Mine is the 12-10-32A. Plextor also gets a big orchid for building “Burn-Proof” into the hardware. This long-needed technology turns off the laser if, due to underflow, there’s nothing to write. It’s astonishing that it took so long for someone to think of doing that, but I’m sure glad Plextor did.

A second User’s Choice Award goes to Ahead Software’s Nero Burning ROM, a program that, in combination with a PlexWriter, has been 100 percent effective in making CDs rather than coasters. Everyone needs the capability to burn CDs. In my case, I periodically make a copy of every word I ever wrote, along with all the editors required to read the files. I store copies of this “Full Monty” in various places, including at Niven’s house so even if Chaos Manor burns to the ground I’ll have all my creative work.

Another use is file transfer. The other day, Roland wanted to reconfigure our Linux boxes. He needed a number of files available only online. Chaos Manor still has only a 56K modem connection (with luck that will change soon), while Roland has a T-1 connection at home. It was a great deal faster to have him take home a Backpack CD-Rewriter, download the files, burn them onto CDs, and bring them back here. The Backpack CD-Rewriter isn’t anything as fast as an internal Plextor PlexWriter, but it’s external, portable, and works off either the parallel or the USB port. It’s worth taking one along in checked luggage if you are on assignment where backup is important. The User’s Choice Award for portable CD-R, CD-RW drives goes to the Backpack.

A rather grudging Chaos Manor orchid to Roxio DirectCD. As I said last month, this is the latest edition of what was one of the most hated programs around, Adaptec’s DirectCD. The latest version, though, actually works with all flavors of Windows including 2000 and Me, and works invisibly and well. It took Adaptec six years and having to spin the product off to another company to do it, but once done, they did it well. If you have Nero Burning ROM, you don’t absolutely have to have Direct CD, but if you use your CD-RW for incremental backups and as a file safe, you’ll want it. With DirectCD, your system sees a CD-RW drive as just another drive and you can read, write, and rewrite files the same as you would to any other drive. That’s very convenient, and Roxio gets an orchid.

DVD-RAM An orchid to the Hitachi/Panasonic-led coalition that is bringing out DVD-RAM. I don’t have an award for a system yet because I don’t have one of the new versions with double (4.7 GB/side) capacity; I expect I will be giving a User’s Choice next year. DVD-RAM will eventually replace DVD, Magneto-Optical, CD-R, and CD-RW drives as the removable read and storage device of choice. CD-ROM drives may stay around in about the same way that 3.5 inch floppies have survived, but since you can read an ordinary CD-ROM in a DVD-RAM drive, they’ll slowly vanish as will DVD-ROM (read only) drives as DVD-RAM prices fall.

DVD-RAM is permanent read/write storage, and while the cartridges are expensive now, their prices will fall as more drives are installed and demand rises. At 4.7 gigabytes a side on those cartridges, DVD-RAM can do many backup jobs now done by tape. They’re faster and far more convenient than tape, more permanent, and more reliable. You may still want tape to do enormous comprehensive backups, but for small establishments doing a weekly incremental update, DVD-RAM will be good enough and more convenient. I think this will be the year DVD-RAM begins to take off.

Categories: Platforms

What Other Things Can You Do With Dell PowerEdge Recovery?

Tuesday, July 15, 2014 at 4:26 am

r5rA lot of people nowadays are becoming crafty and smart when it comes to reusing or using something other than the purpose that it was meant for. Most of them uses the internal contents of Dell PowerEdge recovery and use it for something else. Many people say instead of buying expensive anti-virus for your computer, you can simply back up your files in this Dell PowerEdge recovery so it can be protected against any other kinds of file corruption no matter what the circumstances are. Other times they try to hack the system and make Dell’s recovery system usable for other brands of computers. That is how smart and clever people are nowadays. The reason why they do this is because they want to save more on protecting their computers. Instead of buying generic brands, they buy these branded recovery systems and hack it and they use it for their device. In this way, they can be sure that they will be protected and their files as well.

A lot of people are convinced that other than using this system for retrieving your files, you can also use this as a safe lock for your very important files or the files you want to keep safe and maintain for a long time.

What Makes RAID 5 Recovery Better Than Other Recovery Systems?

Like other recovery systems, RAID 5 recovery is very useful when it comes to retrieving files. What makes it more special than the others, though? Aside from it is very easy to use, they have evolved from RAID one to RAID five. With the number added in the end, you can see the numbers of how many times they have tried to improve it. Who knows? There might be more improvements to come in the near future that will be very helpful to all users and it will be easier for people to use it no matter what their age is. Recovery systems are designed to help users retrieve whatever they might have lost or accidentally deleted. What are the things that make it very special?

One of the reasons why it is very special is because it has an automatic backup system that does not need you to manually choose if you want to start the back up now or some other time. It automatically does that every single time that is on. Without your consent, ruse but true, it saves the current files while your computer is turned on. This is what makes RAID 5 recovery a better system than the others.

Categories: Uncategorized

Windows ME – What A Stinker!

Tuesday, May 27, 2014 at 7:52 pm

wmeThough the Windows Me upgrade itself costs as little as $49, there’s another price to consider: Dozens of utilities and other apps designed for earlier versions of Windows won’t work with the OS (see “Using Windows Me–the Hidden Costs of Upgrading,” page 52, for a list of the most prominent). If you plan to keep using one or more such programs under Windows Me, you’ll need to expand your upgrade budget to pay for new versions–or at least allocate time for lengthy downloads.

And those aren’t the only difficulties. By mid-December, a search of Microsoft’s knowledge base (search.support.microsoft.com/kb) for the text Microsoft has confirmed this to be a problem’ retrieved 200 incompatibilities, “issues,” and other difficulties that the company blames on Windows Me. Searching for the same text for Windows 98 yielded the same number–which probably indicates that 200 is the upper limit on records returned by the site’s search engine.

But the same search directed at Windows 98 SE identified only 184 items. And although this data provides only the roughest of measures (we don’t know how many problems have been found for Windows Me and Windows 98 in total, for starters), we can say that Windows Me has generated more problem reports in less than three months than Windows 98 SE has in more than a year.

Most of the 200 problems that our search uncovered don’t afflict previous versions of Windows. They range from the silly (a pointer problem in Hasbro’s Tonka Search and Rescue) to the stupefying (system freezes when you switch between an MS-DOS window and Me’s Full Screen mode).

Several upgraders reported an incompatibility between Me and the Point-to-Point-Protocol-over-Ethernet (PPPoE) DSL software commonly used by DSL providers. At least two major services, Verizon and BellSouth, were working on Windows Me updates as of mid-December.

Microsoft has long touted the Windows 9x family as the OSs most compatible with both new and aging consumer hardware and software. So why the compatibility issues in this swan-song edition?

Microsoft consumer Windows product manager Tom Laemmel attributes the absence of some drivers to a combination of factors. More-exacting compatibility testing washed some older drivers out, and a number of manufacturers simply didn’t submit new drivers to Microsoft in time for inclusion with the upgrade.

You can still pull out your device’s installation CD and reinstall those older drivers after Windows Me is up and running. But if you own a digital camera or scanner, you will almost certainly run into another difficulty: The new Windows Me Imaging (WIA) subsystem is incompatible with manufacturers’ Win 98 software. You can reinstall the older software, but you can’t use the features of WIA.

STABLE, BUT INCOMPATIBLE

SIMILAR incompatibilities cause other “no-run” situations. Windows Me’s System File Protection feature makes the OS more stable by monitoring key system files in real time to ensure that no one– and no program-changes them. Several applications that want to change those files therefore can’t run under Windows Me. The vendors of most such apps have released Windows Me-compatible upgrades, but typically you must pay for the new versions.

Finally, some readers report that Windows Me failed to identify and install drivers for several of Microsoft’s own mice and keyboards. This problem extends to the company’s software, too. For example, although he was satisfied overall with Windows Me, reader Troy Clarke reports that his keyboard began to malfunction after he installed Microsoft’s Internet Explorer 5.5 Service Pack 1.

HEALING THE HURT

CLARKE WAS undaunted by the update snafu with Internet Explorer 5.5 however. In previous versions of Windows, attempting to remove an Internet Explorer version upgrade or service pack didn’t always succeed–the bugs checked in, but they didn’t check out. This time around, instead of uninstalling Service Pack 1, Clarke simply rolled the system back to its pre-[SP.sub.1] state, using Windows Me’s new System Restore feature. Dozens of readers lauded System Restore’s ability to undo buggy software installations.

“System Restore alone is worth the price of the upgrade,” writes Douglas Emerick of Langhorne, Pennsylvania. When an application that he installed somehow disabled his computer’s USB ports, Emerick says, System Restore saved him hours of troubleshooting.

But not everyone in our informal survey had a good experience with System Restore. “It didn’t work,” reports Gene Adamski of St. Augustine, Florida, adding that a dialog box simply announced that the system could not be restored, providing no further explanation. Other users say that they had to disable System Restore because it demanded too much space on their hard disk.

In addition, many of the readers grouse about Windows Media Player 7, calling it a slow, crash-prone memory hog that has proved to be no match for such leaner, meaner players as MusicMatch Jukebox, RealPlayer, and Winamp –or even for previous versions of Media Player itself.

Likewise, readers report little interest in the limited Movie Maker video-editing software, with many objecting to the fact that the operating system installs it by default. Others grumbled about how it saves video only in a proprietary Microsoft file format.

As if software incompatibilities and lackluster extras were not enough, Windows Me’s reduced MS-DOS support angers other readers. Many of them express confusion over the details: You can still run DOS programs, open a DOS prompt Window, and issue certain commands, but you cannot boot the computer directly to a DOS prompt (except from a start-up floppy disk that you can make from within Me), and you cannot reboot in MS-DOS Mode.

DOESN’T DO DOS

WINDOWS EXPERTS who were accustomed to using DOS text commands for backing up, editing, and restoring the Windows Registry in previous versions can do so no more. And those are not the only command-line tools that won’t work in a DOS box under Windows. Many of the existing antivirus, disk-maintenance, and hardware-configuration utilities won’t function with Windows Me either.

UPGRADE RESISTANT

FOR SOME READERS, such fundamental changes are reason enough not to upgrade. Donald Matschull, business manager for a church in Plano, Texas, says he’s not interested in Me because it means training people to use and support a new OS.

Matschull says he’ll resist replacing his aging Windows 95- and 98-based machines as long as new computers are available only with Windows Me or Windows 2000 preinstalled. He resents the way the industry abandons old OSs when new ones come along. “I question the efficiency of new technology that forces workers to relearn procedures they already know,” he comments.

With readers reporting such a broad range of experiences, it’s hard to offer definitive advice to prospective upgraders. At the very minimum, you should take a careful look at Microsoft’s step-by-step upgrade guide at wwwmicrosoft.com/windowsme/upgrade/checklist.asp before you buy Windows Me. In particular, visit the hardware compatibility guide (www.microsoft. com/window sme/upgrade/compat). In addition, be sure to download and install the latest Windows Me–compatible drivers for your hardware, if they are available.

CAUTION: DON’T BURN YOUR BRIDGES

IF YOU DO decide to perform the upgrade, be careful not to skip the steps that enable you to return to your current version. More than one respondent to the PC World survey lived to regret their failure to back up the old configuration and drivers before performing a clean install.

Categories: Platforms

Choosing An OS For Your Older Machine

Wednesday, May 14, 2014 at 9:53 pm

closPERHAPS THE SINGLE MOST important factor in your productivity is your computer’s operating system (OS). It determines how you store and retrieve files; how you connect to the Internet and to your home network; how you use devices like PDAs and peripherals like printers and CD-RW or DVD drives. Most important, your OS governs your interface or interaction with all your other software–and indeed, dictates which programs you can use.

Not surprisingly, the first thing to do when choosing an OS is to assess your workstyle. Just what do you want your computer to do? What software will perform the tasks you require? What operating system runs that software? For example, if you’re a graphics or publishing professional, the Mac OS is the right fit–the best software for you exists on that platform. If you want the broadest choice of business programs or hardware add-ons, you want Windows.

This month, we take a look at five popular operating systems, four for Intel- or AMD-powered PCs and one for Macs. Three–Microsoft Windows 2000 Professional, Corel Linux OS Second Edition, and Red Hat Linux 6.2–can serve as network operating systems as well, meaning that they can handle more than one user request at a time. The others, Microsoft Windows Millennium Edition (or Me) and Apple’s Mac OS 9, are designed primarily for a single user at a single desktop or laptop, though both can easily set up a simple network for a few systems to share files and a printer or Internet connection.

We used a total of four machines, each with 128MB of memory, to test the five operating systems. Hopping from one OS to another left us cognitively confused, but we found the essential elements of each interface to be the same–clicking and double-clicking were the basic maneuvers we had to perform, and hopefully the only things you’ll need to master before getting to work.

Apple Mac OS 9

Users either idolize the Mac OS or shrug it off as too far outside today’s business standard. Nevertheless, Mac OS 9 provides a worthy alternative to Windows for home-based workers, especially those who create graphics, video, or audio content.

In this version, attention has been paid to convenient access to applications and files. Rather than promoting Layered menus, users are encouraged to use desktop icons and the Control Strip, a toolbar that can reside near the bottom of your screen or be tucked away. Mac OS 9 is billed as an “Internet operating system,” with Web tools always within easy reach via hotkey, icon, or toolbar. What used to be simply “Find Files” is now a search engine called Sherlock that integrates Internet searches with Local hard disk hunts. Unfortunately, this also puts considerable memory and performance overhead on what used to be a streamlined task.

One new feature we enjoyed was “speakable commands.” By pressing the Escape key and speaking to our machine, we could give a Large number of common commands to most of our applications. Mac OS 9 also shines at switching among multiple users’ settings or preferences and managing their various Internet passwords.

Though relatively speedy and stable, Apple’s OS has long Lacked the multitasking muscle of its rivals. That should change in early 2001 with the release of Mac OS X, which combines a nearly crash-proof Unix core with a more colorful new interface for new applications. Apple promises most OS 9 programs will run under OS X, though they won’t look any different, and that the new operating system will run on all G3 and G4 Macs.

For now, we feel that Mac OS g, while not truly innovative, continues to be a solid and stylish alternative platform.

Apple Mac OS 9 Rating: 8

A Cassy, powerful, crashes are rare

B Fewer apps for the Mac

Corel Linux OS Second Edition

* ($25 Standard, $80 Deluxe with phone support; 800-772-6735, www.corel.com)

Linux has been the province of computer pros and serious hobbyists. Corel’s version goes quite a way to change that, but not far enough to reach the home office mainstream.

We found installation to be easy. Corel Linux even offered to repartition our PC’s hard disk so we could switch between Windows and Linux; this is an excellent option for business owners who want to give Linux a proper tryout but may need to fall back on familiar applications.

There are two hurdles for the average user to overcome when it comes to Linux: adjusting to a new interface, and determining whether the software available for the platform will meet your needs. Corel sets you up with KDE, a Windows-like graphical interface that makes the OS easier to navigate. As for software, Corel bundles WordPerfect Office, Sun’s StarOffice, and some powerful freeware. Linux applications remain scarcer than Windows or Mac programs.

There’s also a daunting learning curve to climb. While it’s easy to get advice and technical support from Linux Web sites, it’s a good idea to have at Least one person in your office with an information technology (IT) background.

Today, Linux is for intrepid technophiles only. But as the platform evolves, Corel Linux will be seen as a friendly OS.

Corel Linux OS Second Edition Rating: 6

A Everything you need is in the box

B … everything but an IT department

Microsoft Windows Millennium Edition

* ($109 upgrade, $209 full version; 800-426-9400, www.microsoft.com)

Think of Windows Me as Windows 95 Sixth Edition, with a variety of multimedia and system enhancements that formerly had been the domain of third-party vendors.

For home-based workers, the most valuable new features are the under-the-hood utilities, including System Restore, which takes periodic snapshots of your Registry and Program Files folders to rewind your system to a previous state after a crash. Also, what was Sleep is now Hibernate, allowing you to shut down your PC with files open and return to your work in progress at restart. A System File Protection feature attempts to ensure that third-party program installation routines won’t trash existing programs by replanting important resources.

We found the interface a bit cleaner than earlier versions, with frequently used menus easily accessible. Behind the scenes, Windows Me boots up and runs slightly faster than its predecessors.

Installation was faster than previous versions, as well, and Win Me checks for incompatibilities, which it’s likely to find since many Windows 95/98 communications and Internet-related programs require upgrading.

Even so, Windows Me has far more software available to it than any of its competitors, and comes preinstalled on far more new PCs. In short, it’s likely to be the new home office standard–and although still more crash-prone than the other platforms here, it proved stable enough to be a worthwhile upgrade.

Microsoft Windows Millennium Edition Rating: 8

A The future home office standard

B Still more crash-prone than others

Microsoft Windows 2000 Professional

* ($219 upgrade, $319 full version; 800-426-9400, www.microsoft.com)

This successor to Windows NT 4.0 is not as friendly to individual PC users as Windows 98/Me; it doesn’t work with as many popular software packages; and it’s not as universally supported by hardware add-ons and home networks (other than Ethernet).

Win 2000% one big plus is that it proved much more stable, though not noticeably faster, than other Windows variants we tested. It’s greatest power Lies in corporate network environments; as such, it made good use of other Microsoft applications on our Ethernet LAN. However, its Learning curve and demand for network manager/administrator aptitude call for a setting with on-site tech support–not common in home offices and small businesses.

Like Red Hat Linux, Windows 2000 is much better suited to the enterprise than the home office environment at this stage of the game. While wonderfully stable and powerful, Microsoft’s flagship OS just isn’t ready for a small office that values productivity and compatibility above all else.

Microsoft Windows 2000 Professional Rating: 6

A Excellent network support

B Not the best choice for the home

Red Hat Linux 6.2

* ($30 Standard, $80 Deluxe with phone support; 888-REDHAT1, www.redhat.com)

Of all the OSes we Looked at, Red Hat Linux was the most stable, most flexible, and most awkward to use–we had to get our hands dirty several times during the configuration process.

On the positive side, Red Hat’s Deluxe distribution includes one of the best available Linux software bundles, plus excellent support. Once you have everything set up, there’s little you need to worry about. Red Hat gives you a choice of both the KDE and GNOME graphical desktop environments, and offers great flexibility for customizing your interface.

However, most of the caveats we gave for Windows 2000 apply here as well. Linux is not for the faint of heart, and Red Hat’s version is targeted more toward the IT expert than the casual user. We had to reinstall and tweak the OS a few times before we were able to achieve the configuration we wanted, and that’s not something a home business owner will want to spend time doing. As such, we can’t recommend Red Hat unless you’re a Linux enthusiast ready to take advantage of its power and Low cost.

Red Hat Linux 6.2 Rating: 5

A Today’s most stable and flexible OS

B … is the hardest to learn and use

RATINGS

We rates products on a scale of 1 to 10–with few 9’s or 10’s–based on value, performance, innovation (medals go to rare standouts in these areas), ease of use, and suitability for home offices. The A and B symbols indicate pros and cons.

LINUX GLOSSARY

Enterprise A large business organization and the computer and network systems it uses.

GNOME (GNU Network Object Model Environment) A graphical user interface (GUI) for the Linux OS, designed to make both using the OS and developing software for the OS easier.

KDE (K Desktop Environment) The most popular graphical user interface used with the Linux operating system.

Linux An open-source version of Unix that runs on several popular platforms. The basic source code, or kernel, is developed into distributions by vendors including Caldera, Corel, and Red Hat.

Open Source Software that is distributed for free, with the source code available to anyone who wants to develop it.

Categories: Platforms

Ah Microsoft – Your Licensing Is Evil!

Monday, April 28, 2014 at 9:55 pm

amylI have spent the past couple of weeks looking into the new Microsoft licensing policies, and while I don’t understand everything that is going on, I have a better handle on it. A good part of what I’ve learned is off the record, but I have this on the record from Simon Hughes, program manager in Microsoft Business Licensing: “We do not see reimaging licensing policy as a source of revenue.”

Now what that means depends in part on your attitude toward Microsoft. If you simply don’t believe Microsoft you may come to one conclusion. For my part, after more than 20 years in this field, I have yet to have Microsoft tell me a direct untruth on the record. Like all companies, it will refuse to answer certain questions, or answer in a misleading way; but I have never had any suspicion that any Microsoft official, from Bill Gates down, has ever looked me in the eye and lied to me.

Thus, I believe what it said. The re-licensing policies are not a revenue grab, and Microsoft isn’t trying to soak people by making them pay twice for an operating system license.

So what is it trying to do?

The issue came up because there are new reimaging technologies available, particularly to large enterprise customers, and these customers asked to have things clarified. In these days of lawsuits, disgruntled employees who claim to be (and may be) whistle blowers, racketeering suits, class-action suits, triple damages, and the rest of the baggage that comes along with a litigious society, large companies simply cannot stand the notion of legal ambiguities. They want to know that what they are doing is legal and sanctioned and within licensing policy. Microsoft issued a policy statement to that effect.

Unfortunately, that wasn’t the end of the matter. The statement covered a particular group of licensees who happen to be really big companies. Now, the next size down said “Wait a minute! Does that mean that what we are doing is illegal?” Microsoft thought about that, and came out with another policy statement applying to all enterprise customers.

And that leaves the rest of us in limbo, because strictly speaking what many of us do is not precisely legal. An example helps.

Suppose you are the outside MIS for a small outfit. It wants 15 PCs. You go buy them from Compaq or Gateway or Dell. Each has Windows installed. The company also buys 15 copies of, say, Office 2000 at a discount. Now, you configure one machine the way you want it and install Office 2000. When everything is the way you and your client want it, you use Ghost or some other reimaging program to clone that machine 14 times. Now, all 15 machines have identical operating systems and identical copies of Office. Are you legal? I put this to the Microsoft executives.

Strictly speaking, no, unless you have certain group licenses from Microsoft, and you probably don’t.

Does Microsoft care? Will it pursue you for this? Clearly not. The spirit of the licensing agreements is carried out: each copy of both OS and applications in use has been paid for.

That is a clear case. I can come up with less clear scenarios, and did, but so can Microsoft. Suppose you are not merely cloning the new system but in doing so are upgrading older ones? Suppose you don’t have as many applications licenses as you do machines, but you are also certain that all those machines will never be in use at the same time? How do we word a policy so that it conforms to common sense, and at the same time can survive legal twisters? I don’t know, and neither does Microsoft.

The bottom line is this: given the world of piracy Microsoft is very careful about licensing policy statements and their wording, and as I write it has not come up with an acceptable policy statement for individual customers. It will, and when it does, despite all the accusations and speculations in some publications, it will not be trying to use this as a means of revenue enhancement.

The whole issue turns out to be a tempest in a teapot.

System DisksA more disturbing trend has been distribution of computers without operating system CD-ROMs. The system comes with Windows installed, and a “recovery” disk, but the installation CD isn’t there. The CAB (compressed files that expand to become the actual operating system files; search for *.cab to see where yours are) may or may not all be present, and if they are they may be well hidden. In most cases you can get an installation CD for a few dollars, but you have to know to do that, and it can take considerable effort.

I haven’t been able to find out just what is going on here. Microsoft says this is up to the manufacturer. OEMs blame Microsoft. I have asked Microsoft for an official statement on this, but I don’t have it yet. It’s not clear to me who’s doing what because I am not much affected: I tend either to build my own systems or work with professional equipment, and of course I have installation CD’s for all operating systems I use.

Indeed, every one of the Compaq professional workstations and laptops I have had over the years has come with a sealed package containing the Windows installation CD and a certificate of authenticity, so did the AMD Athlon. Microsoft clearly didn’t forbid that. My other systems have been built out of parts. I keep spare copies of Windows available to install on those: as one system is retired, the OS license goes to another. If one gets out of my labs to someone else I send along the OS certificate as well.

I understand, though, that many consumer-level computers are delivered without installation CDs. My advice in those cases is to take whatever effort you must to get your installation CD. You may never need it, but if you do, you’ll need it bad and in a hurry. Alternatively, when you see a big sale on the latest Windows OS at a ridiculously low price, grab it. It’s cheap insurance and, unlike the procedures for getting an installation CD, takes none of your time.

If you don’t do that, you may be subject to the system’s “critical-need detector,” which kicks in to disable the machine and clobber the operating system precisely when you most need it to work. Of course, there are people to whom it doesn’t matter if their computer is down for a few days. I just don’t know any of them. (Except Harlan Ellison, but then he doesn’t have a computer.)

(For information on where and how to get Windows OS disks from Microsoft — at hopefully the least-worst price — See “Getting Windows CDs From Microsoft: It Can Be Done!” in Byte.com executive editor Daniel Dern’s August 14, 2000 “Letter From The Editor.”)

Windows ME/Voodoo 5 Loses Text For a writer, the cardinal sin of any system is losing original text. In reality, it may not have been very good text, but it will not take long for a writer to convince himself that the computer has just erased the best thing he has ever done, words to shame Shakespeare by.

I’ve just lost text.

The system is Galacticus, an Intel 933 system on an Intel D815EEA “Easton” motherboard with a 3dfx Voodoo 5 video board. When I first began writing with computers I got in the habit of saving fairly often, so I lost no more than a couple of paragraphs, but it’s annoying. I was working on this column when the system just froze up. No mouse, no keyboard. Control-alt-delete did nothing. The system was entirely frozen, and worse, the last things I had done to the column were not visible on the screen.

After about a minute the system reset itself. When it came back all seemed normal. Invoking Word, I got a “recovered” document, but I was still missing a couple of paragraphs. It’s no big deal, but it is annoying.

I suspected the Voodoo drivers, so I went to the 3dfx website and downloaded an 8-Mbyte program dated September 20 or so. That took about 15 minutes. Installation was simple. It’s a self-executing file. We will now see just what that does for me, but I am still a bit distressed. I have not actually lost a whole paragraph of text on a computer in quite a while. With luck and the new drivers it will never happen again. If it does, out goes that Voodoo board, and I’ll rely on the built-in video from the Intel Easton board, but I suspect it was a driver problem.

Categories: Platforms