Upside: Y2k: The HOAX of the Centurygreenspun.com : LUSENET : Poole's Roost II : One Thread |
UPSIDE: Y2k: HOAX of the Century| In E-Biz |
http://www.upside.com/Ebiz/3a244a9e1.html
Hoax of the century Upside New England
November 30, 2000 12:00 AM PT
by Geoffrey James Remember Y2K? Hard to believe it, but just a year ago, everyone from the U.S. government to BusinessWeek was terrified that we were about to approach a technological Armageddon that many believed would herald the end of the world as we know it. We now know that the Y2K disaster scenario was, in fact, a hoax.
-------------------------------------------------------------------------------- There has yet to be an apology from the Y2K "experts" for starting the hoax and keeping it alive. -------------------------------------------------------------------------------- Don't get me wrong. The Y2K glitch was real and there were some programs that might not have functioned perfectly if the glitch wasn't fixed. But despite all the brouhaha, there was little likelihood that the Y2K glitch would have a significant impact on anything other than a few irate customers. We know this because the new millennium began without any significant computer-related problems, even in countries like Russia, Bulgaria and Vietnam, where next to nothing was spent on the problem.
The supposed massive impact wasn't just a hoax, it was an expensive hoax. John Gantz, chief research officer at Framingham, Mass.-based International Data Corp., estimates at least $70 billion was wasted on Y2K work that wasn't really necessary.
That estimate, however, doesn't take into account the money spent by frightened citizens on "Y2K preparedness," some of whom sold everything they owned and headed for the hills. And then there was the extra government expense, which included $50 million to create a Y2K "crisis center."
Where it started
Where did the hoax originate? While Y2K glitch worries had been bouncing around the industry for a decade or so, the supposed problem was thrown into the public eye when the Gartner Group published a news release describing the supposed "dangers" of the Y2K glitch, predicting that it would cost $300 billion to $600 billion to fix it.
These figures were obviously based upon pure guesswork (give or take $300 billion?), but that didn't keep the Gartner Group and other firms from building an entire business selling "information" about the hazards of Y2K.
Like its colleague companies, the Gartner Group is widely quoted in the media as an "authority" on high-tech matters. But anybody who has ever worked with these market research companies knows the quality of research frequently is questionable and the opinions in the reports are often tailored to excite computer vendors into buying pricey reports.
The "expert" status of the Gartner Group put it in an excellent position to capitalize on the growing interest in the Y2K issue. Leading the charge at the Gartner Group was Vice President Lou Marcoccio, a Massachusetts resident who was quoted frequently on the subject and even was called down to Washington, D.C., to present expert testimony to the U.S. Special Committee on the Year 2000 Technology Problem.
During that presentation, he outlined the dangers of Y2K and even predicted that 1999 was supposed to be marked by "fiscal year system failures" (which occur for companies that have an early fiscal year ahead of the calendar year).
When the fiscal year failures didn't happen in 1999, I called Marcoccio to ask why other analysts were beginning to question Gartner Group's numbers. Despite repeated requests for an interview, he refused to return my calls.
Another local Y2K monger, Capers Jones, chairman of Burlington, Mass.-based Software Productivity Research, was willing to talk to me, however. It was Jones who popularized the notion that Y2K eventually would cost $3.2 trillion, based in part upon his belief that there would be $1 trillion in lawsuits as a result of Y2K failures. Hoax of the century
http://www.upside.com/texis/mvm/ebiz/story?id=3a257f088 page 2: New England news of note Around and around
I asked Jones where he got the $1 trillion figure. He told me the number came from another locally based industry analyst, Stephanie Moore of GIGA Information Group.
While Moore admitted to using the $1 trillion number in her Y2K articles, she insisted she got the number from Capers Jones. In other words, nobody was willing to own up to the forecast -- despite the fact that at that time, the U.S. Congress was wasting taxpayer money by debating whether to put financial caps on Y2K litigation.
In my view, these so-called Y2K experts, along with their many colleagues, were largely responsible for providing the intellectual underpinnings for the entire Y2K hoax -- despite the fact that they clearly didn't know squat.
I might add that Upside magazine, alone among the major publications, was telling its readers as early as November 1998 that Y2K disasters were overwrought hype -- a position that resulted in a flurry of hostile letters from the true believers.
Y2K is now history, but it's interesting to note that there has yet to be anything resembling an apology from the Y2K "experts" for starting the hoax in the first place -- and for keeping it alive even when it was clear (from the lack of fiscal year failures in mid-1999) that Y2K was going to be an enormous non-event.
While I suspect they were fooling themselves as much as they were fooling the rest of the world, these are the people who are paid big bucks to provide reliable advice and perspective. In this case they failed miserably, and the result was an enormous waste of time, money and energy on a problem that simply wasn't life-threatening.
Related story: Who's Afraid of the Y2K? 11/02/98 http://www.upside.com/texis/mvm/story?id=363e09320
-- Anonymous, November 30, 2000
For more details by James published BEFORE 1/1/2000: see:http://www.marketingcomputers.com/issue/aug99/feature2.asp
Out Like A Lamb
By Geoffrey James
Have you heard the latest Y2K prediction?
---------------------------------------------------------------------- ----------
According to some experts, the projected value of Y2K lawsuits is expected to reach $1 trillion! Golly! You'd expect that every public relations, marketing, communications and investor relations group in every technology company would be busy managing expectations, contacting every high and low-lying customer in their database, working their tails off to get their customers through the critical hours. If nothing else, such good faith effort would help against the torts headed their way at the hands of ambulance chasing lawyers working on behalf of every Joe-six-pack who's still running a non-Y2K compliant version of Quicken 92.
And yet, even the threat of $1 trillion has brought little more than a yawn. Most software companies have positioned themselves against Y2K only to tout the event as another reason to buy new software. Take the Oracle site. It spins Y2K as "the perfect time to migrate to the latest Oracle products, with their Y2K compliance and superior technological advances." Or Germany-based SAP AG, whose site includes links to the lunatic fringe of the Y2K controversy. It seems unphased by the coming cataclysm. "The year 2000 is going to come and go and we need to offer our customers services beyond that," says Steve Rietzke, business development manager of SAP America.
In fact, the most fervent positioning is not coming from software firms, but rather from service firms-more specifically, the Y2K service firms, which are rapidly trying to increase their life spans beyond 12/31/99. Ted Kempf of San Jose-based Dataquest's Consulting and Systems Integration program recently surveyed a dozen Y2K- oriented service firms to discover their future plans. All were attempting to grow new service products-mostly in the areas of e- commerce and data warehousing.
SNIP
Hearing those dire predictions, it's hard to keep a sense of perspective, until one considers that polls of Y2K experts are naturally going to be skewed, because Y2K experts have a vested interest in keeping the issue visible. And we're not talking chump change. The Y2K issue has given birth to a number of best-selling books, and there's plenty of money for those in a position to cash in. "A speaker on the Y2K issue can command a fee from $7,500 to as high as $40,000," according to Bob Parsons of the Washington Speakers Bureau.
An organization called the Washington D.C. Year 2000 Group sent a questionnaire to 700 email addresses of people who had indicated an interest in Y2K. The survey asked for an estimate of the impact of Y2K within the U.S., on an escalating scale of zero to 10. The survey revealed that an "overwhelming majority of the respondents believe that the United States will experience a significant economic impact from the Year 2000 issue," and that "one-third (34 percent) believe it will at the least result in a strong recession, local social disruptions, and many business bankruptcies." A significant number even believed that Y2K would result in "martial law, the collapse of U.S. government and possible famine."
-- Anonymous, November 30, 2000
http://channel.nytimes.com/2000/01/09/technology/09year.htmlJanuary 8, 2000
Experts Puzzled by Scarcity of Y2K Failures
By BARNABY J. FEDER
Whether it is with scorn, anger or resignation, most computer experts and Year 2000 program managers brush off suggestions that they overreacted to the Y2K threat, taken in by computer companies and consultants positioned to profit from fear.
Still, like the skeptics, many wonder: How did countries that started so late -- and appeared to do so little -- manage to enter 2000 as smoothly as nations like the United States and Britain that got an early jump?
"That question is plaguing all of us, although some people won't admit it," said Maggie Parent, Morgan Stanley Dean Witter's representative to Global 2000, an international banking group formed to coordinate and stimulate Year 2000 work. "We expected there to be some significant blowouts."
A World Bank survey published last January concluded that just 54 of 139 developing countries had national Year 2000 programs outlined and only 21 were actually taking concrete steps to prepare.
Japan, China, Italy and Venezuela showed up as high-profile question marks in various studies. Paraguay's Year 2000 coordinator was quoted last summer saying the country would experience so many disruptions its government would have to impose martial law. Russia, Ukraine, Belarus and Moldova were seen as so risky that the State Department issued travel advisories in November and called nonessential personnel home over New Year's.
So what accounts for the surprisingly quiet rollover? Computer experts cite several factors. Even they may have underestimated how hard many countries worked in the last few months, when the problems were better understood, and how much help came from others that started early. And in many cases, assessments of overseas readiness were based on scarce or vague data.
But the simplest if most embarrassing explanation is that the some public and private analysts who testified before Congress and were widely quoted overestimated the world's dependence on computer technology. Most countries had much less to do to prepare because they are far less computerized than the United States. The computers they do have are much less likely to be tied together in complex systems and are often so old that they run much simpler software, according to Louis Marcoccio, Year 2000 research director for the Gartner Group, a technology consulting firm.
At a briefing last week on why Pentagon analysts overestimated the risks in many countries, Deputy Defense Secretary John Hamre said, "If we had a failing, it may be that we extrapolated to the rest of the world the kind of business practices that we have developed here."
Once adjustments are made for technology dependence, some analysts say, the investment of the United States and other pacesetters in Year 2000 preparations was not that far out of line with those that started late. But the figures from many countries are so unreliable that it is hard to be sure. Russia, for example, is estimated to have spent anywhere from $200 million to $1 billion.
Mr. Marcoccio suspects the lower figures are closest to the truth but he adds that based on the government's estimate that the United States spent $100 billion, "If Russia spent $400 million, they spent proportionally more than the United States, because the United States is 300 times more reliant on computers."
Such assessments lead down a pathway that only a statistician could love. Use Gartner's estimate that the United States spent $150 billion to $225 billion, and the comparable Russia investment jumps to a minimum of $500 million. Tamper with Gartner's guess that the United States is 300 times as computer-dependent, and figures dance another direction.
But nearly everyone agrees that the figures for the United States include substantial sums toward preparations abroad by American multinationals. Motorola said its $225 million Year 2000 budget included not just repairs at its overseas factories but, for example, helping its Asian suppliers pinpoint potential Year 2000 flaws. It also paid overtime for support that helped paging and radio networks in Italy function flawlessly over New Year's.
The federal government picked up part of the tab for foreign nations. To jump-start lagging nations, the government paid for many of them to send representatives to the first United Nations meeting on Year 2000 in late 1998. It distributed hundreds of thousands of CD's in 10 languages providing background and suggestions for how to organize Year 2000 projects. More recently, the Defense Department provided $8 million to set up a joint observation post in Colorado as insurance against miscommunication that could lead to missiles' being launched.
"We got a lot of free consulting from the United States and agencies like the Inter-American Development Bank," said Rodrigo Martin, a Chilean who headed a regional Year 2000 committee in South America.
Such aid played a bigger role in helping late starters to catch up than most people realize, some computer experts say. As John Koskinen, chairman of the President's Council on Year 2000 Conversion, sees it, hype about the magnitude of the problem misled fewer people than hype about the impossibility of getting it fixed.
"This was a process that could move faster than the preparedness surveys," Mr. Koskinen said, noting that alarming press releases and testimony frequently relied on research that was obsolete within weeks.
Del Clark, who led the Year 2000 program at Phillips Petroleum, concurred, saying: "China was the big question mark for us. Part of what happened was that they were working hard late in 1999 and the status information was out of date."
It helped that repair efforts became less expensive toward the end because of the experience gained by those who did the work early and the tools developed for them, according to Brian Robbins, senior vice president in charge of the Year 2000 project at Chase. In addition, Mr. Robbins said, it turned out that some countries like Italy had done more work than reported.
By 1998, the pacesetters were far enough along for a sense to develop that others were lagging, and fears about the consequences began building. There were extenuating circumstances in some cases, like the economic slump in Asia, and many realized the problems would not be as daunting as in the United States. But with time short, industry groups like Global 2000 and a few countries began trying a variety of tactics to accelerate Year 2000 preparations.
"People outside of information technology don't realize how incredibly mobilized the world became," Ms. Parent said.
Still, many of those most familiar with the relative preparedness and spending levels in many foreign countries wonder whether it will be possible to figure out why things ended up going so smoothly.
Information was always hard to come by and hard to compare since sources varied so widely in what costs they attributed to Year 2000 work. In general, foreign countries have not included labor costs in their Year 2000 figures while the United States and Britain have, but practices have varied widely.
Now that Year 2000 has arrived, the pressure to sort out such data is disappearing rapidly.
Still, questions about the transition will not go away. What actually happened might figure in insurance lawsuits because if courts were to decide insurers were liable for the money companies spent to avoid problems, the insurers would undoubtedly cite the success of laggards and low spenders as a sign that budgets for American companies were needlessly bloated.
More broadly though, comparing preparations and the results achieved may shed valuable light on cultural differences in how technology is set up and managed, according to Edward Tenner, author of "Why Things Bite Back." That in turn could help society deal with problems like global warming and the proper use of biotechnology. "We really need to look at the sociology of computing in detail," he said.
-- Anonymous, November 30, 2000
Well, here we go again.As is usual in CPR's better posts, there is food for thought, but I want more detailed information.
For example, John Gantz's estimate that $70 billion was wasted. I happen to think highly of John Gantz, have read him for years. And I wouldn't be a bit surprised if $70 billion was wasted. I have my ideas on how it was wasted, and of course Gantz has his. I would very much like to know how exactly Gantz feels the waste occurred; any link would be appreciated.
And then there is the old business about how foreign countries "did about nothing" and got away with it.
It is my belief that they simply installed fixes sent them by their foreign software suppliers. Easy for them, cost them next to nothing.
But that's not the same as doing nothing.
As someone who has spent his entire professional career practically on complicated financial systems, I say don't go overboard on what a supposedly minor threat Y2K really was. The famous (infamous to me) Nick Z. article kissed off the whole problem by saying "easy to find, easy to fix." Easy to fix once you've found them, OK, but always easy to find, no way.
-- Anonymous, November 30, 2000
Geoff James speaks for me.You wouldn't UNDERSTAND a more detailed explanation after all this time, Nitwit.
-- Anonymous, November 30, 2000
Geoffrey always did have a way with the words. :)
-- Anonymous, November 30, 2000
Turn the page already...
-- Anonymous, December 01, 2000
http://www.y2k.gov/docs/LASTREP3.htm
RETROSPECTIVE ON THE MAGNITUDE OF THE PROBLEM
There is general agreement that the Year 2000 rollover went more smoothly than expected. The incredible success of the transition has prompted a number of questions about the effort and the results it produced.
Was Y2K an insignificant, over-hyped problem?
In the weeks since the rollover, some have expressed doubt about the magnitude of the Y2K problem and whether or not the significant investment of time and money to avoid disruptions was necessary. However, it has been difficult to find executives who worked on Y2K in a major bank, financial institution, telephone company, electric power company or airline who believe that they did not confront -- and avoid -- a major risk of systemic failure.
One indication of the difficulty of the Y2K problem is the fact that many large, sophisticated users of information technology revealed in regular filings with the Securities and Exchange Commission that they had been required to increase the funds allocated to their Y2K programs. These increases, which in some cases were in the hundreds of millions of dollars, were not for public relations purposes. Rather, they reflected the difficult effort of remediating large, complicated and often antiquated IT systems.
The Federal Government experienced a similar phenomenon. Cumulative agency estimates for the costs to solve the Y2K problem increased over four years from under $3 billion to the $8.5 billion that was actually spent. This was still significantly less than the $20 to $30 billion estimated by outsiders. But here too, the job of ensuring Y2K compliance proved to be more challenging than initially expected.
The range of actual failures during the January 1 and February 29 (Leap Day) rollovers served as a reminder of the major economic and operating disruptions that had been avoided by the development of Y2K compliant IT systems:
- A classified Defense Department intelligence satellite system was totally inoperable for several hours during the rollover period. The problem originated not in the satellite itself but in the ground- based switching and software equipment used to download and process information from the satellite.
- Bank credit card companies identified a Y2K-glitch involving some credit card transactions. Merchants that did not make use of free upgrades provided during 1999 for a particular software package charged customers for orders every day after a single purchase was made. The problem affected primarily smaller retailers since most major retailers use their own customized software.
- A Y2K computer glitch at a Chicago-area bank temporarily interrupted electronic Medicare payments to some hospitals and other health care providers. As a work- around, Medicare contractors -- private insurance companies that process and pay Medicare claims - were forced to send diskettes containing processed claims to the bank by courier or Federal Express so that the payments could be made in a timely manner.
- Florida and Kentucky unemployment insurance benefit systems encountered a Y2K glitch in an automated telephone call processing system. The Y2K glitch in customized code prevented some claimants from claiming earned income for the week ending 01/01/2000. Claimants reporting the problem had to be given an alternative means for filing their claims pursuant to State contingency plans.
- Low-level Windshear Alert Systems (LLWAS) failed at New York, Tampa, Denver, Atlanta, Orlando, Chicago O'Hare and St. Louis airports during the date rollover. The systems displayed an error message. Air transportation system specialists at each site were forced to reboot LLWAS computers to clear the error. Fortunately, the weather was mild across the United States.
- Seven nuclear power plant licensees reported problems with plant computer systems used for supporting physical plant access control, monitoring operating data, and calculating meteorological data. The affected systems did not have an impact on the safety of operations at the plants.
- During the Leap Day rollover, several hotels reportedly were unable to issue room keys to guests because of a failure in hotel key- producing software.
- The Council Chair, traveling in March, received a car rental contract that included a $10 daily charge as an underage driver since the software indicated he was born in 2039.
These and other glitches would have been more serious had they occurred in an environment in which a wide range of other Y2K problems had also surfaced. If there had been a flurry of other difficulties, some glitches would have gone undetected for a longer period of time. Glitches also could have had a multiplier effect by creating problems through interfaces to other systems or could have resulted in a gradual degradation of service. As it happened, organizations were able to focus all of their attention on the relatively few problems that did occur, which resulted in much faster restoration of normal operations.
Some of the failed expectations about more serious Y2K problems can be traced to the skepticism and disbelief with which some people greeted company and government progress reports on Y2K, believing that these institutions were inevitably covering up the possibilities of major Y2K failures. However, as the Council noted on numerous occasions, individuals in positions of responsibility who were claiming success in their Y2K efforts would be easily found after January 1, and held accountable, if subsequent system failures proved that they had misrepresented the facts. But many people continued to assume the worst would materialize even as much of the self-reporting pointed to a fairly orderly transition into the new millennium.
Why weren't there more Y2K-related problems abroad, especially in less-developed nations?
Some of those who have discounted, after the fact, the significance of the Y2K threat point to the relative lack of major disruptions abroad as evidence of how exaggerated the problem was. How did countries that appeared to have spent so little, and were thought to be relatively unprepared, emerge unscathed?
A number of factors created the mismatch between perception about the Y2K readiness of foreign countries and the actual outcome. Chief among them was the difficulty in obtaining accurate status reports internationally on a fast moving issue such as Y2K. Information three months old was out of date, and much of the international information reported was second hand and anecdotal. But, in many cases, this was the best information available until countries began to report more publicly on their Y2K work. Without more current, detailed reports, people often relied on such older information and were then surprised when it was overtaken by subsequent progress. A report about risks from April or June 1999 was assumed to still be operative in December.
A related problem was the stereotype of countries doing nothing to prepare for Y2K. While this was probably true for three- quarters of the countries in the world in early 1998, by mid-1999 virtually every country had a Y2K program in place and was devoting a high level of attention to the problem. In many cases, the fact that some countries may have spent the bulk of their funds in a concentrated effort the last six to nine months of 1999 was largely ignored. For some commentators, therefore, it has been easier to suggest that the problem was overstated rather than to consider the possibility that perceptions before the rollover were inaccurate.
Additionally, outside of the world's largest users of information technology B countries like the United States, Canada, Japan, and the United Kingdom -- the reliance upon IT drops off quickly. In many of these less IT-dependent countries, other factors also made for an easier transition into the Year 2000. Fixes in these countries were frequently more straightforward than in the United States since the technology being used was more likely to be "off the shelf," and not customized. Also, unlike the United States, countries such as Spain and Italy that had moved into IT more recently were not saddled with old legacy systems that were built with antiquated, customized code by people who had long since retired.
Countries starting later also had the benefit of lessons learned by those who had been working on Y2K for several years. The sharing of technical information about problems, products, fixes and testing techniques that was encouraged by international organizations and the Council paid enormous dividends. Elevators provide a good example. In 1998, everyone was testing to see if elevator-specific systems had a Y2K problem. Once it became clear that they did not, no one else had to spend time and money pursuing the issue. Similar experiences took place in industries such as banking, finance, telecommunications, air traffic and electric power where information was being exchanged and shared globally in a way never seen before. And in many industries, large multi-national companies actually worked directly with their local counterparts and host countries to fix basic systems.
Finally, technology itself helped countries that had gotten a late start on Y2K. One the reasons those that started late spent less on their Year 2000 efforts was that the technology to fix the problem improved dramatically. By 1999, automated tools could fix millions of lines of code quickly and at a dramatically lower cost than was possible just two years earlier. This technology helped late-starting countries to fix the problem quickly - and more cheaply.
Why weren't there more problems among small businesses?
Small business was another area about which many, including the Council, had expressed concerns. While there were relatively few reports of Y2K-related failures among small businesses, for firms large and small, there is a natural inclination not to report problems that are fixed in very short time frames. This phenomenon was revealed before the rollover when surveys showed that over 70 percent of companies reported they had experienced Y2K glitches, even though the public was unaware of virtually all of them. Some said the number of failures indicated the pervasive nature of the Y2K problem. The Council believed that the experience of companies with Y2K failures before January 1, 2000 also demonstrated that most Y2K problems could be fixed without people being inconvenienced or even knowing that anything had happened.
The lack of information about how small businesses were doing was an ongoing challenge for the Council and others following Y2K. The sheer number of these companies - over 23 million - and the absence of regular reporting relationships that made it difficult to gather information on the progress of small businesses prior to January 1, also made it difficult to determine how many actually experienced Y2K difficulties after the date change.
What happened to fears of overreaction by the public?
While a very small, but visible, minority engaged in excessive stockpiling of goods in advance of the New Year, most Americans took Y2K in stride. Anxiety about the date change, which seemed to peak in 1998, declined throughout 1999 as more and more information became available about organizations that were completing their Y2K work. By the end of the year, there was very little evidence of overreaction among the general public to the potential consequences of Y2K.
The availability of information - both positive and negative -- about Y2K efforts played a major role in reversing the trend toward overreaction. The Council's position was that people are more inclined to panic when they lack information, which can lead to a general feeling that the system is out of control. But, given the facts, whatever they are, people have great common sense and will respond appropriately. Even when the information about industry and government Y2K efforts revealed that there was still substantial work left to do, people were not alarmed. Instead, they seemed reassured in the knowledge that organizations were treating the problem seriously, were working together to solve it, and would keep the public informed about their progress. Americans knew Y2K was an important problem, but they also knew that organizations were spending large amounts of time and money to minimize any difficulties that could have been created by the date change.
Was the money well spent?
In hindsight, it is always easy to see what was not a problem and say that less money could have been spent. It's a little like saying you could have saved money spent on building safer roads when fewer accidents occur. But part of the reason for the smooth transition, in the face of thoughtful analyses noting that IT projects generally finish late and over budget with remediation work creating errors as well as removing them, was that people did test, retest, and then test their systems once again. Never before had so much independent verification and validation been done for IT work -- and it showed in the positive results and the on-time performance.
Ultimately each organization had to make its own judgement about the potential implications of failures and the appropriate cost necessary to minimize such problems. Any organization that cut back on its work to save money and subsequently experienced serious system failures would have been pilloried as badly managed and foolish.
-- Anonymous, December 01, 2000
As someone who has spent his entire professional career practically on complicated financial systems, I say don't go overboard on what a supposedly minor threat Y2K really was. The famous (infamous to me) Nick Z. article kissed off the whole problem by saying "easy to find, easy to fix." Easy to fix once you've found them, OK, but always easy to find, no way. Apples and oranges. Y2K was a DUD, those who would have had a problem were already working on it or had compensated for it years before the outcry started with the public. Very few problems would have existed if the IT's of the past decade had been trained enough to know what they were doing when they continued to write create applications that were not Y2K compliant.Certainly all of the big problems with the infrastructure were never a threat.
-- Anonymous, December 01, 2000
Certainly all of the big problems with the infrastructure were never a threat.The idea there could be infrastructure problems here and there was still believed by some experts in Oct. 1999.
http://www.intellnet.org/resources/cia_worldfactbook_99/pr101399.html< /a>
. . . .Where effective prevention action has been taken in advance of 1 January, disruptions will likely be random, temporary, and of localized impact. In the absence of effective remediation and contingency plans, Y2K-related problems could cause widespread, possibly prolonged disruptions in vital services that could have serious humanitarian and economic consequences.
Y2K failures will occur before and as the date rollover approaches, peaking on 1 January and persisting well beyond that. In some countries, such as Russia, it will likely take a significant amount of time to overcome Y2K failures. . . .
. . . .Humanitarian Crises. Y2K-related malfunctions have the potential to cause or exacerbate humanitarian crises through prolonged outages of power and heat, breakdowns in urban water supplies, food shortages, degraded medical services, and environmental disasters resulting from failures in safety controls. Russia, Ukraine, China, Eastern Europe, Egypt, India, and Indonesia are especially vulnerable, due to their poor Y2K preparations and, in some cases, the difficulty of coping with breakdowns in critical services in the middle of winter. We are also concerned that Y2K failures in chemical plants—which are often located in urban areas— could result in environmental degradation and hazards to the nearby population.
Even the poorest countries rely on essential services that are computerized to some extent, such as power, telecommunications, food and fuel distribution, and medical care. Remediation work in these sectors, however, has proceeded slowly.
Few governments outside the West would be capable of managing widespread humanitarian needs should they arise from a breakdown of basic infrastructure in their countries, especially in urban areas. . . .
Why did they still believe this in October, 1999?
-- Anonymous, December 01, 2000
Uncle Bob's on-target advice of "turn the page" notwithstanding, I am still LMAO at whoever posts these ".gov" and "Koskinen" (whose name seems to spelled correctly now that the non-event is in the history books; anyone remember when the usual spelling was "Ko-Skin-Em"?) and (this is the funniest) "CIA" links.
Just rememeber that you listened to the "experts" that were saying Exactly What You Wanted to Hear.
Ah.....disconnect. I can still smell it in the air.
-- Anonymous, December 01, 2000
Disconnect but held together by the same cut and paste jobs from the same "linkmeisters" always ready to re-inforce the Meme.JERKS.
-- Anonymous, December 01, 2000
To CPR:So Geoff James speaks for you. That figures.
He says "there was little likelihood that the Y2K glitch would have a significant impact on anything other than a few irate customers."
That is just about the dumbest thing I've ever read, and I've read plenty.
-- Anonymous, December 01, 2000
Actually, I am in agreement, and always have been, with a great deal of what the "pollies" have had to say. A lot of the Y2K hype was indeed just plain ridiculous. And the predicted infrastructure collapse was not a live possibility.
-- Anonymous, December 01, 2000
Ever-ErringBoy, you should read what you write if you want to see new lows in DUMB.Go and do your own homework and stop questioning the work of those who have done theirs with your rhetorical nonsensical "opinions posed as questions".
-- Anonymous, December 01, 2000
Peter:[He says "there was little likelihood that the Y2K glitch would have a significant impact on anything other than a few irate customers."
That is just about the dumbest thing I've ever read, and I've read plenty.]
Well, what makes it less dumb is that it turned out to be *absolutely true*!
We have two separate issues being munged together here:
1) Just how serious was the y2k problem to begin with? Well, without question it was WAY overhyped. We know this because essentially NOTHING went wrong, despite the vast scope to do so. It's unlikely that the maintenance effort was divinely inspired or superhuman. Except for PR aspects, it was fairly routine. Yes, we know the date bugs were very real, and needed to be fixed at least enough so that post-rollover firedrills could contain them. Clearly, we achieved this, and it seems equally clear that had the problem been even 10% of what the pessimists described, we'd have had no hope of reaching that point. In hindsight, y2k really was a routine maintenance project that just happened to be common to most application code.
2) Just how clearly could we have *known* how overhyped the problem was? Here is where I think your statement comes in. You looked at your part of the elephant, assumed it was MUCH more important than it was, saw real problems presenting real threats, and extrapolated from there. Cherri and I and others in the embedded world saw no problems at all, and extrapolated from OUR experience. Nobody had the Big Picture clearly focused, but most people saw no problems happening and decided to play it by ear, and went on with their lives.
I agree entirely with Patricia on this one. Yes, we had some people looking to make money off the issue, by selling books or remediation services or testing services. Nothing wrong with that, people will look for ways to make money off *anything*. We also have people who live for bad news, exaggerate from force of long habit, and are very vocal. Finally, we have people who are cautious, and believe in taking no chances if at all possible. And most of these people lost interest and wandered off during 1999 when every single alarm proved false. We reached the point where the only people talking about "the problem" were those who believed there *was* a problem. Nobody could quote all the articles that were never written about problems that never arose, because of normal daily routine that's never newsworthy.
What was left at the end were the hardcore doomies for whom the facts were secondary at best. Although it was becoming increasingly difficult to find *anything* to worry about, the sheer effort to find dire speculations, or to redefine good news into bad news through the most contorted special pleading, was simply astounding. But remember that except for a handful of us here, nearly nobody was aware of this. People bought a few extra cans of food or they didn't, whatever satisfied their sense of necessary preparation, and otherwise didn't get involved.
Now, just how extreme does exaggeration need to become before we have a "hoax"? I can't answer that. The problem was real, but the response was appropriate and sufficient. Some of the louder voices may have known perfectly well they were exaggerating, but hype sells. And I think Gary North was sincerely stunned by the non-event.
-- Anonymous, December 01, 2000
To Flint:What actually happened, vs. what could have been intelligently predicted in 11-98 when James wrote his article, are two altogether different things.
Anyway, you know from the discussions we two have had in recent months what I was worried about. Whether I was right or wrong, what I was worried about is totally beyond CPR's comprehension.
-- Anonymous, December 01, 2000
Here's one example of the "circle jerk" from last year. I compare it to Paula Gordon posting on TB2000 and then using TB2000 as a reference to back up her claims.Another local Y2K monger, Capers Jones... It was Jones who popularized the notion that Y2K eventually would cost $3.2 trillion, based in part upon his belief that there would be $1 trillion in lawsuits as a result of Y2K failures. Hoax of the century
I asked Jones where he got the $1 trillion figure. He told me the number came from another locally based industry analyst, Stephanie Moore of GIGA Information Group.
While Moore admitted to using the $1 trillion number in her Y2K articles, she insisted she got the number from Capers Jones.
-- Anonymous, December 01, 2000
I think this is how I remember it:And the wheels go round and round
and the painted ponies up and down.
We're captive a carousel of life.
-- Anonymous, December 01, 2000
Here you go:Ever=ErringBoy. James, Aug.1999.I told you to read before inserting your foot in your mouth. You really should. It eases the pain when you then insert your head up your ass.
http://www.marketingcomputers.com/issue/aug99/feature2.asp
News Wire
August Issue
Editor's Note
Cmgi'S Excellent @Venture
Out Like A Lamb
Casebook
ReadMe.1st
Tearsheet
The Help Files
Inside Moves
Columns:
Aaron Goldberg
Michael Schrage
Bill Laberis
George Parker
Chris Clark
Past Issues
July '99
June '99
May '99
April '99
March '99
February '99
January '99
December '98
November '98
October '98
September '98
August '98
aug '98
June '98
May '98
April '98
March '98
February '98
January '98
December '97
November '97
Out Like
A Lamb
By Geoffrey JamesHave you heard the latest Y2K prediction?
According to some experts, the projected value of Y2K lawsuits is expected to reach $1 trillion! Golly! You'd expect that every public relations, marketing, communications and investor relations group in every technology company would be busy managing expectations, contacting every high and low-lying customer in their database, working their tails off to get their customers through the critical hours. If nothing else, such good faith effort would help against the torts headed their way at the hands of ambulance chasing lawyers working on behalf of every Joe-six-pack who's still running a non-Y2K compliant version of Quicken 92.
And yet, even the threat of $1 trillion has brought little more than a yawn. Most software companies have positioned themselves against Y2K only to tout the event as another reason to buy new software. Take the Oracle site. It spins Y2K as "the perfect time to migrate to the latest Oracle products, with their Y2K compliance and superior technological advances." Or Germany-based SAP AG, whose site includes links to the lunatic fringe of the Y2K controversy. It seems unphased by the coming cataclysm. "The year 2000 is going to come and go and we need to offer our customers services beyond that," says Steve Rietzke, business development manager of SAP America.
In fact, the most fervent positioning is not coming from software firms, but rather from service firms-more specifically, the Y2K service firms, which are rapidly trying to increase their life spans beyond 12/31/99. Ted Kempf of San Jose-based Dataquest's Consulting and Systems Integration program recently surveyed a dozen Y2K-oriented service firms to discover their future plans. All were attempting to grow new service products-mostly in the areas of e- commerce and data warehousing.
One reason for the retreat from Y2K is that they discovered the coming apocalypse wasn't the financial windfall they expected. "Every vendor I spoke to said a good portion of the Y2K work was done in-house," he explains, "and they didn't see the demand for external services that others had projected." In fact, Y2K-related services revenues have been so disappointing that "financial analysts have been very unkind to these companies."
Far from finding Y2K to be a pot of gold, some firms launched Y2K practices only to abandon them, according to Kempf. The Y2K services business is so bad that some CEOs are congratulating themselves for not jumping in. Jim Lavalee, chairman and CEO of Cotelligent, a San Francisco services contractor employing over 3,500 technical consultants, seems quite happy to have avoided a pitfall. "Back in 1997 when the Y2K clamor began growing in volume, the investor community kept asking, 'Why don't you chase this?'" he chuckles, "Today, the same analysts were saying, 'Aren't you glad you didn't go after the Y2K business?'"
The weakness of the Y2K services market, as well as the declining interest
among software vendors of Y2K marketing messages, is simply a reaction to the IT/IS community-which simply isn't buying the message that Y2K is a life or death issue. The branding research firm Addison Whitney recently surveyed 1,100 IS managers worldwide from a cross-section of large businesses, industries, and organizations, concerning their future spending plans. The study concluded that "Y2K, while an issue, has diminished in importance," and noted that over 50 percent of companies have postponed all or some Y2K work to implement e-business initiatives, which were considered "simply too critical to the success of the enterprise to be put off."Only 8 percent of IS managers considered Y2K an issue that "kept me awake at night." But the most telling fact emerged when they were asked what they'd be doing on New Year's Eve. Sixty-two percent said they'd be partying; 13 percent would be sleeping; 5 percent would be watching TV; 4 percent would be on the Net. Only 12 percent planned to be checking up on possible computer crashes. "What can I say?" says Serge Timacheff, director of corporate communications at Attachmate, the survey's funder. "IT managers don't think that Y2K is all that big a deal."
This lack of a sense of urgency about Y2K among vendors and IS runs contrary to many predictions made by leading market research firms. According to testimony presented to the U.S. Special Committee on the Year 2000 Technology Problem on October 7, 1998 by Lou Marcoccio of Gartner Group, 1999 was supposed to be marked by "fiscal year system failures," which occur for companies with fiscal years that are ahead of the calendar year.
Unfortunately for Marcoccio (but fortunately for everyone else), such problems have failed to materialize. For example, ICL, one of Europe's largest computer support organizations, recently put a monitoring line on its call desk for U.K. customers to determine the frequency of Y2K-related calls. "Because we support legacy systems and software from companies like Microsoft, we expected Y2K-related failures," explains Jane Burns, Year 2000 marketing manager. The number of failures? "None," he says. "We've been watching the help desks, but there's been nothing yet,"-even on such "dangerous days" as 1/1/99 and 4/1/99 (because of new fiscal years), and 4/9/99 (the 99th date of the 99th year, reputed to be a problem for old COBOL programs). The June 30 fiscal year turnover has also proven to be a disappointment to the Y2K doom- mongers. Rather than a horde of systems failures and a resulting drop in stock prices, the Dow Jones average after the July 4 weekend clocked its highest numbers in history.
In his expert testimony, Marcoccio predicted that in 1999, "11 percent of vendors [will] default on compliance with their products." This was supposed to create a spate of lawsuits-the first signs of the fabled $1 trillion. In fact, there's actually been very little Y2K lawsuit activity to date, which makes Congress' recent overwhelming passage of a bill to cap punitive damages on Y2K lawsuits look like wasted effort.
There are two kinds of potential Y2K lawsuits: contractual and tort. In a contractual lawsuit, the customer sues the vendor under the terms of a pre-existing contract to do Y2K work that otherwise would cost the customer money. Torts deal with damages after the fact, e.g., people suing vendors if somebody is injured in an elevator that fails due to a Y2K bug.
While it says little about what torts may pop up, it appears that contractual Y2K lawsuits simply aren't going to be a big-money business, according to attorney Tobey Marzouk of Washington D.C.'s Marzouk & Parry, one of the world's foremost law firms dealing in high technology law and litigation.
Marzouk notes that three years ago Y2K litigation was the hot topic at legal conferences, but lately "the level of interest has plummeted," he says. The cases simply aren't holding up in court. "In most cases, the warranty on non-compliant software ran out years ago," Marzouk explains, "and almost all contracts are covered by statutes of limitations."
That's been the experience at Mountain View-based Intuit, which, being responsible for Quicken, could be a likely target. "Intuit has a basic philosophy that we do right by our customers," says Intuit spokesman Jeff Larsen, "and we've made every effort to bring all current versions of our software to Y2K compliance."
It appears that Intuit has been going beyond the call of duty. Earlier this year,
the Santa Clara Superior Court dismissed three Y2K-related lawsuits filed against the company with regard to the Y2K readiness of the online banking functionality of certain versions of Quicken. (The Supreme Court of New York dismissed three similar lawsuits at the end of 1998.) "It's market-driven," says Marzouk. "Vendors are simply giving away their Y2K fixes," which shows more good faith than may be contractually required.But what about the torts? "It's anybody's guess," says Marzouk. "If airplanes fall out of the sky, you'll have litigation."
In other words, the legal profession will have a field day if Y2K turns out to be a major disaster. However, it appears less and less likely that that will happen. As a result, many Y2K experts are beginning to back-pedal, distancing themselves from the aggressive predictions they and their colleagues have made.
Take, for example, one of the most widely quoted figures-that the cost of fixing the Y2K bug was going to be $300 billion to $600 billion. This figure originally came from the Gartner Group, a Stamford, Conn.-based market research firm. Though the figure has been quoted everywhere from Time to The Wall Street Journal, it now seems highly unlikely that it will ever be reached. "The Gartner Group numbers are pretty much valueless," laughs Howard Adams, director at Gartner competitor GIGA Information Group in Cambridge, Mass. "It's going to be much less than $300 billion."
Even Ted Kempf of Dataquest, which is owned by the Gartner Group, is reluctant to defend the numbers. "It's going to be a bit less, but I don't know by how much," he says. As for the Gartner Group itself, despite repeated calls to the company's Y2K analysts-including two calls to Marcoccio's home office-nobody was willing to talk, leaving the impression that the market research firm has lost confidence.
Despite gathering evidence to the contrary, there are plenty of Y2K "experts" who continue to predict disasters. For example, Russ Kelly Associates, a systems integration company focusing on Y2K solutions, has kept a running poll of 20-odd self-proclaimed Y2K experts regarding the "seriousness" of Y2K on a scale from zero ("absolutely no concern") to 10 ("major worldwide social, economic and technological disruptions"). The average "expert" opinion, as of this writing, remains in the 8-9 range-despite that no natural disaster, war, economic depression or combination of the above has ever scored higher than five.
Hearing those dire predictions, it's hard to keep a sense of perspective, until
one considers that polls of Y2K experts are naturally going to be skewed, because Y2K experts have a vested interest in keeping the issue visible. And we're not talking chump change. The Y2K issue has given birth to a number of best-selling books, and there's plenty of money for those in a position to cash in. "A speaker on the Y2K issue can command a fee from $7,500 to as high as $40,000," according to Bob Parsons of the Washington Speakers Bureau.An organization called the Washington D.C. Year 2000 Group sent a questionnaire to 700 email addresses of people who had indicated an interest in Y2K. The survey asked for an estimate of the impact of Y2K within the U.S., on an escalating scale of zero to 10. The survey revealed that an "overwhelming majority of the respondents believe that the United States will experience a significant economic impact from the Year 2000 issue," and that "one- third (34 percent) believe it will at the least result in a strong recession, local social disruptions, and many business bankruptcies." A significant number even believed that Y2K would result in "martial law, the collapse of U.S. government and possible famine."
Once again, it's a dire prediction that seems credible-unless you know something about market research. The Year 2000 Group study employed a research methodology known as a "self-selected listener opinion poll"-SLOPs for short. It's a methodology that nobody who does real market research takes seriously. The classic example of a SLOP is the typical mail-in magazine survey. For example, a women's magazine might have a mail-in survey entitled "Have you ever been cheated on?" asking readers to mail in responses. Most responses are naturally going to be from the scorned, because they're the ones likely to care. In the next issue, the magazine publishes the results, which indicate (not surprisingly) that the country is experiencing an epidemic of cheating boyfriends. It sells magazines, but the research is meaningless. In order to have statistical validity, the poll would need to include a broad cross-section of women, none of whom had previously indicated an interest in the issue.
Dire predictions of social unrest seem credible - except to those who know something about market research.
The Washington D.C. Year 2000 Group SLOP is even less scientific than the magazine example. Because the email addresses were from a list of those who had indicated an interest in Y2K, the baseline was pre-selected to favor those who thought Y2K important. Not only that, of 700 surveys, only 230 came back-presumably from people who cared most- thereby creating an additional level of self-selection. By contrast, the Addison Whitney survey cited earlier in this article simply included Y2K questions in the context of a broader survey, rendering results that are a statistically valid representation of the IT community.
Now that early signs of the doomsday scenario have failed to materialize, some Y2K analysts seem to be taking a distinctly less aggressive stance when predicting Y2K disasters. For example, Stephanie Moore, a Y2K analyst at GIGA, has appeared on CNBC Market Wrap, Fox News and numerous radio talk shows to discuss Y2K issues. Moore will be in Australia on December 31, 1999, reporting by satellite on the Y2K problem as it hits that part of the world some 15 hours before it will hit the U.S. Despite this dramatic stunt, Moore is low- key. "I'm not one of the big chicken littles, saying the sky is falling," she says. "I think there will be inconveniences-like no phone service to Africa."
One Y2K expert who hasn't changed his tune is Capers Jones, chairman of
Software Productivity Research, a consulting firm in Massachusetts. Jones, who frequently testifies as an expert witness in software litigation suits, believes Y2K will, over a 10-year period, cost $3.6 trillion dollars (including hardware, software, litigation, et al.).Jones cites the possibility of major power outages-citing the widely-reported story of a power plant that moved its dates forward to test Y2K and experienced a shutdown. It's a compelling story, one that's proven difficult to verify. The story seems to float around, with the power plant located "somewhere in Ohio," or "in Arizona" or "in the U.K." or even as far away as "in Australia." In fact, it appears that the power test outage story-like many Y2K cautionary yarns-is an urban legend.
The mythical nature of the power outage story became clear when Y2K guru Peter DeJager penned an article for Scientific American. He raised the red flag about power outages but was unable to cite a single example of a power plant that had blacked out during a Y2K test. The closest event he could cite was an extensive outage in New Zealand that was caused not by a software problem, but by a severed cable. A strangely sensationalist story for a magazine that "demands extensive fact-checking for every article it publishes," according to the editor at Scientific American who worked on the DeJager piece.
It appears that, despite all the hand-wringing, there never was much danger to the power grid. At the San Onofre nuclear power plant-an installation considered among the most complicated in the world-only 2,900 out of 190,000 devices (about 1.5 percent) used computer chips and only a "few hundred" needed replacement. And even if all the Y2K glitches aren't located, it's unlikely that any would cause a major power outage.
As John Ballance, manager of power grid dispatch and operations for Southern California Edison told The Los Angeles Times: "The control systems don't care what date it is or what year it is; they are just there for logging purposes." The same article pointed out that less sophisticated power plants, such as those running fossil fuels, are at even less risk.
This doesn't keep some self-proclaimed Y2K "experts" from spreading
disinformation. For example, the web site of author Steven Heller features his opinion that Y2K will mean "the end of Western civilization," citing as evidence data on the web site of the Nuclear Regulatory Commission (www.nrc.org). However, rather than containing evidence that backs up his claim, the site contains a document dated April 1999, "Nuclear Power Plants on Track for Achieving Y2K Compliance," stating that the "NRC has no indication that Y2K computer-related problems exist with safety-related systems in nuclear power plants. Most commercial nuclear power plants have protection systems that do not rely on computer dates and hence, are not vulnerable to the Y2K 'bug.'" The article states that the audit did "identify problems with non-safety related, but important computer-based applications," but noted that work on these systems would be completed by the autumn of 1999. In other words, there is some bug fixing to be done, but the problem is less extensive and complicated that originally anticipated.Other so-called Y2K "facts" are simply lies. For example, on a November 27, 1998 radio broadcast from WDJC, a 100,000 watt FM station in Birmingham, Ala., a self-proclaimed Y2K expert (who was hawking a book) claimed that IBM's Watson Research center was "hunkered down" with its own food supply and power generators. Another urban legend, says Paul Horn, head of IBM Research. "We do have a small group of people who are helping build tools to find Y2K problems in our code, but it's not a significant piece of what we're doing here," he insists.
Another popular canard is that the world's financial systems are in danger. But, once again, the reality seems quite different. As early as summer of last year, Y2K tests of 29 securities firms and 12 xmail went smoothly, according John Panchevy, Year 2000 project manager for the Securities Industry Association, the organization that conducted the test. "The financial industry in the United States will be fine," he says.
The nature of the power outage myth became clear when Scientific American couldn't cite one single case.
But what about overseas where, according to alarmists, they're far behind? Not a problem, according to tests of 190 international banks conducted in mid-June. "Everything has gone very well for us so far," according to Gerhard Singer, a vp of Deutsche Bank quoted in the June 14 edition of The South China Morning Post.
The computer trade press is becoming skeptical of the doomsday scenarios being bruited in the popular press. In a study of trade books conducted by Boston-based Press Access, Y2K reporters revealed "concern that much of the news about Y2K has been hyperbole and rumor, and many doom and gloom messages are unfounded." According to the study, editors feel the media, in general, is doing a poor job of "reporting the facts, squelching rumors, staying calm and educating readers." According to the report, editors are finally noticing that many of the information sources, such as companies that sell their Y2K expertise, have an interest in keeping the story alive.
The fact that Y2K reporting is hyperbolic is not say that there won't be some problems. The bug is real, but it appears the frequency of its occurrence, the negative impact when it occurs, and the complexity of fixing the problem, have all been vastly exaggerated. Even Capers Jones admits the problems that have been encountered thus far have been less than spectacular. "It's been accounts that don't balance and some applications that have stopped cold. Mostly they've been localized," he says, adding that a ripple effect "wouldn't be impossible." Anything catastrophic? "No," he admits, disappointment in his voice.
Still, Jones insists there would have been at least "$300 billion" in Y2K-related
lawsuits, had Congress not passed legislation limiting Y2K litigation. As for the $1 trillion figure, he claims that the number came from Stephanie Moore of GIGA. While Moore admits to using the $1 trillion number in her articles, she insists she got it from-guess where?-Capers Jones. In other words, nobody seems willing to own up to the forecast.Like so many other Y2K "facts," the $1 trillion appears to be an urban myth-which isn't surprising when you consider that the actual figure is probably too "pat" an estimate to be based upon research. Why not $935,252,785,080? Or for that matter, why not a "zillion" or even a "gazillion?" Like Gartner Group's $300 billion to $600 billion (give or take $300 billion?!), "$1 trillion" has been devised to shock rather than to inform.
Despite mounting evidence that Y2K is unlikely to create major disruptions, either to the computing infrastructure or to society at large, the big numbers and dire forecasts are likely to continue being quoted. Meanwhile, the people who are most concerned-the IS managers and the companies that sell to them-are quietly moving on to bigger and better things.
The lackadaisical attitude may be, as Capers Jones puts it, "a state of complete denial" or it may be the result of a healthy perspective on how technology crazes come and go. Let's face it, this won't be the first time that computer industry hype has far outstripped reality, and, to be sure, it probably won't be the last. MC
Geoffrey James (www.geoffreyjames.com) is a frequent MC contributor and the author of the cult classic The Tao of Programming (InfoBooks).
August Table of Contents
MC Main Page
Copyright © 1999 MC. All rights reserved.
-- Anonymous, December 01, 2000
AND just to drive home the point for those who continue to insist that "nobody knew what was going to happen".....here is James calling Bruce Webster's cherished users group SURVEY ***SLOP**. AND it was that survey that found its way to .........YARDENI's site and most like was the reason that Yardeni was a pessimist but not doomer even into Fall, 1999. It was that **SLOP**.....that was cited repeatedly and spread over the net as "even the experts can't agree" or "some of the people working on the problem think it will be a disaster". BLAH BLAH AND SLOP.The Washington D.C. Year 2000 Group SLOP is even less scientific than the magazine example. Because the email addresses were from a list of those who had indicated an interest in Y2K, the baseline was pre-selected to favor those who thought Y2K important. Not only that, of 700 surveys, only 230 came back-presumably from people who cared most- thereby creating an additional level of self-selection. By contrast, the Addison Whitney survey cited earlier in this article simply included Y2K questions in the context of a broader survey, rendering results that are a statistically valid representation of the IT community.
-- Anonymous, December 01, 2000
Peter, about the only thing that could have been "intelligently predicted" prior to 1999 was an IF/THEN scenario; e.g., IF we don't fix this, THEN this *could* happen. And I don't consider that a "prediction" as much as it was an "analysis"; a "business case", if you will.
I think that's where many people made their mistakes (as it were). The predictions that were around in 1998 were virtually the same as the ones in 1997 and 1996; of course they were going to be "dire". You have to look at where the predictions were coming from and, more importantly, who was "spreading" them. Then you had to consider the "agendas" of those sources.
IMO, it wasn't until the "JoAnn" dates of 1999 came and went with nary a squeak that anything could be "intelligently predicted" (in addition to report after report of businesses, governments, agencies, entities, etc. who were indicating they were "OK"). A good many who were still "predicting" "doom" after September, 1999 were the book- and supply-sellers, and those predisposed to doom anyway. This can be stated pretty much as fact because if one looked at the actual facts after September, 1999, one could only reach the conclusion that virtually nothing earth-shattering was going to happen. This, of course, assumes one listened to the people who really knew; versus the talking-heads.
To this day, I'm still curious as to who all those alleged "whistle-blowers" on TB2K and related fora were; how much you want to bet they were connected in some way to someone who was selling books and/or supplies? (And no, I don't mean anyone specific; just in general.) A great marketing ploy, if you think about it.
Flint, I'll bet you're right about North. Would almost have loved to have seen his face as the realization set in...."I've blown yet another one".
That's funny, Anita. I saw a number of examples of that. Stephanie would get her info from Gartner, who would claim to have obtained that info from GIGA....and so on. Funniest part was that in at least a few of these cases, this could almost always be traced back to Peter de Jager, circa 1996/1995. The "info" and the "claims" never changed.
-- Anonymous, December 01, 2000
To CPR:Although I remain firm in my belief that the Geoff James sentence above that I quoted is truly stupid, I thought James' August 99 article was quite good. He takes a lot of people who made ridiculous claims to the woodshed. But since I never believed in the ridiculous claims, I don't see what the article has to do with me.
-- Anonymous, December 01, 2000
Peter:Whether or not you consider James' views intelligent predictions depends on what you mean by both "intelligent" and "prediction". I'd compare predicting the details of y2k impacts in 11/98 with predicting the result of the super bowl in March. This prediction isn't totally uninformed, since we know what all the teams will be, and have a fairly good idea which are likely to be the strongest. The team we pick will nearly always have a winning season, and probably reach the playoffs.
But considering our prediction "obvious" or "self-evident" is silly. At that time, there were sincere and knowledgeable y2k predictions all over the map, and only those people whose teams really DID win the super bowl bother to dredge up their early predictions. I suspect James himself was surprised that we didn't even have irate customers.
And of course, "a few" customers is a damn flexible term. If he's just trying to say that systemic infrastructure breakdown was unlikely, hey, you and I were saying the same.
-- Anonymous, December 01, 2000
Well Stephen, dispite your best admonishment when starting this forum that we "talk about anything but y2k" it looks like this is dog that just won't die.
-- Anonymous, December 02, 2000
The HOAX of the CenturyYep. Even cpr was fooled for awhile.
http://home.swbell.net/buytexas/y2k3link.htm
YEAR 2000 !!!!
The Gathering Storm
New Year's Eve in 1999 is the Millenium New Year's Eve
The morning after might be THE WORLD'S BIGGEST HEADACHE !!!
-- Anonymous, December 02, 2000
Hardly "fooled". That was **1996** and the web site discusses a COMPUTER PROBLEM (as Geoff James does). NOT "THE END OF THE WORLD" with an ORDER FORM FOR **BOOKS, TAPES.......OR BULL SHIT "SURVIVAL GOODS".THAT CAME AFTER GARY NORTH ENTERED IN Mid 1997 when he and YOUR- TOAST_ED POURED THE MYTHS ON FOR SUCKERS.
-- Anonymous, December 03, 2000
NOTE WHERE MY LINKS GO ALSO (with the sad exception of Westergaard who ended up hosting all the FruitLOOPS of Y2k from LORD JIM DUMBO to ROLEIGH GUMP):THESE WERE LINKS TO ***COMPUTER SOLUTION VENDORS*** NOT "Walden Foods" or AMAZON SURVIVAL BOOKS:
The IBM Executive Summary Page IBM's FAQ page on Year 2000 The IBM System 390 Year 2000 page Larry Towner's Bibliography of Y2K articles The J.P.Morgan Analysis of the Y 2 K Problem The Year 2000.com Web Site MISSION CONTROL BY PETER de JAGER. A vital resource for anyone. The Software Productivity Group Web site. Contains the Comer Jones detailed Analysis of the Year 2000 Problem. John Westergaard's Site Re: Financial Implications of Year 2000
Legal Issues Concerning Y2K by Jeff Jinnett ("MUST" reading for all!!) William Tannenbaum's Paper on Legal Issues Easy to Read Series from Yorktown Technologies, a vendor.
-- Anonymous, December 03, 2000
Gary North's Y2K Links and Forums The Year 2000 Problem: The Year the Earth Stands StillBy now, you have heard about the Year 2000 computer problem, also known as Y2K or the Millennium Bug. When I started this Web site in January of 1997, not many people had heard of it. There were no books for a general audience on it. Now, there are hundreds.
Yet even this late in the game, the press is convinced that readers do not understand it. Today, in almost every published article on y2k, the journalist feels compelled to include this: "The problem exists because programmers for three decades used the last two digits of the century as substitutes for all four digits. Thus, 1967 is written 67, and 1999 is written 99. The problem will come in 2000 when unrepaired computer programs will read 00 as 1900."
Actually, unrepaired IBM-clone personal computers will revert to either 1980 or 1984, but the problem still exists, and not just in ancient models (pre-1998). Millions of old PC's are still used in running the infrastructures of most of the world. There may be 300 million PC's still in use, worldwide, and most are not compliant.
Add to this 50 billion embedded chips -- or maybe 70 billion. Perhaps only 1% of these are noncompliant. Or 3%. No one seems to know. (Three percent of 70 billion chips is over two billion chips.) We know only that there is a lot of chip-based systems to test, replace bad chips, and test again. (See Noncompliant Chips.)
But it is not just computer programs that are noncompliant. The data stored in these computers are noncompliant. It is also computer operating systems, including DOS, Windows 95, and early versions of Windows 98.
Over the last three to five years, large organizations around the world have been paying programmers to fix these systems. With only a few weeks to go before the century date change, the vast majority of these firms and governments are still noncompliant. This includes the largest money center banks on earth. The threat is two-fold: bank runs by depositors and, far more important, what Federal Reserve Board Chairman Alan Greenspan calls cascading cross defaults, where banks cannot settle accounts with each other, and the banking system goes into gridlock, worldwide.
Because corporate computer systems are noncompliant, they have not been subjected to rigorous final testing, which can take months. This was always a major problem, as I have said on this site from the beginning. (See Testing.) In 1997 and 1998, Fortune 1000 company after company promised to be finished with all repairs, leaving "a full year for testing." With very few exceptions, organizations missed this crucial deadline. The press, which had quoted it faithfully, promptly dropped the missed deadline down the Orwellian memory hole. The U.S. government, still noncompliant, has had numerous deadlines, beginning with September 30, 1998. It never meets these deadlines. No mainstream reporter ever mentions this fact in print.
Yet few people have changed their minds about y2k since March 1, 1999. In that month, all signs of panic, even among the 1% or less of American y2k-preparationists, disappeared overnight. The U.S. press has cooperated with the U.S. government and large trade groups in assuring the public that there is no big problem, that the December 31 deadline will not be missed by any important segment of the society. (Why will this deadline be any different from all the earlier ones? No one asks.)
In every country, the public has been assured that there is no need to panic, that everything will be all right. Especially banks. Over and over, the public is assured that banks will be all right, that there is no reason to get more than a few days' worth of currency.
If everything is all right, why have the vast majority of organizations missed the numerous deadlines that they have publicly announced?
The U.S. government assures the nation that y2k will seriously affect only foreign nations (rarely named, and when named, issue immediate official denials) and small businesses. But in the U.S., small businesses -- under 500 employees - - number 24 million. One-third of them are thought by the U.S. government's Small Business Administration to have done nothing to repair y2k. These businesses employ tens of millions of people. They also supply the largest businesses that are "not quite compliant."
Oil-exporting nations are not compliant. The U.S. imports half of its oil.
The largest companies that convert oil into finished products were not compliant as of early 1999. The industry promised it would be compliant by September 30. So far, no such announcement has been made. There is a new deadline for the industry: December 31.
U.S. ports are noncompliant. But 95% of all imported goods come through these ports.
We are told that the electrical power generating industry is almost compliant, but the basis of these assurances is a series of unverified, self- reported data from anonymous firms. These reports have been assembled by a private agency financed by the U.S. power industry, NERC.
What happens to electrical power generation if fuel and spare parts cease to be produced? The typical urban power company relies on more than 5,000 suppliers. The reports issued by NERC never discuss this aspect of the y2k problem.
As for the chemical industry, the news is not reassuring. The U.S. government's Chemical Safety Board sent a warning about noncompliant chemical plants to all 50 state governors on July 22, 1999. Yet this industry is the major exporter of goods industry in the U.S.
The U.S. Navy published on the Web, and then pulled (no explanation offered), a report on the risks to 144 U.S. cities due to failures of public utilities. The U.S. government and then the Navy went into damage control mode when the findings of this report were posted on a Web site that, within days, received so many hits that it had to be shut down and redesigned. Updates to the Navy's June, 1999 "Master Utilities" report have reduced many risk assessments, but the risks are still serious.
There is little but bad news coming from the nation's water and sewer utilities. Think of your community without water or sewer services for, say, a month.
The universal refrain is: "We can run it manually." For a few hours, maybe. But where is there publicly available evidence that large public utilities have produced detailed operations manuals and have implemented extensive training programs to be sure that employees can run all systems manually for days or weeks or months? There is no such evidence. The slogan is a public relations ploy.
I am not a computer programmer. My Ph.D. is in history. For over three decades I have studied the operations of bureaucracies. I have served as a Congressman's research assistant. I have seen how the U.S. government operates. All things are going according to standard operating procedure: public relations handouts, unverified positive statements, and verbal assurances that everything is fine here. Serious y2k problems are limited to the Other Guys Over There.
But the computers of the Other Guys Over There exchange data with "our" computers. Bad data from their computers can reinfect our computers and their data. This, the PR people never discuss in public. Even if our computers somehow can be programmed to lock out noncompliant data, then the computerized systems that rely on shared data will break down. Think "banking system." (See Imported Data.)
Things will not break down all at once in early January unless the power grid goes down and stays down. But the domino effect will create ever- increasing institutional noise and confusion throughout January and beyond. Your check will not be in the mail. (See Domino Effect.)
(To view my original home page, which I removed on October 20, 1999, click here.)
Click here to go directly to the links (with comments) and forums.
-- Anonymous, December 06, 2000