A big reason for all the spending, of course, was national health-care reform. As you recall, the Supreme Court struck down the original financing vehicle-the Kennedy-Helms Health Fairness Act of 1994, which (a) raised the federal tax on cigarettes and (b) required the rich to purchase them in large quantities. Meanwhile, costs were rising rapidly, especially after the creation of the cabinet-level Department of Redundant Health Paperwork Department, which will eventually be housed in the Rodham Building, which, upon completion, will be nearly seven times the size of the former Pentagon (which had to be torn down in 1997 following the discovery that acronyms cause cancer).
Despite the high cost,, health-care reform has been a major success, as measured by the one standard that really matters: job creation. Throughout the 1990s the nation’s fastest-growing employment category was people keeping track of people attempting to provide health care. Granted, there have been some problems in the area of actually providing health care. There was that much-publicized incident in 1998, when a 57-year-old woman in perfect health went to the office of the South Central Illinois Health Security Sector (motto: “A Sector That Cares”) in an attempt to stop the computer system from sending her increasingly threatening letters demanding that she provide certain documentation relating to her alleged vasectomy. Police later concluded that the woman spent six days there, wandering from department to department and becoming increasingly disoriented, before she stumbled into a supply closet and was crushed under an avalanche of informational pamphlets entitled “Las Opciones De Su Vesicula Biliar” (“Your Gallbladder Options”). But there was a silver lining: the ensuing public outrage led to the passage of the historic Health Care Reform Reform Act of 1998, mandating tough new standards for federal storage shelves.
The government was able to create jobs in other areas as well, thanks to continued rapid advances in computer technology. A fine example can be found in the Charleston, W.Va., home of the $125 million Robert C. Byrd Virtual Reality Naval Base, where today thousands of workers are gaining self-esteem–and earning good salaries–by wearing special sensory suits that enable them to experience exactly what it feels like to be employed in a vital defense industry, yet at the same time sparing the taxpayers the billions of additional dollars it would cost to physically maintain a bunch of ships. In a similar example of cost-cutting, the Agriculture Department’s Virtual Subsidized Farming Program has enabled thousands of farmers to have the total sensory experience of not growing crops, while taking up only a fraction of the land they formerly required for this purpose.
This is not to say that technology was an unadulterated plus in the ’90s. The Information Superhighway was pretty much of a dud. Remember that? By the mid-’90s, just about everybody was hooked up to the vast international computer network, exchanging vast quantities of information at high speeds via modems and fiber-optic cable with everybody else. The problem, of course, was that even though the information was coming a lot faster, the vast majority of it, having originated with human beings, was still wrong. Eventually people realized that the Information Superhighway was essentially CB radio, but with more typing. By late in the decade millions of Americans had abandoned their computers and turned to the immensely popular new VirtuLib 2000, a $14,000 device that enables the user to experience, with uncanny realism, the sensation of reading a book.
Speaking of entertainment, the ’90s saw an explosion of cable TV channels, offering innovative programming such as the Home Shopping Network’s hugely successful show “Interactive Salad Bar,” in which viewers, by manipulating a console in their homes, can direct the selection of ingredients in a studio thousands of miles away, and then have the actual salads delivered to their homes, ready to eat, within two working days.
But the major TV success story in the ’90s has been the wildly popular syndicated afternoon show “It’s None of Our Business,” wherein guests with various psychological dysfunctionalities go before a live studio audience and are prevented by the host from talking about themselves:
GUEST: … it was actually on our wedding night that Roberta and I discovered that she could not achieve climax without the aid of a squid.
HOST: That’s really none of our business, is it?
GUEST: Well, I guess not.
AUDIENCE: (Applause)
Professional sports did not fare well in the ’90s: bitter baseball fans will not soon forget the sorry spectacle of the 1998 World Series, which the Southern Connecticut Microsoft/Wal-Mart Yankees lost in the bottom of the ninth inning of the deciding 11th game when right fielder Rodd (The Bodd) Sprankel, then making $4.2 million per week, failed to score because his agent–speaking to Sprankel through the tiny receiver that most pros now wear in their ears during games so they can keep abreast of market fluctuations–informed him that he was not contractually obligated to advance more than two bases in any given three-inning period.
Musically, the most important innovation of the ’90s was “subsonic,” which resulted from the ceaseless quest of young people to find new forms of music that are even more hateful to their parents. Subsonic is produced by huge amplified speakers pumping out “music” at a pitch so low that the only way that the young people can tell whether they have enjoyed it is by measuring the amount of tissue damage.
Politically, the ’90s were a decade of great upheaval, especially the string of events touched off by the then vice president Al Gore’s 1996 preconvention decision–shocking at the time, but, in retrospect, inevitable–to become a Rastafarian. Less understandable was President Clinton’s choice as a replacement running mate: many experts had predicted that he would choose a woman, but nobody thought that the specific woman, in yet another example of the questionable staff work that plagued the Clinton administration from day one, would be Mary Lou Retton. Prospects looked very bright for the Republicans, especially in light of Ross Perot’s decision, after consulting his volunteers, to become president of Belize (where he remained until 1998, when he was finally picked up by the Mother Ship).
But the 1996 GOP convention, held at the newly constructed LimbaughLand complex, degenerated into vitriolic squabbling over the wording of the proposed constitutional anti-witchcraft amendment. It was a wounded Quayle/Robertson ticket that emerged from that convention, only to be weakened still further during the campaign when Quayle, in a pattern reminiscent of 1992, was forced to explain incorrect and at times utterly ridiculous statements made by her husband, Dan.
Thus Clinton was able to eke out a narrow re-election victory, sealing the win with his hugely popular campaign promise that, if elected, he would stop holding town meetings. But what might have been a triumphant second term was marred by continued administration blunders, including the ill-advised attempt to nationalize Toys “R” Us, the highly unpopular FDA ban on frozen yogurt and the series of State Department misjudgments that enabled Saddam Hussein to acquire, and ultimately move into, a condo in Vail.
These events, coupled with Congress’s passage of the OK We Are Increasing the Deficit Limit to $10[22] Or At Most $10[25] But That’s IT and We Are Really Not Kidding This Time Darn It Act of 1998, left the voters feeling disgusted with politicians of both parties. It was this widespread public anger that made possible the astounding series of events that followed, leading to the historic Constitutional Convention of 1999, which opened with the now legendary four-word keynote speech by former president Ronald Reagan (“It’s cold in here”) and ended with a genuine reinvention of the American political system.
And thus it is that, as this eventful decade draws to a close, we Americans prepare to enter the 21st century with what is certainly the most democratic system of government ever devised: Virtual President Bob. He is wise, and he is good. If we flip a switch, he is a she. He comes to our home whenever we want, day or night, and he listens to us, and he agrees with whatever we say. If we’re in a cranky mood, he’ll make a funny speech for us, or launch a virtual nuclear strike against France. We don’t know who’s paying for all of this. There is a Virtual Congress, but nobody ever turns it on. It doesn’t matter. What matters is that we love President Bob, and he loves us. Happy days are here again.
By the end of the century, AIDS will overwhelmingly be a disease of poor people. That creates a huge political problem. Our society didn’t grapple with AIDS in the ’80s, because it was the disease of gay men. But in the ’90s we are going to have a greater problem with the disease because it’s not just affecting gays but blacks and Hispanics too. Our society is being split into two worlds. For people who have AIDS it is a factor in your life that dwarfs any other concerns. For people who don’t have it, the disease is a trivial concern, unless you actually know someone who is dying from it. One of the fascinating aspects of the ’90s is the women’s movement. We are seeing that what was talked about in the early ’70s–women should have equal opportunity in the professions–is now becoming a reality. Heterosexual men just don’t know what to do about that; all of the rules they grew up with don’t apply anymore, and no one has created new rules for how men should define themselves. For women it’s been nothing short of a revolution. Little girls grow up knowing they can be an astronaut, a Supreme Court justice or a U.S. senator. But men feel that the pie is only so big and that if somebody else gets a bigger piece, they are going to have less. The gay issue plays into this, because it’s an issue that is redefining the roles of what men and women should be. Men don’t have to be with women; women don’t have to be with men. The rules have changed.