web stats analysis
 
 
 
 
Michael Ubaldi, July 20, 2003.
 

Yesterday - all day - the database connection was acting up, so certain archives were unavailable and leaving comments was impossible. My apologies. Luck is with me because this error is intermittent (it hadn't happened for the better part of two weeks) but I'll nevertheless get to the bottom of this.

 
 
 
 
Michael Ubaldi, July 17, 2003.
 

The Drudge Report has spent the last two days giving headlines to the major networks' assault on morale at home by presenting desertion-level moral in the warzone.

The blogsphere has yet to pick up on it - at least at a big-time level like Instapundit or Andrew Sullivan - but it does appear to have caught some peoples' attention.

What do you think: is Drudge simply gaming for attention and sensation, is he simply conveying what news outlets are reporting, or might he be playing the wily conservative and exposing the panic-button stories as much as possible to hoist the press by its own petard?

 
 
 
 
Michael Ubaldi, July 16, 2003.
 

It was the First World War that introduced airplanes to a military role and pressed both sides with the desperation of conflict to achieve an unprecedented technological metamorphosis that arguably would not have been possible in peacetime. In 1914 and 1915, pilots rode what the day offered them: canvas-and-wood planes, their roll controlled with a cord system that twisted the wings sympathetically to directional intent, known as wing-warping.

The airplane was initially intended for observation and reconnaissance. Combat usage was minimal and inconsistent; stories tell of early Allied and German pilots waving to each other, briefly enjoying the status of an echelon privileged enough to escape the horror of trench warfare. Inevitably, conflict worked its way into aviation; in the proverbial sense, one man bid good day to another with lead, and the race began.

Firearms and flying machines were a rough match. A pilot's ability to fire was greatly dictated by his imperative need to fly the aircraft; unfortunately, the only logical location for a gun requiring the least distraction and aiming correction - the nose of the plane - invited a dangerous intersection of propeller and bullet. Two-seater scouts allowed for a gunner to fire rear-mounted machine guns, but a restricted field of fire combined with the ponderous gait of the primitive craft amounted to a marginal defense at best.

The Germans revolutionized air combat with the Fokker Eindecker E.1 monoplane in May of 1915. Equipped with a mechanism that synchronized the engine's propeller rotation with a nose machine gun's discharge in order to prohibit a bullet from firing into a propeller blade, a pilot could easily and safely target enemy aircraft. The Fokker Scourge, a direct result of this enormous advantage over the Allies, continued throughout the rest of the year.

Thus began a tight struggle of deadly one-upmanship: the French Nieuport 17 and Spad 13 biplanes - the former light and maneuverable, the latter rugged and fast - utilized the same fire-synchronization technology and, with ailerons for rolling instead of wing-warping veins, immediately outmatched the Eindecker E.3.

The Germans responded in 1916 by introducing the enduring Albatross series and reclaimed air superiority with the well-powered and heavily armed D.1. Months later the British deployed the successful RAF SE5a and the first of the Sopwith series, the Pup. The 1917 Albatross D.3 entered combat as a respected fighter that controlled the skies but the mediocre D.5 allowed the Allies a stumble in technological escalation for the British Sopwith Triplane to rattle German pilots and the Sopwith Camel to reign supreme.

The Camel was idiosyncratic to say the least; deadly to inexperienced pilots but a terrifying weapon at the hands of veterans. Its rotary engine produced a clockwise torque that heavily influenced the plane's handling; left rolls pulled the nose skyward and bled airspeed and right rolls would pull the aircraft in an extremely tight turn, threatening a fatal spin without proper counteradjustments. The characteristics, however, allowed the Camel to outperform even the legendary German triplane, the Fokker Dr.1.

Copied directly from the Sopwith Triplane, the Dr.1 performed with similar strengths and weaknesses; it was underpowered, light and slow but boasted climbing rates and maneuverability that easily flew circles around all aircraft but a well-piloted Camel in a right turn. Unlike its British inspiration - and rather unique unto its own - the Dr.1 lacked a vertical stabilizer and a pilot, using nothing but hard rudder, could simply yaw through a turn nearly as fast as other planes could roll. More heavily armed than the Sopwith Triplane, the Dr.1 succeeded as far more on the battlefield than a competitive technical experiment. Though Manfred von Richthofen - the infamous man we all know, minus the fallacious mustache - made most of his kills at the controls of biplanes, the Dr.1's lethality endeared itself to him in some respect and was the plane the Red Baron rode to the ground to his death in 1918.

The loss of Richthofen came as the Allies were overwhelming a flagging Germany with numerical advantage. Not even the authoritative Fokker D. VII - an airplane that legendarily would be seen in formation on vertical climbs and was indeed the best fighter of the war - could prevent the inevitable collapse of German warmaking. Following the Armistice, in a gesture split between respect, awe, envy and fear, the Allies specifically ordered for the destruction of every D. VII through the Treaty of Versailles.

In 1914, airplanes were sluggish curiousities that were limited with low operational ceilings, climbed poorly and could barely maintain 80 or 90 miles per hour in level flight; by 1918, they were nimble and irrepressible, or else so well-built that they could sustain power dives in excess of 200 miles per hour. In the face of pandemic destruction and death caused by the war, four years managed to accomplish more for aviation technology than the eleven preceding them.

Despite the grim nature of the warrior biplane's inception, American culture has attached a particularly tenacious romance to the thought of a sky staggered with tiny, frail airplanes from a younger world, each piloted by a gentleman without a parachute who, above the faceless slaughter of the front lines, would seek to best his adversary in an aerial duel.

Another employment for aviators in the war were tethered balloons for observation and artillery sighting. Balloons were as predictable as the wind and logistically simple, providing stable, inexpensive platforms for static reconnaissance. More than twenty years before the Hindenburg's crash, design philosophy and industrial convenience resulted in the balloons that weren't hot-air powered to be filled with hydrogen. As with any early modern mechanism, this was an act of placing superior performance over operator safety. Far more buoyant and exceedingly more abundant than helium, hydrogen was, unlike its inert cousin, dangerously flammable.

Precautions were taken. Each side would hoist groups of two or three of them several hundred feet in the air from the front lines, barricade them within a veritable phalanx of zeroed antiaircraft and machine guns, and have crews ready to winch the balloons back down as quickly as possible if aircraft approached. British crews were, not unexpectedly, issued parachutes.

And airplanes came. Using special incendiary bullets to set the gas alight, audacious pilots would brave forbidding walls of flak hurled from below in attempts to flame as many balloons as they could before the balloons reached an altitude point well within deadly range of ground fire.

Reasonably protected, it's natural to conclude that balloons were highly vulnerable sitting ducks. And what with the advances of aviation and sensory technology, one might assume that balloons in wartime are charming relics from lost days of old.

Not quite:

With nearly one hundred years of technological evolution, the aerostat is unmanned, fiber-optic-sensor-packed and rises with the aid of now-accessible helium - but it's a tethered, inexpensive alternative for military monitoring that has shed neither the look nor the function of its predecessor. While we won't see double-winged, canvas-and-wood vehicles in touted defense contracts, some things never change.

 
 
 
 
Michael Ubaldi, July 15, 2003.
 

Because African and Asian elephants simply aren't good enough:

Scientists hoping to clone prehistoric woolly mammoths are preparing their first frozen DNA samples in a bid to revive the species.

The specimens of bone marrow, muscle and skin were unearthed last August in the Siberian tundra where they had been preserved in ice for thousands of years.

Researchers at the Gifu Science and Technology Centre and Kinki University want to use the genetic material in the cells to clone a woolly mammoth, according to Akira Irytani, a scientist at Kinki University in western Japan.


Staunchly against the cloning of humans for any purpose, I believe that the genetic interpretation of our license to dominion over animals - up to creating bizarro mutants like spider-pigs, giraffe-seals and hyena-maples, of course - is as beneficial to our development as it is utterly fascinating.

Read the article. I'd love to see mammoths bounding about in wildlife refuges.

One little niggling discrepancy - isn't it a bit presumptious, given the particular lack of hard evidence, for the Independent to assume that mammoths were cleaned out not by obviously detrimental climate changes but homo sapiens overhunting?

Oh, wait; that's right. The Independent prefers enlightened opinion over fact for just about everything.

 
 
 
 
Michael Ubaldi, July 14, 2003.
 

Here's one dead-paper journalist who's confident that both her crowd and their inverted-magnet colleagues could learn quite a bit from the indefatigable blogosphere:

I'm not an expert on blogging, but I am a fan. As a regular visitor to a dozen or so news and opinion blogs, I'm riveted by the implications for my profession. Bloggers are making life interesting for reluctant mainstreamers like myself and for the public, whose access to information until now has been relatively controlled by traditional media.

I say "reluctant mainstreamer" because what I once loved about journalism went missing some time ago and seems to have resurfaced as the driving force of the blogosphere: a high-spirited, irreverent, swashbuckling, lances-to-the-ready assault on the status quo. While mainstream journalists are tucked inside their newsroom cubicles deciphering management's latest "tidy desk" memo, bloggers are building bonfires and handing out virtual leaflets along America's Information Highway.

[...]

The best bloggers, who are generous in linking to one another -- alien behavior to journalists accustomed to careerist, shark-tank newsrooms -- are like smart, hip gunslingers come to make trouble for the local good ol' boys. The heat they pack includes an arsenal of intellectual artillery, crisp prose, sharp insights and a gimlet eye for mainstream media's flaws.


Well said, Madam - and, witnessing her own kind's gradual supercession as the media instruments of first response and final thought, her words are borne on humility. She even kept her online column philosophically contradistinguished, pleasantly free of hyperlinks.


UPDATE: A brief, after-read, moment of devil's advocate (unusual for me). Might the hobby-like, for-the-love-of-it nature of today's blogging be the reason for such an inexplicably fraternal community - that if it ever became lucrative, serious bloggers would grow triangular dorsal fins? Perhaps not: hyperlinking is the purest form of flattery, deference and fair representation - and it's what many have recognized as the hallmark of internet commentary.

 
 
 
 
Michael Ubaldi, July 14, 2003.
 

This is a bit of a chilling screen capture from Truth Laid Bear's Blog Ecosystem. Many more times than I'd prefer to count, the digits in question happen to be the time of day when glancing at a clock:

Brrrrrr. And I'm not even superstitious!

UPDATE: I just realized that this is post number 666 (#685 externally). Rabbit's foot, anyone? I might as well congratulate myself for achieving the most remote blogging coincidence ever before a meteorite thumps me.

 
 
 
 
Michael Ubaldi, July 7, 2003.
 

Thankfully, the Bush administration is defiant of the axiom, "If the sheriff rides unarmed, the outlaws will, too." Unfortunately, the technological development of weaponry is anyone's game; if the free world does not remain at the forefront of nuclear explosive research, someone will - probably those who would seek to use the devices for offensive leverage, not as a last-resort preventative measure. Says USA Today:

..."Icecap," the test of a bomb 10 times the size of the one that devastated the Japanese city of Hiroshima in 1945, was halted when the first President Bush placed a moratorium on U.S. nuclear tests in October 1992. The voluntary test ban came two years after Russia stopped its nuclear tests.

In the 11 years since, the United States has worked to halt the spread of nuclear weapons around the world and has often touted its own self-imposed restraint as a model for other nations.

Last year the White House released, to little publicity, the 2002 Nuclear Posture Review. That policy paper embraces the use of nuclear weapons in a first strike and on the battlefield; it also says a return to nuclear testing may soon be necessary. It was coupled with a request for $70 million to study and develop new types of nuclear weapons and to shorten the time it would take to test them.


The lack of attention paid may have well been attributed to the fact that most Americans prefer their country to maintain military superiority in all respects. The bomb-cutting game, after all, was simply a dance we played with the Soviets. Now that the Bear is gone, "mutual gestures" with Russia are largely irrelevant - the worst fear today being that her orphan cubs are hawking atomic material because it sells better on the black market than hammer-and-sickle-stamped tractors.

The United States is not in a strategic position where the destruction of an industrial center, such as Hiroshima or Nagasaki, is remotely necessary. Big bombs are deterrent - enter Reagan's brilliant "mutually assured destruction." Future uses of nuclear energy's effortlessly destructive potential come in small packages: tactical nukes.

The main reason offered by the Pentagon is that "rogue" nations such as North Korea, Iran and Libya have gone deep, building elaborate bunkers hundreds of feet underground where their leaders and weapons could ride out an attack by the biggest conventional weapons U.S. forces could throw at them. U.S. officials also theorize that the vaporizing blast of a nuclear bomb might be the only way to safely destroy an enemy's chemical or biological weapons.

The Pentagon says developing new nuclear weapons makes sense in a dangerous world. "Without having the ability to hold those targets at risk, we essentially provide sanctuary," J.D. Crouch, an assistant secretary of Defense, told reporters earlier this year.

[...]

Gen. Richard Myers, chairman of the Joint Chiefs of Staff, says nuclear weapons could be crucial tools for destroying chemical and biological weapons stocks without causing wider harm.

"In terms of anthrax, it's said that gamma rays can ... destroy the anthrax spores, which is something we need to look at," Myers told reporters at the Pentagon on May 20. "And in chemical weapons, of course, the heat (of a nuclear blast) can destroy the chemical compounds and make them not develop that plume that conventional weapons might do, that would then drift and perhaps bring others in harm's way."


I put my stake on the answer to Iraq's weapons riddle in highly scattered and easily concealed components couched in clandestine, underground sites. That's Saddam's trick. Nations that are not confronted with authoritative scrutiny, however, would not be required to necessarily dissemble weapons stockpiles and industry; such sites, therefore, would be perfect for the consummate extirpative power of small-scale nuclear bombs.

Incidentally, one must admire the tenacity of journalists to qualify the morality of a nation by placing rogue status under quotations.

And of course, the usual counterargument from the usual sources:

[O]thers argue that moving toward a new generation of nuclear weapons, instead of improving conventional and non-nuclear ways to attack deep targets or chemical weapons sites, is fraught with danger.

"They are opening the door to a new era of a global nuclear arms competition," says Daryl Kimball, executive director of the Arms Control Association in Washington, D.C. "As we try to turn the tide of nuclear proliferation, the last thing we should suggest is that nuclear weapons have a role in the battlefield, and these weapons are battlefield weapons. This is a serious step in the wrong direction."


Director Kimball should have made his statement to prevent competitive accoutering some millenia ago, when one fur-cloaked homo sapiens knocked his neighbor on the back of the head with a leg bone. Arms races never stop - certainly not with dictators, whose power rests solely on the broadcast of violence, yet dotting the globe. The free world needs a leg up. See the sheriff proverb above.

 
 
 
 
Michael Ubaldi, June 26, 2003.
 

Winds of Change linked to this rather forceful notch-dropping on blogging, its impact and its longevity.

As an argument, I'd say that it's a bit over the top and in some places needlessly pejorative. The author, I'm sure, didn't mean to denigrate the medium - but truly, blogging has matured, and as Glenn Reynolds explained recently, quite positively so. That can't be overlooked. If blogging has a societal problem, it is, as the essay similarly describes, when bloggers become laptop rappers; blogging seriously about how hip they are, how hopelessly "yesterday" non-blogging media is, and how URL-hatin', sucka HTTPs are stealing game and gots to go down. Even while political weblogs burst daily with for-keeps debate and heated rhetoric on the most controversial, explosive topics, I have yet to stumble upon one and read about the latest developments in some extended, pointless, private feud. Strictly personal weblogs seem to be far more vulnerable to this abuse and, unfortunately, their exploits help paint the picture.

Not that it's unexpected. Like it or not, blogs are extensions of ourselves and most are competitive: if people can't battle for ideas they'll battle for attention, or space, or even the will to continue showing up in the form of snarky hypertext. A few months ago, on a bit of a bad day, I almost made the mistake of falling deep into one of these awful pits of pride and vanity. I scrambled out, literally within minutes, but not without making myself look like a part-time lout (complete with apology e-mail and corrected entry).

However, it's not unfair to say that every human endeavor is at the risk of becoming a chess board or rugby field. Even if you think the loss of innocence is a 20th-Century thing, consider Bill Buckley's verbal scuffle with Gore Vidal in 1968 a very large chomp into a certain apple.

In that sense, the essay in question underestimates not only the staying power of blogging but its eventual place in journalism, commentary and newsmaking. The colossi, print and television, have taken notice and, held by their own ponderous hulk, now reluctantly acknowledge the hirundine bloggers as both pioneers and archaelogists. Thoroughly decentralized, yet wholly and frighteningly focused when great matters arise and the like-minded chatter. Thousands upon thousands have been exposed to authors and other creators who might have remained the sole acquaintance of politico-mag wonks; the wonks still rule in online politics, of course - most people I know who aren't web-savvy haven't the slightest idea who Andrew Sullivan or Glenn Reynolds are - but the stages of separation are evaporating. And the more people go online, the more likely it is they'll inevitably bump into a reference, an article, or a homepage itself.

All this suggests that the endurance of the weblog, particularly the newsworthy weblog, is remarkable. How is it unreasonable, then, for the author of even a modestly attended weblog to feel obligated to his readers? Music, art, writing: a following is a following is a following. It's a blessing, not a nag. To blow an audience off would certainly not be an act of humility.

Natalie, the author, believes that the "the blogging 'community' as it were...is preparing to implode on itself." No, it's inward reflection that we should expect over the coming years: the sieve will come out to separate the dead, failed, and impugned blogs from time to time. A regular molting.

Style, technology and necessity will alter its look and feel, but blogging is established as a concept - so its next challenge is to prove itself through trial, error and success. Fads die out within a few years; is the weblog flashy, creampuff ephemera? Ask Trent Lott and Howell Raines.

 
 
 
 
Michael Ubaldi, June 24, 2003.
 

From the Harvard Journal of Law & Technology archives, a 1990 symposium with the then-Slight-Radio-Frequency-Delay-Pundit, Dan Quayle and William J. Brennan, Jr.

I've skimmed it; quite compelling, especially considering the event's age. (But what's this, no Prime Directive? Amateurs.) Note that a bill for which Reynolds testified, H.R. 2946, had a companion in the Senate sponsored by the Right Honorable Albert Gore, Jr.

Excellent reading, even for legal lay like myself. Foresight, stay our course.

 
 
 
 
Michael Ubaldi, June 20, 2003.
 

Movable Type - or, to be fair, my server's MySQL - has held my two morning posts hostage. All day.

Nefarious blogging schemes were totally ruined. If I had a handlebar mustache, I'd be tweaking it. Drat! Curses!