A Daily History of Holes, Dots, Lines, Science, History, Math, the Unintentional Absurd & Nothing |1.6 million words, 7000 images, 3.6 million hits| Press & appearances in The Times, The Paris Review, Le Figaro, MENSA, The Economist, The Guardian, Discovery News, Slate, Le Monde, Sci American Blogs, Le Point, and many other places... 3,000+ total posts
This is another in a series of posts on images found in the glorious Fr. Athansius Kircher's masterpiece, Mundus Subterraneus, printed in 1668. (One of the main posts on these images occurs here; there are many more if your search under Kircher's name in the Google box at left.)
This fine example of the depth of his observation finds the great Jesuit finding things in rock and mineral samples like agate--in this case he looks at a collection of samples that display found letters of the alphabet and geometrical shapes.
The original of this image may be purchased via the blog bookstore, here.
I think that if you squinted your eyes a little to deform your visual field and then looked at this map of the Earth's ocean currents that what you might see is...Mars. At least that's what I can see if I concentrate on it, or at least the Mars of 1880 or thereabouts, with its large and blotchy seemingly mobile and ambiguous forms skirting around the planet.
"This map may be purchased at the blog's bookstore, here.
That said this map does represent a high achievement, displaying the elements of the still young-ish science of oceanography (or at least in the form presented by Matthew F. Maury) and showing the movements of the ocean currents.
This maps comes about a century after the first map to truly distinguish the Gulf Stream (the B. Franklin/Folger map of 1768-1785) and shows the remarkable activity that must have come with the acquisition of the data necessary to show on even this popular map. There were earlier attempts, of course, notably by the great/problematic Athanasium Kircher in his Mundus... of 1662 though his effort was largely theoretical (and for what it was worth likened the currents to something similar to blood moving though the body, which at least used the work of William Harvey) given the lack of available hard data. (Eberhard Happel is on a similar wavelength as Kircher with the currents map, and published his map a few years later, though he was mainly a science popularizer who "relied" heavily upon the Kircher map and was nowhere near to being Kircher's intellectual rival, his map is still interesting given its very heavy lining and beauty.)
In any event, the Houston map is clear and concise--and given its 7"x 9" size, remarkably compelling.
Years ago I phone-met the physicist Al Wattenberg. He started his long and fine career under the bleachers at the University of Chicago, working with Enrico Fermi where on December 2, 1942, they achieved the first controlled nuclear chain reaction. (The moment was celebrated with a bottle of Chianti donated by Eugene Wigner, and it was signed by all those present at the creation. It turned out that Wattenberg was the last man to leave the area, and he saw the abandoned Chianti bottle and rescued it--he gave it to the Argonne National Lab where it lives to this day.)
One day I asked him about Fermi and what he was like when he wasn't working. Wattenberg said that he used to play tennis with Fermi, who (paraphrasing here) played tennis like he did physics--meticulous, methodical, careful. He said he was frustrating to play with, and he never beat Fermi at the game.
I like these offbeat cross-interest metaphors for explaining complicated things. And it was this thinking that brought me to Oppenheimer and the New York Yankees--and that Boston team--in the hope of dislodging a little piece of insight that might come from comparing disparate things. And so far as I can tell discussing Robert Oppenheimer in terms of Ted Williams and Joe DiMaggio doesn't get very much red carpet room as an idea, and I suspect that there are good reasons for it.
Let's face it—Robert Oppenheimer left his theoretical physics career sorta behind when he went to win the war at Los Alamos in 1943. I was surprised to see today that his publishing career for the hard stuff stopped around 1950 He was an absolute brilliant light when he started out in 1926, and a massive influence in his field by 1940.
Here's what the production looked like:
1926-29: 16 papers
1930-39 36 papers
1940-42 10 papers
Which makes 62 papers for the 1926-1942 period . After the war, from 1946-50, there were five papers. And after that, none. That's not to say that he didn't publish, because he did, and was prolific—it was just a different career. Also from 1926-1942 he was a theoretical physicist with wholly different responsibilities than those he would take on in 1942, when he became not only the brilliant physicist but also the brilliant administrator/director/curator, working an almost-impossible job with thousands of creative people and the U.S. Army in shacks in the desert trying to beat the Nazis to a bomb that would end the war.
After the war Oppenheimer helped to formulate the direction of physics in the U.S., leading Princeton's Institute for Advanced Study as its director; then he helped formulate American nuclear policy going into the Cold War as the Chair of the Atomic Energy Commission. And then there was the great tragedy of his "security hearings” (1954) which crushed him on a McCarthyite slab, costing him much of the rest of his career. And then he was dead in 1967.
Had he not gone to war, had he not taken the job only he could have done, what would he have become? His accomplishments were already large--the Born-Oppenheimer approximation, Oppenheimer-Phillips process, neutron stars, quantum tunneling. More than likely one of the things he would be well known for today would be his development of the discovery and understanding of black holes—a singularity that he discovered in with in a paper called printed in Physical Review on September 1, 1939, but this was somewhat premature for the time. And then came the war.
Millions of other people went to war, too, stopping their lives on one end, starting a new life on the other, and then returning to their previous lives with varying degrees of retention or creation. Two of those folks were Ted Williams and Joe DiMaggio.
Ted Williams went to war too, from 1943-1945. Williams just started out in his brilliant career hitting .406 in ; then two years later, he was gone. Williams was 24-26 years old for the war. Oppenheimer 38-40, though Ted got to play from 1946-1960 when he came back, absent a year that he spent as a Marine fighter pilot during the Korean War (“strafing Commies”). No doubt Williams would have piled up the numbers in a big way had he played ball in those four years
Then there's Joe DiMaggio, who also served for those years when he was 28-30. He had finished up seven years with the Yankees before he left, brilliant—perhaps the most stellar thing about it all (long pointed out to me by my friend, Mr. Baseball, Andy Moursund) was that he hit 219 home runs while striking out 196 times in 7 years. His batting averages were very high, and he was a slugger, and he didn't walk all that much compared to Williams. This seems more in-line with Oppenheimer—a different sort of precision, one where DiMaggio hit for power and rarely didn't hit anything at all, going up swinging but rarely going down swinging at nothing on the third strike. Although he was relatively young went he left for the war, he came back at 30, and played another 6 years,
It may be too weird or nonsensical to think of these sorts of things, let alone thinking of Oppenheimer in units of Joe DiMaggios, especially since Oppenheimer knew/cared nothing about sports2. Plus in the alternative histories world we really can't express the potentials of missing years, especially in terms of metaphors from non-related entities.
On the other hand, I did create a physicists vs. mathematicians chess set, matching up people with what their possible position of the board might be, we can probably Oppenheimer to his baseball equivalent.
Perhaps this is also my pneumonia talking rather than my brain, what with Spring Training approaching, and perhaps this was all useless--but for some reason if Oppenheimer suddenly appeared out in the front yard wearing a baseball jersey, I'd see him in pinstripes, with a "5" on his back.
Blanche Murray wrote this short pamphlet in 1947, the designer completing the cover in the Russian-spider/American-fly motif. Ms. Murray was trying to warn the U.S. that World War III was "happening now" and that Russia had crept into the U.S. on "cushioned paws".
The author was more or less correct in the WWIII part, though not so much in the creeping Commie part--my quick-browse didn't find any "Cold War" reference exactly, but that is what she was talking about there in 1947, two years before the Soviets had their own nuclear weapons (with the so-called Joe-1 shot) and three years before the Korean War began. Mr. Orwell called it in October 1945:
"For forty or fifty years past, Mr. H. G. Wells and others have been warning us that man is in danger of destroying himself with his own weapons, leaving the ants or some other gregarious species to take over. Anyone who has seen the ruined cities of Germany will find this notion at least thinkable. Nevertheless, looking at the world as a whole, the drift for many decades has been not towards anarchy but towards the reimposition of slavery. We may be heading not for general breakdown but for an epoch as horribly stable as the slave empires of antiquity. James Burnham's theory has been much discussed, but few people have yet considered its ideological implications — that is, the kind of world-view, the kind of beliefs, and the social structure that would probably prevail in a state which was at once unconquerable and in a permanent state of ‘cold war’ with its neighbors."--George Orwell, Tribune 19 October 1945 ("You and the Atomic Bomb") Here, via Project Gutenberg.
In any event I like the cover design, which is probably the best part of the pamphlet. Sometimes that is all you really need.
Richard Feynman wrote this about symmetry in section 52 of the first volume of his Lectures in Physics (1963) the three volumes now beautifully available online at CalTech here. The last two paragraphs are also quoted in Mario Livio's "The Equation that Couldn't Be Solved", Martin Gardiner in "The Ambidexterous Universe", and many other places, probably hundreds of times; I've included the previous two paragraphs for interest's sake. Feynman, the symmetry master, included a statement about a beautiful gate in "Neiko" Japan, which must "Nikko", though I can hardly identify the gate that he was talking about. There seem to be many--see the UNESCO Shrines and Temples of Nikko, here--though the candidate in my mind is the extraordinary Yomeimon gate, here.
52–9 Broken symmetries
"The next question is, what can we make out of laws which are nearly symmetrical? The marvelous thing about it all is that for such a wide range of important, strong phenomena—nuclear forces, electrical phenomena, and even weak ones like gravitation—over a tremendous range of physics, all the laws for these seem to be symmetrical. On the other hand, this little extra piece says, “No, the laws are not symmetrical!” How is it that nature can be almost symmetrical, but not perfectly symmetrical? What shall we make of this?..."
Through the years I've known a little bit of something about the Freedman's Bureau, but I think I've never known its full name:
This was an act of federal relief proposed by Abraham Lincoln in 18651 empowered to help the newly-freed slave transition to their new life. The "Abandoned Lands" part was very unsettling, in a way equating refugees and newly-freed slaves with real estate. But it was the language and practice of the time, and economy of presentation, and logic, I guess, that lead to this title. The bureau here was the Military, which was run by General O.O. Howard. The Bureau functioned from 1865 until it ran out of steam (along with other Reconstruction efforts and measures) in 1872.
In a way it reminded me somewhat of the fight for the capitalization of the "N" in the word "Negro" (see here)--something that seems beyond the scope of thinking about nowadays, but it was a battlefield for decades after the Civil War, with that "N" not being decided positively until the 1920's in common usage. "Abandoned Lands" is not the same thinking-point as "N", though it does make one think about what was in the common thinking that these three titles could be brought together as a single bureau, regardless of the hyper- and sub-text of the meaning associations via proximity
Sec. 1. "...a bureau of refugees, freedmen, and abandoned lands, to which shall be committed, as hereinafter provided, the supervision and management of all abandoned lands, and the  control of all subjects relating to refugees and freedmen from rebel states, or from any district or county within the territory embraced in the operations of the army..."
Sec. 2 "And be it further enacted, That the Secretary of War may direct such issues of provisions, clothing, and fuel, as he may deem needful for the immediate and temporary shelter and supply of destitute and suffering refugees and freedmen and their wives and children..."
[Source: with thanks to Rebecca Onion at Slate Magazine's The Vault, who posted the original document earlier today and where the name of teh bureau struck me so heavily.]
“I believe that eight million Americans are entitled to a capital letter.” W.E.B. BuBois
The history of the power of words is long and complex, and for the most part is on one side or the other of the political and social mirror, at least in the United States. Controlling the meaning of a word or phrase controls the idea which alters the way people approach it, defining the very heart of what may control the impulse for war or peace, which means that people may die as much for words as they will for ideas.
Sometimes the idea of control is even simpler than the word—it may be a simple letter.
Like the letter “N”.
Since the American Civil War (now in its 150th year as of last month) when the use of the word “Negro” was used to describe African Americans, it appeared in print overwhelmingly with a small “n”. The idea was simple—to use a capital “N” would give a certain amount of respect and social diligence in referring to this race of people with a proper salutation; the small “n” minimized all of that, a symbol that these people were not worthy of having the initial letter of their race capitalized, and that because of their inferiority.
This was also the case with the word “Colored”, which was used in the decades before the Civil War and then lightly after that, giving way to for a short time to “Freedman” and then Negro--”Colored” appeared in print as “colored” in the vast majority of times.
The capital N was a rallying point, a common point of singularity for a large percentage of the Black population in the U.S.--and a very tiny percentage of the White population. We can deduce this because major papers such as the New York Times did not adopt a policy of using the capital “N” until 1930. And as a matter of fact the federal government documents printed “negro” small “n” beyond 1930, even though heavily lobbied to use the more enlightened and respectful “Negro”. The issue was evidently sidestepped throughout the Hoover administration.
There was in the country a racism so entrenched and engrained that African Americans were seen as being wholly unworthy of being dignified with a capital “N”, and it was the natural way of things. As the editor of the Eatonton, Georgia, newspaper Messenger said when asked about the capitalization issue, that he would not be a party to it, because “it would lead to social equality."
And that's just what everyone ws trying to avoid. And like the federal government, the control of the capital “N” extended into historical control as well. When W.E.B. DuBois wrote an article for the venerable American Historical Review, the editor, J. Franklin Jameson, refused to allow the use of DuBois' capitalized “Negro”. As the editor of the Dictionary of American Biography, Jameson refused the capital “N” in the publication until it was terminally embarrassed into doing so, in 1937.
And so it goes, 60 or 70 years of fighting for the minimum respect of capitalization.
And this doesn't even address the use of the reviled “n” word, which is a story unto itself. And in which there was also debate over the years as to whether or not that be capitalized.
Notes: The major source of the information in this post comes the fantastic article by Donald L Grant (12/01/1975). "Some Notes on the Capital \"N\"", Phylon (0031-8906), 36 (4), p. 435, who did a splendid job of research. There's also a bit of memory pulled from H.L. Mencken's The American Language (4th edition), which was clear and concise, even though Mr. M. had a small social problem here and there with minorities. It must be said in his defense that the magazine he edited, The American Mercury, "always" used the capital N.
I wanted to reproduce Wolfgang Pauli's letter of 4 December 1930--in it he thinks very widely of missing stuff, of some of the basic bits of the universe, in a rather open and guarded way, about the ghost of the neutron. He didn't feel very comfortable with his ideas yet, at least for professional consumption--that would have to wait another three years when it was discussed at the 7th Solvay Conference (1933) and another three when it first came into print (1936). The name "neutron" would also be changed to the familiar "neutrino" ("little one") by Enrico Fermi in 1933 to differentiate it from the much larger nuclear particle discovered the year earlier by James Chadwick--Chadwick's paper was published in Nature, which would reject Fermi's paper in 1934 as too radical a leap.
[Source: Exhibition of the ETH-Bibliothek to the occasion of the 100th birthday ofWolfgang Pauli http://www.library.ethz.ch/exhibit/pauli/neutrino_e.html]
[Still from the IBM 2013 video "A Boy and His Atom", where a team manipulated carbon monoxide atoms on a 45x25 nanometer frame. Just for reference a human hair is 105 nm, and there are 24x109 nanometers to the inch. Small.]
I was looking around, trying to figure out a chronology of small, of how small things can really be, when I decided to check out the basic terms of conversation in the Oxford English Dictionary. I was surprised to see that the first reference to "sub-atomic" was much earlier than I expected, finding a place in the five-year-old science journal Nature, in 1874. Just for the sake of it I've made a list of the atomic-related words that came to mind, just to see how they were entering relatively common usage in English. And so, below: sub-atomic, inter-atomic, atom, split the atom, proton, electron, and neutron.
I was looking around for one of the original references to the earliest human-tech definition of "singularity" and found it in a roundabout way, a classic reference referenced in a classic paper on singularity. Vernor Vinge wrote a breakout paper in 1993 called “The Coming Technological Singularity: How to Survive in the Post-Human Era"1. Among many other things the San Diego State math prof quotes how the great Stan Ulam paraphrased John von Neumann saying: “One conversation centered on the ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.” This was in 1958, and it appeared in Ulam's "Tribute to John von Neumann" in the Bulletin of the American Mathematical Society, (volume 64, number 3, part 2, pp 1-49).
It struck me as ironic that the "singularity" would appear just at the time von Neumann2--perhaps without equal in this century in thinking in terms of the computer and its applications and overall sheer brain-power--died, Ulam surfacing the term in what was basically a memorial/obituary/celebration issue of the Bulletin, the carbon-based life-form container finally failing the great mind.
It was then that I came to realize how much biologicalization has taken place in compsci terminology--not the least of which is the self-replicating and damaging "virus", which itself of course is a massive biological deal, though in the digital world it is not its most abundant entity3. E-viriology is found just about everywhere, much like its bio counterpart, which is located in every ecosystem on Earth.
Even the word "computer" has an earlier biological counterpart--the "computer" was a human tabulator, a person grunting out figures into some sort of tabulating device. (Tracts for Computers, a series that began in 1919 and edited by Karl Pearson, is filled with statistical elements intended for the human computer...)
But what strikes me first are the bio references for the bad stuff. The viruses, and then later, the worms, and Trojan horses. (I should point out the "bug" enters the computer vernacular fairly early, in 1949, via (later Admiral) Grace Murray Hooper, though it doesn't get listed by her in her 1954 glossary of computer terms as published in two parts in Computers and Automation, volume 4, 1954. There's no "bug", though there is "de-bug".)
"Virus" emerges in a science fiction effort by Douglas Gerrold in 1972, a few years before they were artificially produced, which was a few years before a virus was released into the e-phere ("in the wild"). In 1975 John Bruner unleashes a "worm" in his Shockwave Rider.
Others early viruses have biological names: Creeper (1970), Rabbitt (1974), ANIMAL (by John Walker, though not created for being malicious, 1975), Top Cat (1980), Elk (1982), Whale (1990), Hare (1996), Blackworm (2006). There are of course many more names for viruses (and company) that are not biological, but it struck me of how many of the earliest examples do have animal names. I'm not sure that I have much to say about this presently, though I did want to put the general observation out there in this note.
1. The abstract of the paper begins: "The acceleration of technological progress has been the central feature of this century. I argue in this paper that we are on the edge of change comparable to the rise of human life on Earth. The precise cause of this change is the imminent creation by technology of entities with greater than human intelligence. There are several means by which science may achieve this breakthrough..."
2. Perhaps of most interest here is von Neumann's 1949 paper, "Theory and Organization of Complicated Automata", which looks as the logic required for the self-replicating machine, in A. W. Burks, ed., Theory of Self-Reproducing Automata [by] John von Neumann, University of Illinois Press, Urbana, pp. 29-87. This was based on transcripts of lectures delivered at the University of Illinois, in December 1949, and then edited for publication by A.W. Burks.
3. "Virus" is an old word, and is Latin for "poison" or "poisonous", and which first appeared in English in 1392. "Virulent" appears in English in 1728, "viral" in 1948, "virion" in 1958. "Virus" as we know it bioloigcally today has a somewhat complicated history, escaping Pasteur and his microscope until it emerges (again) with Martinus Beijernick in 1898.
(Almost) in the beginning were monsters.The epic battle that Moses wages early on is not with Pharaohs, but with the dragon(s?) that the creator itself had fought on the opening whisper of creation (Book of Job 26:12, Psalms 89:10) and who would again meet at the very last bits of the closing days.Behemoth, Leviathan, Rehab may all well have been monsters to these ancient folks, but they very well might look like rhinoceros or crocs or hippos to us.Monster demons like Rehab (Psalm 87:4), a slaughtering beast who would be reintroduced to a different part of the world as Tiamar who would or could also be known as the Red Sea, lifted straight from Mesopotamian mythology and placed directly into that of the Old Testament among the rest of the borrowed stories and beliefs, a problem by any other name.
Following names and their cyclonic twists, and absences and sudden re-emergence through the history of storytelling is dizzying—just consult your Robert Graves on myths if you want to have your memory plumbed (the great poet and writer doing not such a poetical or writerly job in this effort in my opinion though most people love it). Keeping an eye on the mix and mash of gods and goddess and associated super beings from thousands of years ago, the god of the Old Testament makes it very clear and precise about just who he is in his self-introduction to Moses:“I am the God thy Father, the God ofAbraham, the God of Isaac, and the God of Jacob” (Exodus 3:6)
Equally almost in the beginning, again so far as the Bible is concerned, are colors—before that maybe everything was black and white, or just white, or maybe just black, depending on your epistemological concept of everythingness or nothingness. (It looks like green may be the first color mentioned in the bible, (Gen 1:30 And to every beast of the earth, and to every fowl of the air, and to every thing that creepeth upon the earth, wherein there is life, I have given every green herb for meat: and it was so) though the mixing up of meat and veggie is a little confusing to me.))
Over the years color names themselves have creepethed among themselves like a vocabularic ocean, a fluid dynamic of naming. Names have flowed across their individual spectra, some names sticking, some not; the originator of the concept of the naming of "red", the original namer of the color, lost to the earliest and deepest part f the collective human memory.
I don't know where many of the names of colors come from, or why. Index to Color Names and Color Numbers of the Standard and Season Color Cards of America (published and created by the Textile Color Card Association), and published in 1923, is filled with color names whose meaning and origin are a mystery and whose necessity seem to hinge on sunspots. Which is fine, though it might be interesting to have had color names more dependent on that which went before. Like the names of the streets on most of Connecticut Avenue in D.C. are alphabetical, and once the first 26 letters or so are monosyllabicly employed, the second set starts with two syllables, and then three. It is a system that usefully indicates where long the long avenue you might be. It might be useful to employ such a method in color names; or not.
And I suspect it would be "or not", unless the poetry and art and music inherent in these formulations would be imaginatively employed
But on to the color names: Ambulance, Basketball, Bosom, Cowboy, Squirrel, Chit, Old, Nymph, Old mephisto, Pelt, Racket are examples of some of the mystery colors.
Some names which were part of institutionalized racism I'm sure are now gone: Arab, Negro, African; Bagdahd (?), Bombay Brown, Coolie Yellow, Coolie, Congo Brown, Egyptian Husk, Hankow (yellow), Kyoto Yellow, Korea Yellow, Mandarin Yellow, Punjab Brown, Kafir, Tar Baby.
But I've got to say, even though the names may not have much to do with the colors, most of the names in the pamphlet sound quite lovely, and many are yummy: London Smoke, Log Cabin, Leadville, Madonna, Naked, Pitchpine, Pompeii, Prelate, Smoked Pearl, Swamp, Lucky Stone, to name a few. Overall I doubt that this is what Newton, Goethe, Chevreul, Rood, Maxwell and the rest had in mind when they were figuring out what color *is*, but I do think that all of them had large enough poetic natures (Newton the weakest and Goethe by far the strongest) to appreciate the occasional beauty of naming. The unbelievable Shakespeare seems not to have spent that much time on color (so far as I know, and I don;'t much about the Bard), but (in a dear-sweet-god understatement) other people did: people like Richard Feynman synesthesically thought in mathematical color terms, others created musical instruments which would produce color from music while most of the rest of the world produced music which conveyed color, and on and on. And then of course there's the whole world of art.
But I won't go there now--I just wanted to follow this loose thread in what seemed to be a pretty inert pamphlet--in the end it opened itself to a lot of possibilities with just a little thinking.
It is interesting to think about the newness of old things, particularly English words that we have in use every day. In this case I'm referring to words in the sciences--and not necessarily the words coming in the 20th century following the explosion of modernity beginning in 1895. It is surprising sometimes to realize the relative newness of some terms, like, for example, "scientist". The word "science" is very old and very old in English, but the word "scientist" is coined only in 1833. It is surprising to think of the modernity of some of the words when by their constant use they seem as though they must be ancient, but of course are not so.
Here's a quick lising of some interesting candidates, everyday words with a not-very-old lineage, their dates taken from first-usages identified by the Oxford English Dictionary:
I always thought that the word "scientist" came to us from William Whewell in his Philosophy of the Inductive Sciences (volume 1, page 113) in 1840, the group of people dedicated intrepid seekers of standards and anomalies finally receiving a short and concise (if three-syllable) name for what it is they are (for even in death a scientist is still a scientist, no past tense there, like a Marine): once one, always one.
"We need very much a name to describe a cultivator of science in general. I should incline to call him a Scientist."
But I notice that an anonymous note in the Quarterly Review peeks its head under the tent, using the word six years earlier in 1834, though not favorably:
"Science..loses all traces of unity. A curious illustration of this result may be observed in the want of any name by which we can designate the students of the knowledge of the material world collectively. We are informed that this difficulty was felt very oppressively by the members of the British Association for the Advancement of Science, at their meetings..in the last three summers... Philosophers was felt to be too wide and too lofty a term,..; savans was rather assuming,..; some ingenious gentleman proposed that, by analogy with artist, they might form scientist, and added that there could be no scruple in making free with this termination when we have such words as sciolist, economist, and atheist—but this was not generally palatable."--Quarterly Review, volume 51, page 59, via the Oxford English Dictionary
Later in the year a prettier offer emerges, though it has more to do with beauty than truth:
What brought me to the word "scientist" was an article/notice in Nature in 1873, where scientists are terribly put out by what the writer felt was a disparaging of the sciences--in short, a pissed-off mini-screed on the societal attack of ennui against the very idea of Science. The thing is in the two paragraphs the word scientist doesn't come into play. So far as I can tell, this is the earliest scientific screed in this high-ish Victorian journal of proper and popular science, Nature began in 1869, and even though it was eight volumes and four years into publication, I'm surprised that it took this long for a note like this to appear. It isn't exactly a white-knuckled jumbleup venom-bomb from Dr. Hunter S. Thompson, but the anonymous writer does get the message across:
The War Production Board (WPB)--the entity responsible for printing the leaflets below--was instituted 16 January 1942 as a federal effort to direct U.S. wartime production and allocation of essential commodities. And time. The WPB centralized the effort to convert some peacetime industry to wartime production, as well as direct the production of existing wartime industries in the manufacture of critical goods, and rationed or prohibited the manufacture of other material that could possibly hinder the war effort. Thus the WPB rationed things like sugar, heating oil, plastic, gasoline, paper, rubber, nylon and metals of all descriptions; it also controlled large swaths of the workforce, restricting wages and benefits as well as prices. It was an essential element of fighting the war, and brought necessary control to an economy and industrial base that needed organization for what would become a total war.
Instituting a war-production base for American industry allowed production to ratchet way up, producing (for example) multiples more aircraft more in 1943 than in 1941: 1940, 6k aircraft; 1941, 19k; 1942, 47k; 1943 85k; 1944, 56k; 1945, 46k. Also there were enormous production advancements for total shipping tonnage (reaching a wartime production total of 33 million tons), across-the-board huge increases in coal and iron ore (and especially in crude oil), and on and on. The U.S. was able to mobilize itself and take advantage of its enormous natural resources, industrial base and civilian workforce in what was essentially an unreachable island economy, forming what was actually an unbeatable combination of war goods production. (Plus of course there was the atomic bomb, which is another story, but which also only could have been produced in the U.S., given the enormous quantities of energy and material needed to begin its production.)
There have been many Kings of the Hoboes, and Emperor of the Hoboes, in the history of American Hobodom. The most widely recognized of all this royalty is, probably, Mr. Jeff Davis, who was elected King of Hoboes each year from 1908 to 1935--in that year at the Pittsburgh meeting of the annual "Hoboes of America" his minions gave up elected him King for Life. Of hoboes, that is, the Knights of the Road. Davis was also a real hobo, unlike the pretenders and throne-seekers, who in general were not. (Nels Anderson, in his Men On The Move, written just as the Depression was broken, (1940) observed: "Whatever else may be said of King Jeff, his romanticizing the hobo is not without a basis in reality, and his poetic interest in the species arose from experience. But King Jeff has placed on a pedestal a man who belongs to the past. The hobo belongs with the pre-Hollywood cowboy and the lumberjacks of the Paul Bunyan legends.”)