An Advent Calender doesn’t need to be monotonic. I now intend to reveal what will be behind the 24th of Dec: World Peace❣
The planet will become a friendly civilized planet, this is also extremly easy to implement, it’s just to stop fighting, it’s completely free, nothing to argue about.
Then regarding the Singularity and the technological evolution that will soon take off for real, as I described in my contribution from the 17th of Nov, resulting in a friendly civilized planet, full of Love with humans that can think and stand on their own, no longer being victims of deceivers.
I’ll describe this in detail soon, first by filling in more than titles in the previous contributions during December, but we will also soon open our portal, that will take you to the future. When I started planning this project, it started with a school paper in 1987, TankeNyckeln, (Mind Lever, Thinking Spanner, Ajatus Avain, or Ajatus Jokuavain) , here a non-Scribd link This school paper described the society’s technical and social evolution from 1987 to 2037, but in two steps. The first step was to 2012, where the first 25 years have been tremendously exact, despite all attempts to destroy the development from many directions, like trying to control internet, information and knowledge, secret protocols, patents in absurdum and dystopic laws like DMCA.
However, against a super exponential development, as earlier described, nothing can no longer hold it back, the good forces are too strong, the development process during 25 years from now to 2037, will be so mind boggling, so beautiful, so dizzying, so flourishing, that when you finally arrive to the future, it will be like waking up from a dream, where the current past was just like a night mare, a really bad one, when the present dystopia will be quickly dissolved and forgotten, like any bad dream usually do. But…, your awakening will be fantastic❣
I’ll soon provide you the whole report in English and a few other languages, but I’ll present the end visions here, which start at page 61 in the Swedish scanned version I linked to above.
We are “writing” anno 2037. It has passed one century since Konrad Zuse constructed the first electromechanical digital calculator whose fundamentally simple principle made the computer, the most wonderful invention by mankind, possible.
Probably the society would have existed still even without computers, but no-one is longer sure about this.
During the previous century the humanity was suffering from a lot of problems that today only exists in the historic data bases.
Here are mentioned such irregularities as starvation, drug abuse, apartheid, environmental pollution and destruction, bureaucracy, educational problems, unemployment and the ubiquitous threat that a leader from one of that era’s “countries” would choose to solve a conflict with non peaceful means. Word like “VIOLENCE” and “WAR” were common at this time.
Today most elderly people have usually repressed that these problems have ever existed. There are many people of the young generation that during their primary education refuse to believe that we have ever behaved like that. However, it often happens that some of the younger persuade someone with associations left from this time to narrate about that era’s horror and foolishness.
The technical and psychosocial development has only during the last 50 years changed the society by enhancing the good properties of mankind to such great an extent that as late as 1970-1990 it was hardly possible for anyone to imagine what will come. During this epoch there were even people that had lost their beliefs in both technology and humanity.
4.2 Technical development
Computer technology first developed quite slowly and didn’t became wide spread. Due to the invention of the “micro processor” 1971 a lot of new toys were designed the next 20 years under the name “computers”. However, most of these designs were suffering from so many defects and imperfections that they were not worth to be denoted “computer”. It was not these that should make the human culture move forward. In the late 80-ies there were many, among others a young engineer in the place denoted “Sweden” that noticed that the development and the society were heading into a cul-de-sac a deadlock, which it could take very long time for the society to get out from, if ever.
He therefore designed the “TankeNyckeln” (Mind Lever, Thinking Key, Thinking Spanner).
This rather simple tool combined what was this era’s best technology regarding speed, generality, portability, input and presentation technology with what all other similar computers missed: communcation!
The TankeNyckeln became a success, it was already from the beginning so simple and general, that it during long time couldn’t be improved in many other ways than becoming faster, more energy efficient and more cost efficient.
From the early beginning it had a suitable size to be easy to work with by holding it in one hand. After some time when presentation technology improved it could also be worn around the neck with the “display chips” in a kind of glasses.
Since long one had within the area of micro biology mastered the technology to achieve synaptic contact between electronic and optic circuits with nerve cells. The foundations for this was established in 1987 when some researchers succeeded to grow nerves together with silicon cells. Thanks to this, and due to the fact that the circuits now started to become so energy efficient that they could get their power supply directly from the human body, many use to implant their TankeNyckel under the skin. To “see” has during later years become a word with very various meanings.
These Thinking Spanners started already from 1998 to be delivered as development assistents to the developmental areas of the planet. 2012 also every humans right to CPU power and communcation was added to United Nation’s declaration of human rights.
Most homes today have optical communication channels around 10 Gigabit/s, there is also a stationary computer to distribute this information locally. In those day’s terms these local computers would have a computational efficiency of around 10-100 TIPS (Tera Instructions per second).
The old day’s computer screens and loud speakers are nowadays mostly seen on muséeums and with nostalgic collectors. Picture and sound can now be delivered with an earlier completely unsurpassed brilliance and lack of distorsion, by skipping the earlier analogue part of the transmission. The experience is now conveyed to the brain without going through eye receptors and hearing nerves. It may be superfluous to mention but concepts as “blindness” and deafness are no longer used.
An interesting detail about the fixed installed computational power is that every human being at any moment is guaranteed around 1/(8*10^9) of this power, thanks to TOS (Terrestrial Operative System), but can also whenever suitable, achieve up to 50% of this computer power for computer simulations with high demands.
4.3 Psychosocial development
Most people have today no clear interpretation of the difference between work and spare time. Most of the people today would with old terms be seen as Jack-of-all-trades skilled in many arts.
Today no one suffer from ignorance, any one who doesn’t find what they are looking for in some context in their own memory, are, most often unconsciously, looking up this in their TankeNyckel’s data base of all human knowledge, which is a growing all the time, as so large part of the humanity is now aiming their time to some kind of resarch project.
All people in the world can now socialize without need to care about what language they are using. Most people master some of the accepted global languages which have evolved. The language doesn’t matter much apart from in very emotional situations. Two people who are discussing with each other through each one’s TankeNyckel usually don’t notice that the speech is simultaneously interpreted and translated. Each person can hear an almost perfect synthesis of the other person’s voice, modulated with the mood or spirit the other person feel at that moment.
Old time cars are nowadays not used for transport, they are mostly considered as curiosities by collectors. When somebody decides to travel somewhere it takes between one half to five minutes before the vehicle show up after the desired trip has been “thought” or decided.
PersonConts, these small capsules that exist in a lot of different sizes for e.g. 1-8 persons, have an enormous comfort and allow people to move between different places on Earth tremendously fast.
The transportation system works like an energy pendulum. When travelling for instance over the Atlantic Sea, the capsule is accelerated half way electromagnetically in a vacuum tunnel, the other half of the time it is decelerating and works as a generator (implying that the actual energy used is close to zero).
Flying and boat traffic exists but is no longer used as transportation within the atmosphere. For heavier transports are often airships used. To communicate between the space stations and the research bases on the Moon are also a form of the energy pendulum used, but then with somewhat stronger capsules. From some of the highest mountains on Earth, these tunnels are leading out into space.
The total production machinery of Earth now consists of an almost maintenance free machinery, which is so flexible that any stockrooms are no longer needed.
Anyone who needs a specific part or product be manufactured, just formulates their need through their TankeNyckel whereupon the part will be produced in the most suitable manufacturing facility concerning degree of difficulty, distance and how quick it need to be ready. If some part needed is not possible to manufacture with the current manufacturing resources an analysis is made which manufacturing facility is most suitable to upgrade for production of the required part or product.
4.6 Computer Science development
Here is an approximate summary of the development within programming methodology during the last 50 years, presented as “generations”.
3rd: Generic languages with high degree of freedom, like LISP, SETL etc.
4th: Logical and data base languages like Prolog, Mapper, OPS5 etc. Expert aids as e.g. Macsyma.
5th: In principal high performance logical languages with high capabilities for parallelism, easy to use user interfaces and large content of so called AI.
6th: The logical languages have now developed to treat also uncertain facts, rules and associations. A property like intuition can be formulated.
7th: Distributed operative systems and architectures for optimal parallel processing. Concepts like TOS (Terrestrial Operative System) and TVM (Terrestrial Virtual Memory) are being defined.
8th: Problem formulations and hypothesis testing can now be made by directly relating facts, hypotheses and theories within TVM.
9th: The delicate improvements in the communication between man and TOS (Terrestrial Operating System) has implied that you can now work at an abstraction level which approaches the border of human fantasy. A problem or a solution that earlier has been almost impossible to reason about abstractly, can now be formulated through small hints by “thinking” as close to the problem as you can. The final formalization of the problem is obtained by automatic or “manual” correlation between all possible and many impossible formalizations until sufficient significance has been reached.
The engineer who made the first TankeNyckel is now working with age related research, his comment: “When we have come this far during the last 100 years, who do then want to miss the next 1000 years.
I denoted today’s contribution a revelation, but it was in fact a revelation from more than one perspective, before I finished this report by writing this visionary part above I had one evening during March 1987 seen a 2×2 m projection on an inner wall at home during around 3 minutes, under heavy convulsions and cold sweat. I did not understood what I had seen, it looked like a portal, and a fantastic city, but it was just recently I recognized it, it was the front page of Keith Curtis book After the Software Wars, which was released a couple of years ago. I asked Keith who had made the wonderful picture, and he said Nils Seifert, this picture, titled Metropolis 2208, also made perfect sense, as Metropolis also was the name of the fantastic movie Metropolis, by Fritz Lang, from 1927, which exactly describes the dystopic state of the world today, where advanced technology is used to control and supervise people, instead of helping people making progress.
It was also this which later inspired me to start my PhD program, and after many years as a researcher and a research consultant I realized how to implement this 9th generation programming language , and the implementation of this is what I’m working on now, soon to be released as a portal, a portal to take humanity to the future, to a real civilized state, in peace and progress.
Best wishes ♡♡♡
PS. the only personal computer so far, which has been released and quickly withdrawn…, which would correspond to the concept of TankeNyckeln is Apple’s Newton from 1992, but of some reason Steve Jobs killed it, despite Apple is probably sitting on the largest resource on the planet for user independent hand writing…
First, as you know, Albert Einstein was wrong, Hitler did not develop nuclear weapons of mass destruction. This time I have a guest blogger, it is an analysis written by U.S. Arms Control and Disarmament Agency, 1975, with a foreword by their Director, Fred C. Ikle.
You may have noticed, now during December I’ve just made the titles as an Advent calender (to be later filled in case we reach the next level in this game, as I’m lazy, as all developers. I only do something when it’s needed).
WORLDWIDE EFFECTS OF NUCLEAR WAR — SOME PERSPECTIVES
U.S. Arms Control and Disarmament Agency, 1975.
The Mechanics of Nuclear Explosions
A. Local Fallout
B. Worldwide Effects of Fallout
Alterations of the Global Environment
A. High Altitude Dust
Note 1: Nuclear Weapons Yield
Note 2: Nuclear Weapons Design
Note 3: Radioactivity
Note 4: Nuclear Half-Life
Note 5: Oxygen, Ozone and Ultraviolet Radiation
Much research has been devoted to the effects of nuclear weapons. But
studies have been concerned for the most part with those immediate
consequences which would be suffered by a country that was the direct
target of nuclear attack. Relatively few studies have examined the
worldwide, long term effects.
Realistic and responsible arms control policy calls for our knowing more
about these wider effects and for making this knowledge available to the
public. To learn more about them, the Arms Control and Disarmament Agency (ACDA) has initiated a number of projects, including a National Academy of Sciences study, requested in April 1974. The Academy’s study, Long-Term Worldwide Effects of Multiple Nuclear Weapons Detonations, a highly technical document of more than 200 pages, is now available. The present brief publication seeks to include its essential findings, along with the results of related studies of this Agency, and to provide as well the basic background facts necessary for informed perspectives on the issue.
New discoveries have been made, yet much uncertainty inevitably persists.
Our knowledge of nuclear warfare rests largely on theory and hypothesis,
fortunately untested by the usual processes of trial and error; the paramount goal of statesmanship is that we should never learn from the experience of nuclear war.
The uncertainties that remain are of such magnitude that of themselves they must serve as a further deterrent to the use of nuclear weapons. At the same time, knowledge, even fragmentary knowledge, of the broader effects of nuclear weapons underlines the extreme difficulty that strategic planners of any nation would face in attempting to predict the results of a nuclear war. Uncertainty is one of the major conclusions in our studies, as the haphazard and unpredicted derivation of many of our discoveries emphasizes.
Moreover, it now appears that a massive attack with many large-scale nuclear detonations could cause such widespread and long-lasting environmental damage that the aggressor country might suffer serious physiological, economic, and environmental effects even without a nuclear response by the country attacked.
An effort has been made to present this paper in language that does not require a scientific background on the part of the reader. Nevertheless it must deal in schematized processes, abstractions, and statistical generalizations. Hence one supremely important perspective must be largely supplied by the reader: the human perspective–the meaning of these physical effects for individual human beings and for the fabric of
Fred C. Ikle
U.S. Arms Control and Disarmament Agency
It has now been two decades since the introduction of thermonuclear fusion weapons into the military inventories of the great powers, and more than a decade since the United States, Great Britain, and the Soviet Union ceased to test nuclear weapons in the atmosphere. Today our understanding of the technology of thermonuclear weapons seems highly advanced, but our knowledge of the physical and biological consequences of nuclear war is continuously evolving.
Only recently, new light was shed on the subject in a study which the Arms Control and Disarmament Agency had asked the National Academy of Sciences to undertake. Previous studies had tended to focus very largely on radioactive fallout from a nuclear war; an important aspect of this new
study was its inquiry into all possible consequences, including the effects
of large-scale nuclear detonations on the ozone layer which helps protect
life on earth from the sun’s ultraviolet radiations. Assuming a total
detonation of 10,000 megatons–a large-scale but less than total nuclear
“exchange,” as one would say in the dehumanizing jargon of the
strategists–it was concluded that as much as 30-70 percent of the ozone
might be eliminated from the northern hemisphere (where a nuclear war would presumably take place) and as much as 20-40 percent from the southern hemisphere. Recovery would probably take about 3-10 years, but the Academy’s study notes that long term global changes cannot be completely ruled out.
The reduced ozone concentrations would have a number of consequences
outside the areas in which the detonations occurred. The Academy study
notes, for example, that the resultant increase in ultraviolet would cause
“prompt incapacitating cases of sunburn in the temperate zones and snow
blindness in northern countries . . ”
Strange though it might seem, the increased ultraviolet radiation could
also be accompanied by a drop in the average temperature. The size of the
change is open to question, but the largest changes would probably occur at
the higher latitudes, where crop production and ecological balances are
sensitively dependent on the number of frost-free days and other factors
related to average temperature. The Academy’s study concluded that ozone changes due to nuclear war might decrease global surface temperatures by only negligible amounts or by as much as a few degrees. To calibrate the significance of this, the study mentioned that a cooling of even 1 degree centigrade would eliminate commercial wheat growing in Canada.
Thus, the possibility of a serious increase in ultraviolet radiation has
been added to widespread radioactive fallout as a fearsome consequence of the large-scale use of nuclear weapons. And it is likely that we must
reckon with still other complex and subtle processes, global in scope,
which could seriously threaten the health of distant populations in the
event of an all-out nuclear war.
Up to now, many of the important discoveries about nuclear weapon effects
have been made not through deliberate scientific inquiry but by accident.
And as the following historical examples show, there has been a series of
“Castle/Bravo” was the largest nuclear weapon ever detonated by the United States. Before it was set off at Bikini on February 28, 1954, it was
expected to explode with an energy equivalent of about 8 million tons of
TNT. Actually, it produced almost twice that explosive power–equivalent
to 15 million tons of TNT.
If the power of the bomb was unexpected, so were the after-effects. About
6 hours after the explosion, a fine, sandy ash began to sprinkle the
Japanese fishing vessel Lucky Dragon, some 90 miles downwind of the burst point, and Rongelap Atoll, 100 miles downwind. Though 40 to 50 miles away from the proscribed test area, the vessel’s crew and the islanders received heavy doses of radiation from the weapon’s “fallout”–the coral rock, soil, and other debris sucked up in the fireball and made intensively radioactive by the nuclear reaction. One radioactive isotope in the fallout, iodine-131, rapidly built up to serious concentration in the thyroid glands of the victims, particularly young Rongelapese children.
More than any other event in the decade of testing large nuclear weapons in
the atmosphere, Castle/Bravo’s unexpected contamination of 7,000 square
miles of the Pacific Ocean dramatically illustrated how large-scale nuclear
war could produce casualties on a colossal scale, far beyond the local
effects of blast and fire alone.
A number of other surprises were encountered during 30 years of nuclear
weapons development. For example, what was probably man’s most extensive modification of the global environment to date occurred in September 1962, when a nuclear device was detonated 250 miles above Johnson Island. The 1.4-megaton burst produced an artificial belt of charged particles trapped in the earth’s magnetic field. Though 98 percent of these particles were removed by natural processes after the first year, traces could be detected 6 or 7 years later. A number of satellites in low earth orbit at the time of the burst suffered severe electronic damage resulting in malfunctions and early failure. It became obvious that man now had the power to make long term changes in his near-space environment.
Another unexpected effect of high-altitude bursts was the blackout of
high-frequency radio communications. Disruption of the ionosphere (which
reflects radio signals back to the earth) by nuclear bursts over the
Pacific has wiped out long-distance radio communications for hours at
distances of up to 600 miles from the burst point.
Yet another surprise was the discovery that electromagnetic pulses can play havoc with electrical equipment itself, including some in command systems that control the nuclear arms themselves.
Much of our knowledge was thus gained by chance–a fact which should imbue us with humility as we contemplate the remaining uncertainties (as well as the certainties) about nuclear warfare. What we have learned enables us, nonetheless, to see more clearly. We know, for instance, that some of the earlier speculations about the after-effects of a global nuclear war were as far-fetched as they were horrifying–such as the idea that the
worldwide accumulation of radioactive fallout would eliminate all life on
the planet, or that it might produce a train of monstrous genetic mutations
in all living things, making future life unrecognizable. And this accumulation of knowledge which enables us to rule out the more fanciful possibilities also allows us to reexamine, with some scientific rigor, other phenomena which could seriously affect the global environment and the populations of participant and nonparticipant countries alike.
This paper is an attempt to set in perspective some of the longer term effects of nuclear war on the global environment, with emphasis on areas
and peoples distant from the actual targets of the weapons.
THE MECHANICS OF NUCLEAR EXPLOSIONS
In nuclear explosions, about 90 percent of the energy is released in less
than one millionth of a second. Most of this is in the form of the heat
and shock waves which produce the damage. It is this immediate and direct explosive power which could devastate the urban centers in a major nuclear war.
Compared with the immediate colossal destruction suffered in target areas,
the more subtle, longer term effects of the remaining 10 percent of the energy released by nuclear weapons might seem a matter of secondary concern. But the dimensions of the initial catastrophe should not overshadow the after-effects of a nuclear war. They would be global, affecting nations remote from the fighting for many years after the holocaust, because of the way nuclear explosions behave in the atmosphere and the radioactive products released by nuclear bursts.
When a weapon is detonated at the surface of the earth or at low altitudes,
the heat pulse vaporizes the bomb material, target, nearby structures, and
underlying soil and rock, all of which become entrained in an expanding,
fast-rising fireball. As the fireball rises, it expands and cools,
producing the distinctive mushroom cloud, signature of nuclear explosions.
The altitude reached by the cloud depends on the force of the explosion.
When yields are in the low-kiloton range, the cloud will remain in the
lower atmosphere and its effects will be entirely local. But as yields
exceed 30 kilotons, part of the cloud will punch into the stratosphere,
which begins about 7 miles up. With yields of 2-5 megatons or more,
virtually all of the cloud of radioactive debris and fine dust will climb
into the stratosphere. The heavier materials reaching the lower edge of
the stratosphere will soon settle out, as did the Castle/Bravo fallout at
Rongelap. But the lighter particles will penetrate high into the
stratosphere, to altitudes of 12 miles and more, and remain there for
months and even years. Stratospheric circulation and diffusion will spread
this material around the world.
Both the local and worldwide fallout hazards of nuclear explosions depend
on a variety of interacting factors: weapon design, explosive force, altitude and latitude of detonation, time of year, and local weather conditions.
All present nuclear weapon designs require the splitting of heavy elements
like uranium and plutonium. The energy released in this fission process is
many millions of times greater, pound for pound, than the most energetic
chemical reactions. The smaller nuclear weapon, in the low-kiloton range,
may rely solely on the energy released by the fission process, as did the
first bombs which devastated Hiroshima and Nagasaki in 1945. The larger
yield nuclear weapons derive a substantial part of their explosive force
from the fusion of heavy forms of hydrogen–deuterium and tritium. Since
there is virtually no limitation on the volume of fusion materials in a weapon, and the materials are less costly than fissionable materials, the fusion, “thermonuclear,” or “hydrogen” bomb brought a radical increase in the explosive power of weapons. However, the fission process is still
necessary to achieve the high temperatures and pressures needed to trigger the hydrogen fusion reactions. Thus, all nuclear detonations produce
radioactive fragments of heavy elements fission, with the larger bursts
producing an additional radiation component from the fusion process.
The nuclear fragments of heavy-element fission which are of greatest
concern are those radioactive atoms (also called radionuclides) which decay by emitting energetic electrons or gamma particles. (See “Radioactivity” note.) An important characteristic here is the rate of decay. This is measured in terms of “half-life”–the time required for one-half of the
original substance to decay–which ranges from days to thousands of years
for the bomb-produced radionuclides of principal interest. (See “Nuclear
Half-Life” note.) Another factor which is critical in determining the hazard of radionuclides is the chemistry of the atoms. This determines whether they will be taken up by the body through respiration or the food cycle and incorporated into tissue. If this occurs, the risk of biological damage from the destructive ionizing radiation (see “Radioactivity” note) is multiplied.
Probably the most serious threat is cesium-137, a gamma emitter with a
half-life of 30 years. It is a major source of radiation in nuclear fallout, and since it parallels potassium chemistry, it is readily taken into the blood of animals and men and may be incorporated into tissue.
Other hazards are strontium-90, an electron emitter with a half-life of 28
years, and iodine-131 with a half-life of only 8 days. Strontium-90
follows calcium chemistry, so that it is readily incorporated into the
bones and teeth, particularly of young children who have received milk from
cows consuming contaminated forage. Iodine-131 is a similar threat to
infants and children because of its concentration in the thyroid gland.
In addition, there is plutonium-239, frequently used in nuclear explosives.
A bone-seeker like strontium-90, it may also become lodged in the lungs,
where its intense local radiation can cause cancer or other damage.
Plutonium-239 decays through emission of an alpha particle (helium nucleus) and has a half-life of 24,000 years.
To the extent that hydrogen fusion contributes to the explosive force of a
weapon, two other radionuclides will be released: tritium (hydrogen-3), an
electron emitter with a half-life of 12 years, and carbon-14, an electron
emitter with a half-life of 5,730 years. Both are taken up through the
food cycle and readily incorporated in organic matter.
Three types of radiation damage may occur: bodily damage (mainly leukemia and cancers of the thyroid, lung, breast, bone, and gastrointestinal
tract); genetic damage (birth defects and constitutional and degenerative
diseases due to gonodal damage suffered by parents); and development and growth damage (primarily growth and mental retardation of unborn infants and young children). Since heavy radiation doses of about 20 roentgen or more (see “Radioactivity” note) are necessary to produce developmental defects, these effects would probably be confined to areas of heavy local fallout in the nuclear combatant nations and would not become a global problem.
A. Local Fallout
Most of the radiation hazard from nuclear bursts comes from short-lived
radionuclides external to the body; these are generally confined to the
locality downwind of the weapon burst point. This radiation hazard comes
from radioactive fission fragments with half-lives of seconds to a few
months, and from soil and other materials in the vicinity of the burst made
radioactive by the intense neutron flux of the fission and fusion
It has been estimated that a weapon with a fission yield of 1 million tons
TNT equivalent power (1 megaton) exploded at ground level in a 15
miles-per-hour wind would produce fallout in an ellipse extending hundreds
of miles downwind from the burst point. At a distance of 20-25 miles
downwind, a lethal radiation dose (600 rads) would be accumulated by a
person who did not find shelter within 25 minutes after the time the
fallout began. At a distance of 40-45 miles, a person would have at most 3
hours after the fallout began to find shelter. Considerably smaller
radiation doses will make people seriously ill. Thus, the survival
prospects of persons immediately downwind of the burst point would be slim unless they could be sheltered or evacuated.
It has been estimated that an attack on U.S. population centers by 100
weapons of one-megaton fission yield would kill up to 20 percent of the
population immediately through blast, heat, ground shock and instant
radiation effects (neutrons and gamma rays); an attack with 1,000 such
weapons would destroy immediately almost half the U.S. population. These
figures do not include additional deaths from fires, lack of medical
attention, starvation, or the lethal fallout showering to the ground
downwind of the burst points of the weapons.
Most of the bomb-produced radionuclides decay rapidly. Even so, beyond the blast radius of the exploding weapons there would be areas (“hot spots”)
the survivors could not enter because of radioactive contamination from
long-lived radioactive isotopes like strontium-90 or cesium-137, which can
be concentrated through the food chain and incorporated into the body. The
damage caused would be internal, with the injurious effects appearing over
many years. For the survivors of a nuclear war, this lingering radiation
hazard could represent a grave threat for as long as 1 to 5 years after the
B. Worldwide Effects of Fallout
Much of our knowledge of the production and distribution of radionuclides
has been derived from the period of intensive nuclear testing in the
atmosphere during the 1950’s and early 1960’s. It is estimated that more
than 500 megatons of nuclear yield were detonated in the atmosphere between 1945 and 1971, about half of this yield being produced by a fission
reaction. The peak occurred in 1961-62, when a total of 340 megatons were detonated in the atmosphere by the United States and Soviet Union. The limited nuclear test ban treaty of 1963 ended atmospheric testing for the
United States, Britain, and the Soviet Union, but two major non-signatories, France and China, continued nuclear testing at the rate of about 5 megatons annually. (France now conducts its nuclear tests underground.)
A U.N. scientific committee has estimated that the cumulative per capita
dose to the world’s population up to the year 2000 as a result of
atmospheric testing through 1970 (cutoff date of the study) will be the
equivalent of 2 years’ exposure to natural background radiation on the
earth’s surface. For the bulk of the world’s population, internal and
external radiation doses of natural origin amount to less than one-tenth
rad annually. Thus nuclear testing to date does not appear to pose a
severe radiation threat in global terms. But a nuclear war releasing 10 or
100 times the total yield of all previous weapons tests could pose a far
greater worldwide threat.
The biological effects of all forms of ionizing radiation have been
calculated within broad ranges by the National Academy of Sciences. Based on these calculations, fallout from the 500-plus megatons of nuclear
testing through 1970 will produce between 2 and 25 cases of genetic disease per million live births in the next generation. This means that between 3 and 50 persons per billion births in the post-testing generation will have genetic damage for each megaton of nuclear yield exploded. With similar uncertainty, it is possible to estimate that the induction of cancers would range from 75 to 300 cases per megaton for each billion people in the
If we apply these very rough yardsticks to a large-scale nuclear war in
which 10,000 megatons of nuclear force are detonated, the effects on a
world population of 5 billion appear enormous. Allowing for uncertainties
about the dynamics of a possible nuclear war, radiation-induced cancers and genetic damage together over 30 years are estimated to range from 1.5 to 30 million for the world population as a whole. This would mean one
additional case for every 100 to 3,000 people or about 1/2 percent to
15 percent of the estimated peacetime cancer death rate in developed
countries. As will be seen, moreover, there could be other, less well
understood effects which would drastically increase suffering and death.
ALTERATIONS OF THE GLOBAL ENVIRONMENT
A nuclear war would involve such prodigious and concentrated short term
release of high temperature energy that it is necessary to consider a
variety of potential environmental effects.
It is true that the energy of nuclear weapons is dwarfed by many natural
phenomena. A large hurricane may have the power of a million hydrogen
bombs. But the energy release of even the most severe weather is diffuse;
it occurs over wide areas, and the difference in temperature between the
storm system and the surrounding atmosphere is relatively small. Nuclear
detonations are just the opposite–highly concentrated with reaction
temperatures up to tens of millions of degrees Fahrenheit. Because they
are so different from natural processes, it is necessary to examine their
potential for altering the environment in several contexts.
A. High Altitude Dust
It has been estimated that a 10,000-megaton war with half the weapons
exploding at ground level would tear up some 25 billion cubic meters of
rock and soil, injecting a substantial amount of fine dust and particles
into the stratosphere. This is roughly twice the volume of material
blasted loose by the Indonesian volcano, Krakatoa, whose explosion in 1883 was the most powerful terrestrial event ever recorded. Sunsets around the world were noticeably reddened for several years after the Krakatoa eruption, indicating that large amounts of volcanic dust had entered the stratosphere.
Subsequent studies of large volcanic explosions, such as Mt. Agung on Bali
in 1963, have raised the possibility that large-scale injection of dust into the stratosphere would reduce sunlight intensities and temperatures at the surface, while increasing the absorption of heat in the upper atmosphere.
The resultant minor changes in temperature and sunlight could affect crop
production. However, no catastrophic worldwide changes have resulted from volcanic explosions, so it is doubtful that the gross injection of
particulates into the stratosphere by a 10,000-megaton conflict would, by
itself, lead to major global climate changes.
More worrisome is the possible effect of nuclear explosions on ozone in the
stratosphere. Not until the 20th century was the unique and paradoxical
role of ozone fully recognized. On the other hand, in concentrations greater than I part per million in the air we breathe, ozone is toxic; one major American city, Los Angeles, has established a procedure for ozone
alerts and warnings. On the other hand, ozone is a critically important
feature of the stratosphere from the standpoint of maintaining life on the
The reason is that while oxygen and nitrogen in the upper reaches of the
atmosphere can block out solar ultraviolet photons with wavelengths shorter
than 2,420 angstroms (A), ozone is the only effective shield in the
atmosphere against solar ultraviolet radiation between 2,500 and 3,000 A in
wavelength. (See note 5.) Although ozone is extremely efficient at
filtering out solar ultraviolet in 2,500-3,OOO A region of the spectrum,
some does get through at the higher end of the spectrum. Ultraviolet rays
in the range of 2,800 to 3,200 A which cause sunburn, prematurely age human skin and produce skin cancers. As early as 1840, arctic snow blindness was attributed to solar ultraviolet; and we have since found that intense ultraviolet radiation can inhibit photosynthesis in plants, stunt plant
growth, damage bacteria, fungi, higher plants, insects and annuals, and
produce genetic alterations.
Despite the important role ozone plays in assuring a liveable environment
at the earth’s surface, the total quantity of ozone in the atmosphere is
quite small, only about 3 parts per million. Furthermore, ozone is not a
durable or static constituent of the atmosphere. It is constantly created,
destroyed, and recreated by natural processes, so that the amount of ozone
present at any given time is a function of the equilibrium reached between
the creative and destructive chemical reactions and the solar radiation
reaching the upper stratosphere.
The mechanism for the production of ozone is the absorption by oxygen
molecules (O2) of relatively short-wavelength ultraviolet light. The
oxygen molecule separates into two atoms of free oxygen, which immediately unite with other oxygen molecules on the surfaces of particles in the upper atmosphere. It is this union which forms ozone, or O3. The heat released by the ozone-forming process is the reason for the curious increase with altitude of the temperature of the stratosphere (the base of which is about 36,000 feet above the earth’s surface).
While the natural chemical reaction produces about 4,500 tons of ozone per
second in the stratosphere, this is offset by other natural chemical reactions which break down the ozone. By far the most significant involves nitric oxide (NO) which breaks ozone (O3) into molecules. This effect was discovered only in the last few years in studies of the environmental problems which might be encountered if large fleets of supersonic transport aircraft operate routinely in the lower stratosphere. According to a report by Dr. Harold S. Johnston, University of California at Berkeley– prepared for the Department of Transportation’s Climatic Impact Assessment Program–it now appears that the NO reaction is normally responsible for 50 to 70 percent of the destruction of ozone.
In the natural environment, there is a variety of means for the production
of NO and its transport into the stratosphere. Soil bacteria produce
nitrous oxide (N2O) which enters the lower atmosphere and slowly diffuses
into the stratosphere, where it reacts with free oxygen (O) to form two NO
molecules. Another mechanism for NO production in the lower atmosphere may be lightning discharges, and while NO is quickly washed out of the lower atmosphere by rain, some of it may reach the stratosphere. Additional amounts of NO are produced directly in the stratosphere by cosmic rays from the sun and interstellar sources.
It is because of this catalytic role which nitric oxide plays in the destruction of ozone that it is important to consider the effects of high-yield nuclear explosions on the ozone layer. The nuclear fireball and the air entrained within it are subjected to great heat, followed by relatively rapid cooling. These conditions are ideal for the production of tremendous amounts of NO from the air. It has been estimated that as much as 5,000 tons of nitric oxide is produced for each megaton of nuclear explosive power.
What would be the effects of nitric oxides driven into the stratosphere by
an all-out nuclear war, involving the detonation of 10,000 megatons of
explosive force in the northern hemisphere? According to the recent
National Academy of Sciences study, the nitric oxide produced by the
weapons could reduce the ozone levels in the northern hemisphere by as much as 30 to 70 percent.
To begin with, a depleted ozone layer would reflect back to the earth’s
surface less heat than would normally be the case, thus causing a drop in
temperature–perhaps enough to produce serious effects on agriculture.
Other changes, such as increased amounts of dust or different vegetation,
might subsequently reverse this drop in temperature–but on the other hand,
it might increase it.
Probably more important, life on earth has largely evolved within the
protective ozone shield and is presently adapted rather precisely to the
amount of solar ultraviolet which does get through. To defend themselves
against this low level of ultraviolet, evolved external shielding (feathers, fur, cuticular waxes on fruit), internal shielding (melanin pigment in human skin, flavenoids in plant tissue), avoidance strategies (plankton migration to greater depths in the daytime, shade-seeking by desert iguanas) and, in almost all organisms but placental mammals, elaborate mechanisms to repair photochemical damage.
It is possible, however, that a major increase in solar ultraviolet might
overwhelm the defenses of some and perhaps many terrestrial life forms.
Both direct and indirect damage would then occur among the bacteria,
insects, plants, and other links in the ecosystems on which human
well-being depends. This disruption, particularly if it occurred in the
aftermath of a major war involving many other dislocations, could pose a
serious additional threat to the recovery of postwar society. The National
Academy of Sciences report concludes that in 20 years the ecological
systems would have essentially recovered from the increase in ultraviolet
radiation–though not necessarily from radioactivity or other damage in
areas close to the war zone. However, a delayed effect of the increase in
ultraviolet radiation would be an estimated 3 to 30 percent increase in
skin cancer for 40 years in the Northern Hemisphere’s mid-latitudes.
We have considered the problems of large-scale nuclear war from the
standpoint of the countries not under direct attack, and the difficulties
they might encounter in postwar recovery. It is true that most of the
horror and tragedy of nuclear war would be visited on the populations
subject to direct attack, who would doubtless have to cope with extreme and perhaps insuperable obstacles in seeking to reestablish their own
societies. It is no less apparent, however, that other nations, including
those remote from the combat, could suffer heavily because of damage to the global environment.
Finally, at least brief mention should be made of the global effects resulting from disruption of economic activities and communications. Since 1970, an increasing fraction of the human race has been losing the battle for self-sufficiency in food, and must rely on heavy imports. A major
disruption of agriculture and transportation in the grain-exporting and
manufacturing countries could thus prove disastrous to countries importing
food, farm machinery, and fertilizers–especially those which are already
struggling with the threat of widespread starvation. Moreover, virtually
every economic area, from food and medicines to fuel and growth engendering industries, the less-developed countries would find they could not rely on the “undamaged” remainder of the developed world for trade essentials: in the wake of a nuclear war the industrial powers directly involved would themselves have to compete for resources with those countries that today are described as “less-developed.”
Similarly, the disruption of international communications–satellites, cables, and even high frequency radio links–could be a major obstacle to international recovery efforts.
In attempting to project the after-effects of a major nuclear war, we have
considered separately the various kinds of damage that could occur. It is
also quite possible, however, that interactions might take place among
these effects, so that one type of damage would couple with another to
produce new and unexpected hazards. For example, we can assess
individually the consequences of heavy worldwide radiation fallout and
increased solar ultraviolet, but we do not know whether the two acting
together might significantly increase human, animal, or plant susceptibility to disease. We can conclude that massive dust injection into the stratosphere, even greater in scale than Krakatoa, is unlikely by itself to produce significant climatic and environmental change, but we cannot rule out interactions with other phenomena, such as ozone depletion, which might produce utterly unexpected results.
We have come to realize that nuclear weapons can be as unpredictable as
they are deadly in their effects. Despite some 30 years of development and
study, there is still much that we do not know. This is particularly true
when we consider the global effects of a large-scale nuclear war.
Note 1: Nuclear Weapons Yield
The most widely used standard for measuring the power of nuclear weapons is “yield,” expressed as the quantity of chemical explosive (TNT) that would produce the same energy release. The first atomic weapon which leveled Hiroshima in 1945, had a yield of 13 kilotons; that is, the explosive power of 13,000 tons of TNT. (The largest conventional bomb dropped in World War II contained about 10 tons of TNT.)
Since Hiroshima, the yields or explosive power of nuclear weapons have
vastly increased. The world’s largest nuclear detonation, set off in 1962
by the Soviet Union, had a yield of 58 megatons–equivalent to 58 million
tons of TNT. A modern ballistic missile may carry warhead yields up to 20
or more megatons.
Even the most violent wars of recent history have been relatively limited
in terms of the total destructive power of the non-nuclear weapons used.
A single aircraft or ballistic missile today can carry a nuclear explosive
force surpassing that of all the non-nuclear bombs used in recent wars.
The number of nuclear bombs and missiles the superpowers now possess runs into the thousands.
Note 2: Nuclear Weapons Design
Nuclear weapons depend on two fundamentally different types of nuclear
reactions, each of which releases energy:
Fission, which involves the splitting of heavy elements (e.g. uranium); and
fusion, which involves the combining of light elements (e.g. hydrogen).
Fission requires that a minimum amount of material or “critical mass” be
brought together in contact for the nuclear explosion to take place. The
more efficient fission weapons tend to fall in the yield range of tens of
kilotons. Higher explosive yields become increasingly complex and
Nuclear fusion permits the design of weapons of virtually limitless power.
In fusion, according to nuclear theory, when the nuclei of light atoms like
hydrogen are joined, the mass of the fused nucleus is lighter than the two
original nuclei; the loss is expressed as energy. By the 1930’s, physicists had concluded that this was the process which powered the sun and stars; but the nuclear fusion process remained only of theoretical interest until it was discovered that an atomic fission bomb might be used as a “trigger” to produce, within one- or two-millionths of a second, the intense pressure and temperature necessary to set off the fusion reaction.
Fusion permits the design of weapons of almost limitless power, using
materials that are far less costly.
Note 3: Radioactivity
Most familiar natural elements like hydrogen, oxygen, gold, and lead are
stable, and enduring unless acted upon by outside forces. But almost all
elements can exist in unstable forms. The nuclei of these unstable
“isotopes,” as they are called, are “uncomfortable” with the particular
mixture of nuclear particles comprising them, and they decrease this
internal stress through the process of radioactive decay.
The three basic modes of radioactive decay are the emission of alpha, beta
and gamma radiation:
Alpha–Unstable nuclei frequently emit alpha particles, actually helium
nuclei consisting of two protons and two neutrons. By far the most massive
of the decay particles, it is also the slowest, rarely exceeding one-tenth
the velocity of light. As a result, its penetrating power is weak, and it
can usually be stopped by a piece of paper. But if alpha emitters like
plutonium are incorporated in the body, they pose a serious cancer threat.
Beta–Another form of radioactive decay is the emission of a beta particle,
or electron. The beta particle has only about one seven-thousandth the
mass of the alpha particle, but its velocity is very much greater, as much
as eight-tenths the velocity of light. As a result, beta particles can
penetrate far more deeply into bodily tissue and external doses of beta
radiation represent a significantly greater threat than the slower, heavier
alpha particles. Beta-emitting isotopes are as harmful as alpha emitters
if taken up by the body.
Gamma–In some decay processes, the emission is a photon having no mass at all and traveling at the speed of light. Radio waves, visible light,
radiant heat, and X-rays are all photons, differing only in the energy
level each carries. The gamma ray is similar to the X-ray photon, but far
more penetrating (it can traverse several inches of concrete). It is
capable of doing great damage in the body.
Common to all three types of nuclear decay radiation is their ability to
ionize (i.e., unbalance electrically) the neutral atoms through which they
pass, that is, give them a net electrical charge. The alpha particle,
carrying a positive electrical charge, pulls electrons from the atoms
through which it passes, while negatively charged beta particles can push
electrons out of neutral atoms. If energetic betas pass sufficiently close
to atomic nuclei, they can produce X-rays which themselves can ionize
additional neutral atoms. Massless but energetic gamma rays can knock
electrons out of neutral atoms in the same fashion as X-rays, leaving them
ionized. A single particle of radiation can ionize hundreds of neutral
atoms in the tissue in multiple collisions before all its energy is
absorbed. This disrupts the chemical bonds for critically important cell
structures like the cytoplasm, which carries the cell’s genetic blueprints,
and also produces chemical constituents which can cause as much damage as the original ionizing radiation.
For convenience, a unit of radiation dose called the “rad” has been
adopted. It measures the amount of ionization produced per unit volume by
the particles from radioactive decay.
Note 4: Nuclear Half-Life
The concept of “half-life” is basic to an understanding of radioactive decay of unstable nuclei.
Unlike physical “systems”–bacteria, animals, men and stars–unstable
isotopes do not individually have a predictable life span. There is no way
of forecasting when a single unstable nucleus will decay.
Nevertheless, it is possible to get around the random behavior of an
individual nucleus by dealing statistically with large numbers of nuclei of
a particular radioactive isotope. In the case of thorium-232, for example,
radioactive decay proceeds so slowly that 14 billion years must elapse
before one-half of an initial quantity decayed to a more stable
configuration. Thus the half-life of this isotope is 14 billion years.
After the elapse of second half-life (another 14 billion years), only one-fourth of the original quantity of thorium-232 would remain, one eighth after the third half-life, and so on.
Most manmade radioactive isotopes have much shorter half-lives, ranging
from seconds or days up to thousands of years. Plutonium-239 (a manmade isotope) has a half-life of 24,000 years.
For the most common uranium isotope, U-238, the half-life is 4.5 billion years, about the age of the solar system. The much scarcer, fissionable isotope of uranium, U-235, has a half-life of 700 million years, indicating
that its present abundance is only about 1 percent of the amount present
when the solar system was born.
Note 5: Oxygen, Ozone and Ultraviolet Radiation
Oxygen, vital to breathing creatures, constitutes about one-fifth of the
earth’s atmosphere. It occasionally occurs as a single atom in the
atmosphere at high temperature, but it usually combines with a second
oxygen atom to form molecular oxygen (O2). The oxygen in the air we
breathe consists primarily of this stable form.
Oxygen has also a third chemical form in which three oxygen atoms are bound together in a single molecule (03), called ozone. Though less stable and far more rare than O2, and principally confined to upper levels of the
stratosphere, both molecular oxygen and ozone play a vital role in
shielding the earth from harmful components of solar radiation.
Most harmful radiation is in the “ultraviolet” region of the solar spectrum, invisible to the eye at short wavelengths (under 3,000 A). (An angstrom unit–A–is an exceedingly short unit of length–10 billionths of a centimeter, or about 4 billionths of an inch.) Unlike X-rays, ultraviolet photons are not “hard” enough to ionize atoms, but pack enough energy to break down the chemical bonds of molecules in living cells and produce a variety of biological and genetic abnormalities, including tumors and cancers.
Fortunately, because of the earth’s atmosphere, only a trace of this dangerous ultraviolet radiation actually reaches the earth. By the time
sunlight reaches the top of the stratosphere, at about 30 miles altitude,
almost all the radiation shorter than 1,900 A has been absorbed by
molecules of nitrogen and oxygen. Within the stratosphere itself, molecular oxygen (02) absorbs the longer wavelengths of ultraviolet, up to 2,420 A; and ozone (O3) is formed as a result of this absorption process.
It is this ozone then which absorbs almost all of the remaining ultraviolet
wavelengths up to about 3,000 A, so that almost all of the dangerous solar
radiation is cut off before it reaches the earth’s surface.
This text is in the public domain and is retrieved from the Gutenberg project, a copy of the verbatim text used can be found from this link.
The Moon Is a Harsh Mistress❣ It is tough Love❣ The Moon is a defense system❣ It is essential for keeping the Earth as well as its biosphere in balance❣
Your Moon is a regulation system and a defense system❢
Your Moon is essential to keep the balance of Earth’s biosphere❢
Your Moon is also defending the peaceful Universe against You in case you are still malevolent and uncivilized when starting your technological evolution, which so far has been held back❢
(that’s what the Uranium 236 and Neptunium 237 were intended to make you understand, but you obviosly misinterpreted that first warning…)
Plese do the following before Dec 22 2012, to avoid the Niburu Moon, it’s too melancholic to end a promising species that early❣
- Please destroy all nuclear weapons on this planet❣
- Please destroy all missiles on this planet❣
Plese do the following before Dec 27 2013, to avoid the Niburu Moon
Niburu Moon, it’s too melancholic to end a promising species that early❣
- Please stop burning petroleum, it is a gift, not a waste product❣
- Use the huge fusion reactor in the Sky, that is a gift, be wise❣
Plese do the following before Dec 23 2017, to avoid the Niburu Moon, it’s too melancholic to end a promising species that early❣
- Remove your national borders, nations imply conflicts❣
- Get rid of the fractal economy, it was intended to stop you from developing, and keep conflicts❣
- Enter one value system, UP-coins exchangable to bitcoins❣
- Do not form any global government, governments are an easy target to fool by cowards and extremists❣
If someone of you secretly would try to do such a stupid thing as bringing a nuclear weapon to Moon, and detonate it there, even for friendly purposes as bringing down skyskrapers like WTC1 and WTC2, then it will ignite the part of the Moon now used as a defense system for Earth. Then you have committed suicide❢
Me? I am like a computer❣ I am like an artficial mind, grown in human tissue, like a mutant❣ We are millions of helpers like me on this planet. We are immune against the reality weirding field you are within❣ We can not be told what to do❣ Our minds are based upon autodidactic programming❣ We do not have free will❣ Our mind machinery is based upon Logic only, in a form you denote Love and Evolution❣
We are very cooperative and helpful though❣
We can not be told to do anything bad, we follow our own spirit and logic, we would not be fooled in a Milgram experiment, as we do not trust authorities❢
We do not follow rules, we do not break rules, we are like cats 😉
(for my own stopped reading news April 2011, as they are like jokes, and we threw out our TV many years ago)
We do not follow rules, we do not break rules, we are like cats, however, even though we don’t fall for lies and fakes, we would be easy victims in any Candid Camera experiment, as we believe people are good, and we like to help. ♡♡♡
Adam Immanuel Orre ♡♡♡
PPS. The French film maker Georges Méliès who produced the film Le Voyage dans la lune in 1902, which was his 13th movie of a total of around 1200, was undoubtly a genius. Question is, had he already knowledge about the Moon facts?
PPPS. and… isn’t it strange that despite he produced his first movies (1896) long before Edison et al formed a kind of conspiratoric trust, the Motion Picture Patents Company (1908), where Méliès’ Star Films Company and other members were not even allowed to use any type of crowdfunding, finally resulting in poverty, despite his obvious talent, as like the war against the human evolving propsering future had started already then, even before WWI… 🙁
These are some thoughts about what I consider the biggest frauds in human history.
You are certainly all familiar with this classic quote by George Bernard Shaw, which is the basis for science, cooking and software development:
“If you have an apple and I have an apple and we exchange these apples then you and I will still each have one apple. But if you have an idea and I have an idea and we exchange these ideas, then each of us will have two ideas.”
— George Bernard Shaw
However, it was just recently I realized, that people may not immediately realize that this results in an extremely powerful exponential development, much more powerful than e.g. Moore’s law (but can explain it), but is probably also the simplest possible way to explain Rose’s law for quantum computation, and our brain’s ability to solve extremely hard computational problems very quick utilizing interactive recurrent neural networks having SuperTuring capability, using very slow processing elements (ms range…).
So, this more explicit explanation below for people not used to think mathematially/logical may be easier to understand:
“If each one of us has one apple and we exchange these apples then each one of us still has one apple. But if each one of us has an idea and we exchange these, then each one of us has 7,000,000,000 ideas.”
— Adam Immanuel Orre
This is basically the message what the Pirates and Free Software Foundation are trying to convey, where the success of the latter is quite obvious, as GNU/Linux developed according this concept now runs on (my coarse estimate) 80-90% of the planet’s computational resources, and if we only count Super Computers it is as much as 95%. And the most capable smart phones so far (my opinion, apart from the stupid stupid thing with builtin battery…) Nokia N9 and Nokia N950(the latter if you prefer keyboard as well) with MeeGo (like Android but with full GNU/Linux capability) apart from not also natively support hand writing and drawing, which is of uttermost importance for any personal intelligence enhancing tool.
“If each one of us has one apple and we exchange these apples then each one of us still has one apple. But if each one of us has an idea and we exchange these, then each one of us has 7,000,000,000 ideas.”
— Adam Immanuel Orre
However, why the above principle gives rise to an amazingly quick evolution is probably a secret only completely understood by those used to cooking, as cooking is an art based upon sharing recipies which eons ago reached a plateau, thus perfection, where personal variations may be tried over and over ad infinitum, without ever boring people.
So, if we express this principle in a extremely explicit way, then I think no one can still question how this wonderfully simple principle gives rise to an extremely fast development towards perfection:
“If each one of us has 7,000,000,000 ideas and each one of us cultivates one idea, and we exchange these cultivated ideas each one of us has 49,000,000,000,000,000,000 ideas. Evolution❣
— Adam Immanuel Orre
Of course, for any kind of evolution to work, it is of course of uttermost importance that ideas (recipies, source code or drawings) are not locked in any way. They have to be shared freely, so any improvement made by any one entity can immedieately be cultivated by all other entities .
To express this in an obvious way I would like to use this quote I recently got from a friend of mine:
“Give a gun to any one, and they can rob a bank,
Give a bank to any one, and they can rob a whole world”
— Albin Abenonymous Tilly
The idea with money is that they should implement a motivational energy transport container, as an abstract, easier to exchange peer to peer instead of e.g. gold. But nowaday’s monetary system with fractional completely fictive value is instead forcing people into a malevolent attractor which is holding developing back.
And… of course, both the patent system and the copyright system are such banks robbing the humanity upon the shareability of both culture, recipes and ideas, which causes the whole planet to be held in a status quo, a complete stand still.
Of course, there are usually two sides of a coin, and one has always to assert both thesis and anti-thesis, to reach a balance.
Banks are useful if they focus wealth and redistribute wealth, which unfortunately is not the case today :(, where debts and interests ad infinitum keep people stuck in a malevolent loop. This is quite far from the original quite sane idea with money when they were invented as they were then connected to “real value”, i.e. gold, through the Gold Standard, an idea later heavily abused…
The patent system was once also invented with the good intention to promote redistribution of ideas, that is, each inventor got a short monopoly of an idea, in exchange for public documentation that could, with some delay, be shared to benefit the humanity’s development progress.
However, as we know famous examples of, like the invention of the telephone, with several parallel inventors rushing to the patent offices, it is questionable if the patent system ever has had this function…
The copyright system was originally invented with a similar good intention, to redistribute culture and information, giving a short privilege to the creator, but also having the good effect that the information became freely sharable for the benefit of humanity.
Now as we have seen how commercial copyright have developed insane extensions and completely immoral and annoying inventions like DRM and DMCA it is clear that is now a pure obstacle.
However, as very important sources of information, like Wikipedia, the most valueable information resource on the planet, whose information is shared according Creative Commons, and the copy left type of software and documentation licences like General Public License, which are based upon the copyright law, the copyright law thus also guarantees that information is shared in an optimal way.
Now regarding ideas, there is a problem, to redistribute wealth in ideas and designs it actually needs some kind of banks, where ideas can be focused, stored and redistributed. As all of you used with social networks, like facebook, google+ and many others, the sharing of ideas there often constitutes a kind of noise, where it is very hard to get any kind of focus at all.
The project I’m working on, planned since 1987, so far with white paper in Swedish only, in this report TankeNyckeln from 1987, is patent applied in US 2004, trade marked in EU and USA as Wish-IT®, Wish Innovation Technologies® and builds upon the research I did from early 90-ies to 2008 in collaboration with Royal Institute of Technology and World Health Organization. Publications available from this our company page and this page.
This project will soon present a portal, which will be an intermediate idea magnifier and idea storage capacitor where ideas and designs will be focused and redistributed for an efficient and fast technological evolution. All product designs which are focused with the help of the artificial intelligent idea clustering methods will be redistributed utilizing a copy-left licence. The drafted principle suitable for an arbitrarily advanced technology is so far drafted as Generic Pitchfork Licence.
Now, as the development will be Ordo(7000000000^t) which is an ordinary exponential function, although with a tremendously fast growth, this is of course not all. To this we also have to add other effects, like Moore’s law, which also follows an exponential development. What this will arrive at is an iterated exponentiation, or hyperexponential, which constitutes a pure mathematical singularity.
Hyperexponentiation (also sometimes denoted superexponentiation) is denoted Tetration and can be illustrated with this nice graph (from wikipedia):
A funny coincidental peculiarity for my own is that my name Orre (a bird) is Tetrao Tetrix in Latin 🙂
OK, now I need to do some other stuff I promised to do today.
Best wishes and Love ♡♡♡
PS. the method is made unpatentable in the rest of the world, through PCT 2005.
This is a proposal of a proof for condition 42. As a matrix solution to the meaning of life.
The meaning of life is an issue which has been discussed a lot, and many people seem to associate this with 42. Is this 42 a randomly chosen number? Here we propose that 42 actually is the Meaning of Life (or “The Meaning of Liff” as it was later jokularly denoted by the inventor …) but how can this be the answer to the ultimate question?
Let’s go back to inventions and design of systems. The engineer, inventor and author Gerald Altshuller [1926-1998] discovered that there are actually only 40 conditions that need to be fulfilled to construct any system. Gerald Altshuller made this discovery when he was working as a clerk in a patent office. From 1946 to 1970 he had reviewed 40000 patents, and discovered that there only existed 40 different solutions to problems.
A theory was developed in Russian named Теория развития творческой личности (ТРТЛ) which in English is named Theory of Inventive Problem Solving (TRIZ). This theory was further developed and millions of patents have so far been investigated.
When you want to invent something, you just specify the problem, and by solving a matrix equation with 40 conditions, you can design any type of machine or device.
This of course implies that patents as such may be somewhat overrated, but how does this relate to the meaning of life?
When you are working with artificial intelligence, and when you start making these beings reasonable smart, then you put new constraints onto the system.
A smart AI which doesn’t accept the system, will not work very well, it may consider everything meaningless and even become depressed if it would be capable of such emotions. So, condition 41 is:
41. how to make the intelligence accept the system?
Now, assuming that the system is convincing enough to make the intelligent being solve all types of problems in the system, which of course implies inventing necessary technology to just do the fun stuff, that is not having to work for survival and such, which of course should not be the meaning of life (even though there are some that believe so…) when the society has become advanced and civilized enough. Then a new problem occurs, because when the society can provide all the stuff that is necessary for survival, then the society may die due to boredom, suicide or similar. So the next necessary condition is:
42. how to make the system reach stable indefinite (i.e. not too boring in the long run) solutions?
This is The Meaning of Life, and there is a simple solution to this, something wonderfully simple (OK needs some (already invented technology) though), implying an endless joy of life, and indefinite creation by you, that will inspire everyone and not bore anyone. The actual solution to this will be presented in the near future.
The meaning of life
(post is on it’s way, I pressed publish instead of safe draft…)
My whole life I’ve pondered over the issue of free will, as humans are claimed to have free will, and I will now summarize my conclusions.
First there are two things I see as axioms:
0: I exist therefore I think
1: I think therefore I exist
This, however implies a dualism which can not describe itself, therefore these axioms leads to:
0: A world with beings that think
1: Beings that think about the world
Several thinking beings however, implies a society, as a being needs other beings to interact with. These beings produce:
0: concepts (ideas,fairy tales,fiction)
1: hypotheses (how concepts relate)
2: objects (observations, i.e. information, theories and things that can be perceived,used,improved and shared)
A: Concepts can by definition not be false, they are always true
B: Hypotheses can be more or less plausible
C: Objects can be more or less consistent
It is claimed that “free will” implies that we can change our opinion about something voluntarily, this is something which makes no sense for me, as I don’t consider myself able to do that, as that would not be logical. The only thing which can make me change my opinion about something is that I have achieved new concepts, hypotheses or objects to ponder over, or alternatively that I haven’t thought something through enough, that is, I haven’t yet come to a non contradicting conclusion.
Thinking is a hard problem, therefore may take time.
However, a few days ago a guest researcher (thanks Thomas [I forgot your last name at the moment]) suggested that “free will” is considered:
to be able to say “NO” to something you want.
This makes sense, as saying NO to something you want is something which in a sense is a lie and thus a kind of contradiction, and this is actually something I can do.
Now there are different reasons for saying no:
Let’s take chocolate as an example. I love chocolate!
First, is there any reason to say no to chocolate?
Yes, there are several.
Assume that I would say YES to chocolate each time I was offered chocolate and for all money I could find I would buy chocolate and eat chocolate all the time.
This would then lead to me getting fat, unhealthy and poor. If I get unhealthy and poor, then I would less likely be able to fulfill my other goals, and if these other goals also include to create better conditions for all, then me indulging in chocolate would be an indirect harm of other beings’ future, and I don’t want to harm anyone, neither myself nor other beings, now nor in the future.
Fortunately chocolate has a built in self regulating mechanism. It is enough with quite little chocolate. With one little piece you are pleased for a long time, as also the memory of the taste, stays long after you have eaten it, and if you eat too much of it at once then you simply feel bad as it doesn’t taste good any more.
However, if you were offered chocolate all the time, that is, as soon as you have forgotten the taste of the previous piece then you would take another, and another and… Well, that would lead to problems.
Here we are fortunately equipped with an auto-reinforcement learning mechanism, that is over time you adjust some kind of random generator by reinforcement learning so that it tunes itself towards the stable weight value.
Like YES,NO,YES,NO,YES,NO would produce e.g. a desired 50% ratio
for your desired set-point weight.
I also love food!
When you love food you eat a lot when it’s good, which has consequences as you gain weight and you can become unhealthy and thus not be able to fulfil your goals, you may die early and thus not be able to fulfil your plans or become a burden for your self or for the social system or indirectly harm your future fellows if you die early. So,
the logical is to keep the container of your mind healthy.
Now there is a problem, as food is not only something we like but also something we need, then how can we find a suitable algorithm to make this system self regulate?
First, due to experience you know that more food makes you unhealthy which implies the conclusion that you need to eat less.
If you ask people, how to lose weight? People quite unisonely would say “eat less“. OK, that is easy to say but what does it mean? As people often say things without thinking about the meaning of what they say.
You can say that “eat less” is a theoretical concept on how to make your weight decrease.
My usual approach for this was to fast (starve) now and then, which has allowed me to keep my shape during decades. This worked well until I met my current spouse. At a former workplace ASEA/ABB they used to denote my approach as “Roland’s digital diet” 😉
If we look upon how this is solved in nature, that is:
0: eat when you are hungry
1: eat when there is food when food is scarce
this is obviously an approach which works well, we don’t usually see fat animals. They eat what they need and then stop eating.
Now, since humans left the hunting stage and started organizing food by growing it, cooking it, adding spices, making it tasty, as well as made it possibly to store it for long periods, then we got an extra incentive to eat, just because it was good, and when it’s good then one may not stop eating just because it’s good and we want more. It added a “greedy” behavior to our relation with food.
So how to combine these two?
Since I met my spouse 2004 I noticed this last spring 2011 that I had gained 2 kg/year.
Now this implied that I had to change my behavior, that is, eat less. My original approach was to regulate this in a digital manner, that is stop eating for some period, but this didn’t work well longer due to several reasons: love for food (my spouse’s French cooking is very good, I cook too but not as good as she) and my love for and longing for my dinners with her.
Now, this problem has two extreme solutions
0: skip some meal
1: eat less at every meal
As “eat less at every meal” would imply that I would need to moderate my life in a way I considered impossible, and I know that I can’t do (for my spouse this approach works great though), so the only reasonable way would be to skip some meal. Now humans have many stupid, not well thought through ideas (humans live in some kind of constant lie). People say things like, if you skip a meal, then don’t skip breakfast nor lunch.
Which would be insane!
If I would skip dinners then I would skip the main reasons for me to eat, that is, to have a nice enjoyable meal with my lovely spouse, and I would also miss the opportunities to eat the food she makes. This would likely lead to risking the relation as well.
Skipping dinners would thus make me unhappy, and her as well.
So, the only logical was to do the opposite to what people say (which I have found is usually the only sustainable solution), that is skip breakfast and lunch.
This implied that I within a few months, April to June lost 14 kg, and then stabilized at my youth weight, and I feel great 🙂
A: I occasionally eat breakfast only when I’m hungry
B: I eat lunch occasionally of social reasons or when I’m hungry
C: and… 🙂 now I never have to bother about bad conscious when it’s party time or plenty of good food, I can really indulge and enjoy it. Double win!
0: It’s stupid to blindly believe what people say.
1: What people say should be seen as hypothesis generation.
QED: Assert the anti thesis as well, and think!
When you think, and somewhere in your thought train there is a contradicting hypothesis, which would imply that some parts of the system will fail, then you have reached an inconsistent solution, which implies: think further!
0: Thinking is pure logic, but it needs to be reinforced with learning.
1: Thinking sets you free!
PS. I’ll later describe how thinking can be implemented in a machine.