Wednesday, November 27, 2019

Roots Of Individualism In Europe Essays - Christian Philosophy

Roots Of Individualism In Europe Essays - Christian Philosophy Roots Of Individualism In Europe Roots of Individualism in Europe During the Middle Ages, independent thought was viewed disdainfully. Almost any idea deviating from the status quo, largely determined by the Roman Catholic Church, was condemned as heresy. One convicted of such a grievous offense was often excommunicated or killed, either by means of a proper execution or by a hostile mob. However, with the decline of the Middle Ages, the conditions arose for the birth of individualismthe development of which can be traced through the Renaissance, the Reformation, and the rise of capitalism. Individualism was a radical ideological revolution that forever altered the face of Europe and the rest of the world. The beginning of individualisms gradual evolution was first manifested in the Renaissance Era. The Renaissance was a ripe time ready for change. The weakening role of the Catholic Church led to an increase in power for the masses. Corruption plagued Church officials and many sought theological respite elsewhere. The reemphasis of ancient Greek and Roman texts proffered alternatives for many to satisfy their religious needs. This helped contribute to the abolishment of the Churchs imposition of its absolute truth and its claim to ultimate authority. As the church lost power, so did the political units. The bonds between church and state began to erode. Feudalism declined, Babbitt 2 hence giving rise to new political opportunity. The noble class no longer held a monopoly on the valued positions in society. Rather, one was able to pursue wealth and fame through various endeavors ranging from artist to soldier. The most empowering change of this era was the dominance of a secular attitude and the decline of church absolutism. This secular viewpoint altered mans reason for existence from an otherworldly quest to an intimate, immediate appreciation for that which exists on earth. Humanism is a primary source of individualism. Pico della Mirandolas Oration on the Dignity of Man captures the essence of the humanist movement. He writes that God gave man the ability to make of himself what he wills. Although man is capable of depraved acts, he also possesses the profundity to distinguish him as a holy being. Pico praises the goodness of mankind when he writes, man is rightly called and judged a great miracle and a wonderful creature indeed. This Renaissance perspective varies from the idea it replaced that held man as an intrinsically evil being. Picos oration, representative of the Renaissance itself, placed a higher importance on mankind, hence endowing members of society with a sense of pride rather than shame in their humanity. No longer did the church determine piety and greatness, but it was the common man who was now able to make this measurement. This represents a drastic step towards individualism. The Reformation was an epoch that increased the right and power of the individual. As a reaction against rampant church corruption, Martin Luther publicized complaints against the church. Luthers criticism sparked a revolution that resulted in the Babbitt 3 formation of several new religionsLutheranism, Calvanism, Anabaptism, and Protestantism. These new faiths undermined the church as having an absolute truth because each religion claimed to have an absolute truth of their own, separate from their counterparts. One now had the option to freely choose his or her faith rather than accept beliefs that were forced upon him. Also, theology adapted from one dictatorial faith to a variety that better suited society and its members. The people rather an establishment deemed what theological ideas were to be embraced and rejected. Lutheranism differs from Catholicism in the understanding and interpretation of three major areas: determination of salvation, source of truth, and basis of the church. The Catholic Church believed that salvation was achieved through Gods grace. In other words, humans were at the mercy of God whether they were to be saved or not. Luther professed that faith was the necessary element for salvation. He wrote, Faith redeems, corrects, and preserves our consciences so that we know that righteousness does not consist in works our righteousness is not in them, but in faith. Humans, therefore, had the power of self-determination in relation to their salvation. Through faith, one could achieve salvation. The Catholic Church believed that priests, who held the power of absolution, and

Saturday, November 23, 2019

General Philip Kearny in the Civil War

General Philip Kearny in the Civil War Major General Philip Kearny, Jr. was a renowned soldier who saw service with US and French Armies. A native of New Jersey, he distinguished himself in the Mexican-American War where he lost his left arm and later served in Emperor Napoleon IIIs forces during the Second War of Italian Independence. Returning to the United States after the outbreak of the Civil War, Kearny quickly gained a position of prominence in the Army of the Potomac. A tenacious fighter who relentlessly trained his men, he earned the nickname One-Armed Devil from the Confederates. Kearnys career ended on September 1, 1862, when his was killed leading his men at the  Battle of Chantilly. Early Life Born June 2, 1815, Philip Kearny, Jr. was the son of Philip Kearny, Sr. and Susan Watts. Leading one of New York Citys richest families, the Harvard-educated Kearny, Sr. had made his fortune as a financier. The familys situation was bolstered by the immense wealth of Susan Watts father, John Watts, who had served as New York Citys last Royal Recorder in the years before the American Revolution. Raised on the familys estates in New York and New Jersey, the younger Kearny lost his mother when he was seven. Known as a stubborn and temperamental child, he showed a gift for horsemanship and was an expert rider by age eight. As patriarch of the family, Kearnys grandfather soon took responsibility for his upbringing. Increasingly impressed with his uncles, Stephen W. Kearny, military career, the young Kearny expressed a desire to enter the military. Into the Army These ambitions were blocked by his grandfather who desired that he pursue a career in law. As a result, Kearny was compelled to attend Columbia College. Graduating in 1833, he embarked on a tour of Europe with his cousin John Watts De Peyser. Arriving back in New York, he joined the law firm of Peter Augustus Jay. In 1836, Watts died and left the bulk of his fortune to his grandson. Freed from his grandfathers constraints, Kearny sought assistance from his uncle and Major General Winfield Scott in obtaining a commission in the US Army. This proved successful and his received a lieutenants commission in his uncles regiment, the 1st US Dragoons. Reporting to Fort Leavenworth, Kearny aided in protecting pioneers on the frontier and later served as an aide-de-camp to Brigadier General Henry Atkinson. Kearny le Magnifique In 1839, Kearny accepted an assignment to France to study cavalry tactics at Saumur. Joining the Duke of Orleans expeditionary force to Algiers, he rode with the Chasseurs dAfrique. Taking part in several actions during the campaign, he rode into battle in the style of the Chasseurs with a pistol in one hand, a saber in the other, and the reins of his horse in his teeth. Impressing his French comrades, he earned the nickname Kearny le Magnifique. Returning to the United States in 1840, Kearny found that his father was terminally ill. Following his death later that year, Kearnys personal fortune again expanded. After publishing Applied Cavalry Tactics Illustrated in the French Campaign, he became a staff officer in Washington, DC and served under several influential officers, including Scott. Boredom In 1841, Kearny married Diana Bullitt whom he had met earlier while serving in Missouri. Increasingly unhappy as a staff officer, his temper began to return and his superiors reassigned him to the frontier. Leaving Diana in Washington, he returned to Fort Leavenworth in 1844. The next two years saw him become increasingly bored with army life and in 1846 he decided to leave the service. Putting in his resignation, Kearny quickly withdrew it with the outbreak of the Mexican-American War in May. Mexican-American War Kearny was soon directed to raise a company of cavalry for the 1st Dragoons and was promoted to captain in December. Based at Terre Haute, IN, he quickly filled the ranks of his unit and used his personal fortune to purchase it matching dapple gray horses. Initially sent to the Rio Grande, Kearnys company was later directed to join Scott during the campaign against Veracruz. Attached to Scotts headquarters, Kearnys men served as the generals bodyguard. Unhappy with this assignment, Kearny prophetically lamented, Honors are not won at headquarters...I would give my arm for a brevet (promotion). As the army advanced inland and won key victories at Cerro Gordo and Contreras, Kearny saw little action. Finally on August 20, 1847, Kearny received orders to take his command to join Brigadier General William Harneys cavalry during the Battle of Churubusco. Attacking with his company, Kearny charged forward. In the course of the fighting, he received a severe wound to his left arm which required its amputation. For his gallant efforts, he was given a brevet promotion to major. Frustration Returning to New York after the war, Kearny was treated as a hero. Taking over the US Army recruiting efforts in the city, his relationship with Diana, which had long been strained, ended when she left him in 1849. Having adjusted to life with one arm, Kearny began to complain that his efforts in Mexico had never been fully rewarded and that he was being ignored by the service due to his disability. In 1851, Kearny received orders for California. Arriving on the West Coast, he took part in the 1851 campaign against the Rogue River tribe in Oregon. Though this was successful, Kearnys constant complaining about his superiors along with the US Armys slow promotion system led to him resigning that October. Back to France Leaving on an around-the-world trip, which took him to China and Ceylon, Kearny finally settled in Paris. While there, he met and fell in love with New Yorker Agnes Maxwell. The two openly lived together in the city while Diana became increasingly embarrassed back in New York. Returning to the United States, Kearny sought a formal divorce from his estranged wife. This was refused in 1854 and Kearny and Agnes took up residence at his estate, Bellegrove, in New Jersey. In 1858, Diana finally relented which opened the way for Kearny and Agnes to marry. The following year, bored with country life, Kearny returned to France and entered the service of Napoleon III. Serving in the cavalry, he took part in the Battles of Magenta and Solferino. For his efforts, he became the first American to be awarded the Là ©gion dhonneur. The Civil War Begins Remaining in France into 1861, Kearny returned to the United States following the outbreak of the Civil War. Arriving in Washington, Kearnys initial attempts to join the Union service were rebuffed as many remembered his difficult nature and the scandal surrounding his second marriage. Returning to Bellegrove, he was offered command of the New Jersey Brigade by state officials in July. Commissioned a brigadier general, Kearny joined his men who were encamped outside Alexandria, VA. Stunned by the units lack of preparation for battle, he quickly commenced a rigorous training regime as well as used some of his own money to ensure that they were well-equipped and fed. Part of the Army of the Potomac, Kearny became frustrated by a lack of movement on the part of its commander, Major General George B. McClellan. This culminated in Kearny publishing a series of letters which severely criticized the commander. Into Battle Though his actions greatly angered the army leadership, they endeared Kearny to his men. Finally in early 1862, the army began moving south as part of the Peninsula Campaign. On April 30, Kearny was promoted to command the 3rd Division of Major General Samuel P. Heintzelmans III Corps. During the Battle of Williamsburg on May 5, he distinguished himself when he personally led his men forward. Riding ahead with a sword in his hand and his reins in his teeth, Kearny rallied his men yelling, Dont worry, men, theyll all be firing at me! Ably leading his division throughout the doomed campaign, Kearny began to earn the respect of both the men in the ranks and the leadership in Washington. Following the Battle of Malvern Hill on July 1, which ended the campaign, Kearny formally protested McClellans orders to continue withdrawing and advocated for a strike on Richmond. One-Armed Devil Feared by the Confederates, who referred to him as the One-Armed Devil, Kearny was promoted to major general later in July. That summer Kearny also directed that his men wear a patch of red cloth on their caps so that they could rapidly identify each other on the battlefield. This soon evolved into an army-wide system of insignias. With President Abraham Lincoln tiring of McClellans cautious nature, the aggressive Kearnys name began to surface as a potential replacement. Leading his division north, Kearny joined in the campaign that would culminate with the Second Battle of Manassas. With the beginning of the engagement, Kearnys men occupied a position on the Union right on August 29. Enduring heavy fighting, his division almost broke through the Confederate line. The next day, the Union position collapsed following a massive flank attack by Major General James Longstreet. As Union forces began fleeing the field, Kearnys division was one of the few formations to stay composed and helped cover the retreat. Chantilly On September 1, Union forces became engaged with elements of Major General Thomas Stonewall Jacksons command at the Battle of Chantilly. Learning of the fighting, Kearny marched his division to the scene to reinforce Union forces. Arriving, he immediately began preparing to assault the Confederates. As his men advanced, Kearny rode forward to investigate a gap in the Union line despite his aide urging caution. In response to this warning he allegedly replied, The Rebel bullet that can kill me has not yet been molded. Encountering Confederate troops, he ignored their demand to surrender and attempted to ride away. The Confederates promptly opened fire and one bullet pierced the base of his spine and instantly killed him. Arriving on the scene, Confederate Major General A.P. Hill exclaimed, Youve killed Phil Kearny, he deserved a better fate than to die in the mud. The next day, Kearnys body was returned under a flag of truce to the Union lines accompanied by a letter of condolence from General Robert E. Lee. Embalmed in Washington, Kearnys remains were taken to Bellegrove where they laid in state before being interred in the family crypt at Trinity Church in New York City. In 1912, following a drive led by New Jersey Brigade veteran and Medal of Honor winner Charles F. Hopkins, Kearnys remains were moved to Arlington National Cemetery.

Thursday, November 21, 2019

Application of Electrical Technology Assignment Example | Topics and Well Written Essays - 500 words

Application of Electrical Technology - Assignment Example In industrial applications, the switchgear is constructed with high-voltage circuit breakers and they may be lined-up together with the transformers in one unitized substation (USS). Switchgear de-energize loads in order to allow work to be done and also to enable clearing of faults downstream power systems. In the construction of protection relays, the current coil of the relay is connected to the secondary current coil of the transformer. Moreover, the secondary voltage coil of the transformer is connected to the voltage coil of the protection relay. When a fault occurs in the circuit feeder, an increased mmf of a current coil of the relay is triggered. The increased mmf closes the normally open contact of the relay that in return closes and completes the DC Trip Coil Circuit. The mmf of the Trip Coil initiates a tripping mechanical movement on the circuit breaker that causes it to isolate the fault. A sub-station refers to a part of an electrical generation, distribution, and transmission that that performs the function of voltage transformation from high to low and vice versa. A power plant refers to an installation that is used for the production of electricity while a power equipment refers to any equipment that is powered by electricity. A ring is an electrical wiring technique that enables the use of wires of smaller diameter than the ones used in the radial circuit, but of equivalent total current. On the other hand, feeders refer to a set of electric conductors that transmit power from the primary distributor centers to secondary distribution centers or branch-circuit distribution centers. An isolation transformer is installed between an AC power source and medical grade equipment in order to protect patients and staff from electric shocks in case faults occur due to the defectiveness of a medical grade equipment or use of a non-medical grade.

Tuesday, November 19, 2019

Annotated bibliography Essay Example | Topics and Well Written Essays - 1500 words - 2

Annotated bibliography - Essay Example The Truth and reconciliation process offers restorative form of justice that seeks to repair and solidify relationships between conflicting parties, victims and the general society. Restorative justice differs from retributive justice as it does not recognize punishment for those found guilty of facilitating conflict. The purpose of TRC in championing truth and reconciliation process is to determine and unveil truth to the public regarding the issues behind a particular conflict. Truth and reconciliation process therefore seeks to facilitate acknowledgement about a certain conflict, public mourning, healing and forgiveness among the people. In light of the above, Derek Rasmussen’s recommendation of reconciliation to forgive remains the only viable and practical way to achieve restorative justice, which is the basis of truth and reconciliation process. This is a news article written by the Canadian Broadcasting Corporation in reflection of the factors that led to the creation of the Canadian Truth and Reconciliation Commission. The article also identifies and recognizes commissioners appointed to serve in the TRC. According to the article, the move by the Canadian government through court to establish TRC related to the need to reconcile and console aboriginal Canadians who suffered in the Indian Residence School. The mandates of the TRC as indicted in the article were to investigate the reasons that led to the creation of the IRS system of learning and its impacts on the children. The article also reveals series of resignation by chairs of the committee as one of the challenges TRC of Canada faced. This article defines a very important material in facilitating the study bearing the clear information it depicts. Its importance to this study also relates to succinct definition and description of truth and reconciliation process as well differentiating restorative from retributive forms of justice. This article is an interim report of the

Sunday, November 17, 2019

Nazi Ideology Essay Example for Free

Nazi Ideology Essay Nazism also officially known as the National Socialism is defined as an ideology and practices that are influence by the National Socialist German Worker’s Party that is under the leadership of Adolf Hitler. In relation to this, Nazism is also regarded as political policies that were adopted by the dictatorial of Nazi Germany that took place from 1933 to 1945. Nazism strongly advocates the superiority of an Aryan race that makes the Germanic people stand above others. During the leadership of Hitler, Nazis supported the centralized government that is led by the Fuhrer that claimed to have the responsibility of defending and protecting Germany and the German people in their country and abroad against the forces of Communism and Jewish subversion (Thomas). As such, the recurring themes of Nazism include extreme nationalism, xenophobia, and the glorification of the Aryan race (Levy 497). These themes of Nazism have put many negative effects for people that do not belong to the Aryan race especially for Jews as well as to the German themselves. The themes of this ideology created prejudices and stereotypes against other races especially for the Jewish people that often become the cause of conflict and violent practices. Due to this, Jewish people and other races that Hitler deems as dirty or is a threat to the Aryan race are hunt down. The holocaust is a clear example of the violent practices that he implemented in line of the Nazis ideology. Nevertheless, the German people also experience the negative effect of this ideology as some of them experience xenophobia. They think that they should not interact with other races especially those that they believe is a threat to their racial superiority. Lastly, Germans also have experience difficulties in interacting and relating with other races especially when they have the mentality that they are better as compared to them (Levy 497-498). Works Cited Levy, Richard S. Antisemitism. California: ABC-CLIO, 2005. Thomas, Robert. â€Å"The Nature of Nazi Ideology. † 11 June 2009 http://www. libertarian. co. uk/lapubs/histn/histn015. pdf.

Thursday, November 14, 2019

ECB vs. Banque de France :: Economy Monetary Europe Papers

ECB vs. Banque de France With the introduction of a single currency for twelve different countries came along the introduction of a new banking system in France and 11 other countries. This system was officially put to work in January of 1999 when the euro became the currency in 11 countries in Europe; Greece became the twelve in June of 2001. At the time of introduction the countries could still use their own currency and the exchange rate between their currency and the euro was set by the new banking system. This system if officially named the European System of Central Banks (ESCB), and it is composed the European Central Bank, in Frankfurt, Germany and the National Central Banks (NCB) of the European Union Member States. All 15 EU members participate but the countries which did not introduce the euro have a special status and may implement their own monetary policy and do not partake in any of the decision making by the ESCB, they basically just observe what is going on. The ECB and the Banque de Fr ance, which is the national bank of France work together, but you may also separate their roles within the ESCB. Even though the ECB has much control in determining the financial status of France, the national bank has ways to fine-tune the economy. In the end the Banque de France is often held back by the ECB, and one of the main reasons is that the ECB is always torn in how to change its regulations and rates because the decision that applies to twelve countries that are never all going to be at exactly the same point economically, financially, or monetarily. As a whole the Eurosystem has four basic tasks that it is responsible to carry through the ECB and the national banks. The first is to define and implement the monetary policy. The second is to conduct foreign exchange operations. The third is to hold and manage the official foreign reserves of the Member States and the fourth is to promote the smooth operation of the payment systems (Organization of the ESCB, 1). These are just the broad goals or duties of the whole Eurosystem. When you break it down into the responsibilities of the ECB and the Banque de France things get more critical. Ultimately the ECB is responsible for defining the single monetary policy and making sure that the Banque de France and the other national banks implement the policies efficiently.

Tuesday, November 12, 2019

Global Warming Essay

The Global Warming theory has become increasingly popular over the past few years. Citizens of the world are being encouraged to be more environmentally conscious by others including politicians, celebrities, and world organizations. The problem with the theory lies in the fact that it has become more and more controversial as it gains publicity and attention. The basic concept behind this theory is that the earth was made with a balance of â€Å"greenhouse gasses†. These gasses are naturally occurring within the atmosphere and they essentially make the planet habitable by keeping it warm. When the world became more industrialized, the amount of carbon dioxide that was being emitted into the atmosphere increased. In the past hundred years or so, that number has more or less skyrocketed when compared to the previous amount due to the invention and mass distribution of automobiles, aircraft, trains, and boats. The surplus of greenhouse gasses are beyond what the earth can store and are creating a pseudo blanket around the world. The earth has thence become more and more hot, â€Å"†¦the globe has heated up by about one degree Fahrenheit over the past century—and it has heated up more intensely over the past two decades. † (IPCC, 2001) To be able to correctly put that into perspective one must have the knowledge that the temperature difference between the ice age and current times is nine degrees. The potential repercussions of the increase in global temperature include more intense storms, more severe droughts, and rising sea levels. There are several ways to have a more positive impact on the environment. Everything from buying more energy efficient cars to helping control the population has the potential to help reduce the effects of global warming. One of the largest contributors to the increase in global temperature in the past century has been the ever expanding population. It is a simple math problem, if one person produces too many greenhouse gasses and then proceeds to have seven children, the greenhouse emissions with increase even more so. The more people habituating the planet, the more potential there is for emission of greenhouse gasses. In short, the basis of this argument other than years of documented research and scientific discovery is what harm could we do by being more conservative in our use of energy and other entities that produce more than their share of greenhouse gasses? The answer as it stands now is none. The harmful effects of ignoring this crisis are all but proven fact. The problem that the pro- global warming theorists have created is that of social standing and little else. While there may be scientific backing to support some of the theory, the media presents the problem with great sensationalism. Global warming and energy conservation has thus become a trend and losses some of its validity through this. The scare tactics used by the media to â€Å"promote awareness† are just that, a linguistic ploy to gain favor. â€Å"Awareness of this global threat reinforced public concern and environmental problems and thereby provided environmental activists, scientists, and policy makers with new momentum in their efforts to promote environmental protection. (McCright, 2000) This statement draws line to the potential benefits that would be received if the pro-global warming theorists were to draw enough attention to the issue. Driven by social empowerment and conviction to environmental protection, these activists misrepresent the actual threat and paint it as being much more intense and imminent than the scientific evidence concludes. The fact that the planet’s temperature is ever changing is solid, however there is no solid proof that humans are responsible for this rise. The earth’s temperature has experienced extreme highs and lows throughout its millions of years in existence and we as humans understand little about what has caused those fluctuation. If humans did not exist billions of years ago, yet the temperature still changed dramatically, then why is it that scientists’ claim that humans are the cause for this current phenomena? The answer to that question is unknown however one could make the assumption that it may be due to our lack of understanding about the way in which carbon dioxide exists in the atmosphere. The public has no easy access to this information therefore it is easily forgotten or removed from the argument. Global warming is a theory that has been wildly blown out of proportion. Media backing and celebrity endorsements combined with political scare tactics have been used to create the sense of responsibility in this matter. The true concern is being masked by the â€Å"solution† that is being presented to the world. The supposed solution to the global warming theorists is to conserve. The real issue at hand is discovery. Science must be perfected or at least further tested before conclusions can be drawn. While evidence exists on both sides, the side that acknowledges the existence of global warming at least has the appearance of more science on its side. The publicity brought to global warming can have the ability to cause people to accept it as fact rather than simply a theory. That being said, it is the belief of this writer that the pro-global warming theory is more valid than the anti-global warming theory. This conclusion was made mostly due to the distrust that this student grew for the anti-global warming supporters. The anti-side invalidates their own arguments by using some basic fallacious thinking and aggressive tactics. Many of the websites and other sources to gain knowledge about the potential non-threat of global warming are extremist and use many common fallacies. For example, here is an excerpt from just one of the many anti-global warming websites available to all on the World Wide Web, â€Å"We all know that the artificial construct known as â€Å"cap and trade† is nothing more than a fraud to get companies to pay more taxes. It will have very little, to no, impact on CO2 levels, much less global warming. † (Casey, 2009). The author assumes that the â€Å"cap and trade† issue is universally known and understood which a fallacy of hasty generalization is. The second flaw in that particular argument is that it appears to be an argument of outrage in the sense that it degrades the government and paints them to be money hungry and unconcerned with the well-being of its peoples. Websites against global warming take personal attacks at the politicians or celebrities who endorse the idea of global warming. While these personal attacks may hold some general truth, they do not address that person’s ability to become involved with an organization to create awareness for global warming. Personal attacks are used to dissuade the American public from siding with the pro-global warming theory. Celebrities are often used as spokespeople for different causes and charities, but they do not embody what they are speaking for. They are solely a resource to raise awareness to the general population. While it may be true that some scientific evidence has been withheld or not made as accessible to the general population due to its potential harm to the pro-global warming theory, this writer has not been dissuaded from her former school of thought. It is very true that there are more humans on the planet than there were many years ago. It is also true that humans now use more advanced technology that have the tendency to produce a lot of carbon dioxide. The link between what we understand about humans and their production habits and the atmosphere may be blurry, but the question remains, why not conserve? In conclusion, the moral of the global warming story is moderation. Little personal changes may affect the way in which the world climate is changing but if they do not, what harm would we have done? Riding a bike to work or walking to school not only reduces the amount of carbon dioxide being produced, it saves that person money. Global warming may not be as big a threat as the media makes it out to be, but there is scientific evidence supporting the fact that it just may be a threat. Global Warming Essay Global Warming is an issue that concerns almost everybody worldwide: it is the primary cause for the erratic and sometimes devastating weather that is experienced around the world. Global warming is causing the rise in sea level which in turn causes the flooding of coastal areas and areas with low elevation. Is global warming really happening today? Scientists with the United Nations Environmental Program (UNEP) believe it is so (Mank, 2005). It is indisputable that there has been a rise in the concentration of greenhouse gases (GHG) in the atmosphere during the last century, which scientists think may be one of the causes of global warming. The climate change however is not a direct result of the rise in greenhouse gases. Will global warming spell doom for our world? Scientists believe this to be so. â€Å"Much depends on what actions we take now and in the coming years. † Meteorologist Jagadish Shukla of the University of Maryland found out that deforestation would cause rainfall in the Amazon River to decline by more than 26 percent from the current 2. 5 m. to about 1. 8 m. a year (Bellamy & Gifford, 2000). At the same time, the burning of fossil fuels, particularly coal and oil, produces sulfur dioxide and nitrogen oxides which are hazardous to the atmosphere. Findings show that a single smokestack may produce as much as 500 tons of sulfur dioxide a day. When these gases combine with oxygen and moisture, sulfuric acid and nitric acid is formed. The rain will carry the acids to the ground (acid rain) which may cause the depletion of calcium and magnesium in the soil, elements needed by plants for the formation of chlorophyll and wood, or it may cause the release of aluminum in the soil, which are poisonous and can kill the roots of trees (Carwardine, 2000). This study intent to: (1) know the effect of global warming worldwide thus knowing the global warming and doomsday and; (2) widen our knowledge about the ozone slayer and do the humans are the reasons of causing global warming or if its just a natural process that the earth goes through. II. Literature Review Ozone is an unstable oxygen that occurs naturally in the atmosphere (also called isothermal region), the upper portion of the atmosphere above 7 miles where clouds are rare. The ozone layer absorbs the dangerous ultraviolet-B (UV-B) rays while it allows the needed safe light to pass through. Though easily broken down by other gases in the stratosphere, it is constantly being repaired by the sun’s rays. However, man is destroying the ozone layer which serves as a protective umbrella against the sun’s harmful rays. In fact, the ozone layer is destroyed faster than the sun’s rays can produce it. It is being destroyed by industrial gases like CFCs (Johnston, 2000). CFCs was discovered by Thomas Widgley Jr. , a chemist working at the Frigidaire Division of General Motors but were discovered hazardous in 1974—only after 44 years of use, used as coolants in refrigerators and air conditioners and aerosol propellants in spray cans, medical sterilizers, cleaning solvents for electronic components and raw materials for making plastic foams such as coffee cups. CFCs are estimated to account for 14 percent of global warming. It is dismaying to know that ozone depletion can be found in the south (Antarctica) and north (Greenland) poles (Dolan, 2006). According to British scientist Joe Farman, 40 percent of ozone depletion can be found in the South Pole. At the South Pole is a huge vortex with clouds composed of tiny ice particles, giving chlorine millions of tiny spaces through which it can perform its deadly dance with ozone even faster (Simpson, 2000). Both holes at the poles are seasonal, opening and closing each year. In the northern hemisphere, a more populous region, ozone depletion rate is between three percent and seven percent for 17 years, as compared previously to only three percent for100 years. On the other hand, what are the effects of Ultraviolet-B rays to human beings and the ecosystem in general? To humans, they can cause skin cancer and cataract and damage the immune system. To the ecosystem, they can kill planktons (basic element of the ocean food chain), destroy plant life and crops and change global wind and weather patterns. In 1978, Canada, Sweden, the United States and other countries banned the use of CFCs in aerosols. However, other uses of CFCs were found, effecting an increase in its production. The US still uses one-fourth of the world’s annual supply of CFCs (Turner, 2000). However, in September 1987, 24 nations cooperated for the first time to solve this environmental problem and passed the Montreal Protocol. The agreement issued a call for developed nations to freeze the use and production of CFCs while cutting 50 percent of use and production by 1999. Still, the CFCs currently rising through the troposphere will take seven to 10 years to drift up to the stratosphere. The troposphere is the portion of the atmosphere that is below the stratosphere, extending outward about seven to 10 miles from the earth’s surface (Bellamy & Gifford, 2000). III. Methodology In order to accomplish this study, the researcher used two different methods to make the investigation more informative, accurate, and successful. Aside from gathering information through internet, the researcher gathered information through statistics, charts, and observation. IV. Results and analysis During the earliest times, the life-styles of our ancestors were very simple. The air they breathed was clean. The streams were clear and free of harmful organisms. They used natural fertilizers for their agricultural crops. The surroundings were free of household throwaways. Today, there has been a tremendous growth in science and technology. Such advances have brought about changes in terms of new products, improved equipment, and more effective methodologies. Unfortunately, this same technology which made life easier for us produced wastes which are now affecting the quality of our surrounding, air, water, and land. Factories and motor vehicles send tons of pollutants into our air. Excessive air pollution poses a danger to our health and environment. It can likewise cause stunted growth and even death to our plants. Out streams are polluted by discharges from industrial plants that use chemicals. Garbage and sink wastes are carelessly thrown in our surroundings. Synthetic fertilizers and insecticides pollute our land and farm products (Johnston, 2000). At the same time, the burning of fossil fuels, particularly coal and oil, produces sulfur dioxide and nitrogen oxides which are hazardous to the atmosphere. Findings show that a single smokestack may produce as much as 500 tons of sulfur dioxide a day. When these gases combine with oxygen and moisture, sulfuric acid and nitric acid are formed (Jenner, 1999). The rain will carry the acids to the ground (acid rain) which may cause the depletion of calcium and magnesium in the soil, elements needed by plants for the formation of chlorophyll and wood, or it may cause the release of aluminum in the soil, which are poisonous and can kill the roots of trees. Moreover, nitrous oxide or â€Å"laughing gas’ is a colorless gas with a sweet taste and odor that is used as an anesthetic in minor surgery that H2O is responsible for about 6 percent of the human contributes to greenhouse warming. Methane or â€Å"cow gas,† on the other hand, makes up about 18 percent of human contributions to greenhouse effect. Cattle, sheep, goats, and other cud-chewing animals give off methane, in burps and flatulence as they digest (Cairncross, 2002). Experts said that what is happening right now is not a matter of adding a few degrees to the average temperature of a community. A rise of this magnitude may cause life, for without the environment, creatures on earth cannot survive (Davidson, 1999). CFCs are estimated to account for 14 percent of global warming. Experts said that what is happening right now is not a matter of adding a few degrees to the average temperature of a community. A rise of this magnitude may cause life, for without the environment, creatures on earth cannot survive With these, are we all aware of the extent of the damages brought about by modernization? Have we contributed to such environmental dilemma? What have we done to minimize such danger to our lives? How can we take care of our environment? We must undertake measures to preserve our resources and minimize utilization of energy before it’s too late. Our fight against pollution is an initial step toward conserving our environmental resources and energy. We must all join hands for this common goal. If present day emissions of greenhouse gases continue, it is estimated that the rate of increase in global mean temperatures will reach about 0. 30 0C per decade. This will mean a likely increase of 1 0C above the present level by the year 2025, and 30 0C before the end of the next century. A. Resolution a. ) Recycling and Reuse of Solid Wastes Solid wastes are now viewed as a potential resource which must be recovered and reused whenever possible. Since disposal forest resources are rapidly being depleted, recycling solid wastes offer a solution to both. Consider the element phosphorus. Mined from phosphate ores, it is manufactured into fertilizers. It enters the plant tissues and we obtain it when we eat plant as vegetable. This is later excreted and joins the sewage system. The sewage system sludge can be used directly as fertilizer or soil conditioner. Used bottles can be used over and over again. Durable plastic containers can be saved for more household uses. Tires can be recapped and used again. Old clothing materials are used as kitchen towels and bags (see Environment Matters: Industry’s Guide to the Issues, the Challenges and the Solutions, 1999). If the materials cannot be used over several times, then they can be shredded and converted into a new form. Old newspapers are repulped into new paper. Broken glasses are ground and manufactured into new ones. Tires are processed to raw rubber. Protein leftovers are manufactured into animal feeds. b. ) Conserving our Forests Every now and then we receive alarming news about our forests being denuded. Big logging concessionaires indiscriminately cut down trees without undertaking reforestation measures. Without trees, the soil is loosened and rapid erosion occurs. As a result fertile topsoil is washed away, which makes growth of other forms of vegetation almost impossible. We suffer great loss of timber, wildlife, and other forest products. But the greatest danger is the occurrence of floods and global warming that cause losses of food, properties, and lives (Davidson, 1999).

Sunday, November 10, 2019

History of Digital Computer

The History of Digital Computers B. RANDELL Computing Laboratory, University of Newcastle upon Tyne This account describes the history of the development of digital computers, from the work of Charles Babbage to the earliest electronic stored program computers, It has been prepared for Volume 3 of â€Å"l’Histoire Generale des Techniques,† and is in the main based on the introductory text written by the author for the book â€Å"The Origins of Digital Computers: Selected Papers† (Springer Verlag, 1973). . Charles Babbage THE first electronic digital computers were completed in the late 1940’s. In most cases their developers were unaware that nearly all the important functional characteristics of these computers had been invented over a hundred years earlier by Charles Babbage. It was in 1821 that the English mathematician Charles Babbage became interested in the possibility of mechanising the computation and printing of mathematical tables.He successfully constructed a small machine, which he called a â€Å"difference engine,† capable of automatically generating successive values of simple algebraic functions by means of the method of finite differences. This encouraged him to plan a full-scale machine, and to seek financial backing from the British government. During the next 12 years both Babbage and the government poured considerable sums of money into the attempt at building his Difference Engine.However the project, which called for the construction of six interlinked adding mechanisms, each capable of adding two multiple-digit decimal numbers, together with an automatic printing mechanism, was considerably beyond the technological capabilities of the era – indeed it has been claimed that the efforts expended on the Difference Engine were more than justified simply by the improvements they generated in mechanical engineering equipment and practice.Although Babbage’s plans for a Difference Engine were somewha t premature, the basic scheme was vindicated when in 1843, inspired by their knowledge of his work, George and Edvard Scheutz successfully demonstrated a working prototype difference engine. A final version of this model was completed 10 years later, with financial assistance from the Swedish government. Several other difference engines ere constructed in the decades that followed, but such machines never achieved the importance of more conventional calculating machines, and when multi-register accounting machines became available in the 1920’s it was found that these could be used essentially as difference engines. However Babbage’s ideas soon progressed far beyond that of a special-purpose calculating machine – in fact almost as soon as he started work on his Difference Engine he became dissatisfied with its limitations.In particular he wished to avoid the need to have the highest order of difference constant, in order to be able to use the machine directly fo r transcendental as well as algebraic functions. In 1834 Babbage started active work on these matters, and on problems such as division and the need to speed up the part of the addition mechanism which dealt with the assimilation of carry digits. He developed several very ingenious methods of carry assimilation, but the time savings so obtainable would have been at the cost of a considerable amount of complex machinery.This led Babbage to realise the advantages of having a single centralised arithmetic mechanism, the â€Å"mill,† separate from the â€Å"figure axes,† i. e. , columns of discs which acted merely as storage locations rather than accumulators. Babbage’s first idea for controlling the sequencing of the various component mechanisms of the engine was to use â€Å"barrels,† i. e. , rotating pegged cylinders of the sort used in musical automata. He first planned to use a set of subsidiary barrels, with over-all control of the machine being specifi ed by a large central barrel with exchangeable pegs.However in June 1836 he took the major step of adopting a punched card mechanism, of the kind found in Jacquard looms, in place of the rather limited and cumbersome central barrel. He did so in the realisation that the â€Å"formulae† which specified the computation that the machine was to perform could therefore be of almost unbounded extent, and that it would be a simple matter to change from the use of one formula to another.Normally formula cards, each specifying an arithmetic operation to be performed, were to be read by the Jacquard mechanism in sequence, but Babbage also envisaged means whereby this sequence could be broken and then recommenced at an earlier or later card in the sequence. Moreover he allowed the choice of the next card which was to be used to be influenced by the partial results that the machine had obtained.These provisions allowed him to claim that computations of indefinite complexity could be perf ormed under the control of comparatively small sets of formula cards. Babbage talked at one time of having a store consisting of no less than 1000 figure axes, each capable of holding a signed 40-digit decimal number, and planned to provide for reading numbers from cards into the store, and for punching or printing the values of numbers held in the store.The movement of numbers between the mill and the store was to be controlled by a sequence of â€Å"variable cards,† each specifying which particular figure axis was involved. Therefore an arithmetic operation whose operands were to be obtained from the store and whose result was to be returned to the store would be specified by an operation card and several variable cards. He apparently intended these different kinds of control cards to be in separate sequences, read by separate Jacquard mechanisms.Thus in the space of perhaps 3 years Babbage had arrived at the concept of a general purpose digital computer consisting of a sto re, arithmetic unit, punched card input and output, and a card-controlled sequencing mechanism that provided iteration and conditional branching. Moreover although he continued to regard the machine, which he later came to call the Analytical Engine, as being principally for the construction of mathematical tables, he had a very clear grasp of the conceptual advances he had made.Basing his claim on the unbounded number of operation and variable cards that could be used to control the machine, the ease with which complicated conditional branches could be built from a sequence of simple ones, and the fact that automatic input and output, and multiple precision arithmetic, were provided, he stated that â€Å". . . it appears that the whole of the conditions which enable a finite machine to make calculations of unlimited extent are fulfilled in the Analytical Engine . . . . I have converted the infinity of space, which was required by the conditions of the problem, into the infinity of time. Because separate, but associated, sequences of cards were needed to control the Analytical Engine the concept of a program as we know it now does not appear very c1early in contemporary descriptions of the machine. However there is evidence that Babbage had realised the fact that the information punched on the cards which controlled the engine could itself have been manipulated by an automatic machine-for example he suggested the possibility of the Analytical Engine itself being used to assist in the preparation of lengthy sequences of control cards.Indeed in the description of the use of the Analytical Engine written by Lady Lovelace, in collaboration with Babbage, there are passages which would appear to indicate that it had been realised that an Analytical Engine was fully capable of manipulating symbolic as well as arithmetical quantities. Probably Babbage himself realised that the complete Analytical Engine was impractical to build, but he spent much of the rest of his l ife designing and redesigning mechanisms for the machine.The realisation of his dream had to await the development of a totally new technology, and an era when the considerable finances and facilities required for an automatic computer would be made available, the need at last being widely enough appreciated. He was a century ahead of his time, for as one of the pioneers of the modern electronic digital computer has written: â€Å"Babbage was moving in a world of logical design and system architecture, and was familiar with and had solutions for problems that were not to be discussed in the literature for another 100 years. †He died in 1871, leaving an immense collection of engineering drawings and documents, but merely a small portion of the Analytical Engine, consisting of an addition and a printing mechanism, whose assembly was completed by his son, Henry Babbage. This machine and Babbage’s engineering drawings are now in the Science Museum, London. 2. Babbageâ€⠄¢s direct successors Some years’ after Babbage’s death his son Henry Babbage recommenced work on the construction of a mechanical calculating machine, basing his efforts on the designs his father had made for the Mill of the Analytical Engine.This work was started in 1888 and carried on very intermittently. It was completed only in about 1910 when the Mill, which incorporated a printing mechanism, was demonstrated at a meeting of the Royal Astronomical Society. By this date however the work of a little-known successor to Charles Babbage, an Irish accountant named Percy Ludgate, was already well advanced. Ludgate started work in 1903 at the age of 20 on an entirely novel scheme for performing arithmetic on decimal numbers.Decimal digits were to be represented by the lateral position of a sliding metal rod, rather than the angular position of a geared disc. The basic operation provided was multiplication, which used a complicated mechanism for calculating the two-digit products resulting from multiplying pairs of decimal digits. together. The scheme involved first transforming the digits into a form of logarithm, adding the logarithms together, and then converting the result back into a two-digit sum.This scheme is quite unlike any known to have been used in earlier mechanical calculators, or for that matter since, although there had been several calculating machines constructed that used built-in multiplication tables to obtain two-digit products – the earliest known of these was that invented by Bollee in 1887. It is in fact difficult to see any advantages to Ludgate’s logarithmic scheme, although his form of number representation is reminiscent of that used in various mechanical calculating devices in the following decades.So striking are the differences between Ludgate’s and Babbage’s ideas for mechanical arithmetic that there is no reason to dispute Ludgate’s statement that he did not learn of Babbageâ€℠¢s prior work until the later stages of his own. It seems likely that Babbage was the eventual inspiration for Ludgate to investigate the provision of a sequence control mechanism. Here he made an advance over the rather awkward system that Babbage had planned, involving separate sets of operation and variable cards.Instead his machine was to have been controlled by a single perforated paper tape, each row of which represented an instruction consisting of an operation code and four address fields. Control transfers simply involved moving the tape the appropriate number of rows forwards or backwards. Moreover he also envisaged the provision of what we would now call subroutines, represented by sequences of perforations around the circumference of special cylinders-one such cylinder was to be provided The Institute of Mathematics and its Applications 2 for division.The machine was also to be controllable from a keyboard, a byproduct of whose operation would be a perforated tape which could then be used to enable the sequence of manually controlled operations to be repeated automatically. Ludgate estimated that his Analytical Machine would be capable of multiplying two twenty-digit numbers in about 10 seconds, and that, in considerable contrast to Babbage’s Analytical Engine, it would be portable. However there is no evidence that he ever tried to construct the machine, which he apparently worked on alone, in his spare time.He died in 1922, and even if at this time his plans for the Analytical Machine still existed there is now no trace of them, and our knowledge of the machine depends almost entirely on the one description of it that he published. The next person who is known to have followed in the footsteps of Babbage and to have worked on the problems of designing an analytical engine was Leonardo Torres y Quevedo. Torres was born in the province of Santander in Spain in 1852.Although qualified as a civil engineer he devoted his career to scientific re search, and in particular to the design and construction of an astonishing variety of calculating devices and automata. He gained great renown, particularly in France and in Spain, where he became President of the Academy of Sciences of Madrid, and where following his death in 1936 an institute for scientific research was named after him. Torres first worked on analog calculating devices, including equation solvers and integrators.In the early 1900’s he built various radio-controlled devices, including a torpedo and a boat which, according to the number of pulses it received, could select between various rudder positions and speeds, and cause a flag to be run up and down a mast. In 1911 he made and successfully demonstrated the first of two chess-playing automata for the end game of king and rook against king. The machine was fully automatic, with electrical sensing of the positions of the pieces on the board and a mechanical arm to move its own pieces. The second machine was built in 1922, and used magnets underneath the board to move the pieces. ) In all this work, he was deliberately exploiting the new facilities that electromechanical techniques offered, and challenging accepted ideas as to the limitations of machines. He picked on Babbage’s Analytical Engine as an important and interesting technical challenge, and in 1914 published a paper incorporating detailed schematic designs for a suitable set of electro-mechanical components.These included devices for storing, comparing and multiplying numbers, and were accompanied by a discussion of what is now called floating point number representation. He demonstrated the use of the devices in a design for a special-purpose program-controlled calculator. The program was to be represented by areas of conductive material placed on the surface of a rotating drum, and incorporated a means for specifying conditional branching. Torres clearly never intended to construct a machine to his design, but 6 yea rs later he built, and successfully demonstrated, a typewriter-controlled calculating machine primarily to demonstrate that an electromechanical analytical engine was completely feasible. He in fact never did build an analytical engine, although he designed, and in many cases built, various other digital devices including two more calculating machines, an automatic weighing machine, and a machine for playing a game somewhat like the game of Nim. However there seems little reason to doubt that, should the need have been sufficiently pressing, Torres would indeed have built a complete analytical engine.In the event, it was not until the 1939-1945 war that the desirability of largescale fully automatic calculating machines became so clear that the necessary environment was created for Babbage’s concept to become a reality. Before this occurred there is known to have been at least one further effort at designing an analytical engine. This was by a Frenchman, Louis Couffignal, who was motivated mainly by a desire to reduce the incidence of errors in numerical computations.He was familiar with the work of Babbage and Torres y Quevedo but, in contrast to their designs, proposed to use binary number representation. The binary digits of stored numbers were to be represented by the lateral position of a set of parallel bars controlled by electro-magnets. The various arithmetic operations were to be performed by relay networks, the whole machine being controlled by perforated tapes. Couffignal apparently had every intention of building this machine, in association with the Logabax Company, but presumably because of the war never did so.However after the war he was in charge of an electronic computer project for the Institut Blaise Pascal, the design study and construction of the machine being in the hands of the Logabax Company. With Couffignal’s pre-war plans, the line of direct succession to Babbage’s Analytical Engine seems to have come to an end. Most of the wartime computer projects were apparently carried out in ignorance of the extent to which many of the problems that had to be dealt with had been tackled by Babbage over a century earlier. However in some cases there is clear evidence that nowledge of Babbage’s work was an influence on the wartime pioneers, in particular Howard Aiken, originator of the Automatic Sequence Controlled Calculator, and William Phillips, an early proponent of binary calculation, and various other influential people, including Vannevar Bush and L. J. Comrie, were also well aware of his dream. 3. The contribution of the punched card industry An initially quite separate thread of activity leading to the development of the modern computer originated with the invention of the punched card tabulating system.The capabilities of Herman Hollerith’s equipment, first used on a large scale for the 1890 US National Census, were soon extended considerably. The original equipment allowed cards to hold binary information representing the answers to a Census questionnaire. These cards could be tabulated, one by one, using a machine which sensed the presence of holes in the card electrically and could be wired to count the number of cards processed in which particular holes or combinations of holes had been punched. A device could be attached to such a tabulator which assisted the manual sorting of cards into a number of separate sequences.Within 10 years automatic card handling mechanisms, which greatly increased the speed of machine operation, and addition units, which enabled card tabulators to sum decimal numbers punched on cards, had been provided. The system soon came into widespread use in the accounting departments of various commercial organisations, as well as being used for statistical tabulations in many countries of the world. After the 1900 US Census relations between Hollerith and the Census Bureau deteriorated, and the Bureau began to manufacture its own equ ipment for use in the 1910 Census.The person in charge of this work was James Powers who circumvented Hollerith’s patents by producing a mechanical card reading apparatus. He retained the patent rights to his inventions and formed his own company which eventually merged with Remington Rand in 1927. In 1911 Hollerith sold his own company, the Tabulating Machine Company, which he had formed in 1896, and it was shortly afterwards merged with two other companies to form the Computing-TabulatingRecording Company. This company which was under the direction of Thomas J.Watson from 1914 became the International Business Machines Corporation in 1924. During the 1920’s and 1930’s punched card systems developed steadily, aided no doubt by the stimulus of competition, not only in the USA but also in Britain, where the Hollerith and Powers-based systems continued to be marketed under the names of their original inventors, while in France a third manufacturer, Compagnie Machi nes Bull, was also active. Unfortunately the people involved in this work did not in general publish technical papers and their work has received little public recognition.Thus full appreciation of the contribution of IBM development engineers, such as J. W. Bryce, one of the most prolific inventors of his era, will probably have to await an analysis of the patent literature. One inventor whose work has, however, been documented is Gustav Tauschek, a self-taught Viennese engineer, with more than 200 patents in the computing field to his credit. While working for Rheinische Metallund Maschinenfabrik he designed and built a punched card electromechanical accounting machine.His other patents, many of which were filed whilst he was under contract to IBM during the 1930’s, also included a â€Å"reading-writing-calculating machine† which used photocells to compare printed input characters with templates held on photographic film, a number storage device using magnetised stee l plates, and an electromechanical accounting machine designed for use in small banks capable of storing the records of up to 10 000 accounts. By the 1930’s printing tabulators were available which worked at approximately 100 cards per minute, and there were sorters which worked at 400 cards per minute.The machines were controlled by fairly intricate plugboards, but arithmetic and logical computations involving sequences of operations of any great complexity were carried out by repeated processing of sets of cards, under the direction of operators. Various attempts were made to supplement the functional capabilities of punched card systems by linking together otherwise independent machines. One such system, the Synchro-Madas machine, incorporated a typewriter/accounting machine, an automatic calculating machine and an automatic card punch.These were linked together so that a single action by the operator sitting at the typewriter/accounting machine would control several opera tions on the different machines. One other system involving a set of inter-linked card machines, although very different in concept and scale from the Synchro-Madas machine, is worth mentioning. This is the Remote-control Accounting system which was experimented with in a Pittsburgh department store, also in the mid-1930’s. The system involved 250 terminals connected by telephone lines to 20 Powers card punch/tabulators and 15 on-line typewriters.The terminals transmitted data from punched merchandise tags which were used to produce punched sales record cards, later used for customer billing. The typewriter terminals were used for credit authorisation purposes. The intended peak transaction rate was 9000 per hour. Even during the 1920’s punched card systems were used not only for accounting and the compilation of statistics, but also for complex statistical calculations. However the first important scientific application of punched card systems was made by L.J. Comrie in 1929. Comrie was Superintendent of HM Nautical Almanac Office until 1936, and then founded the Scientific Computing Service. He made a speciality of putting commercial computing machinery to scientific use, and introduced Hollerith equipment to the Nautical Almanac Office. His calculations of the future positions of the Moon, which involved the punching of half a million cards, stimulated many other scientists to exploit the possibilities of punched card systems. One such scientist was Wallace J.Eckert, an astronomer at Columbia University, which already had been donated machines for a Statistical Laboratory by IBM in 1929, including the â€Å"Statistical Calculator,† a specially developed tabulator which was the forerunner of the IBM Type 600 series of multiplying punches, and of the mechanisms used in the Harvard Mark I machine. With assistance from IBM in 1934 Eckert set up a scientific computing laboratory in the Columbia Astronomy Department, a laboratory which was la ter to become the Thomas J.Watson Astronomical Computing Bureau. In order to facilitate the use of his punched card equipment Eckert developed a centralised control mechanism, linked to a numerical tabulator, a summary punch and a multiplying punch, so that a short cycle of different operations could be performed at high speed. The control mechanism which was based on a stepping switch enabled many calculations, even some solutions 4 The Institute of Mathematics and its Applications of differential equations, to be performed completely automatically.The potential of a system of inter-connected punched card machines, controlled by a fully general-purpose sequencing mechanism, and the essential similarity of such a system to Babbage’s plans for an Analytical Engine, were discussed in an article published by Vannevar Bush in 1936. Bush was at this time already renowned for his work on the first differential analyser, and during the war held the influential position of Director o f the US Office of Scientific Research and Development.In fact an attempt was made to build such a system of inter-connected punched card machines at the Institut fur Praktische Mathematik of the Technische Hochschule, Darmstadt, in Germany during the war. The plans called for the inter-connection of a standard Hollerith multiplier and tabulators, and specially constructed divider and function generators, using a punched tape sequence control mechanism. Work was abandoned on the project following a destructive air raid in September 1944. However, by this stage, in the United States much more ambitious efforts were being made to apply the expertise of punched card equipment designers.The efforts originated in 1937 with a proposal by Howard Aiken of Harvard University that a large-scale scientific calculator be constructed by inter-connecting a set of punched card machines via a master control panel. This would be plugged so as to govern the transmission of numerical operands and the sequencing of arithmetic operations. Through Dr. Shapley, director of the Harvard College Observatory, Aiken became acquainted with Wallace Eckert’s punched card installation at Columbia University.These contacts helped Aiken to persuade IBM to undertake the task of developing and building a machine to his basic design. For IBM, J. W. Bryce assigned C. D. Lake, F. E. Hamilton and B. M. Durfee to the task. Aiken later acknowledged these three engineers as co-inventors of the Automatic Sequence Controlled Calculator, or Harvard Mark I as it became known. The machine was built at the IBM development laboratories at Endicott and was demonstrated there in January 1943 before being shipped to Harvard, where it became operational in May 1944.In August of that year IBM, in the person of Thomas J. Watson, donated the machine to Harvard where it was used initially for classified work for the US Navy. The design of the Harvard Mark I followed the original proposals by Aiken fairly close ly, but it was built using a large number of the major components used in the various types of punched card machines then manufactured, rather than from a set of complete machines themselves. It incorporated 72 â€Å"storage counters† each of which served as both a storage location, and as a complete adding and subtracting machine.Each counter consisted of 24 electromechanical counter wheels and could store a signed 23digit decimal number. A special multiply/divide unit, and units for obtaining the value of previously computed functions held on perforated tape, and for performing interpolation, were provided together with input/output equipment such as card readers and punches, and typewriters. The various mechanisms and counter wheels were all driven and synchronised by a single gearconnected mechanical system extending along nearly the entire length of the calculator.A main sequence control mechanism incorporating a punched tape reader governed the operation of the machine. Each horizontal row on the tape had space for three groups of eight holes, known as the A, B and C groups. Together these specified a single instruction of the form â€Å"Take the number out of unit A, deliver it to unit B, and start operation C. † Somewhat surprisingly, in view of Aiken’s knowledge of Babbage’s work and writings, no provision was made originally for conditional branching.As it was, such provision was only made later when a subsidiary sequence control mechanism was built at Harvard and incorporated into the machine. The Harvard Mark I was a massive machine over 50 feet long, built on a lavish scale. Being largely mechanical its speed was somewhat limited – for example multiplication took 6 seconds – but it continued in active use at Harvard until 1959. It has an important place in the history of computers although the long-held belief that it was the world’s first operational programcontrolled computer was proved to be fals e, once the details of Zuse’s wartime work in Germany became known.It marked a major step by IBM towards full involvement in the design of general-purpose computers and, with ENIAC and the Bell Telephone Laboratories Series, represents the starting point of American computer developments. After completion of the Mark I, Aiken and IBM pursued independent paths. Aiken, still distrustful of the reliability of electronic components, moved to electromagnetic relays for the construction of the Harvard Mark II, another paper-tape-sequenced calculator.This machine had an internal store which could hold about 100 dccimal floating point numbers. One of the most interesting aspects of the machine was that it could be operated either as a single computer or as two separate ones. The complete system incorporated four of each type of input/output device, namely sequence tape readers, data tape readers and punches, numerical function tape readers and output printers. It also had multiple ar ithmetic facilities, including two adders and four multipliers (taking 0. 7 second) which could all be used simultaneously.Detailed design of the machine, which was intended for the US Naval Proving Ground, Dahlgren, Virginia, began at Harvard early in 1945, and the machine was completed in 1947. Afterwards Aiken and his colleagues went on to design the Mark III, an electronic computer with magnetic drum storage, completed in 1950, and the Mark IV, which incorporated 200 magnetic core shift registers, completed in 1952. The designers of IBM’s next machine, the Pluggable Sequence Relay Calculator, included two of the Harvard Mark I’s design team, namely C. D. Lake and B. M.Durfee, but the machine in fact had more in common with IBM’s earlier calculating punches than with the Mark I; like the punches it was controlled using plugboard-specified sequencing, rather than by a sequence control tape of essentially unlimited length. Its relay construction resulted in its basic operation speed being considerably faster than the Mark I, although it lacked the Mark I’s ease and flexibility of programming, demanding instead the kind of detailed design of parallel subsequencing that one sees nowadays at the microprogramming level of some computers.Great stress was raid by the designers on the efficient use of punched card input/output, and it was claimed that in many cases, where other machines’ internal storage capacity proved inadequate, the IBM relay calculators could outperform even the contemporary electronic computers. Several machines were built, the first of which was delivered in December 1944 to the Aberdeen Proving Ground, and two were installed at the Watson Scientific Computing Laboratory that IBM had set up at Columbia University under the directorship of Wallace Eckert.The Relay Calculator was followed by the giant IBM Selective Sequence Electronic Calculator, a machine which was very much in the tradition of the Mark I. Wal lace Eckert was responsible for the logical organisation of the machine, with Frank Hamilton being the chief engineer on the project. The design was a compromise between Eckert’s wish, for performance reasons, to use electronic components to the full, and Hamilton’s preference for electro-mechanical relays, on grounds of reliability. As a result vacuum tubes were used for the arithmetic unit, the control circuitry, and the 8 word high-speed store, relays being used elsewhere.In addition to the 8 word store there was a 150 word random access electro-magnetic store and storage for 20000 numbers in the form of punched tapes. Numbers would be read from the electro-magnetic store, or in sequence from the punched tape store, at the speed of the multiplier, i. e. , every 20 milliseconds. The design was started in 1945, and the machine was built in great secrecy at Endicott, before being moved to New York City, where it was publicly unveiled at an elaborate dedication ceremony in January 1948. The most important aspect of the SSEC, credited to R. R.Seeber, was that it could perform arithmetic on, and then execute, stored instructions – it was almost certainly the first operational machine with these capabilities. This led to IBM obtaining some very important patents, but the machine as a whole was soon regarded as somewhat anachronistic and was dismantled in 1952. It had however provided IBM with some valuable experience – for example, Hamilton and some of his engineering colleagues went on to design the highly successful IBM 650, and many of the SSEC programmers later became members of the IBM 701 programming group.Finally, mention should be made of one other machine manufactured by IBM which can be classed as a precursor to the modern electronic digital computer. This was the Card Programmed Calculator, a machine which along with its predecessors now tends to be overshadowed by the SSEC. Like the Pluggable Sequence Relay Calculator, the C PC can trace its origins to the IBM 600 series of multiplying punches. In 1946 IBM announced the Type 603, the first production electronic calculator. The IBM 603, which incorporated 300 valves, was developed from an experimental multiplier designed at Endicott under the direction of R.L. Palmer in 1942. One hundred machines were sold, and then IBM replaced it with the Type 604, a plugboardcontrolled electronic calculator, which provided conditional branching but, lacking backward jumps, no means of constructing program loops. Deliveries of the 604, which incorporated over 1400 valves, started in 1948 and within the next 10 years over 5000 were installed. In 1948 a 604 was coupled to a type 402 accounting machine by Northrop Aircraft Company, in order to provide the 604 with increased capacity and with printing facilities. This idea was taken up by IBM, and formed the basis of the CPC.Nearly 700 CPC’s were built, and this machine played a vital role in providing computing pow er to many installations in the USA until stored program electronic computers became commercially available on a reasonable scale. In the years that followed the introduction of the CPC, IBM continued to develop its range of electronic calculators and, starting in 1952 with the IBM 701, an electronic computer in the tradition of von Neumann’s IAS machine, took its first steps towards achieving its present dominant position amongst electronic computer manufacturers. . Konrad Zuse Konrad Zuse started to work on the development of mechanical aids to calculation as early as 1934, at the age of 24. He was studying civil engineering at the Technische Hochschule, Berlin-Charlottenburg, and sought some means of relief from the tedious calculations that had to be performed. His first idea had been to design special forms to facilitate ordinary manual calculation, but then he decided to try to mechanise the operation.Continuing to use the special layouts that he had designed for his fo rms, he investigated representing numerical data by means of perforations, and the use of a hand-held sensing device which could communicate the data over an electrical cable to an automatic calculating machine. The idea then arose of using a mechanical register rather than perforated cards, and, realising that the layout was irrelevant, Zuse started to develop a general purpose mechanical store, whose locations were addressed numerically.By 1936 he had the basic design of a floating point binary computer, controlled by a program tape consisting of a sequence of instructions, each of which specified an operation code, two operand addresses and a result address. Thus, apparently quite independently of earlier work by Babbage and his successors on analytical engines, Zuse had very quickly reached the point of having a design for a general-purpose program-controlled computer, although the idea of conditional branching was lacking.More importantly, even though the various basic The Inst itute of Mathematics and its Applications 6 ideas that his design incorporated had, it now turns out, been thought of earlier (i. e. , binary mechanical arithmetic (Leibniz), program control (Babbage), instruction formats with numerical storage addresses (Ludgate) and floating point number representations (Torres y Quevedo)), Zuse’s great achievement was to turn these ideas into reality. Zuse had considerable trouble finding sponsors willing to finance the building of his machine.Despite his financial difficulties his first machine, the Z1, which was of entirely mechanical construction was completed in 1938, but it proved unreliable in operation. He then started to construct a second, fixed-point binary, machine which incorporated the 16 word mechanical binary store of the Z1, but was otherwise built from second-hand telephone relays. Although the Z2 computer was completed it was inadequate for any practical use. However by this time a colleague, Helmut Schreyer, was already working with Zuse on the problem of producing an electronic version of the Z1.This led to the construction of a small 10 place binary arithmetic unit, with approximately 100 valves, but proposals that Schreyer and Zuse made to the German government for a 1500 valve electronic computer were rejected and the work was discontinued in 1942. Earlier, in 1939, Zuse was called up for military service, but managed to get released after about a year, and for the first time received significant government backing for his plans. This enabled him to build the Z3 computer, a binary machine with a 64 word store, all built out of telephone relays.This computer, since it was operational in 1941, is believed to have been the world’s first general-purpose program-controlled computer. It incorporated units for addition, subtraction, multiplication, division and square root, using a floating point number representation with a sign bit, a 7-bit exponent and a 14-bit mantissa. Input was via a manu al keyboard and output via a set of lights, in each case with automatic binary/decimal conversion, and the machine was controlled by a perforated tape carrying single address instructions, i. . , instructions specifying one operand, and an operation. In addition to his series of general-purpose computers, Zuse built two special-purpose computers, both used for calculations concerning aircraft wing profiles. The first of these was in use for 2 years at the Henschel Aircraft Works, before being destroyed through war damage. Both computers had fixed programs, wired on to rotary switches, and performed calculations involving addition, subtraction and multiplication by constant factors.Soon after completion of the Z3, the design of an improved version, the Z4, was started. This was mainly electro-mechanical but incorporated a purely mechanical binary store similar to that which had been used for the Zl and Z2 machines. The partially completed Z4 was the only one of Zuse’s machines to survive the war – indeed it eventually was completed and gave years of successful service at the Technische Hochschule, Zurich. The Z4 was inspected shortly after the war by R. C. Lyndon, whose report on the machine for the US Office f Naval Research was published in 1947. At this stage the Z4 had only manual input and output, and no means of conditional branching, although it was planned to add four tape readers and two tape punches, and facilities for repeating programs and for choosing between alternate subprograms. The machine was housed in the cellar of a farmhouse in the little village of Hopferau in Bavaria, and was not fully operational, but the mechanical store and various arithmetic operations and their automatic sequencing were successfully demonstrated to Lyndon.His report, although it gives a fairly full description of the Z4 (with the exception of the mechanical store, which he was not allowed to examine in detail), made virtually no mention of Zuse’s earlier work. Indeed it was many years before any other English language accounts of Zuse’s work were published, and Zuse’s rightful place in the chronology of computer development became at all widely appreciated. 5. Bell Telephone Laboratories The potentialities of telephone equipment for the construction of digital calculation devices were not realised for many years.The first automatic telephone exchange, which used the step-by-step or Strowger switch, was installed in 1892. As early as 1906 Molina devised a system for translating the pulses representing the dialled decimal digits into a more convenient number system. Exchanges based mainly on the use of electromechanical relays started to come into use at the turn of the century, the earliest successful centralised automatic exchanges dating from about 1914. However, from the late 1920’s various different calculating devices were developed using telephone equipment.Perhaps the most spectacular of these was the automatic totalisator. Totalisator, or â€Å"pari-mutuel,† betting became legal on British race courses in July 1929. Development of fully automatic totalisators consisting of ticket-issuing machines situated in various parts of the race course, a central calculating apparatus, and display boards which indicated the number and total value of bets made on each horse, and on the race as a whole, was already well under way.There were several rival systems. The Hamilton Totalisator and the totalisator produced by the British Automatic Totalisator Company were fully electrical, both as regards the calculations performed and the operation of the display boards, whereas the Lightning Totalisator used electrical impulses from remote ticket machines only to release steel balls which fell through tubes and actuated a mechanical adding apparatus.In January 1930 the Racecourse Betting Control Board demonstrated at Thirsk Racecourse a new standard electric totalisator supplied by Bri tish Thompson Houston, built from Strowger switches. This machine which was transportable from racecourse to racecourse could accumulate bets on up to six horses at a maximum rate of 12 000 per minute. The machine had in fact been designed in Baltimore, Maryland, in 1928 but the first complete machine to be used in the USA was installed by the American Totalisator Company at Arlington Park nly in 1933. In succeeding years much more sophisticated totalisators, involving hundreds of remote ticket-issuing machines, were used at racecourses all over USA, and it was not until many years after the advent of the electronic computer that one was used as a replacement for the central calculating apparatus of the totalisator. One early little-known design for a calculating machine to be built from telephone relays was that of Bernard Weiner in Czechoslovakia in 1923.Weiner, in association with the Vitkovice Iron Works, went on during the 1930’s to design a more powerful automatic calcu lator. He did not survive the war, and nothing is known about the results of this work. Other early work was done by Nicoladze who in 1928 designed a multiplier based on the principle of Genaille’s rods. (These were a non-mechanical aid to multiplication which enabled a person to read off the product of a multidigit number by a single digit number. Four years later Hamann described not only various different styles of relay-based multiplier, but also a device for solving sets of simultaneous linear equations, and shortly afterwards Weygandt demonstrated a prototype determinant evaluator, capable of dealing with 3 x 3 determinants. Undoubtedly in the years that followed many other digital calculating devices were developed based on telephone relay equipment, particularly during the war for such military applications as ballistics calculations and cryptanalysis – indeed, as mentioned earlier, some of Zuse’s machines made extensive use of telephone relays.It is per haps a little surprising that it was not until 1937 that Bell Telephone Laboratories investigated the design of calculating devices, although from about 1925 the possibility of using relay circuit techniques for such purposes was well accepted there. However, in 1937 George Stibitz started to experiment with relays, and drew up circuit designs for addition, multiplication and division. At first he concentrated on binary arithmetic, together with automatic decimal-binary and binarydecimal conversion, but later turned his attention to a binary-coded decimal number representation.The project became an official one when, prompted by T. C. Fry, Stibitz started to design a calculator capable of multiplying and dividing complex numbers, which was intended to fill a very practical need, namely to facilitate the solution of problems in the design of filter networks, and so started the very important Bell Telephone Laboratories Series of Relay Computers. In November 1938, S. B. Williams took over responsibility for the machine’s development and together with Stibitz refined the design of the calculator, whose construction was started in April and completed in October of 1939.The calculator, which became known as the â€Å"Complex Number Computer† (often shortened to â€Å"Complex Computer,† and as other calculators were built, the â€Å"Model I†), began routine operation in January 1940. Within a short time it was modified so as to provide facilities for the addition and subtraction of complex numbers, and was provided with a second, and then a third, teletype control, situated in remote locations. It remained in daily use at Bell Laboratories until 1949.The Complex Computer was publicly demonstrated for the first time in September 1940 by being operated in its New York City location from a teletypewriter installed in Hanover, New Hampshire, on the occasion of a meeting of the American Mathematical Society, a demonstration that both John Mauc hly and Norbert Wiener attended. During 1939 and 1940 Stibitz started work on the idea of automatic sequencing and on the use of error-detecting codes. These ideas were not pursued actively until, a year or so later, the onset of the war rovided a strong stimulus and the necessary financial climate. They then formed the basis of the second of the Bell Laboratories relay calculators, the â€Å"Relay Interpolator. † This was a special-purpose tape-controlled device, with selfchecking arithmetic, designed to solve fire control problems, and was built for the National Defense Research Council, to which Stibitz had been lent by Bell Laboratories. Although mainly used for interpolation it was also used for a few problems in harmonic analysis, calculation of roots of polynomials and solution of differential equations.It became operational in September 1943, and after the war it was handed over to the US Naval Research Laboratory, where it was in use until 1961. The Model III relay c alculator, the â€Å"Ballistic Computer,† work on which started in 1942, was a much more complete realisation of Stibitz’s early plans for an automatic computer, and although once again intended for fire control problems was much more versatile than the Model II. It was tape-controlled, and had a tenregister store, a built-in multiplier (designed by E. L.Vibbard), and devices for performing automatic look-up of tables held on perforated paper tape. Perhaps most impressive was the fact that the machine was 100 per cent. self-checked. The machine was completed in June 1944, and remained in use until 1958. The Model IV relay calculator was little different from the Model III, and the series culminated in the Model V, a truly general-purpose program-controlled computer, complete with convenient conditional branching facilities. (The final member of the series, Model VI, was essentially just a simplified version of the Model V. Two copies of the Model V were built, the firs t being delivered in 1946 to the National Advisory Committee on Aeronautics at Langley Field, Virginia, and the second in 1947 to the Ballistics Research Laboratory at Aberdeen, Maryland. With its multiple computing units, the Model V, which used floating point arithmetic, was what we would now call a multiprocessing system, and its â€Å"problem tapes† were the forerunners of the early simple batch-processing operating systems. Each of the two computing units comprising a complete system contained 15 storage registers.A single register could hold a floating point number consisting of a sign, a seven-decimal digit mantissa and a two-digit exponent. Decimal digits were stored in a bi-quinary form, using seven relays, and each register used a total of 62 relays. Each unit had independent provision for the addition, subtraction, multiplication and division and for 8 The Institute of Mathematics and its Applications taking the square root of floating point numbers, and for printi ng or punching its results.In addition a large set of tape readers, intended for tapes of input data, tabulated functions and programs, and for the problem tapes which controlled the running of series of separate programs, were shared by the two computer units. These units normally functioned as independent computers, but for large problems would be arranged to work cooperatively. Although somewhat slow in execution, the Model V set new standards for reliability, versatility and ease of switching from one task to another, and in so doing must surely have had an important influence on the designers of the earliest round of general-purpose electronic computers.In later years, quite a number of relay calculators were constructed, in both the USA and Europe, even after the first stored program electronic computers became operational, but the importance of their role in the history of computers hardly matches that of the Bell Laboratories Model V and its contemporaries. 6. The advent of electronic computers The earliest known electronic digital circuit, a â€Å"trigger relay,† which involved a pair of valves in a circuit with two stable states and was an early form of flip-flop, was described by Eccles and Jordan in 1919.The next development that we know of was the use by WynnWilliams at the Cavendish Laboratory, Cambridge, of thyratrons in counting circuits including, in 1932, a â€Å"scale-of-two† (binary) counter. By the end of the decade quite a few papers had been published on electronic counters intended for counting impulses from GeigerMuller tubes used in nuclear physics experiments. WynnWilliams’ work had a direct influence on the ideas of William Phillips, who apparently in 1935 attempted to patent a binary electronic computing machine.He built a mechanical model, which still exists, of the intended electronic multiplication unit but no other details are presently known of his planned machine. The first known attempt to build an elect ronic digital calculating machine was begun by John V. Atanasoff in the mid-1930’s at Iowa State College where there had been an active interest in statistical applications using punched card equipment since the early 1920’s. As an applied mathematician Atanasoff had many problems requiring generalisations of existing methods of approximating solutions of linear operational equations.He first explored the use of analog techniques and with Lynn Hannum, one of his graduate students, developed the â€Å"Laplaciometer,† a device for solving Laplace’s equation in two dimensions with various boundary conditions. By 1935 the realisation of the sharp limitations of analog computing forced Atanasoff to digital methods. The disadvantages of mechanical techniques and his knowledge of electronics and of the work of Eccles and Jordan then led him to consider an electronic approach.He soon found that in these circumstances a base two number system would have great adva ntages. In 19361937 Atanasoff abandoned the Eccles-Jordan approach and conceived a system employing memory and logic circuits, whose details were worked out in 1938. He received a grant from Iowa State in 1939, and was joined by Clifford E. Berry. With Berry’s assistance a prototype computing element was built and operating by the autumn of that year. They then undertook the design and construction of a large machine intended for the solution of up to 30 simultaneous linear equations.At the heart of the machine there was a pair of rotating cylinders around the surface of which a set of small electrical condensers was placed. Each condenser could, by the direction of its charge, represent a binary digit; although the charge would leak away slowly, it was arranged that as the cylinders rotated the charge on each condenser was detected and reinforced at 1 second time intervals so that information could be stored for as long as required.The condensers were arranged so as to provi de two sets of 30 binary words, each consisting of 50 bits, the condensers corresponding to a single word being arranged in a plane perpendicular to the axis of the cylinders. The results of intermediate steps of a computation were to be punched in binary form on cards, for later re-input to the machine. In order that card punching and reading should be fast enough to keep pace with the computation, special devices were designed that made and detected holes in cards by means of electrical sparks.Ordinary input and output was to be via conventional punched cards, with the machine providing automatic binary/decimal conversions. The machine, with binary addition, subtraction and shifting as its basic arithmetic facilities, was designed to solve sets of simultaneous linear equations by the method of successive elimination of unknowns. The electronic part of the computer was operational but the binary card reader was still unreliable when in 1942 Atanasoff and Bcrry left Iowa State for w artime jobs, so that the machine was abandoned, never having seen actual use.In the late 1930’s and early 1940’s several groups started to investigate the use of digital electronic circuits as replacements for mechanical or electro-mechanical calculating devices, including several of the American business machine manufacturers such as IBM, whose work was described briefly above. The earliest known efforts at applying electronics to a general-purpose program-controlled computer were those undertaken by Schreyer and Zuse, also mentioned earlier.The next development which should be mentioned is the still classified series of electronic cryptanalytic machines that were designed and built in Britain during the war. The machines that are of particular interest, with respect to the development of electronic computers are the Colossi, the first of which was operational in late 1943, while by the end of the war ten had been installed. Each Colossus incorporated approximately 20 00 valves, and processed a punched data tape that was read at a speed of 5000 characters per second.Preset patterns that were to be compared against the input data were generated from stored component patterns. These components were stored in ring registers made of thyratrons and could be manually set by plug-in pins. The Colossi were developed by a team led by M. H. A. Newman. Alan Turing, who had been one of the main people involved in the design of an electro-mechanical predecessor to the Colossi, was apparently not directly associated with the new design, but with others provided the requirements that the machines were to satisfy.The comparative lack of technical details about the design of these machines makes it unreasonable to attempt more than a preliminary, and somewhat hesitant, assessment of the Colossi with respect to the modern digital computer. It would appear that the arithmetical, as opposed to logical, capabilities were minimal, involving only counting rather than g eneral addition or other operations. They did, however, have a certain amount of electronic storage. Although fully automatic, even to the extent of producing printed output, they were very much special-purpose machines, but ithin their field of specialisation the facilities provided by plug-boards and banks of switches afforded a considerable degree of flexibility; in fact several of the people involved in the project have since characterised the machines as being â€Å"program-controlled. † Their importance as cryptanalytic machines, which must have been immense, can only be inferred from the number of machines that were made and the honours bestowed on various members of the team after the end of the war; however, their importance with respect to the development of computers was twofold.They demonstrated the practicality of largescale electronic digital equipment, just as ENIAC did, on an even grander scale, approximately 2 years later. Furthermore, they were also a major source of the designers of some of the first post-war British computers, namely the Manchester machine, the MOSAIC, and the ACE at the National Physical Laboratory. Fascinating though they are, none of the efforts described so far comes near to matching the importance of the work at the Moore School of Electrical Engineering, University of Pennsylvania, which led to the design of first the ENIAC and then the EDVAC computers.By 1942 the Moore School had, because of pressures of war, become closely associated with the Ballistic Research Laboratory of the US Army Ordnance Department, and the Moore School’s differential analyser was being used to supplement the work of the one at the Ballistic Research Laboratory on the production of ballistic tables. (The two analysers were identical and had been patterned on the original differential analyser invented by Vannevar Bush in 1930. ) One of the people who had worked with the analyser was John Mauchly, then an assistant professor at the Moore School.Mauchly was by this time well aware of what could be done with desk calculating machines and punched card equipment, although he was apparently unaware of the work Aiken was then doing on what became the Harvard Mark I, or of Babbage’s efforts 100 years earlier. He did however know of the work of Stibitz and had visited Iowa State in June 1941 in order to see Atanasoff’s special-purpose computer. Another person who worked on the Moore School differential analyser, and in fact made important improvements to it by replacing its mechanical amplifiers by partially electronic devices, was J. Presper Eckert, a research associate at the School.Eckert had met Mauchly in 1941, and it was their discussions about the possibility of surmounting the reliability problems of complex electronic devices that laid the groundwork for a memorandum that Mauchly wrote in August 1942. This proposed that an electronic digital computer be constructed for the purpose of solving numerical difference equations of the sort encountered in ballistics problems. Also at the Moore School, acting as a liaison officer for Colonel Paul N. Gillon of the office of the Chief of Ordnance, was Herman H. Goldstine, who before the war had been assistant professor of mathematics at the University of Michigan.In early 1943 Goldstine and Gillon became interested in the possibility of using an electronic calculating machine for the preparation of firing and bombing tables. By this time Mauchly’s 1942 memorandum had been mislaid, and it had to be recreated from his secretary’s notes. The second version of the memorandum, together with more detailed plans drawn up by Mauchly and Eckert, was included in a report dated April 1943 which formed the basis for a contract between the University of Pennsylvania and the US Government to develop an electronic computer.A large team was assembled at the Moore School in order to design and build the computer under the supervisi on of J. G. Brainerd, with Eckert as chief engineer and Mauchly as principal consultant. As the project progressed its aims broadened, so that the ENIAC, as it became known, turned out to be much more a general-purpose device than had been originally contemplated, and although programs were represented by plugged interconnecting wires, it provided full conditional branching facilities.It was an incredibly ambitious machine incorporating over 19 000 valves and consuming approximately 200 kilowatts of electric power! (The number of valves largely resulted from the use of them for high speed storage, and the choice of number representation, which can best be described as â€Å"unary-coded decimal. â€Å") The ENIAC incorporated 20 10-digit accumulators, which could be used for addition and subtraction, and for the temporary storage of numbers, a multiplier and a combination divider and square rooter.Addition took 200 microseconds, and multiplication of two 10-digit numbers approximat ely 3 milliseconds. Storage was provided for approximately 300 numerical constants in function tables, which could be set up by manual switches prior to commencing a computation. Input and output was via punched cards, using standard IBM devices. Early in its career the method of programming the machine was modified so that the program was represented by settings of the function tables without the need for changing the interconnecting cables.